Page 11234..1020..»

Bolstering environmental data science with equity-centered approaches – EurekAlert

image:

Graphical abstract

Credit: Joe F. Bozeman III

A paradigm shift towards integrating socioecological equity into environmental data science and machine learning (ML) is advocated in a new perspective article (DOI: 10.1007/s11783-024-1825-2)published in the Frontiers of Environmental Science & Engineering. Authored by Joe F. Bozeman III from the Georgia Institute of Technology, the paper emphasizes the importance of understanding and addressing socioecological inequity to enhance the integrity of environmental data science.

This study introduces and validates the Systemic Equity Framework and the Wells-Du Bois Protocol, essential tools for integrating equity in environmental data science and machine learning. These methodologies extend beyond traditional approaches by emphasizing socioecological impacts alongside technical accuracy. The Systemic Equity Framework focuses on the concurrent consideration of distributive, procedural, and recognitional equity, ensuring fair benefits for all communities, particularly the marginalized. It encourages researchers to embed equity throughout the project lifecycle, from inception to implementation. The Wells-Du Bois Protocol offers a structured method to assess and mitigate biases in datasets and algorithms, guiding researchers to critically evaluate potential societal bias reinforcement in their work, which could lead to skewed outcomes.

Highlights

Socioecological inequity must be understood to improve environmental data science.

The Systemic Equity Framework and Wells-Du Bois Protocol mitigate inequity.

Addressing irreproducibility in machine learning is vital for bolstering integrity.

Future directions include policy enforcement and systematic programming.

"Our work is not just about improving technology but ensuring it serves everyone justly," said Joe F. Bozeman III, lead researcher and professor at Georgia Institute of Technology. "Incorporating an equity lens into environmental data science is crucial for the integrity and relevance of our research in real-world settings."

This pioneering research not only highlights existing challenges in environmental data science and machine learning but also offers practical solutions to overcome them. It sets a new standard for conducting research that is just, equitable, and inclusive, thereby paving the way for more responsible and impactful environmental science practices.

Frontiers of Environmental Science & Engineering

Experimental study

Not applicable

Bolstering integrity in environmental data science and machine learning requires understanding socioecological inequity

8-Feb-2024

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.

See the original post here:

Bolstering environmental data science with equity-centered approaches - EurekAlert

Read More..

Going global as Mason Korea’s first computational and data sciences graduate – George Mason University

Traveling abroad has been part of Jimin Jeons life for as long as she can remember. She traveled with her mom during every school vacation, which allowed her to visit 23 countries by the time she was a college student. Being exposed to different cultures from a young age helped her develop a desire to pursue her college education abroad. That brought her to Mason Korea after 12 years of Korean public school education.

While the thought of studying abroad was exciting, I felt burdened by the language barrier to study abroad in the U.S. right after graduating high school, Jeon said. Mason Korea was an alternative to ease that transition by improving my English skills in a more familiar setting in South Korea.

Jeon was part of the first cohort in the newly established Computational and Data Sciences Department at Mason Korea. Although her frequent travels around the world prompted her to major in global affairs, she had her mind set on the world of big data since high school. Thus, after her freshman year, once the new major was opened, she made the jump to the STEM field.

Jeon found she had direct opportunities to engage in data analysis. Her favorite part of Mason Korea was the Career Development Center, which allowed students like her to be exposed to opportunities in data analytics to gain technical hands-on experiences. Her first work experience was through the center as a data science intern at a real estate AI valuation startup during her junior year.

It was a special opportunity to see how the knowledge about programming languages I acquired in the classroom be applied in the real workforce and identify the areas that I need to continue to improve to be a more competent data scientist, said Jeon.

Transitioning to the Fairfax Campus in the fall semester of 2023, Jeon stayed true to her goal of diversifying her experiences. Her last semester at George Mason included working as a teaching assistant for the Computational and Data Sciences Department in the College of Science, performing data cleaning for an on-campus project, and helping students practice their Korean through the language exchange program. She took advantage of the language environment so that she could build her English skills.

Jeon is now a proud graduate in computational and data sciences, one of the few who enrolled in the major in 2020. She is excited about the job opportunities she has and wants to encourage all those who have just closed their four-year journey.

For students just like myself, who have spent their whole life in the Korean education system, going to Mason Korea alone is a challenge, she said. Learning about various topics at a more sophisticated level in a language that you are not familiar with was also not an easy task for me. Yet, the four-year voyage of diverse experiences and success itself shows that I can take on any challenge at any point in my life.

See the rest here:

Going global as Mason Korea's first computational and data sciences graduate - George Mason University

Read More..

Aristotle, AI and the Data Scientist | The ILR School – Cornell University | ILR School

Nearly two and a half millennia since his time, Aristotle and his "virtue ethics" in short, to live a life of good character are every bit as relevant to budding statisticians as the technical skills they learn to build AI models, according to Elizabeth Karns, senior lecturer of statistics and data science at Cornell Bowers CIS and at the ILR School.

An epidemiologist and lawyer, Karns launched Integrated Ethics in Data Science (STSCI 3600), a seven-week course offered twice each spring semester, several years ago in response to what she viewed as a disconnect between statisticians and the high-powered, high-consequence statistical models they were being asked to build.

I started thinking more about algorithms and how we are not preparing students sufficiently to be confronted with workplace pressures to just get the model done Put in the data, don't question it, and just use it, she said.

The problem, as she sees it, is that these models are largely unregulated, have no governing body, and thus skirt rigorous scientific testing and evaluation. Lacking such oversight, ethics and fairness, then, become a matter of discretion on the part of the statisticians developing the models; personal values and virtues are brought into the equation, and this is where Aristotles wisdom proves vital, she said.

At this point in our lack of regulation, we need to depend on ethical people, Karns said. I want students to learn to pause and reflect before making decisions, and to ask, How well does this align with my values? Is this a situation that could lead to problems for the company or users? Is this something I want to be associated with? Thats the core of the class.

For the course, Karns with the help of Cornells Center for Teaching Innovation (CTI) developed an immersive video, Nobodys Fault: An Interactive Experience in Data Science Practice, which challenges students to consider a moral conflict brought about by a bad model.

I tell my students that we're going to be in situations in this class where there's not a clear right or wrong answer, she said. And that's the point to struggle with that ambiguity now and get some comfort in that gray space. That way, when they get out into the workplace, they can be more effective.

To read more about the work Bowers CIS is doing to develop responsible AI, click here. Louis DiPietro is a public relations and content specialist for Cornell Bowers CIS.

More here:

Aristotle, AI and the Data Scientist | The ILR School - Cornell University | ILR School

Read More..

UK and Korea agree closer ties on AI and data science – Research Professional News

Image: Wayan Vota [CC BY-NC-SA 2.0], via Getty Images

Deal on closer research collaboration comes after two countries co-host summit on artificial intelligence

The UK and the Republic of Korea are set to strengthen their research ties on artificial intelligence and data science following a deal between institutes in the two countries.

The UKs national institute for data science and AI, the Alan Turing Institute, signed a deal on 23 May with the Korea Advanced Institute of Science and Technology (Kaist).

This article on Research Professional News is only available to Research Professional or Pivot-RP users.

Research Professional users can log in and view the article via this link

Pivot-RP users can log in and view the article via this link.

Here is the original post:

UK and Korea agree closer ties on AI and data science - Research Professional News

Read More..

Analyzing Geo-based Campaigns with Synthetic Control | by Mandy Liu | May, 2024 – Towards Data Science

Mandy Liu

Follow

Towards Data Science

--

Share

(If you are not a Medium member, read the full story here)

Navigating the complexities of location-based testing can be challenging.

Luckily, GeoLift is here to optimize the process.

More here:

Analyzing Geo-based Campaigns with Synthetic Control | by Mandy Liu | May, 2024 - Towards Data Science

Read More..

The Two Documents Every Data Scientist Must Write Before Taking Interviews – Towards Data Science

Photo by airfocus on Unsplash

Data science interviews can be tough.

As data scientists, we can be expected to serve a sort of swiss-army knife function between business intelligence, software engineering, machine learning, data analysis, and product. On top of that, all

Continue reading here:

The Two Documents Every Data Scientist Must Write Before Taking Interviews - Towards Data Science

Read More..

Unlock real-time insights with AI-powered analytics in Microsoft Fabric – Microsoft

The data and analytics landscape is changing faster than ever. From the emergence of generative AI to the proliferation of citizen analysts to the increasing importance of real-time, autonomous action, keeping up with the latest trends can feel overwhelming. Every trend requires new services that customers must manually stitch into their data estatedriving up both cost and complexity.

With Microsoft Fabric, we are simplifying and future-proofing your data estate with an ever-evolving, AI-powered data analytics platform. Fabric will keep up with the trends for you and seamlessly integrate each new capability so you can spend less time integrating and managing your data estate and more time unlocking value from your data.

Set up Fabric for your business and discover resources that help you take the first steps

Aurizon, Australias largest rail freight operator, turned to Fabric to modernize their data estate and analytics system.

With Microsoft Fabric, weve answered many of our questions about navigating future growth, to remove legacy systems, and to streamline and simplify our architecture. A trusted data platform sets us up to undertake complex predictive analytics and optimizations that will give greater surety for our business and drive commercial benefits for Aurizon and our customers in the very near future.

Aurizon is just one among thousands of customers who have already used Fabric to revolutionize how they connect to and analyze their data. In fact, a 2024 commissioned Total Economic Impact (TEI) study conducted by Forrester Consulting found that Microsoft Fabric customers saw a three-year 379% return on investment (ROI) with a payback period of less than six months. We are thrilled to share a huge range of new capabilities coming to Fabric. These innovations will help you more effectively uncover insights and keep you at the forefront of the trends in data and analytics. Check out a quick overview of the biggest changes coming to Fabric.

Prepare your data for AI innovation with Microsoft Fabricnow generally available

Fabric is a complete data platformgiving your data teams the ability to unify, transform, analyze, and unlock value from data from a single, integrated software as a service (SaaS) experience. We are excited to announce additions to the Fabric workloads that will make Fabrics capabilities even more robust and even customizable to meet the unique needs of each organization. These enhancements include:

When we introduced Fabric, it launched with seven core workloads which included Synapse Real-time Analytics for data streaming analysis and Data Activator for monitoring and triggering actions in real-time. We are unveiling an enhanced workload called Real-Time Intelligence that combines these workloads and brings an array of additional new features, in preview, to help organizations make better decisions with up-to-the-minute insights. From ingestion to transformation, querying, and taking immediate action, Real-Time Intelligence is an end-to-end experience that enables seamless handling of real-time data without the need to land it first. With Real-Time Intelligence, you can ingest streaming data with high granularity, dynamically transform streaming data, query data in real-time for instant insights, and trigger actions like alerting a production manager when equipment is overheating or rerunning jobs when data pipelines fail. And with both simple, low-code or no-code, and powerful, code-rich interfaces, Real-Time Intelligence empowers every user to work with real-time data.

Behind this powerful workload is the Real-time hub, a single place to discover, manage, and use event streaming data from Fabric and other data sources from Microsoft, third-party cloud providers, and other external data sources. Just like the OneLake data hub makes it easy to discover, manage, and use the data at rest, the Real-time hub can help you do the same for data in motion. All events that flow through the Real-time hub can be easily transformed and routed to any Fabric data store and users can create new streams that can be discovered and consumed. From the Real-time hub, users can gain insights through the data profile, configure the right level of endorsement, set alerts on changing conditions and more, all without leaving the hub. While the existing Real-Time Analytics capabilities are still generally available, the Real-time hub and the other new capabilities coming to the Real-Time Intelligence workload are currently in preview. Watch this demo video to check out the redesigned Real-Time Intelligence experience:

Elcome, one of the worlds largest marine electronics companies, built a new service on Fabric called Welcome that helps maritime crews stay connected to their families and friends.

Microsoft Fabric Real-Time Intelligence has been the essential building block thats enabled us to monitor, manage, and enhance the services we provide. With the help of the Real-time hub for centrally managing data in motion from our diverse sources and Data Activator for event-based triggers, Fabrics end-to-end cloud solution has empowered us to easily understand and act on high-volume, high-granularity events in real-time with fewer resources.

Real-time insights are becoming increasingly critical across industries like route optimization in transportation and logistics, grid monitoring in energy and utilities, predictive maintenance in manufacturing, and inventory management in retail. And since Real-Time Intelligence comes fully optimized and integrated in a SaaS platform, adoption is seamless. Strathan Campbell, Channel Environment Technology Lead at One NZthe largest mobile carrier in New Zealandsaid they went from a concept to a delivered product in just two weeks. To learn more about the Real-Time Intelligence workload, watch the Ingest, analyze and act in real time withMicrosoft Fabric Microsoft Build session or read the Real-Time Intelligence blog.

Fabric was built from the ground up to be extensible, customizable, and open. Now, we are making it even easier for software developers and customers to design, build, and interoperate applications within Fabric with the new Fabric Workload Development Kitcurrently in preview. Applications built with this kit will appear as a native workload within Fabric, providing a consistent experience for users directly in their Fabric environment without any manual effort. Software developers can publish and monetize their custom workloads through Azure Marketplace. And, coming soon, we are creating a workload hub experience in Fabric where users can discover, add, and manage these workloads without ever leaving the Fabric environment. We already have industry-leading partners building on Fabric including SAS, Esri, Informatica, Teradata, and Neo4j.

You can also learn more about the Workload Development Kit by watching the Extend and enhance your analytics applications with Microsoft FabricMicrosoft Build session.

We are also excited to announce two new features, both in preview, created with developers in mind: API for GraphQL and user data functions in Fabric. API for GraphQL is a flexible and powerful RESTful API that allows data professionals to access data from multiple sources in Fabric with a single query API. With API for GraphQL, you can streamline requests to reduce network overheads and accelerate response rates. User data functions are user-defined functions built for Fabric experiences across all data services, such as notebooks, pipelines, or event streams. These features enable developers to build experiences and applications using Fabric data sources more easily like lakehouses, data warehouses, mirrored databases, and more with native code ability, custom logic, and seamless integration. You can watch these features in action in the Introducing API for GraphQL and User Data Functions in Microsoft Fabric Microsoft Build session.

You can also learn more about the Workload Development Kit, the API for GraphQL, user data functions, and more by reading the Integrating ISV apps with Microsoft Fabric blog.

We are also announcing the preview of Data workflows in Fabric as part of the Data Factory experience. Data workflows allow customers to define Directed Acyclic Graphs (DAG) files for complex data workflow orchestration in Fabric. Data workflows is powered by the Apache Airflow runtime and designed to help you author, schedule and monitor workflows or data pipelines using python. Learn more by reading the data workflows blog.

The typical data estate has grown organically over time to span multiple clouds, accounts, databases, domains, and engines with a multitude of vendors and specialized services. OneLake, Fabrics unified, multi-cloud data lake built to span an entire organization, can connect to data from across your data estate and reduce data duplication and sprawl.

We are excited to announce the expansion of OneLake shortcuts to connect to data from on-premises and network-restricted data sources beyond just Azure Data Lake Service Gen2, now in preview. With an on-premises data gateway, you can now create shortcuts to Google Cloud Storage, Amazon S3, and S3 compatible storage buckets that are either on-premises or otherwise network-restricted. To learn more about these announcements, watch the Microsoft Build session Unify your data with OneLake and Microsoft Fabric.

Insights drive impact only when they reach those who can use them to inform actions and decisions. Professional and citizen analysts bridge the gap between data and business results, and with Fabric, they have the tools to quickly manage, analyze, visualize, and uncover insights that can be shared with the entire organization. We are excited to help analysts work even faster and more effectively by releasing the model explorer and the DAX query view in Microsoft Power BI Desktop into general availability.

The model explorer in Microsoft Power BI provides a rich view of all the semantic model objects in the data panehelping you find items in your data fast. You can also use the model explorer to create calculation groups and reduce the number of measures by reusing calculation logic and simplifying semantic model consumption.

The DAX query view in Power BI Desktop lets users discover, analyze, and see the data in their semantic model using the DAX query language. Users working with a model can validate data and measures without having to build a visual or use an additional toolsimilar to the Explore feature. Changes made to measures can be seamlessly updated directly back to the semantic model.

To learn more about these announcements and others coming to Power BI, check out the Power BI blog.

When ChatGPT was launched, it had over 100 million users in just over two monthsthe steepest adoption curve in the history of technology.1 Its been a year and a half since that launch, and organizations are still trying to translate the benefit of generative AI from novelty to actual business results. By infusing generative AI into every layer of Fabric, we can empower your data professionals to employ its benefits, in the right context and in the right scenario to get more done, faster.

Copilot in Fabric was designed to help users unlock the full potential of their data by assisting data professionals to be more productive and business users to explore their data more easily. With Copilot in Fabric, you can use conversational language to create dataflows, generate code and entire functions, build machine learning models, or visualize results. We are excited to share that Copilot in Fabric is now generally available, starting with the Power BI experience. This includes the ability to create stunning reports and summarize your insights into narrative summaries in seconds. Copilot in Fabric is also now enabled on-by-default for all eligible tenants including Copilot in Fabric experiences for Data Factory, Data Engineering, Data Science, Data Warehouse, and Real-Time Intelligence, which are all still in preview. The general availability of Copilot in Fabric for the Power BI experience will be rolling out over the coming weeks to all customers with Power BI Premium capacity (P1 or higher) or Fabric capacity (F64 or higher).

We are also thrilled to announce a new Copilot in Fabric experience for Real-Time Intelligence, currently in preview, that enables users to explore real-time data with ease. Starting with a Kusto Query Language (KQL) Queryset connected to a KQL Database in an Eventhouse or a standalone Azure Data Explorer database, you can type your question in conversational language and Copilot will automatically translate it to a KQL query you can execute. This experience is especially powerful for users less familiar with writing KQL queries but still want to get the most from their time-series data stored in Eventhouse.

We are also thrilled to release a new AI capability in preview called AI skillsan innovative experience designed to provide any user with a conversational Q&A experience about their data. AI skills allow you to simply select the data source in Fabric you want to explore and immediately start asking questions about your dataeven without any configuration. When answering questions, the generative AI experience will show the query it generated to find the answer and you can enhance the Q&A experience by adding more tables, setting additional context, and configuring settings. AI skills can empower everyone to explore data, build and configure AI experiences, and get the answers and insights they need.

AI skills will honor existing security permissions and can be configured to respect the unique language and nuances of your organization, ensuring that responses are not just data-driven but steeped in the context of your business operations. And, coming soon, it can also enrich the creation of new copilots in Microsoft Copilot Studio and be interacted with from Copilot for Microsoft for 365. Its about making your data not just accessible but approachable, inviting users to explore insights through natural dialogue, and shortening the time to insight.

With the launch of Fabric, weve committed to open data formats, standards, and interoperability with our partners to give our customers the flexibility to do what makes sense for their business. We are taking this commitment a step further by expanding our existing partnership with Snowflake to expandinteroperability between Snowflake and Fabrics OneLake. We are excited to announce future support for Apache Iceberg in Fabric OneLake and bi-directional data access between Snowflake and Fabric. This integration will enable users to analyze their Fabric and Snowflake data written in Iceberg format in any engine within either platform, and access data across apps like Microsoft 365, Microsoft Power Platform, and Microsoft Azure AI Studio.

With the upcoming availability of shortcuts for Iceberg in OneLake, Fabric users will be able to access all data sources in Iceberg format, including the Iceberg sources from Snowflake, and translate metadata between Iceberg and Delta formats. This means you can work with a single copy of your data across Snowflake and Fabric. Since all the OneLake data can be accessed in Snowflake as well as in Fabric, this integration will enable you to spend less time stitching together applications and your data estate, and more time uncovering insights. To learn more about this announcement, read theFabric and Snowflake partnership blog.

We are also excited to announce we are expanding our existing relationship with Adobe. Adobe Experience Platform (AEP) and Adobe Campaign will have the ability to federate enterprise data from Fabric. Our joint customers will soon have the capability to connect to Fabric and use the Fabric Data Warehouse for query federation to create and enrich audiences for engagement, without having to transfer or extract the data from Fabric.

We are excited to announce that we are expanding the integration between Fabric and Azure Databricksallowing you to have a truly unified experience across both products and pick the right tools for any scenario.

Coming soon, you will be able to access Azure Databricks Unity Catalog tables directly in Fabric, making it even easier to unify Azure Databricks with Fabric. From the Fabric portal, you can create and configure a new Azure Databricks Unity Catalog item in Fabric with just a few clicks. You can add a full catalog, a schema, or even individual tables to link and the management of this Azure Databricks item in OneLakea shortcut connected to Unity Catalogis automatically taken care of for you.

This data acts like any other data in OneLakeyou can write SQL queries or use it with any other workloads in Fabric including Power BI through Direct Lake mode. When the data is modified or tables are added, removed, or renamed in Azure Databricks, the data in Fabric will remain always in sync. This new integration makes it simple to unify Azure Databricks data in Fabric and seamlessly use it across every Fabric workload.

Also coming soon, Fabric users will be able to access Fabric data items like lakehouses as a catalog in Azure Databricks. While the data remains in OneLake, you can access and view data lineage and other metadata in Azure Databricks and leverage the full power of Unity Catalog. This includes extending Unity Catalogs unified governance over data and AI into Azure Databricks Mosaic AI. In total, you will be able to combine this data with other native and federated data in Azure Databricks, perform analysis assisted by generative AI, and publish the aggregated data back to Power BImaking this integration complete across the entire data and AI lifecycle.

Join us at Microsoft Buildfrom May 21 to 23, 2024 to see all of these announcements in action across the following sessions:

You can also try out these new capabilities and everything Fabric has to offer yourself by signing up for a free 60-day trialno credit card information required. To start your free trial, sign up for a free account (Power BI customers can use their existing account), and once signed in, select start trial within the account manager tool in the Fabric app. Existing Power BI Premium customers can already access Fabric by simply turning on Fabric in their Fabric admin portal. Learn more on the Fabric get started page.

We are excited to announce a European Microsoft Fabric Community Conference that will be held in Stockholm, Sweden from September 23 to 26, 2024. You can see firsthand how Fabric and the rest of the data and AI products at Microsoft can help your organization prepare for the era of AI. You will hear from leading Microsoft and community experts from around the world and get hands on experiences with the latest features from Fabric, Power BI, Azure Databases, Azure AI, Microsoft Purview, and more. You will also have the opportunity to learn from top data experts and AI leaders while having the chance to interact with your peers and share your story. We hope you will join usand see how cutting-edge technologies from Microsoft can enable your business success with the power of Fabric.

If you want to learn more about Microsoft Fabric:

Experience the next generation in analytics

1ChatGPT sets record for fastest-growing user base analyst note, Reuters.

Arun Ulagaratchagan

Corporate Vice President, Azure DataMicrosoft

Arun leads product management, engineering, and cloud operations for Azure Data, which includes databases, data integration, big data analytics, messaging, and business intelligence. The products in his teams' portfolio include Azure SQL DB, Azure Cosmos DB, Azure PostgreSQL, Azure MySQL, Azure Data Factory, Azure Synapse Analytics, Azure Service Bus, Azure Event Grid, Power BI, and Microsoft Fabric.

Read more:

Unlock real-time insights with AI-powered analytics in Microsoft Fabric - Microsoft

Read More..

Mosaic Data Science Named Top AI & Machine Learning Company by CIO Review – Global Banking And Finance Review

The data science consultancy is honored to have received this recognition for the second consecutive year.

Leesburg, VA May 14, 2024 Mosaic Data Science, a trailblazer in data science consulting, has been recognized as theTop AI & Machine Learning Company by CIO Reviewforthe second consecutive year. This prestigious accolade underscores Mosaics commitment to delivering transformative, actionable analytics solutions that empower enterprises to harness the full potential of AI, ML, and mathematical optimization for insightful decision-making.

Where technology meets human ingenuity, transformative innovations that redefine industries are born, said Chris Brinton, CEO of Mosaic Data Science.

This statement encapsulates the core philosophy of Mosaic, a premier artificial intelligence company devoted to equipping enterprises with robust, actionable analytics solutions. Mosaicsteam of data scientistsis renowned for its deep domain expertise, positioning the company as a leader in superior, scalable solutions that drive significant digital transformations. Its unique blend of data engineering, statistical analytics, and problem-solving prowess ensures that strategic business visions and digital aspirations are realized. Through a well-established customer engagement model, Mosaic has carved a niche for itself, consistently delivering a competitive advantage in our tech-driven era.

We transcend typical AI/ML boundaries by delivering solutions rooted in real-world applications that foster growth, enhance operations, and produce measurable outcomes, highlights Brinton. Our mission is to simplify data science, making these powerful technologies accessible and effective for both burgeoning startups and established enterprises looking to expand their capabilities.

Mosaic champions adopting bespoke AI/ML tools tailored to the nuances of client needsbe it enhancing existing teams or managing entire project lifecycles. Their offerings have evolved to include innovative solutions such as theNeural Search Engine,Mosaic.deploy, andData-Driven Decision Dynamics.

Named thetop insight engine of 2024 by CIO Review, our Neural Search engine transcends traditional text-matching limitations, said Mike Shumpert, VP of Data Science. As businesses increasingly embrace GenAI and Large Language Models (LLMs), the strategic advantage lies not just in using these technologies but in expertly tuning them to specific needs. With our pioneering work in Reader/Retrieval Architectures, Neural Search helps unlock significant business value and empower clients with actionable insights for strategic decision-making.

These services and tools ensure that clients find optimal engagement paths tailored to their specific needs, thereby maintaining pace with technological advances and securing industry leadership. Mosaics solutions are particularly noted for integrating custom predictive and prescriptive analytics that seamlessly align with client data and business workflows, enhancing strategic outcomes with precision and expertise.

Mosaic.deploy enables clients to deploy AI/ML models efficiently while remaining sustainable and ethical, said Drew Clancy, VP of Sales & Marketing. This ensures long-term success, with an eye toward achieving explainable AI to help customers leverage these technologies for a sustainable future.

Mosaics expertise extends beyond implementing AI/ML solutions; it guides organizations throughout the adoption lifecycles. Fromscoping and executing AI roadmapsto building asustainable MLOps pipeline, it offers guidance that ensures seamless integration and impactful results.

Each project requires custom AI/ML tuning to guarantee ideal outcomes, said Chris Provan, Managing Director of Data Science. Our methodologiesdesigned to make data science techniques understandable and actionableand our expertise in the niche lead to the best outcomes, transforming challenges into growth opportunities.

Mosaics dedication to ethical AI practices is further demonstrated through its partnership withEpstein Becker Green.Mosaic offersexplainable AI and bias auditing servicesto help de-risk clients AI plans, ensuring that crucial information is ethical and reliable for better decision-making and compliance with industry standards. This partnership is ideal for evaluating AI lifecycles for potential risks, improving governance, and offering holistic solutions for fair AI decisions.

Over the last decade, Mosaic has achieved over 350 successful AI/ML deployments and participated in countless analytics and optimization projectsacross dozens of industries,proving that they arent simply participating in the AI revolutionthey are leading it. This recognition as the top AI & ML company of 2024 by CIO Review affirms Mosaics role as a key innovator in the AI and machine learning space, continually pushing the envelope in analytics development and reinforcing its leadership in continuous innovation.

About Mosaic Data Science

Mosaic Data Science is a leading technology and business innovator known for its expertise in AI and ML. With services ranging from Neural Search Engines to AI/ML roadmaps, Mosaic excels in crafting cutting-edge solutions that propel client business goals.It is the Top AI & Machine Learning Company of 2024.

About CIO Review

CIO Review is a leading technology magazine that bridges the gap between enterprise IT vendors and buyers. As a knowledge network, CIO Review offers a range of in-depth CIO/CXO articles, whitepapers, and research studies on the latest trends in technology.

Read the rest here:

Mosaic Data Science Named Top AI & Machine Learning Company by CIO Review - Global Banking And Finance Review

Read More..

TECH TUESDAY: Assessing the Present and Future of AI in Markets – Traders Magazine

TECH TUESDAYis a weekly content series covering all aspects of capital markets technology. TECH TUESDAY is produced in collaboration with Nasdaq.

Artificial intelligence (AI), specifically generative AI, has perhaps been the hottest emerging technology topic in markets of late. Traders Magazine caught up withMike ORourke, Head of AI and Emerging Technology at Nasdaq, to learn more about the current AI landscape and how it will evolve.

Tell us about your background and your current role at Nasdaq.

Ive been with Nasdaq for 25 years, the last 10 of which have been primarily focused on emerging technologiesAI, data science and our move to the cloud.

Back in 2016, I ran our data business technology; we were doing a lot of data science, but there was no formal data scienceprogramat Nasdaq. So, we built out a whole career track as well as a center of excellence for AI because, at the time, Brad Peterson, Chief Technology, and Information Officer at Nasdaq, and I anticipated that this was going to be very important. We wanted to start building up talent, a skill set and prowess in AI.

When did Nasdaq start using AI, and what were the early use cases?

The early projects at that time focused on using machine learning and AI language models to understand data and make new alternative data sets. How can we use AI to process unstructured data and convert it into structured data and knowledge so people can make better investments and trading decisions? We also used machine learning models to better understand trading activity and behavior. Why am I not getting the fill rate that I want? Why did this order not execute? How are my trading patterns compared to other firms? Things like that.

How has Nasdaqs cloud journey and partnership with Amazon Web Services (AWS) facilitated AI adoption?

AWS has been a wonderful partner. The cloud is really where innovations happen first, and Nasdaq took the stance that we needed to be early adopters by moving our data and systems there. It was about being agile and able to scale more easily.

Our investment in the cloud is paying offthe new emerging AI technologies are there, and because our data is there, too, we can readily implement these new models on top of our data. Weve invested a lot in having really good data systems, and to have good AI, you need sound data, which we have in the cloud.

AI is one of our strategic pillars across the companywe want to incorporate it into all our products and services, and the cloud is a significant enabler of this.

What current AI use cases are you most excited about?

Id be remiss if I didnt talk aboutDynamic Midpoint Extended Life Order (M-ELO), which is part of our dynamic markets program. We believe we can provide better service to our clients by having order types that can dynamically adapt to market conditions. Its still early days for Dynamic M-ELO, but its been quite successfulwere seeing increased order flow and higher hit rates, and were seeing no degradation in markouts. So, clients are clearly benefiting from the service.

Another part of dynamic markets is our strike optimization program, where we use machine learning models to figure out which strikes we should have listed in the options market. About 1.3 million strikes can be listed in the options market, so manually determining what strikes should be listed is difficult. We think this is a perfect use case where machines can do it better than humans.

We have several other items in the research phase, such as order types and other solutions within the dynamic markets program. We believe these two first launches are just the beginning of how AI will transform markets.

Generative AI has been a very hot topic. Do any AI use cases at Nasdaq utilize generative AI?

Broadly speaking, if theres an investigative component anywhere within a solution set, generative AI can be very valuable.

As for specific applications, ourVerafinand market surveillance solutions are rolling out automated investigators that use generative AI to make investigating illicit activities more effortless. We recently launched a feature inBoardVantagethat allows us to summarize documents, which can make reviewing length board documents easier.

Beyond those examples, there are a host of new product enhancements that use generative AI. Really, anywhere theres a user interface where people are trying to answer questions, youre probably going to see some sort of generative chatbot interface in the near future.

What will the AI adoption weve been talking about mean for Nasdaq and the markets in the future?

Information can be acted upon much more quickly because generative systems can summarize data more quickly.

Im dating myself here, but when I was a kid, if I wanted to research something, I had to go to the libraryit would take me quite a long time to find the answer. With the advent of the internet, I didnt have to drive to the library anymore. And then when mobile came out, I didnt even have to go home to my computer. Timeframes for information access have gotten shorter and shorterand generative AI will further compress that time.

The upshot for markets is that information will be acted upon much more quickly. This is why we think dynamic market solutions are so important.

What about generative AI raises caution flags?

Generative AI is an incredibly exciting area, but there are risks, and people need to take those risks seriously.

How are AI models built, how are they trained and where are they hosted? At Nasdaq, we focus on having excellent AI governance, where anything that goes out is looked at from a compliance, regulatory and technology perspective to ensure its safe, secure and scalable for clients. No AI solution goes out without a rigorous governance process to ensure that the models are effective and can be trusted.

Explainability is also critical. When we came to market with some of our dynamic market solutions, like Dynamic M-ELO, we were very transparent; in fact, we published a whole whitepaper about it. We covered questions like what features are going into the model, how the model behaves and what type of information it will use to make decisions.

Having that transparency and explainability is essential, and its something we value at Nasdaq.

Transparency is a core value for Nasdaq and a key part of what makes modern financial markets work so well. When it comes to generative AI at Nasdaq, transparency is equally important. Being able to explain both where information comes from and why its relevant is critical to ensuring generative AI is trustworthy and effective. Were excited about what transformation opportunities generative AI can provide, and we look forward to continuing this journey of discovery.

Creating tomorrows markets today. Find out more about Nasdaqs offerings to drive your business forwardhere.

Excerpt from:

TECH TUESDAY: Assessing the Present and Future of AI in Markets - Traders Magazine

Read More..