LEESBURG, Va., July 31, 2023 (Newswire.com) - Mosaic Data Science has joined EBG Advisors, the affiliate consultancy of national law firm Epstein Becker Green, to support the advisory network's nationwide algorithmic bias auditing and risk management services through Mosaic's award-winning artificial intelligence (AI) capabilities.
Developed in alignment with the AI Risk Management Framework developed by the National Institute of Standards and Technology (NIST), EBG Advisors' algorithmic bias auditing and risk management initiative unites data and social scientists to identify and advise on processes that can help reduce bias in AI without being cumbersome or blocking progress.
Mosaic is tasked with critically evaluating the AI lifecycle for potential risk and abuse, helping clients of Epstein Becker Green and EBG Advisors understand how AI-based technology can be applied to their businesses while also identifying possible harmful outcomes.
"Integrated quality assurance and model monitoring provide peace of mind, and empirical evidence that deployed models or algorithms continue to be in alignment with relevant regulations and organizational values while still empowering data scientists to innovate and increase the value that models provide," said Michael Shumpert, Managing Director of Mosaic Data Science.
To successfully audit for algorithmic bias within AI models, Mosaic analyzes model or algorithm inputs and outputs under various scenarios, as well as potential data sets, the code used to train the models, and the trained model code. Mosaic's data scientists do this in the context of existing or proposed regulations or legislation that dictate relevant ethical standards for an algorithm and use case, such as non-discrimination based on race or gender.
Next, Mosaic develops analyses and experiments to ensure compliance with the identified standards. If the statistical evidence suggests that an algorithm does not comply with the standards, Mosaic will help identify algorithm adjustments to ensure compliance. Every model that is moved to production is evaluated, preventing problematic models from being released. Post-deployment model monitoring is also conducted to ensure that issues with models or changes in the data fed into them are identified and handled appropriately without potentially costly delays.
"Federal and state regulators are actively using existing statutes and regulations to threaten industry with enforcement if models cause harm, especially in such areas as consumer products and services, employment, and healthcare," said Bradley Merrill Thompson, Member of the Firm at Epstein Becker Green and Chief Data Scientist at EBG Advisors. "Further, as new regulatory compliance standards for AI continue to emerge, those companies that have invested in their algorithmic quality assurance will be the most prepared to verify the quality of their algorithms and models and mitigate regulatory risks."
Contact Information:Drew Clancy
VP of Marketing and Sales
[email protected]
(410) 458-7674
Original Source: Mosaic Data Science Provides Cutting-Edge Support for National Law Firm's Expansion of National Algorithmic Bias Auditing and Risk Management Services