Breaking News

Bias in artificial intelligence impacts drug development: AiCure

Bias in artificial intelligence impacts drug development: AiCure

Artificial intelligence and other forms of advanced data analysis frequently are viewed as completely impartial and free of bias. However, when the people that set up the processes inadvertently introduce bias, the results can be impacted.

Rich Christie, chief medical officer of AiCure, recently connected with Outsourcing-Pharma to talk about the problems that bias in AI can lead to in trials and development.

OSP: Could you please share some perspective on bias around clinical trial data? Specifically, how can AI and other advanced analytical tools (largely seen as objective) be slanted toward one group and/or against another?

RC: Bias can arise in AI when the data used in initial training sets aren’t completely representative of the diversity present in subsequent patient populations. After all, the strength and generalizability of AI models are reflective of the underlying training data sets, which is why consistently evaluating the quality and quantity of data sets is critical.

For example, the data sets that many computer vision tools trained on in the past lacked variability and diversity along with a number of dimensions, typically because many of them were based on computer programmer volunteers, a group historically limited in diversity. This means that when visual tools are used with patients of diverse backgrounds, the inherent bias can not only impact research operations, but it can also affect patient outcomes and ultimately impact the ability to contribute meaningful data to studying the safety and efficacy of drugs.

https://www.outsourcing-pharma.com/Article/2021/12/20/Bias-in-artificial-intelligence-impacts-drug-development-AiCure