Digital Paper Trail

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 6

Digital Paper Trail

Eliminating Bias in Medicine: Could AI Minimize Human Error in Diagnosis?


Research Question: AIs capability to manage broad amounts of information in a brief time span
offers promising prospects for medical commercialization, but flaws in its visual recognition and
imaging as a result of limited data pools spark concerns on the ethical implications of a software
with measurable errors within certain populations. 

Keywords: Machine learning in Medicine, AI and medical imaging, Racial bias in medical
imaging, Socialized racism in AI
POV#1- AI as a tool to customize treatment, and limit medical bias in diagnosis
1. Louis A. Penner, John F. Dovidio, Tessa V. West, Samuel L. Gaertner, Terrance L. Albrecht,
Rhonda K. Dailey, Tsveti Markova. Aversive racism and medical interactions with Black
patients: A field study, Journal of Experimental Social Psychology, Volume 46, Issue 2,
2010, Pages 436-440, ISSN 0022-1031, https://doi.org/10.1016/j.jesp.2009.11.004.

2. Annie M. Wu & Lucy Q. Shen (2023) Racial Disparities Affecting Black Patients in
Glaucoma Diagnosis and Management, Seminars in Ophthalmology, DOI:
10.1080/08820538.2023.2168489 

3. Hoffman KM, Trawalter S, Axt JR, Oliver MN. Racial bias in pain assessment and
treatment recommendations, and false beliefs about biological differences between blacks
and whites. Proceedings of the National Academy of Sciences. 2016;113(16):4296-4301.
doi:10.1073/pnas.1516047113

4. Ramesh AN, Kambhampati C, Monson JR, Drew PJ. Artificial intelligence in medicine.
Ann R Coll Surg Engl. 2004 Sep;86(5):334-8. doi: 10.1308/147870804290.
 the ability to achieve human-level performance in cognitive tasks, this later became
popular as the ‘Turing test’
 ANNs are computational analytical tools which are inspired by the biological nervous
system

5. Soomro TA, Zheng L, Afifi AJ, Ali A, Yin M, Gao J. Artificial intelligence (AI) for
medical imaging to combat coronavirus disease (COVID-19): a detailed review with
direction for future research. Artificial Intelligence Review. 2022;55(2):1409-1439.
doi:10.1007/s10462-021-09985-z
 AI as an imaging tool to diagnose Covid 19
6. Satcher D, Drew CR. Does Race Interfere With the Doctor-Patient Relationship? JAMA.
1973;223(13):1498–1499. doi:10.1001/jama.1973.03220130044013
 Many white physicians with whom I trained preferred black patients because they
believed the black patient was less likely to be critical and to express dissatisfaction or to
question procedures. Most white physicians interpreted the master-servant relationship as
a good doctor-patient relationship
7. Tang X. The role of artificial intelligence in medical imaging research. BJR Open. 2019
Nov 28;2(1):20190031. doi: 10.1259/bjro.20190031. PMID: 33178962; PMCID:
PMC7594889.
 In radiation oncology, AI has been applied on different image modalities that are used at
different stages of the treatment. i.e. tumor delineation and treatment assessment.
 On the policy level, there are increasing concerns on patient privacy. Patient-related
health information was protected by tight privacy policies, which limited cross-institution
image sharing.

8. Tang X, Wang B, Rong Y. Artificial intelligence will reduce the need for clinical medical
physicists. J Appl Clin Med Phys. 2018 Jan;19(1):6-9. doi: 10.1002/acm2.12244. PMID:
29333732; PMCID: PMC5768036.
 Already, there is an estimate of 20–40 M jobs in peril in the US from developments in AI
and its related technology, which counts 15%–30% of the US labor force.
 A significant benefit of having AI is to increase our time in interacting with patients, as
AI will reduce routine clinical workload for medical physicists.

9. Schork NJ. Artificial Intelligence and Personalized Medicine. Cancer Treat Res.
2019;178:265-283. doi: 10.1007/978-3-030-16391-4_11. PMID: 31209850; PMCID:
PMC7580505.
  the transmission of information from one component to another – or the transitions from
one component to another – is of crucial importance (e.g., consider that a diagnostic
would not be particularly useful if it did not help a physician choose an appropriate
course of action).
10. Pifer, R. (2019, September 25). Ai just as effective as clinicians in diagnostics, study
suggests. Healthcare Dive. Retrieved March 16,
2023,https://www.healthcaredive.com/news/ai-just-as-effective-as-clinicians-in-
diagnostics-study-suggests/563605/#:~:text=However%2C%20AI%20didn't
%20outperform,clinicians'%2086%25%20accuracy%20rate. 
 researchers found deep learning algorithms correctly detected disease in 87% of cases,
compared to clinicians’ 86% accuracy rate. AI and healthcare professionals had similar
rates of identifying healthy medical images, at 93% and 91% accuracy, respectively.

POV #2- AI was socialized with racist medical misinformation, which decreases its
effectiveness on patients of color.
1. Fosch-Villaronga E, Drukarch H, Khanna P, Verhoef T, Custers B. Accounting for
diversity in AI for medicine. COMPUTER LAW & SECURITY REVIEW. 2022;47.
doi:10.1016/j.clsr.2022.105735
2. Matthew DeCamp, Charlotta Lindvall, Latent bias and the implementation of artificial
intelligence in medicine, Journal of the American Medical Informatics Association,
Volume 27, Issue 12, December 2020, Pages 2020–2023,
https://doi.org/10.1093/jamia/ocaa094
 An AI algorithm trained to operate fairly in 1 context could learn from disparities in care
in a different context and start to produce biased results; or the algorithm might simply
learn from pervasive, ongoing, and uncorrected biases in the broader healthcare system
that lead to disparate care and outcomes

3. Zalnieriute M, Cutts T. How AI and new technologies reinforce systemic racism -


ohchr.org. https://www.ohchr.org/sites/default/files/documents/hrbodies/hrcouncil/
advisorycommittee/study-advancement-racial-justice/2022-10-26/HRC-Adv-comm-
Racial-Justice-zalnieriute-cutts.pdf. Accessed March 6, 2023.

 , AI is well-suited to handle repetitive work processes, managing large amounts of data,


and can provide another layer of decision support to mitigate errors
  Most algorithms deployed in the healthcare context do not consider these aspects and do
not account for bias detection. Missing these dimensions in algorithms used in medicine
is a huge point of concern, as neglecting these aspects will inevitably produce far from
optimal results and generate errors that may lead to misdiagnosis and potential
discrimination

4. Racial bias in health care artificial intelligence. NIHCM. (2021, September 30).
Retrieved March 10, 2023, from https://nihcm.org/publications/artificial-intelligences-
racial-bias-in-health-care 
     Algorithmic predictions (43%) accounted for 4.7x more of the racial disparities in pain
relative to standard measures of severity graded by radiologists (9%)

5. Sveen, W., Dewan, M., & Dexheimer, J. W. (2022). The risk of coding racism into
pediatric sepsis care: The necessity of antiracism in machine learning. The Journal of
Pediatrics, 247, 129–132. https://doi.org/10.1016/j.jpeds.2022.04.024 
 a retrospective analysis of 9816 children with severe sepsis indicated that Black children
had an OR of 1.37 (95% CI, 1.19-1.58) of death compared with White children. The aOR
improved to 1.19 (95% CI, 1.02-1.38) when a multivariate analysis accounted for sex,
age, income by zip code, hospital size and region, and chronic conditions, but the
inequality remains
 previous examples of machine learning have demonstrated that attempted “race-neutral”
or “color-blind” models still result in racial inequalities.

6. Cohen FSM, Brass G, Kirmayer LJ. Decolonizing health care: Challenges of cultural and
epistemic pluralism in medical decision making with Indigenous communities. Bioethics.
2021;35(8):767-778. doi:10.1111/bioe.12946
 Indigineous communities hesitant to pursue treatment due to racist past.
7. Hostetter, M., & Klein, S. (2021, January 14). Understanding and ameliorating medical
mistrust among Black Americans. Commonwealth Fund. Retrieved March 10, 2023, from
https://www.commonwealthfund.org/publications/newsletter-article/2021/jan/medical-
mistrust-among-black-americans 
 Black hesitancy to covid 19 vaccine even though most at risk.

8. Grant, C. (2023, February 24). Algorithms are making decisions about health care, which
may only worsen medical racism: ACLU. American Civil Liberties Union. Retrieved
March 10, 2023, https://www.aclu.org/news/privacy-technology/algorithms-in-health-
care-may-worsen-medical-racism 
 Black patients had to be deemed much sicker than white patients to be recommended for
the same care. This happened because the algorithm had been trained on past data on
health care spending, which reflects a history in which Black patients had less to spend
on their health care compared to white patients, due to longstanding wealth and income
disparities. 

9. Davis, N. (2021, November 9). Ai skin cancer diagnoses risk being less accurate for dark
skin – study. The Guardian. Retrieved March 16, 2023, from
https://www.theguardian.com/society/2021/nov/09/ai-skin-cancer-diagnoses-risk-being-
less-accurate-for-dark-skin-study 
 Few of the 21 datasets recorded the ethnicity or skin type of the individuals
photographed, with the team noting that means it is unclear how generalisable algorithms
based on them would be.
 The team found just 2,436 of a total of 106,950 images within the 21 databases had skin
type recorded. Of these, only 10 images were from people recorded as having brown skin
and one was from an individual recorded as having dark brown or black skin.

10. Wen D, Khan SM, Ji Xu A, et al. Characteristics of publicly available skin cancer image
datasets: A systematic review. The Lancet Digital Health. 2022;4(1). doi:10.1016/s2589-
7500(21)00252-1
 In a Chinese study an image classifier algorithm trained and validated predominantly on
images of east Asian skin underperformed on skin lesion images of White patients from
the USA. Providing evidence this is based on localized social constructs of medical
information.
POV #3- AI could be implemented, but only with an accurate and reviewed data pool.
1. Artificial Intelligence in medicine. IBM. https://www.ibm.com/topics/artificial-
intelligence-medicine. Accessed March 6, 2023.

2. Myers A. The Future of Artificial Intelligence in Medicine and Imaging. Stanford HAI.
https://hai.stanford.edu/news/future-artificial-intelligence-medicine-and-imaging.
Published April 1, 2020. Accessed March 8, 2023.
 democratize the data pool

3. Cheuk T. Can AI be racist? Color evasiveness in the application of machine learning to


science assessments. Science Education. 2021;105(5):825-836. doi:10.1002/sce.21671
 Ethical challenges in teaching ML and AI

4. Seneor, A. (2022, October 24). Open source data science: How to reduce bias in AI.
World Economic Forum. Retrieved March 10, 2023, from
https://www.weforum.org/agenda/2022/10/open-source-data-science-bias-more-ethical-
ai-technology/#:~:text=Bias%20in%20AI%20is%20when,biological%20sex%2C
%20nationality%20or%20age. 
 Facial recognition and racial biases in health care are two classic examples. The data used
to train these systems lack examples of people with dark skin color, therefore they do a
poor job of recognizing people of color

5. Moorley, C., Ferrante, J., Jennings, K., & Dangerfield, A. (2020). Decolonizing care of
black, Asian and minority ethnic patients in the critical care environment: A practical
guide. Nursing in Critical Care, 25(5), 324–326. https://doi.org/10.1111/nicc.12537
  During pandemic situations, people of ethnic minority backgrounds suffer higher
infection rates and exacerbation of symptoms of comorbidities, which can lead to critical
care admissions and can result in more deaths than the general population

6. Wu D, Chen S, Zhang Y, Zhang H, Wang Q, Li J, Fu Y, Wang S, Yang H, Du H, Zhu H,


Pan H, Shen Z. Facial Recognition Intensity in Disease Diagnosis Using Automatic
Facial Recognition. Journal of Personalized Medicine. 2021; 11(11):1172.
https://doi.org/10.3390/jpm11111172
 diagnostic accuracy of facial recognition for Turner syndrome tended to be lower than
that of Down syndrome, although a larger sample size helped to improve it
7. Teal, C.R., Gill, A.C., Green, A.R. and Crandall, S. (2012), Helping medical learners
recognise and manage unconscious bias toward certain patient groups. Medical
Education, 46: 80-88. https://doi.org/10.1111/j.1365-2923.2011.04101.x
 Researchers have explored the role of unconscious cognition in two areas: clinical
reasoning, particularly with respect to diagnosis, and perceptions of patients and their
influence on subsequent interactions and decisions.
 Uncon-scious bias in clinical care is activated when a doctor automatically (without
thinking) classifies a patient as a member of a group and applies stereotypical
characterizations of the group – whether positive or negative – to the individual.
 we suggest that multiple and diverse educational experiences are necessary to progress
through the developmental stages and integrate awareness of UB into regular practice in a
mindful way.Learners should be given opportunities to become aware of their own biases
while educators directly explicate the concepts of implicit versus explicit biases.

 Data Pool is a term that is used for multiple databases that are available through instances
to a range of users offering specific in depth big data space and tools. Pool is used to refer
to a body of water that is small and deep.
 Data Lake is a term that is used for big data storage. The data lake is virtual repository for
relational and non relational databases and allows for advanced correlation and analytics
for modern problem solving. A lake is a large body of water that is deep and wide.

8. How 'ai hesitancy' is hindering healthcare. HealthManagement.


https://healthmanagement.org/c/it/news/how-ai-hesitancy-is-hindering-healthcare.
Published September 3, 2019. Accessed March 18, 2023.
 In the UK, one report says that AI and automation is likely to translate into annual
savings of £12.5 billion for the NHS.

9. Ahmet Baki Kocaballi, Enrico Coiera, Huong Ly Tong, Sarah J White, Juan C Quiroz,
Fahimeh Rezazadegan, Simon Willcock, Liliana Laranjo, A network model of activities
in primary care consultations, Journal of the American Medical Informatics Association,
Volume 26, Issue 10, October 2019, Pages 1074–1082,
https://doi.org/10.1093/jamia/ocz046
 The first technical challenge arises from the difference between dictated speech—which
is structured by the human speaker for a specific documentation purpose—and clinical
conversation—which arises between humans to establish common ground, glean
information, and solve problems.

You might also like