Sunday, Nov 12, 2023
|
Sunday, Nov 12, 2023
8:30 AM - 12:00 PM CST
|
|
Mitigating AI Risk through Ethical Data Science
SOLD OUT
Artificial Intelligence (AI) has the potential to improve healthcare outcomes, yet AI technologies can also present risks. Bias in AI, defined as “unfair systematic error”, or more specifically, “the inclination or prejudice of a decision made by an AI system which is for or against one person or group, especially in a way considered to be unfair” is a topic of vital importance in both the informatics and ethics communities. This workshop will focus on several major facets of mitigating AI risks. A panel discussion will feature presentations from experts from the legal, machine learning, statistics, and medical domains. A poster session with lightning talks will further inform the topic. Subgroup discussions will add attendee input to salient issues of law, policy, clinical practice, data management, and AI transparency. A concluding session will summarize workshop takeaways and chart a path for going forward.
Speaker(s):
Speaker:
Qing Zeng, PhD, George Washington University
Speaker:
Melissa M. Goldstein, JD, George Washington University
Speaker:
Joseph Goulet, PhD, VA Connecticut/Yale University
Speaker:
Ravi Parikh, MD, Perelman School of Medicine, University of Pennsylvania
Speaker:
T. Elizabeth Workman, PhD, The George Washington University
Speaker:
Bruce Bray, MD, University of Utah
Speaker:
Yijun Shao, PhD, George Washington University
Fairness and Elimination of Bias
Ethical, Legal, Regulatory, and Social Issues
Machine Learning
Workshop - Collaborative
Location: Jefferson Ballroom
Session Code: W29
Session Credits: 3.00
|
Location: Jefferson Ballroom
Session Code: W29
Session Credits: 3.00
|
2023111208:3012:00 088
|
SUN, NOV 12
|
15DBDCFC-2A98-ED11-80F4-8AF502304007
14DBDCFC-2A98-ED11-80F4-8AF502304007
21DBDCFC-2A98-ED11-80F4-8AF502304007
E6DADCFC-2A98-ED11-80F4-8AF502304007
6C73A080-3198-ED11-80F4-8AF502304007
6A73A080-3198-ED11-80F4-8AF502304007
D8DADCFC-2A98-ED11-80F4-8AF502304007
6D73A080-3198-ED11-80F4-8AF502304007
6973A080-3198-ED11-80F4-8AF502304007
8C73A080-3198-ED11-80F4-8AF502304007
9873A080-3198-ED11-80F4-8AF502304007
|
|