Link Search Menu Expand Document

Wk 10. Validating Machine Learning Models

Lecture Date: October 26, 2020 - Tuesday
Lecturer: Dr. Noah Goodall

This week, we will feature one short lecture by Dr. Noah Goodall and an asynchronous video assignment.

Dr. Noah Goodall’s lecture is entitled “Algorithmic Risk Management in Automated Driving”. Here is the description of his lecture: Driving requires a series of subtle risk management decisions. When these decisions rely on algorithms, the results can have massive safety implications when multiplied by the three trillion miles driven in the United States each year. As an example, a vehicle programmed to minimize its risk exposure may position itself laterally closer to a small car on its left and away from a large truck on its right, transferring risk to the small car without consent. This talk discusses the technical and ethical challenges of perception and risk management for automated driving systems, as well as strategies to reduce bias.

The asynchronous video assignment is from Dr. Abigail Z. Jacobs about fairness in the computational systems, potential harms that it can cause, and potential ways to mitigate such harms. The video continues with Dr. Su Lin Blodgett’s presentation related to measuring biases in Natural Language Processing models. Access the video here


On Blackboard

Assigned material:

  • Chouldechova, A., & Roth, A. (2020). A snapshot of the frontiers of fairness in machine learning. Communications of the ACM, 63(5), 82-89.
  • Robinson, M. C., & Glen, R. C. (2020). Validating the validation: reanalyzing a large-scale comparison of deep learning and machine learning models for bioactivity prediction. Journal of computer-aided molecular design, 1-14.
  • Fuhl, W., Rong, Y., Motz, T., Scheidt, M., Hartel, A., Koch, A., & Kasneci, E. (2020). Explainable Online Validation of Machine Learning Models for Practical Applications. arXiv preprint arXiv:2010.00821.

Back to top

Copyright © Hamdi Kavak. CSI 709/CSS 739 - Verification and Validation of Models.