Founding the Foundry - Algorithmic Accountability: Designing for Safety

College of Science

Speaker

Professor Ben Shneiderman

Speaker's Biography

Ben Shneiderman is a Distinguished University Professor in the Department of Computer Science, Founding Director (1983-2000) of the Human-Computer Interaction Laboratory, and a Member of the UM Institute for Advanced Computer Studies at the University of Maryland. He is a Fellow of the AAAS, ACM, IEEE, and NAI, and a Member of the National Academy of Engineering, in recognition of his pioneering contributions to human-computer interaction and information visualization. His contributions include the direct manipulation concept, clickable highlighted web-links, touchscreen keyboards, dynamic query sliders, development of treemaps, novel network visualizations for NodeXL, and temporal event sequence analysis for electronic health records.

Shneiderman is the lead author of Designing the User Interface: Strategies for Effective Human-Computer Interaction (6th ed., 2016). He co-authored Readings in Information Visualization: Using Vision to Think with Stu Card and Jock Mackinlay and Analyzing Social Media Networks with NodeXL with Derek Hansen and Marc Smith. Shneiderman’s book The New ABCs of Research: Achieving Breakthrough Collaborations (Oxford, April 2016), has an accompanying short book Rock the Research: Your Guidebook to Accelerating Campus Discovery and Innovation (2018).

From: 24 Jul 2018, noon
To: 27 Jul 2018, 1:30 p.m.
Location: Mall Room, Taliesin Create, Swansea University, Singleton Park, SA2 8PZ, Swansea

Join us for the latest in the series of the Founding the Foundry lectures as we welcome Ben Shneiderman and Jennifer Preece to the Computational Foundry for a two-part Founding the Foundry lecture.

Ben will be giving a talk on Tuesday 24th July, 12:30pm (lunch from 12:00pm), in the Taliesin Arts Centre Mall Room, entitled Algorithmic Accountability: Designing for Safety.

Vital services, such as communications, financial trading, healthcare, and transportation depend on sophisticated algorithms, some relying on unpredictable artificial intelligence techniques, such as deep learning, that are increasingly embedded in complex software systems. As high-speed trading, medical devices, and autonomous aircraft become more widely implemented, stronger checks become necessary to prevent failures. Design strategies that promote human-centered systems, which are comprehensible, predictable, and controllable can increase safety and make failure investigations more effective. Social strategies that support human-centered independent oversight during planning, continuous monitoring during operation, and retrospective analyses following failures can play a powerful role in making more reliable and trustworthy systems. Clarifying responsibility for failures stimulates improved design thinking


Contact: Julia Harrison (Email: j.f.harrison@swansea.ac.uk) - Telephone: 01792604347


Event created by: j.f.harrison