Sefydlu’r Ffowndri - Algorithmic Accountability: Designing for Safety

College of Science

Speaker

Professor Ben Shneiderman

Speaker's Biography

Mae Ben Shneiderman yn Athro Prifysgol Nodedig yn yr Adran Gyfrifiadureg, Cyfarwyddwr Sefydlu (1983-2000) y Labordy Rhyngweithio rhwng Dyn a Chyfrifiadur ac yn Aelod o’r Sefydliad Astudiaethau Cyfrifiadureg Uwch ym Mhrifysgol Maryland. Mae’n Gymrawd o’r AAAS, ACM, IEEE a NAI, ac yn Aelod o’r Academi Peirianneg Genedlaethol, fel cydnabyddiaeth o’i gyfraniad arloesol i ryngweithio rhwng dyn a chyfrifiadur a delweddu gwybodaeth. Mae ei gyfraniadau yn cynnwys y cysyniad trin uchelgeisiol, dolenni wedi’u uwcholeuo a ellir eu pwyso, bysellfyrddau sgrin gyffwrdd, sleidiau ymholiadau dynamig, datblygiad siartiau treemaps, delweddau rhwydwaith nofel ar gyfer NodeXL, a dadansoddi dilyniant digwyddiad tymhorol ar gyfer cofnodion iechyd electronig.

Shneiderman yw prif awdur Designing the User Interface: Strategies for Effective Human-Computer Interaction (rhifyn 6, 2016). Cyd-ysgrifennodd Readings in Information Visualization: Using Vision to Think gyda Stu Card a Jock Mackinlay a Analyzing Social Media Networks with NodeXL gyda Derek Hansen and Marc Smith. Mae gan lyfr Shneiderman, The New ABC’s of Research: Achieving Breakthrough Collaborations (Rhydychen 2016), lyfr byr arall i gyd-fynd ag ef sef Rock the Research: Your Guidebook to Accelerating Campus Discovery and Innovation (2018).

From: July 24, 2018, noon
To: July 27, 2018, 1:30 p.m.
Location: Mall Room, Taliesin Create, Swansea University, Singleton Park, SA2 8PZ, Swansea

Join us for the latest in the series of the Founding the Foundry lectures as we welcome Ben Shneiderman and Jennifer Preece to the Computational Foundry for a two-part Founding the Foundry lecture.

Ben will be giving a talk on Tuesday 24th July, 12:30pm (lunch from 12:00pm), in the Taliesin Arts Centre Mall Room, entitled Algorithmic Accountability: Designing for Safety.

Vital services, such as communications, financial trading, healthcare, and transportation depend on sophisticated algorithms, some relying on unpredictable artificial intelligence techniques, such as deep learning, that are increasingly embedded in complex software systems. As high-speed trading, medical devices, and autonomous aircraft become more widely implemented, stronger checks become necessary to prevent failures. Design strategies that promote human-centered systems, which are comprehensible, predictable, and controllable can increase safety and make failure investigations more effective. Social strategies that support human-centered independent oversight during planning, continuous monitoring during operation, and retrospective analyses following failures can play a powerful role in making more reliable and trustworthy systems. Clarifying responsibility for failures stimulates improved design thinking


Contact: Julia Harrison (Email: j.f.harrison@swansea.ac.uk) - Telephone: 01792604347


Event created by: j.f.harrison

 
  en-gb