Buy Tickets

Algorithms aren't neutral: Making UX inclusive

How our biases are reflected in the software we build— with practical examples— and how to combat them.

We all have biases, and the tech we build is a reflection of us; we need to be aware of the exclusionary effects of dev and UX decisions. This talk is about how our biases are reflected in the software we build— with practical examples, and how to combat them. Software can exclude those who need our products the most. We will learn to: (1) recognize exclusionary design and development patterns, (2) identify the decision points which lead to them, the information we ask of our users, and how we use that information, and (3) combat bias, and when to leave the product-building to someone else.

Friday, 2018-10-05 @ 11:35
> Skill level: elementary
> Duration: 25 min

Rate talk


Photo of Ivana McConnell

Ivana McConnell

I was born to a civil engineer-turned-translator and a UN Logistics Officer-turned-software engineer. I live in Vancouver now, working remotely for Customer.io as a Senior UI/UX designer. The first web problem I solved was the difference between a local and public URL, and now think a lot about the intersections of technology, identity, and ethics— and how they affect the products we build. I've previously worked as an ice hockey referee, rock climbing instructor, and video game tester.

Subscribe for latest news