IRMAC General Meeting



How to Keep Your AI (and Your Organization) Out of Jail

Artificial intelligence brings tremendous opportunities, but it also has a dark side: machine bias. More businesses are using AI to support decision making, and unchecked AI biases lead to biased products and services which may discriminate against your customers or employees in ways that you are not even aware of.

Biases are bad for business: they lead to missed opportunities, damage consumer confidence, create reputational risk, and invite fines and other regulatory action. AI biases, therefore, are organization risk; and if you think your organization is safe because you are buying AI/ML as a service, you are wrong! Your organization, not the vendor, is ultimately accountable for the AI-enabled product or service you deliver.

To limit that risk, this talk will teach you about what AI biases are, how they emerge, and what you should be asking your AI vendors or AI team.

Speaker

Natalia Modjeska is Director, Research & Advisory, at Info-Tech Research Group. She writes, speaks and advises IT organizations around the world on topics related to AI, Machine Learning, analytics, and governance. She is currently developing an AI governance framework for Info-Tech members to ensure they build ethical, responsible and trustworthy AI.

Natalia has 15+ years of experience in developing, selling, and implementing analytical solutions. Her diverse career spans from R&D and product management to sales, consulting and program management. Prior to Info-Tech, she led an enterprise data and analytics program at a global luxury hospitality brand. Natalia’s journey into analytics and AI started in the late 1990s with a PhD in AI (Natural Language Processing) at the University of Edinburgh in Scotland. She also holds an Executive MBA from the Ivey School of Business (Western University).


Meeting: Wednesday, October 21, 2020

5 p.m. start; Expected finish 6:00 p.m.

On the day of the presentation go to THIS link

Share with LinkedIn Share on LinkedIn