Ana Arriola's talk at the recent Future of the Future seminar – about intersectionality, surveillance capitalism and the risks of AI – might not have been the stuff of your Dad's (or Steve Ballmer's) Microsoft, but it was actually a great reflection of the way Microsoft thinks now. In recent years, the company has devoted attention and resources to contemplating both the power of its technologies and ways to ensure they help rather than harm. Most notably, there's FATE – for Fairness, Accountability, Transparency and Ethics – a Microsoft Research group set up to "study the complex social implications" of AI and related technologies and match those against the lessons of history. This year, FATE published a thoughtful paper on designing AI so it works for people with disabilities, and another on fairness in machine learning systems, which observes bluntly that the problem starts in the way the datasets on which ML systems are trained are curated. The same paper points out that AI design teams often don't know their systems are biased until they're publicly deployed and, to quote one software engineer, "someone raises hell online." There's also the company's advisory board AI Ethics and Effects in Engineering and Research (Aether), which last year published a set of six principles for any work on facial recognition – and whose advice has apparently already led Microsoft to turn down significant AI product sales over ethics concerns. The company also publishes a general set of ethical principles for AI. And Arriola – whose full job title is General Manager & Partner, AI + Research & Search – has established another group within the company, called ETCH ( Ethics, Transparency, Culture and Humanity). It's evident that Microsoft takes this stuff seriously – and that it's about more than simply aiming for diversity in recruitment. "So much more," Arriola told me a couple the day before her talk at the seminar. "Diversity and inclusion just means making sure that there's safety and security within any given organisation, but it's really about global intersectionality. Learn more about your ad choices. Visit megaphone.fm/adchoices