
Artificial Intelligence (AI) empowers people, but the problem is that it can be used in both good and bad faith, said Neil Thompson, director of the FutureTech Research Project at MIT's Computer Science and Artificial Intelligence Lab, at the NDTV World Summit 2025.
Earlier, Thompson had cited examples of some AI errors linked to daily-use applications that had fatal or harmful implications. Later, at a fireside chat with NDTV's Shiv Aroor, he was asked what he thinks could be the extreme scenarios due to AI.
"For bad scenarios, imagine a business where a dissatisfied client floods every place with bad reviews. Imagine a situation where your emails are flooded with things that are maybe real or not real, and it's a horror to distinguish," answered Thompson. "As tools get more powerful, the idea of harnessing what computers can do in a huge number of ways becomes a challenge."
Thompson said AI is going to make people productive. But if someone is worried about a particular job getting automated, he must understand the separation between how capabilities are developing and how fast they get developed, he explained, calling it the "AI last mile costs."
The researcher stressed that using AI systems for idea generation is fantastic, but one must remember there are errors that can be exploited and lead to unexpected harms. "AI isn't 100% accurate, and it can have very serious consequences."
Thompson also answered if moral hesitation would become unethical if machines can make faster combat calls and save or take lives more efficiently.
"I definitely think we are running into this problem. It is when you have a competitive system—two sides in a war or competitors in a market—they are going to start racing. It's going to lead people to wanting to escalate capabilities. One of the things you can do if the human loop is slow is relinquish capability. That can create real challenges. I think it's very important to think, is there a way to control it?" he said.

Track Latest News Live on NDTV.com and get news updates from India and around the world