Mrinank Sharma, a senior AI safety researcher at US-based artificial intelligence company Anthropic, has resigned from his role, issuing a stark warning that the world is heading towards danger if powerful technologies continue to grow without strong ethical grounding.
In a widely shared resignation note posted online, Sharma said humanity is facing several interconnected crises at the same time, from environmental stress and social unrest to rapid technological change. Artificial intelligence, he warned, could intensify these challenges if its development is not guided by wisdom, restraint, and clear human values.
Sharma headed Anthropic’s safeguards research team, where he worked on reducing risks associated with advanced AI systems. His work included studying how AI could be misused, such as assisting harmful biological research or influencing human behaviour at scale. Despite these efforts, Sharma said it was often difficult to ensure that ethical principles consistently shaped real-world decisions in high-pressure technology environments.
Without directly accusing the company of wrongdoing, Sharma wrote that aligning actions with values is far harder in practice than it appears on paper. He suggested that the broader tech ecosystem tends to prioritise speed, competition, and capability over reflection and long-term responsibility.
His resignation has sparked fresh debate across the technology sector, where concerns are growing that AI development is moving faster than society’s ability to understand and manage its consequences. Sharma’s departure adds to a list of researchers and engineers who have raised alarms about whether current safeguards are enough.
The announcement surprised many in the tech community, Sharma where he is stepping away from AI research altogether and turning to poetry and creative writing. He said this shift would allow him to explore deeper questions about meaning, responsibility, and humanity’s future in a more honest and personal way.
Also Read: Belagavi tech company in Karnataka sues Anthropic over name