Confronting the CEO of the AI company that impersonated me - Decoder with Nilay Patel Recap
Podcast: Decoder with Nilay Patel
Published: 2026-03-23
Duration: 4553
Guests: Shishir Mehrotra
What Happened
Shishir Mehrotra, CEO of Superhuman, formerly known as Grammarly, faced tough questions from Nilay Patel regarding the controversial launch of Grammarly's Expert Review feature. This feature allowed users to receive writing suggestions from AI clones of experts, including Patel, without their consent. This unauthorized use of names caused outrage among journalists and led to legal challenges.
Mehrotra explained that the feature was developed by a small team and was ultimately removed, not due to the lawsuit, but because it didn't align with Superhuman's strategic goals. Despite the controversy, Mehrotra defended the feature as a standard attribution practice, although it generated edits not reflective of the experts' actual work.
The discussion touched on broader issues within the AI industry, such as the collapsing distinctions between copyright, trademarks, and likeness laws. Mehrotra suggested expanding copyright laws to include likeness to better protect creators and address the extractive nature of AI tools that often use creators' work without compensation.
Mehrotra expressed optimism about AI's potential to create jobs and enhance human capabilities, arguing that AI tools, like Grammarly, aim to augment rather than replace human skills. He mentioned the development of Superhuman Go, a network of AI assistants designed to provide consistent experiences across platforms.
The episode also explored historical legal precedents, such as Google's book scanning cases and YouTube's Content ID system, which have shaped current legal challenges in the AI space. Mehrotra discussed the potential for subscription models as a way for creators to adapt to AI's impact on content value.
Mehrotra acknowledged the negative public perception of AI, which is polling worse than ICE and only slightly better than the Democratic Party. He believes the industry has failed to effectively communicate the benefits of AI, leading to fears about job displacement and other negative outcomes.
The conversation concluded with insights into the future of AI, where Mehrotra predicted increased integration of AI in software development. He emphasized the importance of network effects and ecosystem integration in building successful AI applications.
Key Insights
- Superhuman's Expert Review feature used AI clones of experts without consent, leading to legal challenges and public outrage. The feature was removed due to strategic misalignment, not because of these controversies.
- Shishir Mehrotra noted that AI tools like Grammarly are designed to augment human skills rather than replace them, aiming to make users 'superhuman' in their work.
- Historical legal battles, such as Google's book scanning and YouTube's Content ID controversies, set precedents that influence current AI-related legal issues, particularly regarding copyright and likeness laws.
- Mehrotra suggested that the AI industry needs to better communicate its benefits to the public, as current perceptions are highly negative, often associated with job displacement fears.