Laura Hamady, Shelby Dolen, Marlaina Pinto, Susan Fletcher
Session Time: Thur, May 7, 2026: 11:30 AM-12:30 PM
When an AI-driven decision is wrong or discriminatory, who is legally responsible — the developer, the deployer, or both? With courts and regulators taking divergent approaches, organizations need practical ways to allocate and manage this risk. This session will move beyond doctrine and into the operational realities facing technology vendors, enterprise customers, and their counsel. Drawing on our experience designing and implementing AI governance and compliance programs, we will unpack what organizations need to consider when structuring the developer–deployer relationship—contractually, technically, and programmatically—to manage risk in a regulatory environment where the rules are still being written.
We will cover:
- Contractual Allocation of Risk
How vendor agreements assign responsibility for AI outcomes (indemnities, limitations of liability, performance commitments) and where those terms diverge from emerging legal expectations. - Documentation and Impact Assessments
What AI documentation actually looks like, how to align EU AI Act readiness with U.S. and global obligations, and how deployers can conduct meaningful impact assessments for third-party tools. - Configuration and Control
How liability shifts based on who configures and “owns” key use decisions — including scenarios where AI “recommendations” effectively become the decision. - Technical Governance and Bias Testing
What developers can realistically provide on discrimination and performance risk, what bias testing looks like in practice, and who owns ongoing monitoring. - Program Design Under Uncertainty
What a defensible AI governance program looks like today, including integrating AI impact and risk assessments into existing privacy frameworks (PIAs, DPIAs, TIAs, vendor management) and applying emerging “reasonable care” standards.
This session is designed for product leaders, in-house counsel, privacy and data protection officers, and compliance professionals who influence the design, procurement, or deployment of enterprise AI tools and need to operationalize AI governance inside complex organizations.
Laura Hamady, Partner, Troutman Pepper Locke
Shelby Dolen, Attorney, Troutman Pepper Locke
Marlaina Pinto, Data Privacy Attorney, Troutman Pepper Locke
Susan Fletcher, Chief Privacy Officer & Deputy General Counsel, Precisely
Reading Materials:




