An example of how contract playbooks designed by lawyer experts can be applied to commercial contracts using GenAI.
As the revolution in large language models (LLMs) makes its way into new startups and applications, would-be customers ask: How is my private data protected? Can commercial AI providers use my input data to train their models? The emergence of retrieval augmented generation (RAG), which can inject private data into prompts, has made this an urgent issue, particularly for companies building applications on top of LLMs since they often use those services on behalf of their customers.
The big worry is that intellectual property, trade secrets, other sensitive information, or private information is unknowingly leaked to the public via an LLM’s future output for a different customer. Here is a sample nightmare scenario: Continue Reading Fine print face-off: which top large language models provide the best data protection terms? (352)