High Learning Rate

High Learning Rate

Share this post

High Learning Rate
High Learning Rate
The Concern of Privacy with LLMs (part 1)

The Concern of Privacy with LLMs (part 1)

This one might save you a lot of time and money...

Louis-François Bouchard's avatar
Francois Huppe-Marcoux's avatar
Omar Solano's avatar
Louis-François Bouchard
,
Francois Huppe-Marcoux
, and
Omar Solano
Jun 20, 2024
∙ Paid
1

Share this post

High Learning Rate
High Learning Rate
The Concern of Privacy with LLMs (part 1)
Share

Good morning everyone! This one might save you a lot of time and money.

Today, we're writing about an important topic: the privacy concern with large language models (LLMs). We see a lot of clients taking overkill solutions because of privacy concerns.

In this first part, we will explore this complex issue, focusing on the challenges and trade-offs between convenience and privacy with LLMs to help decide which avenue is the best for you.

When dealing with traditional software, privacy concerns often revolve around data storage, transmission, and access control. We implement encryption, set up secure databases, and carefully manage user permissions. However, the world of LLMs introduces a new layer of complexity to privacy considerations when you want the best results possible and cannot necessarily do that on your own, locally.

And, by the way, we are not talking about ChatGPT. ChatGPT is a powerful interface, not just an LLM. It is not used to build products or tools. Here, we are talking about LLMs used through APIs to build the powerful products and chatbots our users want.

Let’s go through these five options one can consider:

  1. Private endpoints of the best LLMs (such as Azure OpenAI Service)

  2. Using APIs with simple User prevention

  3. Using APIs with Anonymized Data

  4. Having your own model

  5. Reconsideration + alternatives

  6. Bonus…

Here’s a quick teaser of what we will discuss…

APIs (Third-Party LLM Services)

1 - Using Private Endpoints for the Best LLMs

This is ideal for those who want to use the best LLMs but want a major cloud provider's privacy guarantee and features. For example, Microsoft offers dedicated private endpoints for OpenAI models through the Azure OpenAI Service, tailored for enterprise customers and partners.

If your organization already uses Azure, this is a great option. You can set up Azure Virtual Networks (VNets) to keep your data isolated from the public Internet. This setup is ideal if you need to meet regulatory and compliance requirements for data privacy.

The cost per input/output token is the same as that of OpenAI's standard rates. To use this service, you need to submit a form and await approval. Note that some users have reported waiting a few weeks for approval.

Keep reading with a 7-day free trial

Subscribe to High Learning Rate to keep reading this post and get 7 days of free access to the full post archives.

Already a paid subscriber? Sign in
© 2025 Louis-François Bouchard
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share