Skip to content
use

Local AI: Your Data Stays With You

Install and use AI on your workstation, without sending data to the cloud. Ollama, AnythingLLM, RAG on your documents. Digital sovereignty starts here.

1 day (7 hours) Professionals handling sensitive data: regulated professions, finance, legal, HR, healthcare

What you'll be able to do

Understand how a local LLM works and its privacy advantages

Use Ollama to interact with different AI models locally

Choose the right model for each type of task

Use an AI agent to interact with local documents via RAG

Apply prompting best practices for relevant results

Program

Session 1

Why local AI + setup

  • Cloud vs local: what goes online vs what stays with you
  • Installing Ollama and downloading models
  • Installed models: Mistral (French), Llama 3.1 8B (fast)
  • First test: asking a question to a local model
Session 2

AnythingLLM: AI with a real interface

  • Installing and configuring AnythingLLM
  • Creating your first workspace
  • Asking business questions in the interface
  • Comparing models: Mistral vs Llama
Session 3

Writing effective prompts

  • The winning structure: context → role → task → format → constraints
  • Before/after examples on your use cases
  • Configuring a system prompt in AnythingLLM
  • Writing 5 optimized prompts for your recurring tasks
Session 4

Working on your documents with RAG

  • RAG: AI reads your documents and answers with sources
  • Drag and drop a full folder into a workspace
  • Querying documents: extraction, summary, cross-referencing
  • Producing a summary note from multiple documents
Session 5

Organizing daily use

  • Creating workspaces per client folder
  • Decision tree: local vs cloud by sensitivity
  • Daily launch procedure
  • Demo of advanced tools: Claude Code + Ollama, Claude Cowork

Practical info

Duration

1 day (7 hours)

Target audience

Professionals handling sensitive data: regulated professions, finance, legal, HR, healthcare

Prerequisites

Regular use of a computer (Mac or PC). No programming skills required.

Group size

1 to 4 people

Pedagogy

100% hands-on on the trainee's workstation. The trainer installs and configures, the trainee learns to use. Exercises on real or fictional documents adapted to the role.

Assessment

Entry-level assessment, practical exercises, end-of-training knowledge evaluation.

Trainer

Colombani.ai, AI consultant and trainer.

Access delay

2 weeks minimum between enrollment and training start.

Pricing

Open enrollment

€700 / person / day

In-company

On request

Pricing on request. Each program is adapted to your situation, contact us for a personalized quote.

Accessibility

Program accessible to people with disabilities. Contact the disability officer in advance to discuss accommodations.

Ulysse Trin — [email protected] — 06 58 58 37 11

Post-training support

Installed and configured environment: Ollama + models + AnythingLLM
Personalized getting started guide (PDF)
AnythingLLM workspace configured with adapted system prompt
Training completion certificate

Frequently asked questions

Do I need a powerful computer to run local AI? +

A recent computer with 8 GB of RAM is enough for lightweight models (Mistral 7B, Llama 3.1 8B). For larger models, 16 GB is recommended. We check compatibility beforehand.

Does my data really stay on my machine? +

Yes, 100%. Ollama and AnythingLLM run entirely locally. No data is sent over the internet. That's the whole point of local AI.

Is local AI as powerful as ChatGPT? +

For most business tasks (summarization, extraction, writing), local models give excellent results. The training teaches you to choose the right model for each task.

Request the full program

Bespoke program, tailored to your industry. First call is free.

Voir cette page en français