Understanding Ollama Model Storage and Portability with the Model Manager
by Eric Hammond
Ollama is a powerful tool for running large language models locally, but understanding how it stores models and making them portable can be challenging. Let's explore how Ollama handles model storage and how the Model Manager utility helps with portability. There are both CLI and GUI versions of the Ollama Model Manager.
How Ollama Stores Models
Ollama uses a consistent storage structure across platforms (macOS, Linux, and Windows):
Base Directory:
macOS/Linux:
~/.ollama
Windows:
%USERPROFILE%\.ollama
Key Directories and Files:
models/
: Contains model files and manifestsmanifests/
: Stores model configuration detailsblobs/
: Stores actual model weight files as SHA256-hashed chunks
Model Structure:
Each model has a manifest JSON file in
manifests/
The manifest references model weight files stored in
blobs/
The manifest contains metadata such as model name, parameters, and a list of blob references
The critical insight is that Ollama splits models into chunks (blobs) and references them by their SHA256 hash. This allows for efficient storage since identical chunks are never duplicated.
Introducing Ollama Model Manager
The Ollama Model Manager is a utility that helps manage and make Ollama models portable. It provides several key features:
Model Discovery: Automatically finds all installed models
Export Capability: Packages models for transport to other systems
Import Capability: Installs exported models on new systems
Space Management: Shows model sizes and helps reclaim disk space
Let's see how the Model Manager works.
How the Model Manager Makes Models Portable
Finding and Analyzing Models
The Model Manager first discovers all installed models by parsing the Ollama manifest files:
def find_ollama_models(): """Find all Ollama models in the manifest directory""" ollama_dir = get_ollama_dir() manifest_dir = os.path.join(ollama_dir, "manifests") models = [] if not os.path.exists(manifest_dir): return models for file in os.listdir(manifest_dir): if file.endswith(".json"): model_path = os.path.join(manifest_dir, file) with open(model_path, "r") as f: manifest = json.load(f) # Parse the manifest to extract model information model_info = { "name": manifest.get("name", "unknown"), "manifest": file, "size": calculate_model_size(manifest), "blob_count": len(manifest.get("blobs", [])), "path": model_path, "manifest_data": manifest } models.append(model_info) return models
Export Process
When exporting a model, the Model Manager:
Creates an archive directory
Copies the model manifest
Copies all unique blob files referenced by the manifest
Creates a compressed archive (.tar.gz) containing everything needed
def export_model(model_info, output_dir=None): """Export an Ollama model to a portable archive""" model_name = model_info["name"] manifest_data = model_info["manifest_data"] ollama_dir = get_ollama_dir() if output_dir is None: output_dir = os.getcwd() # Create a temporary directory for the export with tempfile.TemporaryDirectory() as temp_dir: # Create directory structure os.makedirs(os.path.join(temp_dir, "manifests"), exist_ok=True) os.makedirs(os.path.join(temp_dir, "blobs"), exist_ok=True) # Copy the manifest manifest_src = model_info["path"] manifest_dst = os.path.join(temp_dir, "manifests", os.path.basename(manifest_src)) shutil.copy2(manifest_src, manifest_dst) # Copy all blob files blobs_dir = os.path.join(ollama_dir, "blobs") for blob in manifest_data.get("blobs", []): digest = blob.get("digest", "") if digest and os.path.exists(os.path.join(blobs_dir, digest)): blob_src = os.path.join(blobs_dir, digest) blob_dst = os.path.join(temp_dir, "blobs", digest) shutil.copy2(blob_src, blob_dst) # Create archive archive_name = f"{model_name.replace(':', '_')}_model" archive_path = os.path.join(output_dir, f"{archive_name}.tar.gz") # Create the tar.gz file with tarfile.open(archive_path, "w:gz") as tar: tar.add(temp_dir, arcname=archive_name) return archive_path
Import Process
For importing models, the Model Manager:
Extracts the archive
Verifies the model structure
Copies the manifests and blobs to the Ollama directory
Handles existing files to avoid duplicates
def import_model(archive_path): """Import an Ollama model from a portable archive""" ollama_dir = get_ollama_dir() # Create a temporary directory for extraction with tempfile.TemporaryDirectory() as temp_dir: # Extract the archive with tarfile.open(archive_path, "r:gz") as tar: tar.extractall(path=temp_dir) # Find the extracted model directory model_dirs = [d for d in os.listdir(temp_dir) if os.path.isdir(os.path.join(temp_dir, d))] if not model_dirs: raise ValueError("Invalid model archive: no model directory found") model_dir = os.path.join(temp_dir, model_dirs[0]) # Verify directory structure if not all(os.path.exists(os.path.join(model_dir, d)) for d in ["manifests", "blobs"]): raise ValueError("Invalid model archive: missing required directories") # Import manifests manifest_files = [] for manifest in os.listdir(os.path.join(model_dir, "manifests")): src = os.path.join(model_dir, "manifests", manifest) dst = os.path.join(ollama_dir, "manifests", manifest) os.makedirs(os.path.dirname(dst), exist_ok=True) # If the manifest already exists, we'll need to decide what to do if os.path.exists(dst): # For now, we'll just skip it, but you could add logic here # to rename, compare, or prompt the user pass else: shutil.copy2(src, dst) manifest_files.append(manifest) # Import blobs blob_dir = os.path.join(model_dir, "blobs") for blob in os.listdir(blob_dir): src = os.path.join(blob_dir, blob) dst = os.path.join(ollama_dir, "blobs", blob) os.makedirs(os.path.dirname(dst), exist_ok=True) if not os.path.exists(dst): shutil.copy2(src, dst) return manifest_files
Benefits of the Model Manager
Efficient Storage: By understanding Ollama's blob-based storage, the Model Manager ensures it only copies what's needed.
Model Portability: Makes it easy to move models between machines without re-downloading large files.
Backup and Restore: Provides a way to back up your models before system changes or reinstallation.
Offline Installation: Enables model sharing in environments without internet access.
Conclusion
Ollama's model storage system is elegant and efficient, using content-addressed storage to minimize duplication. However, it wasn't designed with easy portability in mind. The Model Manager bridges this gap by providing a simple interface for exporting and importing models.
By understanding how Ollama stores models and using the Model Manager, you can easily:
Move models between different computers
Back up your collection of fine-tuned models
Share models with colleagues without re-downloading
Manage your local model collection more effectively
This approach saves bandwidth, time, and storage space while making your AI workflows more portable and resilient.
Find the source for the Model Manager on github: