summaryrefslogtreecommitdiff
path: root/docs
diff options
context:
space:
mode:
Diffstat (limited to 'docs')
-rw-r--r--docs/cache_nbrun1
-rwxr-xr-xdocs/nb_build15
-rwxr-xr-xdocs/nb_run_conv82
-rw-r--r--docs/source/auto_examples/auto_examples_jupyter.zipbin51426 -> 67774 bytes
-rw-r--r--docs/source/auto_examples/auto_examples_python.zipbin35678 -> 44112 bytes
-rw-r--r--docs/source/auto_examples/demo_OT_1D_test.ipynb54
-rw-r--r--docs/source/auto_examples/demo_OT_1D_test.py71
-rw-r--r--docs/source/auto_examples/demo_OT_1D_test.rst99
-rw-r--r--docs/source/auto_examples/demo_OT_2D_sampleslarge.ipynb54
-rw-r--r--docs/source/auto_examples/demo_OT_2D_sampleslarge.py78
-rw-r--r--docs/source/auto_examples/demo_OT_2D_sampleslarge.rst106
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_OTDA_2D_001.pngbin52753 -> 0 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_OTDA_2D_002.pngbin87798 -> 0 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_OTDA_2D_003.pngbin167396 -> 0 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_OTDA_2D_004.pngbin82929 -> 0 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_OTDA_classes_001.pngbin53561 -> 0 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_OTDA_classes_004.pngbin193523 -> 0 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_OTDA_color_images_001.pngbin237854 -> 0 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_OTDA_color_images_002.pngbin472911 -> 0 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_OTDA_mapping_001.pngbin44168 -> 0 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_OTDA_mapping_002.pngbin111565 -> 0 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_OTDA_mapping_color_images_001.pngbin237854 -> 0 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_OTDA_mapping_color_images_002.pngbin429859 -> 0 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_OT_1D_001.pngbin27639 -> 21303 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_OT_1D_002.pngbin25126 -> 21334 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_OT_1D_003.pngbin19634 -> 0 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_OT_1D_004.pngbin21449 -> 0 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_OT_1D_005.pngbin0 -> 16995 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_OT_1D_007.pngbin0 -> 18923 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_OT_2D_samples_001.pngbin22199 -> 22153 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_OT_2D_samples_002.pngbin21036 -> 21589 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_OT_2D_samples_003.pngbin9632 -> 0 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_OT_2D_samples_004.pngbin91630 -> 0 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_OT_2D_samples_005.pngbin9495 -> 9645 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_OT_2D_samples_006.pngbin23476 -> 91095 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_OT_2D_samples_009.pngbin0 -> 13987 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_OT_2D_samples_010.pngbin0 -> 109742 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_001.pngbin22451 -> 11710 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_002.pngbin32795 -> 17184 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_003.pngbin38958 -> 38780 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_004.pngbin17324 -> 11710 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_005.pngbin28210 -> 38780 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_006.pngbin77009 -> 38780 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_007.pngbin0 -> 14117 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_008.pngbin0 -> 18696 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_009.pngbin0 -> 21300 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_011.pngbin0 -> 21300 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_WDA_001.pngbin42791 -> 54285 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_WDA_003.pngbin0 -> 86366 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_barycenter_1D_001.pngbin35837 -> 20512 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_barycenter_1D_002.pngbin59327 -> 41555 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_barycenter_1D_003.pngbin132247 -> 41555 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_barycenter_1D_004.pngbin125411 -> 105696 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_barycenter_1D_005.pngbin0 -> 108687 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_barycenter_1D_006.pngbin0 -> 105696 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_compute_emd_001.pngbin146617 -> 162612 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_compute_emd_003.pngbin0 -> 29285 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_compute_emd_004.png (renamed from docs/source/auto_examples/images/sphx_glr_plot_compute_emd_002.png)bin38746 -> 38746 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_optim_OTreg_003.pngbin20684 -> 16995 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_optim_OTreg_004.pngbin21750 -> 18588 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_optim_OTreg_005.pngbin22971 -> 0 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_optim_OTreg_006.pngbin0 -> 19189 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_optim_OTreg_008.pngbin0 -> 20440 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_otda_classes_001.pngbin0 -> 51418 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_otda_classes_003.pngbin0 -> 199721 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_otda_color_images_001.pngbin0 -> 144945 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_otda_color_images_003.pngbin0 -> 50403 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_otda_color_images_005.pngbin0 -> 234386 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_otda_d2_001.pngbin0 -> 135206 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_otda_d2_003.pngbin0 -> 241976 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_otda_d2_006.pngbin0 -> 108946 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_otda_mapping_001.pngbin0 -> 35881 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_otda_mapping_003.pngbin0 -> 73815 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_otda_mapping_colors_images_001.pngbin0 -> 165589 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_otda_mapping_colors_images_003.pngbin0 -> 80727 bytes
-rw-r--r--docs/source/auto_examples/images/sphx_glr_plot_otda_mapping_colors_images_004.pngbin0 -> 541463 bytes
-rw-r--r--docs/source/auto_examples/images/thumb/sphx_glr_plot_OTDA_2D_thumb.pngbin34799 -> 0 bytes
-rw-r--r--docs/source/auto_examples/images/thumb/sphx_glr_plot_OTDA_classes_thumb.pngbin34581 -> 0 bytes
-rw-r--r--docs/source/auto_examples/images/thumb/sphx_glr_plot_OTDA_color_images_thumb.pngbin52919 -> 0 bytes
-rw-r--r--docs/source/auto_examples/images/thumb/sphx_glr_plot_OTDA_mapping_color_images_thumb.pngbin52919 -> 0 bytes
-rw-r--r--docs/source/auto_examples/images/thumb/sphx_glr_plot_OTDA_mapping_thumb.pngbin26370 -> 0 bytes
-rw-r--r--docs/source/auto_examples/images/thumb/sphx_glr_plot_OT_1D_thumb.pngbin21175 -> 18222 bytes
-rw-r--r--docs/source/auto_examples/images/thumb/sphx_glr_plot_OT_2D_samples_thumb.pngbin23897 -> 24711 bytes
-rw-r--r--docs/source/auto_examples/images/thumb/sphx_glr_plot_OT_L1_vs_L2_thumb.pngbin22735 -> 10935 bytes
-rw-r--r--docs/source/auto_examples/images/thumb/sphx_glr_plot_OT_conv_thumb.pngbin2894 -> 0 bytes
-rw-r--r--docs/source/auto_examples/images/thumb/sphx_glr_plot_WDA_thumb.pngbin66426 -> 87479 bytes
-rw-r--r--docs/source/auto_examples/images/thumb/sphx_glr_plot_barycenter_1D_thumb.pngbin19862 -> 16522 bytes
-rw-r--r--docs/source/auto_examples/images/thumb/sphx_glr_plot_compute_emd_thumb.pngbin72940 -> 80806 bytes
-rw-r--r--docs/source/auto_examples/images/thumb/sphx_glr_plot_optim_OTreg_thumb.pngbin21750 -> 3101 bytes
-rw-r--r--docs/source/auto_examples/images/thumb/sphx_glr_plot_otda_classes_thumb.pngbin0 -> 30678 bytes
-rw-r--r--docs/source/auto_examples/images/thumb/sphx_glr_plot_otda_color_images_thumb.pngbin0 -> 51088 bytes
-rw-r--r--docs/source/auto_examples/images/thumb/sphx_glr_plot_otda_d2_thumb.pngbin0 -> 53014 bytes
-rw-r--r--docs/source/auto_examples/images/thumb/sphx_glr_plot_otda_mapping_colors_images_thumb.pngbin0 -> 58321 bytes
-rw-r--r--docs/source/auto_examples/images/thumb/sphx_glr_plot_otda_mapping_thumb.pngbin0 -> 18473 bytes
-rw-r--r--docs/source/auto_examples/images/thumb/sphx_glr_test_OT_2D_samples_stabilized_thumb.pngbin3101 -> 0 bytes
-rw-r--r--docs/source/auto_examples/index.rst102
-rw-r--r--docs/source/auto_examples/plot_OTDA_2D.ipynb54
-rw-r--r--docs/source/auto_examples/plot_OTDA_2D.py120
-rw-r--r--docs/source/auto_examples/plot_OTDA_2D.rst175
-rw-r--r--docs/source/auto_examples/plot_OTDA_classes.ipynb54
-rw-r--r--docs/source/auto_examples/plot_OTDA_classes.py112
-rw-r--r--docs/source/auto_examples/plot_OTDA_classes.rst190
-rw-r--r--docs/source/auto_examples/plot_OTDA_color_images.ipynb54
-rw-r--r--docs/source/auto_examples/plot_OTDA_color_images.py145
-rw-r--r--docs/source/auto_examples/plot_OTDA_color_images.rst191
-rw-r--r--docs/source/auto_examples/plot_OTDA_mapping.ipynb54
-rw-r--r--docs/source/auto_examples/plot_OTDA_mapping.py110
-rw-r--r--docs/source/auto_examples/plot_OTDA_mapping.rst186
-rw-r--r--docs/source/auto_examples/plot_OTDA_mapping_color_images.ipynb54
-rw-r--r--docs/source/auto_examples/plot_OTDA_mapping_color_images.py158
-rw-r--r--docs/source/auto_examples/plot_OTDA_mapping_color_images.rst246
-rw-r--r--docs/source/auto_examples/plot_OT_1D.ipynb76
-rw-r--r--docs/source/auto_examples/plot_OT_1D.py65
-rw-r--r--docs/source/auto_examples/plot_OT_1D.rst175
-rw-r--r--docs/source/auto_examples/plot_OT_2D_samples.ipynb76
-rw-r--r--docs/source/auto_examples/plot_OT_2D_samples.py75
-rw-r--r--docs/source/auto_examples/plot_OT_2D_samples.rst174
-rw-r--r--docs/source/auto_examples/plot_OT_L1_vs_L2.ipynb76
-rw-r--r--docs/source/auto_examples/plot_OT_L1_vs_L2.py285
-rw-r--r--docs/source/auto_examples/plot_OT_L1_vs_L2.rst352
-rw-r--r--docs/source/auto_examples/plot_OT_conv.ipynb54
-rw-r--r--docs/source/auto_examples/plot_OT_conv.py200
-rw-r--r--docs/source/auto_examples/plot_OT_conv.rst241
-rw-r--r--docs/source/auto_examples/plot_WDA.ipynb94
-rw-r--r--docs/source/auto_examples/plot_WDA.py122
-rw-r--r--docs/source/auto_examples/plot_WDA.rst232
-rw-r--r--docs/source/auto_examples/plot_barycenter_1D.ipynb76
-rw-r--r--docs/source/auto_examples/plot_barycenter_1D.py136
-rw-r--r--docs/source/auto_examples/plot_barycenter_1D.rst213
-rw-r--r--docs/source/auto_examples/plot_compute_emd.ipynb76
-rw-r--r--docs/source/auto_examples/plot_compute_emd.py92
-rw-r--r--docs/source/auto_examples/plot_compute_emd.rst155
-rw-r--r--docs/source/auto_examples/plot_optim_OTreg.ipynb94
-rw-r--r--docs/source/auto_examples/plot_optim_OTreg.py111
-rw-r--r--docs/source/auto_examples/plot_optim_OTreg.rst242
-rw-r--r--docs/source/auto_examples/plot_otda_classes.ipynb126
-rw-r--r--docs/source/auto_examples/plot_otda_classes.py150
-rw-r--r--docs/source/auto_examples/plot_otda_classes.rst258
-rw-r--r--docs/source/auto_examples/plot_otda_color_images.ipynb144
-rw-r--r--docs/source/auto_examples/plot_otda_color_images.py165
-rw-r--r--docs/source/auto_examples/plot_otda_color_images.rst257
-rw-r--r--docs/source/auto_examples/plot_otda_d2.ipynb144
-rw-r--r--docs/source/auto_examples/plot_otda_d2.py172
-rw-r--r--docs/source/auto_examples/plot_otda_d2.rst264
-rw-r--r--docs/source/auto_examples/plot_otda_mapping.ipynb126
-rw-r--r--docs/source/auto_examples/plot_otda_mapping.py125
-rw-r--r--docs/source/auto_examples/plot_otda_mapping.rst228
-rw-r--r--docs/source/auto_examples/plot_otda_mapping_colors_images.ipynb144
-rw-r--r--docs/source/auto_examples/plot_otda_mapping_colors_images.py174
-rw-r--r--docs/source/auto_examples/plot_otda_mapping_colors_images.rst305
-rw-r--r--docs/source/auto_examples/searchindexbin0 -> 1892352 bytes
-rw-r--r--docs/source/conf.py12
-rw-r--r--docs/source/examples.rst39
-rw-r--r--docs/source/readme.rst25
154 files changed, 5178 insertions, 3737 deletions
diff --git a/docs/cache_nbrun b/docs/cache_nbrun
new file mode 100644
index 0000000..1510b2b
--- /dev/null
+++ b/docs/cache_nbrun
@@ -0,0 +1 @@
+{"plot_otda_mapping_colors_images.ipynb": "4f0587a00a3c082799a75a0ed36e9ce1", "plot_optim_OTreg.ipynb": "71d3c106b3f395a6b1001078a6ca6f8d", "plot_barycenter_1D.ipynb": "6fd8167f98816dc832fe0c58b1d5527b", "plot_WDA.ipynb": "27f8de4c6d7db46497076523673eedfb", "plot_OT_L1_vs_L2.ipynb": "e15219bf651a7e39e7c5c3934069894c", "plot_otda_color_images.ipynb": "d047d635f4987c81072383241590e21f", "plot_otda_classes.ipynb": "44bb8cd93317b5d342cd62e26d9bbe60", "plot_otda_d2.ipynb": "8ac4fd2ff899df0858ce1e5fead37f33", "plot_otda_mapping.ipynb": "d335a15af828aaa3439a1c67570d79d6", "plot_compute_emd.ipynb": "bd95981189df6adcb113d9b360ead734", "plot_OT_1D.ipynb": "e44c83f6112388ae18657cb0ad76d0e9", "plot_OT_2D_samples.ipynb": "3f125714daa35ff3cfe5dae1f71265c4"} \ No newline at end of file
diff --git a/docs/nb_build b/docs/nb_build
new file mode 100755
index 0000000..6abc6cf
--- /dev/null
+++ b/docs/nb_build
@@ -0,0 +1,15 @@
+#!/bin/bash
+
+
+# remove comment
+sed -i "s/#'sphinx\_gallery/'sphinx\_gallery/" source/conf.py
+sed -i "s/sys.modules.update/#sys.modules.update/" source/conf.py
+
+make html
+
+# put comment again
+sed -i "s/'sphinx\_gallery/#'sphinx\_gallery/" source/conf.py
+sed -i "s/#sys.modules.update/sys.modules.update/" source/conf.py
+
+#rsync --out-format="%n" --update source/auto_examples/*.ipynb ../notebooks2
+./nb_run_conv
diff --git a/docs/nb_run_conv b/docs/nb_run_conv
new file mode 100755
index 0000000..ad5e432
--- /dev/null
+++ b/docs/nb_run_conv
@@ -0,0 +1,82 @@
+#!/usr/bin/env python
+# -*- coding: utf-8 -*-
+"""
+
+Convert sphinx gallery notebook from empty to image filled
+
+Created on Fri Sep 1 16:43:45 2017
+
+@author: rflamary
+"""
+
+import sys
+import json
+import glob
+import hashlib
+import subprocess
+
+import os
+
+cache_file='cache_nbrun'
+
+path_doc='source/auto_examples/'
+path_nb='../notebooks/'
+
+def load_json(fname):
+ try:
+ f=open(fname)
+ nb=json.load(f)
+ f.close()
+ except (OSError, IOError) :
+ nb={}
+ return nb
+
+def save_json(fname,nb):
+ f=open(fname,'w')
+ f.write(json.dumps(nb))
+ f.close()
+
+
+def md5(fname):
+ hash_md5 = hashlib.md5()
+ with open(fname, "rb") as f:
+ for chunk in iter(lambda: f.read(4096), b""):
+ hash_md5.update(chunk)
+ return hash_md5.hexdigest()
+
+def to_update(fname,cache):
+ if fname in cache:
+ if md5(path_doc+fname)==cache[fname]:
+ res=False
+ else:
+ res=True
+ else:
+ res=True
+
+ return res
+
+def update(fname,cache):
+
+ # jupyter nbconvert --to notebook --execute mynotebook.ipynb --output targte
+ subprocess.check_call(['cp',path_doc+fname,path_nb])
+ print(' '.join(['jupyter','nbconvert','--to','notebook','--ExecutePreprocessor.timeout=600','--execute',path_nb+fname,'--inplace']))
+ subprocess.check_call(['jupyter','nbconvert','--to','notebook','--ExecutePreprocessor.timeout=600','--execute',path_nb+fname,'--inplace'])
+ cache[fname]=md5(path_doc+fname)
+
+
+
+cache=load_json(cache_file)
+
+lst_file=glob.glob(path_doc+'*.ipynb')
+
+lst_file=[os.path.basename(name) for name in lst_file]
+
+for fname in lst_file:
+ if to_update(fname,cache):
+ print('Updating file: {}'.format(fname))
+ update(fname,cache)
+ save_json(cache_file,cache)
+
+
+
+
diff --git a/docs/source/auto_examples/auto_examples_jupyter.zip b/docs/source/auto_examples/auto_examples_jupyter.zip
index 7c3de28..fc1d4de 100644
--- a/docs/source/auto_examples/auto_examples_jupyter.zip
+++ b/docs/source/auto_examples/auto_examples_jupyter.zip
Binary files differ
diff --git a/docs/source/auto_examples/auto_examples_python.zip b/docs/source/auto_examples/auto_examples_python.zip
index 97377e1..282bc58 100644
--- a/docs/source/auto_examples/auto_examples_python.zip
+++ b/docs/source/auto_examples/auto_examples_python.zip
Binary files differ
diff --git a/docs/source/auto_examples/demo_OT_1D_test.ipynb b/docs/source/auto_examples/demo_OT_1D_test.ipynb
deleted file mode 100644
index 87317ea..0000000
--- a/docs/source/auto_examples/demo_OT_1D_test.ipynb
+++ /dev/null
@@ -1,54 +0,0 @@
-{
- "nbformat_minor": 0,
- "nbformat": 4,
- "cells": [
- {
- "execution_count": null,
- "cell_type": "code",
- "source": [
- "%matplotlib inline"
- ],
- "outputs": [],
- "metadata": {
- "collapsed": false
- }
- },
- {
- "source": [
- "\nDemo for 1D optimal transport\n\n@author: rflamary\n\n"
- ],
- "cell_type": "markdown",
- "metadata": {}
- },
- {
- "execution_count": null,
- "cell_type": "code",
- "source": [
- "import numpy as np\nimport matplotlib.pylab as pl\nimport ot\nfrom ot.datasets import get_1D_gauss as gauss\n\n\n#%% parameters\n\nn=100 # nb bins\n\n# bin positions\nx=np.arange(n,dtype=np.float64)\n\n# Gaussian distributions\na=gauss(n,m=n*.2,s=5) # m= mean, s= std\nb=gauss(n,m=n*.6,s=10)\n\n# loss matrix\nM=ot.dist(x.reshape((n,1)),x.reshape((n,1)))\nM/=M.max()\n\n#%% plot the distributions\n\npl.figure(1)\npl.plot(x,a,'b',label='Source distribution')\npl.plot(x,b,'r',label='Target distribution')\npl.legend()\n\n#%% plot distributions and loss matrix\n\npl.figure(2)\not.plot.plot1D_mat(a,b,M,'Cost matrix M')\n\n#%% EMD\n\nG0=ot.emd(a,b,M)\n\npl.figure(3)\not.plot.plot1D_mat(a,b,G0,'OT matrix G0')\n\n#%% Sinkhorn\n\nlambd=1e-3\nGs=ot.sinkhorn(a,b,M,lambd,verbose=True)\n\npl.figure(4)\not.plot.plot1D_mat(a,b,Gs,'OT matrix Sinkhorn')\n\n#%% Sinkhorn\n\nlambd=1e-4\nGss,log=ot.bregman.sinkhorn_stabilized(a,b,M,lambd,verbose=True,log=True)\nGss2,log2=ot.bregman.sinkhorn_stabilized(a,b,M,lambd,verbose=True,log=True,warmstart=log['warmstart'])\n\npl.figure(5)\not.plot.plot1D_mat(a,b,Gss,'OT matrix Sinkhorn stabilized')\n\n#%% Sinkhorn\n\nlambd=1e-11\nGss=ot.bregman.sinkhorn_epsilon_scaling(a,b,M,lambd,verbose=True)\n\npl.figure(5)\not.plot.plot1D_mat(a,b,Gss,'OT matrix Sinkhorn stabilized')"
- ],
- "outputs": [],
- "metadata": {
- "collapsed": false
- }
- }
- ],
- "metadata": {
- "kernelspec": {
- "display_name": "Python 2",
- "name": "python2",
- "language": "python"
- },
- "language_info": {
- "mimetype": "text/x-python",
- "nbconvert_exporter": "python",
- "name": "python",
- "file_extension": ".py",
- "version": "2.7.12",
- "pygments_lexer": "ipython2",
- "codemirror_mode": {
- "version": 2,
- "name": "ipython"
- }
- }
- }
-} \ No newline at end of file
diff --git a/docs/source/auto_examples/demo_OT_1D_test.py b/docs/source/auto_examples/demo_OT_1D_test.py
deleted file mode 100644
index 9edc377..0000000
--- a/docs/source/auto_examples/demo_OT_1D_test.py
+++ /dev/null
@@ -1,71 +0,0 @@
-# -*- coding: utf-8 -*-
-"""
-Demo for 1D optimal transport
-
-@author: rflamary
-"""
-
-import numpy as np
-import matplotlib.pylab as pl
-import ot
-from ot.datasets import get_1D_gauss as gauss
-
-
-#%% parameters
-
-n=100 # nb bins
-
-# bin positions
-x=np.arange(n,dtype=np.float64)
-
-# Gaussian distributions
-a=gauss(n,m=n*.2,s=5) # m= mean, s= std
-b=gauss(n,m=n*.6,s=10)
-
-# loss matrix
-M=ot.dist(x.reshape((n,1)),x.reshape((n,1)))
-M/=M.max()
-
-#%% plot the distributions
-
-pl.figure(1)
-pl.plot(x,a,'b',label='Source distribution')
-pl.plot(x,b,'r',label='Target distribution')
-pl.legend()
-
-#%% plot distributions and loss matrix
-
-pl.figure(2)
-ot.plot.plot1D_mat(a,b,M,'Cost matrix M')
-
-#%% EMD
-
-G0=ot.emd(a,b,M)
-
-pl.figure(3)
-ot.plot.plot1D_mat(a,b,G0,'OT matrix G0')
-
-#%% Sinkhorn
-
-lambd=1e-3
-Gs=ot.sinkhorn(a,b,M,lambd,verbose=True)
-
-pl.figure(4)
-ot.plot.plot1D_mat(a,b,Gs,'OT matrix Sinkhorn')
-
-#%% Sinkhorn
-
-lambd=1e-4
-Gss,log=ot.bregman.sinkhorn_stabilized(a,b,M,lambd,verbose=True,log=True)
-Gss2,log2=ot.bregman.sinkhorn_stabilized(a,b,M,lambd,verbose=True,log=True,warmstart=log['warmstart'])
-
-pl.figure(5)
-ot.plot.plot1D_mat(a,b,Gss,'OT matrix Sinkhorn stabilized')
-
-#%% Sinkhorn
-
-lambd=1e-11
-Gss=ot.bregman.sinkhorn_epsilon_scaling(a,b,M,lambd,verbose=True)
-
-pl.figure(5)
-ot.plot.plot1D_mat(a,b,Gss,'OT matrix Sinkhorn stabilized')
diff --git a/docs/source/auto_examples/demo_OT_1D_test.rst b/docs/source/auto_examples/demo_OT_1D_test.rst
deleted file mode 100644
index aebeb1d..0000000
--- a/docs/source/auto_examples/demo_OT_1D_test.rst
+++ /dev/null
@@ -1,99 +0,0 @@
-
-
-.. _sphx_glr_auto_examples_demo_OT_1D_test.py:
-
-
-Demo for 1D optimal transport
-
-@author: rflamary
-
-
-
-.. code-block:: python
-
-
- import numpy as np
- import matplotlib.pylab as pl
- import ot
- from ot.datasets import get_1D_gauss as gauss
-
-
- #%% parameters
-
- n=100 # nb bins
-
- # bin positions
- x=np.arange(n,dtype=np.float64)
-
- # Gaussian distributions
- a=gauss(n,m=n*.2,s=5) # m= mean, s= std
- b=gauss(n,m=n*.6,s=10)
-
- # loss matrix
- M=ot.dist(x.reshape((n,1)),x.reshape((n,1)))
- M/=M.max()
-
- #%% plot the distributions
-
- pl.figure(1)
- pl.plot(x,a,'b',label='Source distribution')
- pl.plot(x,b,'r',label='Target distribution')
- pl.legend()
-
- #%% plot distributions and loss matrix
-
- pl.figure(2)
- ot.plot.plot1D_mat(a,b,M,'Cost matrix M')
-
- #%% EMD
-
- G0=ot.emd(a,b,M)
-
- pl.figure(3)
- ot.plot.plot1D_mat(a,b,G0,'OT matrix G0')
-
- #%% Sinkhorn
-
- lambd=1e-3
- Gs=ot.sinkhorn(a,b,M,lambd,verbose=True)
-
- pl.figure(4)
- ot.plot.plot1D_mat(a,b,Gs,'OT matrix Sinkhorn')
-
- #%% Sinkhorn
-
- lambd=1e-4
- Gss,log=ot.bregman.sinkhorn_stabilized(a,b,M,lambd,verbose=True,log=True)
- Gss2,log2=ot.bregman.sinkhorn_stabilized(a,b,M,lambd,verbose=True,log=True,warmstart=log['warmstart'])
-
- pl.figure(5)
- ot.plot.plot1D_mat(a,b,Gss,'OT matrix Sinkhorn stabilized')
-
- #%% Sinkhorn
-
- lambd=1e-11
- Gss=ot.bregman.sinkhorn_epsilon_scaling(a,b,M,lambd,verbose=True)
-
- pl.figure(5)
- ot.plot.plot1D_mat(a,b,Gss,'OT matrix Sinkhorn stabilized')
-
-**Total running time of the script:** ( 0 minutes 0.000 seconds)
-
-
-
-.. container:: sphx-glr-footer
-
-
- .. container:: sphx-glr-download
-
- :download:`Download Python source code: demo_OT_1D_test.py <demo_OT_1D_test.py>`
-
-
-
- .. container:: sphx-glr-download
-
- :download:`Download Jupyter notebook: demo_OT_1D_test.ipynb <demo_OT_1D_test.ipynb>`
-
-.. rst-class:: sphx-glr-signature
-
- `Generated by Sphinx-Gallery <http://sphinx-gallery.readthedocs.io>`_
diff --git a/docs/source/auto_examples/demo_OT_2D_sampleslarge.ipynb b/docs/source/auto_examples/demo_OT_2D_sampleslarge.ipynb
deleted file mode 100644
index 584a936..0000000
--- a/docs/source/auto_examples/demo_OT_2D_sampleslarge.ipynb
+++ /dev/null
@@ -1,54 +0,0 @@
-{
- "nbformat_minor": 0,
- "nbformat": 4,
- "cells": [
- {
- "execution_count": null,
- "cell_type": "code",
- "source": [
- "%matplotlib inline"
- ],
- "outputs": [],
- "metadata": {
- "collapsed": false
- }
- },
- {
- "source": [
- "\nDemo for 2D Optimal transport between empirical distributions\n\n@author: rflamary\n\n"
- ],
- "cell_type": "markdown",
- "metadata": {}
- },
- {
- "execution_count": null,
- "cell_type": "code",
- "source": [
- "import numpy as np\nimport matplotlib.pylab as pl\nimport ot\n\n#%% parameters and data generation\n\nn=5000 # nb samples\n\nmu_s=np.array([0,0])\ncov_s=np.array([[1,0],[0,1]])\n\nmu_t=np.array([4,4])\ncov_t=np.array([[1,-.8],[-.8,1]])\n\nxs=ot.datasets.get_2D_samples_gauss(n,mu_s,cov_s)\nxt=ot.datasets.get_2D_samples_gauss(n,mu_t,cov_t)\n\na,b = ot.unif(n),ot.unif(n) # uniform distribution on samples\n\n# loss matrix\nM=ot.dist(xs,xt)\nM/=M.max()\n\n#%% plot samples\n\n#pl.figure(1)\n#pl.plot(xs[:,0],xs[:,1],'+b',label='Source samples')\n#pl.plot(xt[:,0],xt[:,1],'xr',label='Target samples')\n#pl.legend(loc=0)\n#pl.title('Source and traget distributions')\n#\n#pl.figure(2)\n#pl.imshow(M,interpolation='nearest')\n#pl.title('Cost matrix M')\n#\n\n#%% EMD\n\nG0=ot.emd(a,b,M)\n\n#pl.figure(3)\n#pl.imshow(G0,interpolation='nearest')\n#pl.title('OT matrix G0')\n#\n#pl.figure(4)\n#ot.plot.plot2D_samples_mat(xs,xt,G0,c=[.5,.5,1])\n#pl.plot(xs[:,0],xs[:,1],'+b',label='Source samples')\n#pl.plot(xt[:,0],xt[:,1],'xr',label='Target samples')\n#pl.legend(loc=0)\n#pl.title('OT matrix with samples')\n\n\n#%% sinkhorn\n\n# reg term\nlambd=5e-3\n\nGs=ot.sinkhorn(a,b,M,lambd)\n\n#pl.figure(5)\n#pl.imshow(Gs,interpolation='nearest')\n#pl.title('OT matrix sinkhorn')\n#\n#pl.figure(6)\n#ot.plot.plot2D_samples_mat(xs,xt,Gs,color=[.5,.5,1])\n#pl.plot(xs[:,0],xs[:,1],'+b',label='Source samples')\n#pl.plot(xt[:,0],xt[:,1],'xr',label='Target samples')\n#pl.legend(loc=0)\n#pl.title('OT matrix Sinkhorn with samples')\n#"
- ],
- "outputs": [],
- "metadata": {
- "collapsed": false
- }
- }
- ],
- "metadata": {
- "kernelspec": {
- "display_name": "Python 2",
- "name": "python2",
- "language": "python"
- },
- "language_info": {
- "mimetype": "text/x-python",
- "nbconvert_exporter": "python",
- "name": "python",
- "file_extension": ".py",
- "version": "2.7.12",
- "pygments_lexer": "ipython2",
- "codemirror_mode": {
- "version": 2,
- "name": "ipython"
- }
- }
- }
-} \ No newline at end of file
diff --git a/docs/source/auto_examples/demo_OT_2D_sampleslarge.py b/docs/source/auto_examples/demo_OT_2D_sampleslarge.py
deleted file mode 100644
index ee3e8f7..0000000
--- a/docs/source/auto_examples/demo_OT_2D_sampleslarge.py
+++ /dev/null
@@ -1,78 +0,0 @@
-# -*- coding: utf-8 -*-
-"""
-Demo for 2D Optimal transport between empirical distributions
-
-@author: rflamary
-"""
-
-import numpy as np
-import matplotlib.pylab as pl
-import ot
-
-#%% parameters and data generation
-
-n=5000 # nb samples
-
-mu_s=np.array([0,0])
-cov_s=np.array([[1,0],[0,1]])
-
-mu_t=np.array([4,4])
-cov_t=np.array([[1,-.8],[-.8,1]])
-
-xs=ot.datasets.get_2D_samples_gauss(n,mu_s,cov_s)
-xt=ot.datasets.get_2D_samples_gauss(n,mu_t,cov_t)
-
-a,b = ot.unif(n),ot.unif(n) # uniform distribution on samples
-
-# loss matrix
-M=ot.dist(xs,xt)
-M/=M.max()
-
-#%% plot samples
-
-#pl.figure(1)
-#pl.plot(xs[:,0],xs[:,1],'+b',label='Source samples')
-#pl.plot(xt[:,0],xt[:,1],'xr',label='Target samples')
-#pl.legend(loc=0)
-#pl.title('Source and traget distributions')
-#
-#pl.figure(2)
-#pl.imshow(M,interpolation='nearest')
-#pl.title('Cost matrix M')
-#
-
-#%% EMD
-
-G0=ot.emd(a,b,M)
-
-#pl.figure(3)
-#pl.imshow(G0,interpolation='nearest')
-#pl.title('OT matrix G0')
-#
-#pl.figure(4)
-#ot.plot.plot2D_samples_mat(xs,xt,G0,c=[.5,.5,1])
-#pl.plot(xs[:,0],xs[:,1],'+b',label='Source samples')
-#pl.plot(xt[:,0],xt[:,1],'xr',label='Target samples')
-#pl.legend(loc=0)
-#pl.title('OT matrix with samples')
-
-
-#%% sinkhorn
-
-# reg term
-lambd=5e-3
-
-Gs=ot.sinkhorn(a,b,M,lambd)
-
-#pl.figure(5)
-#pl.imshow(Gs,interpolation='nearest')
-#pl.title('OT matrix sinkhorn')
-#
-#pl.figure(6)
-#ot.plot.plot2D_samples_mat(xs,xt,Gs,color=[.5,.5,1])
-#pl.plot(xs[:,0],xs[:,1],'+b',label='Source samples')
-#pl.plot(xt[:,0],xt[:,1],'xr',label='Target samples')
-#pl.legend(loc=0)
-#pl.title('OT matrix Sinkhorn with samples')
-#
-
diff --git a/docs/source/auto_examples/demo_OT_2D_sampleslarge.rst b/docs/source/auto_examples/demo_OT_2D_sampleslarge.rst
deleted file mode 100644
index f5dbb0d..0000000
--- a/docs/source/auto_examples/demo_OT_2D_sampleslarge.rst
+++ /dev/null
@@ -1,106 +0,0 @@
-
-
-.. _sphx_glr_auto_examples_demo_OT_2D_sampleslarge.py:
-
-
-Demo for 2D Optimal transport between empirical distributions
-
-@author: rflamary
-
-
-
-.. code-block:: python
-
-
- import numpy as np
- import matplotlib.pylab as pl
- import ot
-
- #%% parameters and data generation
-
- n=5000 # nb samples
-
- mu_s=np.array([0,0])
- cov_s=np.array([[1,0],[0,1]])
-
- mu_t=np.array([4,4])
- cov_t=np.array([[1,-.8],[-.8,1]])
-
- xs=ot.datasets.get_2D_samples_gauss(n,mu_s,cov_s)
- xt=ot.datasets.get_2D_samples_gauss(n,mu_t,cov_t)
-
- a,b = ot.unif(n),ot.unif(n) # uniform distribution on samples
-
- # loss matrix
- M=ot.dist(xs,xt)
- M/=M.max()
-
- #%% plot samples
-
- #pl.figure(1)
- #pl.plot(xs[:,0],xs[:,1],'+b',label='Source samples')
- #pl.plot(xt[:,0],xt[:,1],'xr',label='Target samples')
- #pl.legend(loc=0)
- #pl.title('Source and traget distributions')
- #
- #pl.figure(2)
- #pl.imshow(M,interpolation='nearest')
- #pl.title('Cost matrix M')
- #
-
- #%% EMD
-
- G0=ot.emd(a,b,M)
-
- #pl.figure(3)
- #pl.imshow(G0,interpolation='nearest')
- #pl.title('OT matrix G0')
- #
- #pl.figure(4)
- #ot.plot.plot2D_samples_mat(xs,xt,G0,c=[.5,.5,1])
- #pl.plot(xs[:,0],xs[:,1],'+b',label='Source samples')
- #pl.plot(xt[:,0],xt[:,1],'xr',label='Target samples')
- #pl.legend(loc=0)
- #pl.title('OT matrix with samples')
-
-
- #%% sinkhorn
-
- # reg term
- lambd=5e-3
-
- Gs=ot.sinkhorn(a,b,M,lambd)
-
- #pl.figure(5)
- #pl.imshow(Gs,interpolation='nearest')
- #pl.title('OT matrix sinkhorn')
- #
- #pl.figure(6)
- #ot.plot.plot2D_samples_mat(xs,xt,Gs,color=[.5,.5,1])
- #pl.plot(xs[:,0],xs[:,1],'+b',label='Source samples')
- #pl.plot(xt[:,0],xt[:,1],'xr',label='Target samples')
- #pl.legend(loc=0)
- #pl.title('OT matrix Sinkhorn with samples')
- #
-
-
-**Total running time of the script:** ( 0 minutes 0.000 seconds)
-
-
-
-.. container:: sphx-glr-footer
-
-
- .. container:: sphx-glr-download
-
- :download:`Download Python source code: demo_OT_2D_sampleslarge.py <demo_OT_2D_sampleslarge.py>`
-
-
-
- .. container:: sphx-glr-download
-
- :download:`Download Jupyter notebook: demo_OT_2D_sampleslarge.ipynb <demo_OT_2D_sampleslarge.ipynb>`
-
-.. rst-class:: sphx-glr-signature
-
- `Generated by Sphinx-Gallery <http://sphinx-gallery.readthedocs.io>`_
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_OTDA_2D_001.png b/docs/source/auto_examples/images/sphx_glr_plot_OTDA_2D_001.png
deleted file mode 100644
index 7de2b45..0000000
--- a/docs/source/auto_examples/images/sphx_glr_plot_OTDA_2D_001.png
+++ /dev/null
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_OTDA_2D_002.png b/docs/source/auto_examples/images/sphx_glr_plot_OTDA_2D_002.png
deleted file mode 100644
index dc34efd..0000000
--- a/docs/source/auto_examples/images/sphx_glr_plot_OTDA_2D_002.png
+++ /dev/null
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_OTDA_2D_003.png b/docs/source/auto_examples/images/sphx_glr_plot_OTDA_2D_003.png
deleted file mode 100644
index fbd72d5..0000000
--- a/docs/source/auto_examples/images/sphx_glr_plot_OTDA_2D_003.png
+++ /dev/null
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_OTDA_2D_004.png b/docs/source/auto_examples/images/sphx_glr_plot_OTDA_2D_004.png
deleted file mode 100644
index 227812d..0000000
--- a/docs/source/auto_examples/images/sphx_glr_plot_OTDA_2D_004.png
+++ /dev/null
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_OTDA_classes_001.png b/docs/source/auto_examples/images/sphx_glr_plot_OTDA_classes_001.png
deleted file mode 100644
index 2bf4015..0000000
--- a/docs/source/auto_examples/images/sphx_glr_plot_OTDA_classes_001.png
+++ /dev/null
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_OTDA_classes_004.png b/docs/source/auto_examples/images/sphx_glr_plot_OTDA_classes_004.png
deleted file mode 100644
index c1fbf57..0000000
--- a/docs/source/auto_examples/images/sphx_glr_plot_OTDA_classes_004.png
+++ /dev/null
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_OTDA_color_images_001.png b/docs/source/auto_examples/images/sphx_glr_plot_OTDA_color_images_001.png
deleted file mode 100644
index 36bc769..0000000
--- a/docs/source/auto_examples/images/sphx_glr_plot_OTDA_color_images_001.png
+++ /dev/null
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_OTDA_color_images_002.png b/docs/source/auto_examples/images/sphx_glr_plot_OTDA_color_images_002.png
deleted file mode 100644
index 307e384..0000000
--- a/docs/source/auto_examples/images/sphx_glr_plot_OTDA_color_images_002.png
+++ /dev/null
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_OTDA_mapping_001.png b/docs/source/auto_examples/images/sphx_glr_plot_OTDA_mapping_001.png
deleted file mode 100644
index 8c700ee..0000000
--- a/docs/source/auto_examples/images/sphx_glr_plot_OTDA_mapping_001.png
+++ /dev/null
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_OTDA_mapping_002.png b/docs/source/auto_examples/images/sphx_glr_plot_OTDA_mapping_002.png
deleted file mode 100644
index 792b404..0000000
--- a/docs/source/auto_examples/images/sphx_glr_plot_OTDA_mapping_002.png
+++ /dev/null
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_OTDA_mapping_color_images_001.png b/docs/source/auto_examples/images/sphx_glr_plot_OTDA_mapping_color_images_001.png
deleted file mode 100644
index 36bc769..0000000
--- a/docs/source/auto_examples/images/sphx_glr_plot_OTDA_mapping_color_images_001.png
+++ /dev/null
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_OTDA_mapping_color_images_002.png b/docs/source/auto_examples/images/sphx_glr_plot_OTDA_mapping_color_images_002.png
deleted file mode 100644
index 008bf15..0000000
--- a/docs/source/auto_examples/images/sphx_glr_plot_OTDA_mapping_color_images_002.png
+++ /dev/null
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_OT_1D_001.png b/docs/source/auto_examples/images/sphx_glr_plot_OT_1D_001.png
index da42bc1..e11f5b9 100644
--- a/docs/source/auto_examples/images/sphx_glr_plot_OT_1D_001.png
+++ b/docs/source/auto_examples/images/sphx_glr_plot_OT_1D_001.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_OT_1D_002.png b/docs/source/auto_examples/images/sphx_glr_plot_OT_1D_002.png
index 1f98598..fcab0bd 100644
--- a/docs/source/auto_examples/images/sphx_glr_plot_OT_1D_002.png
+++ b/docs/source/auto_examples/images/sphx_glr_plot_OT_1D_002.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_OT_1D_003.png b/docs/source/auto_examples/images/sphx_glr_plot_OT_1D_003.png
deleted file mode 100644
index 9e893d6..0000000
--- a/docs/source/auto_examples/images/sphx_glr_plot_OT_1D_003.png
+++ /dev/null
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_OT_1D_004.png b/docs/source/auto_examples/images/sphx_glr_plot_OT_1D_004.png
deleted file mode 100644
index 3bc248b..0000000
--- a/docs/source/auto_examples/images/sphx_glr_plot_OT_1D_004.png
+++ /dev/null
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_OT_1D_005.png b/docs/source/auto_examples/images/sphx_glr_plot_OT_1D_005.png
new file mode 100644
index 0000000..a75e649
--- /dev/null
+++ b/docs/source/auto_examples/images/sphx_glr_plot_OT_1D_005.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_OT_1D_007.png b/docs/source/auto_examples/images/sphx_glr_plot_OT_1D_007.png
new file mode 100644
index 0000000..96b42cd
--- /dev/null
+++ b/docs/source/auto_examples/images/sphx_glr_plot_OT_1D_007.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_OT_2D_samples_001.png b/docs/source/auto_examples/images/sphx_glr_plot_OT_2D_samples_001.png
index e023ab4..ba50e23 100644
--- a/docs/source/auto_examples/images/sphx_glr_plot_OT_2D_samples_001.png
+++ b/docs/source/auto_examples/images/sphx_glr_plot_OT_2D_samples_001.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_OT_2D_samples_002.png b/docs/source/auto_examples/images/sphx_glr_plot_OT_2D_samples_002.png
index dda21d4..19978ff 100644
--- a/docs/source/auto_examples/images/sphx_glr_plot_OT_2D_samples_002.png
+++ b/docs/source/auto_examples/images/sphx_glr_plot_OT_2D_samples_002.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_OT_2D_samples_003.png b/docs/source/auto_examples/images/sphx_glr_plot_OT_2D_samples_003.png
deleted file mode 100644
index f0967fb..0000000
--- a/docs/source/auto_examples/images/sphx_glr_plot_OT_2D_samples_003.png
+++ /dev/null
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_OT_2D_samples_004.png b/docs/source/auto_examples/images/sphx_glr_plot_OT_2D_samples_004.png
deleted file mode 100644
index 809c8fc..0000000
--- a/docs/source/auto_examples/images/sphx_glr_plot_OT_2D_samples_004.png
+++ /dev/null
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_OT_2D_samples_005.png b/docs/source/auto_examples/images/sphx_glr_plot_OT_2D_samples_005.png
index 887bdde..aed13b2 100644
--- a/docs/source/auto_examples/images/sphx_glr_plot_OT_2D_samples_005.png
+++ b/docs/source/auto_examples/images/sphx_glr_plot_OT_2D_samples_005.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_OT_2D_samples_006.png b/docs/source/auto_examples/images/sphx_glr_plot_OT_2D_samples_006.png
index 783c594..8ea40f1 100644
--- a/docs/source/auto_examples/images/sphx_glr_plot_OT_2D_samples_006.png
+++ b/docs/source/auto_examples/images/sphx_glr_plot_OT_2D_samples_006.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_OT_2D_samples_009.png b/docs/source/auto_examples/images/sphx_glr_plot_OT_2D_samples_009.png
new file mode 100644
index 0000000..404e9d8
--- /dev/null
+++ b/docs/source/auto_examples/images/sphx_glr_plot_OT_2D_samples_009.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_OT_2D_samples_010.png b/docs/source/auto_examples/images/sphx_glr_plot_OT_2D_samples_010.png
new file mode 100644
index 0000000..56b79cf
--- /dev/null
+++ b/docs/source/auto_examples/images/sphx_glr_plot_OT_2D_samples_010.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_001.png b/docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_001.png
index b159a6a..6a21f35 100644
--- a/docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_001.png
+++ b/docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_001.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_002.png b/docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_002.png
index 9f8e882..79e4710 100644
--- a/docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_002.png
+++ b/docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_002.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_003.png b/docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_003.png
index 33058fc..4860d96 100644
--- a/docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_003.png
+++ b/docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_003.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_004.png b/docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_004.png
index 9848bcb..6a21f35 100644
--- a/docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_004.png
+++ b/docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_004.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_005.png b/docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_005.png
index 6616d3c..4860d96 100644
--- a/docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_005.png
+++ b/docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_005.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_006.png b/docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_006.png
index 8575d93..4860d96 100644
--- a/docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_006.png
+++ b/docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_006.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_007.png b/docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_007.png
new file mode 100644
index 0000000..22dba2b
--- /dev/null
+++ b/docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_007.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_008.png b/docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_008.png
new file mode 100644
index 0000000..5dbf96b
--- /dev/null
+++ b/docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_008.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_009.png b/docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_009.png
new file mode 100644
index 0000000..e1e9ba8
--- /dev/null
+++ b/docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_009.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_011.png b/docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_011.png
new file mode 100644
index 0000000..e1e9ba8
--- /dev/null
+++ b/docs/source/auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_011.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_WDA_001.png b/docs/source/auto_examples/images/sphx_glr_plot_WDA_001.png
index 250155d..f724332 100644
--- a/docs/source/auto_examples/images/sphx_glr_plot_WDA_001.png
+++ b/docs/source/auto_examples/images/sphx_glr_plot_WDA_001.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_WDA_003.png b/docs/source/auto_examples/images/sphx_glr_plot_WDA_003.png
new file mode 100644
index 0000000..b231020
--- /dev/null
+++ b/docs/source/auto_examples/images/sphx_glr_plot_WDA_003.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_barycenter_1D_001.png b/docs/source/auto_examples/images/sphx_glr_plot_barycenter_1D_001.png
index be71674..3454396 100644
--- a/docs/source/auto_examples/images/sphx_glr_plot_barycenter_1D_001.png
+++ b/docs/source/auto_examples/images/sphx_glr_plot_barycenter_1D_001.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_barycenter_1D_002.png b/docs/source/auto_examples/images/sphx_glr_plot_barycenter_1D_002.png
index f62240b..3b23af5 100644
--- a/docs/source/auto_examples/images/sphx_glr_plot_barycenter_1D_002.png
+++ b/docs/source/auto_examples/images/sphx_glr_plot_barycenter_1D_002.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_barycenter_1D_003.png b/docs/source/auto_examples/images/sphx_glr_plot_barycenter_1D_003.png
index 11f08b2..3b23af5 100644
--- a/docs/source/auto_examples/images/sphx_glr_plot_barycenter_1D_003.png
+++ b/docs/source/auto_examples/images/sphx_glr_plot_barycenter_1D_003.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_barycenter_1D_004.png b/docs/source/auto_examples/images/sphx_glr_plot_barycenter_1D_004.png
index b4e8f71..2e29ff9 100644
--- a/docs/source/auto_examples/images/sphx_glr_plot_barycenter_1D_004.png
+++ b/docs/source/auto_examples/images/sphx_glr_plot_barycenter_1D_004.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_barycenter_1D_005.png b/docs/source/auto_examples/images/sphx_glr_plot_barycenter_1D_005.png
new file mode 100644
index 0000000..eac9230
--- /dev/null
+++ b/docs/source/auto_examples/images/sphx_glr_plot_barycenter_1D_005.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_barycenter_1D_006.png b/docs/source/auto_examples/images/sphx_glr_plot_barycenter_1D_006.png
new file mode 100644
index 0000000..2e29ff9
--- /dev/null
+++ b/docs/source/auto_examples/images/sphx_glr_plot_barycenter_1D_006.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_compute_emd_001.png b/docs/source/auto_examples/images/sphx_glr_plot_compute_emd_001.png
index 4917903..9cf84c6 100644
--- a/docs/source/auto_examples/images/sphx_glr_plot_compute_emd_001.png
+++ b/docs/source/auto_examples/images/sphx_glr_plot_compute_emd_001.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_compute_emd_003.png b/docs/source/auto_examples/images/sphx_glr_plot_compute_emd_003.png
new file mode 100644
index 0000000..eb4bd0d
--- /dev/null
+++ b/docs/source/auto_examples/images/sphx_glr_plot_compute_emd_003.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_compute_emd_002.png b/docs/source/auto_examples/images/sphx_glr_plot_compute_emd_004.png
index 7c06255..7c06255 100644
--- a/docs/source/auto_examples/images/sphx_glr_plot_compute_emd_002.png
+++ b/docs/source/auto_examples/images/sphx_glr_plot_compute_emd_004.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_optim_OTreg_003.png b/docs/source/auto_examples/images/sphx_glr_plot_optim_OTreg_003.png
index 7ffcc14..a75e649 100644
--- a/docs/source/auto_examples/images/sphx_glr_plot_optim_OTreg_003.png
+++ b/docs/source/auto_examples/images/sphx_glr_plot_optim_OTreg_003.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_optim_OTreg_004.png b/docs/source/auto_examples/images/sphx_glr_plot_optim_OTreg_004.png
index 2a72060..7afdb53 100644
--- a/docs/source/auto_examples/images/sphx_glr_plot_optim_OTreg_004.png
+++ b/docs/source/auto_examples/images/sphx_glr_plot_optim_OTreg_004.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_optim_OTreg_005.png b/docs/source/auto_examples/images/sphx_glr_plot_optim_OTreg_005.png
deleted file mode 100644
index e70a6de..0000000
--- a/docs/source/auto_examples/images/sphx_glr_plot_optim_OTreg_005.png
+++ /dev/null
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_optim_OTreg_006.png b/docs/source/auto_examples/images/sphx_glr_plot_optim_OTreg_006.png
new file mode 100644
index 0000000..0af4542
--- /dev/null
+++ b/docs/source/auto_examples/images/sphx_glr_plot_optim_OTreg_006.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_optim_OTreg_008.png b/docs/source/auto_examples/images/sphx_glr_plot_optim_OTreg_008.png
new file mode 100644
index 0000000..8a4882a
--- /dev/null
+++ b/docs/source/auto_examples/images/sphx_glr_plot_optim_OTreg_008.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_otda_classes_001.png b/docs/source/auto_examples/images/sphx_glr_plot_otda_classes_001.png
new file mode 100644
index 0000000..bedc950
--- /dev/null
+++ b/docs/source/auto_examples/images/sphx_glr_plot_otda_classes_001.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_otda_classes_003.png b/docs/source/auto_examples/images/sphx_glr_plot_otda_classes_003.png
new file mode 100644
index 0000000..8e3ccad
--- /dev/null
+++ b/docs/source/auto_examples/images/sphx_glr_plot_otda_classes_003.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_otda_color_images_001.png b/docs/source/auto_examples/images/sphx_glr_plot_otda_color_images_001.png
new file mode 100644
index 0000000..2d851c7
--- /dev/null
+++ b/docs/source/auto_examples/images/sphx_glr_plot_otda_color_images_001.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_otda_color_images_003.png b/docs/source/auto_examples/images/sphx_glr_plot_otda_color_images_003.png
new file mode 100644
index 0000000..a1d99ab
--- /dev/null
+++ b/docs/source/auto_examples/images/sphx_glr_plot_otda_color_images_003.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_otda_color_images_005.png b/docs/source/auto_examples/images/sphx_glr_plot_otda_color_images_005.png
new file mode 100644
index 0000000..f76619b
--- /dev/null
+++ b/docs/source/auto_examples/images/sphx_glr_plot_otda_color_images_005.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_otda_d2_001.png b/docs/source/auto_examples/images/sphx_glr_plot_otda_d2_001.png
new file mode 100644
index 0000000..3d6a740
--- /dev/null
+++ b/docs/source/auto_examples/images/sphx_glr_plot_otda_d2_001.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_otda_d2_003.png b/docs/source/auto_examples/images/sphx_glr_plot_otda_d2_003.png
new file mode 100644
index 0000000..aa16585
--- /dev/null
+++ b/docs/source/auto_examples/images/sphx_glr_plot_otda_d2_003.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_otda_d2_006.png b/docs/source/auto_examples/images/sphx_glr_plot_otda_d2_006.png
new file mode 100644
index 0000000..5dc3ba7
--- /dev/null
+++ b/docs/source/auto_examples/images/sphx_glr_plot_otda_d2_006.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_otda_mapping_001.png b/docs/source/auto_examples/images/sphx_glr_plot_otda_mapping_001.png
new file mode 100644
index 0000000..4239465
--- /dev/null
+++ b/docs/source/auto_examples/images/sphx_glr_plot_otda_mapping_001.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_otda_mapping_003.png b/docs/source/auto_examples/images/sphx_glr_plot_otda_mapping_003.png
new file mode 100644
index 0000000..620105e
--- /dev/null
+++ b/docs/source/auto_examples/images/sphx_glr_plot_otda_mapping_003.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_otda_mapping_colors_images_001.png b/docs/source/auto_examples/images/sphx_glr_plot_otda_mapping_colors_images_001.png
new file mode 100644
index 0000000..1182082
--- /dev/null
+++ b/docs/source/auto_examples/images/sphx_glr_plot_otda_mapping_colors_images_001.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_otda_mapping_colors_images_003.png b/docs/source/auto_examples/images/sphx_glr_plot_otda_mapping_colors_images_003.png
new file mode 100644
index 0000000..cc2e4cd
--- /dev/null
+++ b/docs/source/auto_examples/images/sphx_glr_plot_otda_mapping_colors_images_003.png
Binary files differ
diff --git a/docs/source/auto_examples/images/sphx_glr_plot_otda_mapping_colors_images_004.png b/docs/source/auto_examples/images/sphx_glr_plot_otda_mapping_colors_images_004.png
new file mode 100644
index 0000000..7a68343
--- /dev/null
+++ b/docs/source/auto_examples/images/sphx_glr_plot_otda_mapping_colors_images_004.png
Binary files differ
diff --git a/docs/source/auto_examples/images/thumb/sphx_glr_plot_OTDA_2D_thumb.png b/docs/source/auto_examples/images/thumb/sphx_glr_plot_OTDA_2D_thumb.png
deleted file mode 100644
index d15269d..0000000
--- a/docs/source/auto_examples/images/thumb/sphx_glr_plot_OTDA_2D_thumb.png
+++ /dev/null
Binary files differ
diff --git a/docs/source/auto_examples/images/thumb/sphx_glr_plot_OTDA_classes_thumb.png b/docs/source/auto_examples/images/thumb/sphx_glr_plot_OTDA_classes_thumb.png
deleted file mode 100644
index 5863d02..0000000
--- a/docs/source/auto_examples/images/thumb/sphx_glr_plot_OTDA_classes_thumb.png
+++ /dev/null
Binary files differ
diff --git a/docs/source/auto_examples/images/thumb/sphx_glr_plot_OTDA_color_images_thumb.png b/docs/source/auto_examples/images/thumb/sphx_glr_plot_OTDA_color_images_thumb.png
deleted file mode 100644
index 5bb43c4..0000000
--- a/docs/source/auto_examples/images/thumb/sphx_glr_plot_OTDA_color_images_thumb.png
+++ /dev/null
Binary files differ
diff --git a/docs/source/auto_examples/images/thumb/sphx_glr_plot_OTDA_mapping_color_images_thumb.png b/docs/source/auto_examples/images/thumb/sphx_glr_plot_OTDA_mapping_color_images_thumb.png
deleted file mode 100644
index 5bb43c4..0000000
--- a/docs/source/auto_examples/images/thumb/sphx_glr_plot_OTDA_mapping_color_images_thumb.png
+++ /dev/null
Binary files differ
diff --git a/docs/source/auto_examples/images/thumb/sphx_glr_plot_OTDA_mapping_thumb.png b/docs/source/auto_examples/images/thumb/sphx_glr_plot_OTDA_mapping_thumb.png
deleted file mode 100644
index c3d9a65..0000000
--- a/docs/source/auto_examples/images/thumb/sphx_glr_plot_OTDA_mapping_thumb.png
+++ /dev/null
Binary files differ
diff --git a/docs/source/auto_examples/images/thumb/sphx_glr_plot_OT_1D_thumb.png b/docs/source/auto_examples/images/thumb/sphx_glr_plot_OT_1D_thumb.png
index 15c9825..a3b7039 100644
--- a/docs/source/auto_examples/images/thumb/sphx_glr_plot_OT_1D_thumb.png
+++ b/docs/source/auto_examples/images/thumb/sphx_glr_plot_OT_1D_thumb.png
Binary files differ
diff --git a/docs/source/auto_examples/images/thumb/sphx_glr_plot_OT_2D_samples_thumb.png b/docs/source/auto_examples/images/thumb/sphx_glr_plot_OT_2D_samples_thumb.png
index bac78f0..dbb5cfd 100644
--- a/docs/source/auto_examples/images/thumb/sphx_glr_plot_OT_2D_samples_thumb.png
+++ b/docs/source/auto_examples/images/thumb/sphx_glr_plot_OT_2D_samples_thumb.png
Binary files differ
diff --git a/docs/source/auto_examples/images/thumb/sphx_glr_plot_OT_L1_vs_L2_thumb.png b/docs/source/auto_examples/images/thumb/sphx_glr_plot_OT_L1_vs_L2_thumb.png
index c67e8aa..95588f5 100644
--- a/docs/source/auto_examples/images/thumb/sphx_glr_plot_OT_L1_vs_L2_thumb.png
+++ b/docs/source/auto_examples/images/thumb/sphx_glr_plot_OT_L1_vs_L2_thumb.png
Binary files differ
diff --git a/docs/source/auto_examples/images/thumb/sphx_glr_plot_OT_conv_thumb.png b/docs/source/auto_examples/images/thumb/sphx_glr_plot_OT_conv_thumb.png
deleted file mode 100644
index 3015582..0000000
--- a/docs/source/auto_examples/images/thumb/sphx_glr_plot_OT_conv_thumb.png
+++ /dev/null
Binary files differ
diff --git a/docs/source/auto_examples/images/thumb/sphx_glr_plot_WDA_thumb.png b/docs/source/auto_examples/images/thumb/sphx_glr_plot_WDA_thumb.png
index 84759e8..f55490c 100644
--- a/docs/source/auto_examples/images/thumb/sphx_glr_plot_WDA_thumb.png
+++ b/docs/source/auto_examples/images/thumb/sphx_glr_plot_WDA_thumb.png
Binary files differ
diff --git a/docs/source/auto_examples/images/thumb/sphx_glr_plot_barycenter_1D_thumb.png b/docs/source/auto_examples/images/thumb/sphx_glr_plot_barycenter_1D_thumb.png
index 86ff19f..d8cdccb 100644
--- a/docs/source/auto_examples/images/thumb/sphx_glr_plot_barycenter_1D_thumb.png
+++ b/docs/source/auto_examples/images/thumb/sphx_glr_plot_barycenter_1D_thumb.png
Binary files differ
diff --git a/docs/source/auto_examples/images/thumb/sphx_glr_plot_compute_emd_thumb.png b/docs/source/auto_examples/images/thumb/sphx_glr_plot_compute_emd_thumb.png
index 67d2ca1..898cd72 100644
--- a/docs/source/auto_examples/images/thumb/sphx_glr_plot_compute_emd_thumb.png
+++ b/docs/source/auto_examples/images/thumb/sphx_glr_plot_compute_emd_thumb.png
Binary files differ
diff --git a/docs/source/auto_examples/images/thumb/sphx_glr_plot_optim_OTreg_thumb.png b/docs/source/auto_examples/images/thumb/sphx_glr_plot_optim_OTreg_thumb.png
index 2a72060..cbc8e0f 100644
--- a/docs/source/auto_examples/images/thumb/sphx_glr_plot_optim_OTreg_thumb.png
+++ b/docs/source/auto_examples/images/thumb/sphx_glr_plot_optim_OTreg_thumb.png
Binary files differ
diff --git a/docs/source/auto_examples/images/thumb/sphx_glr_plot_otda_classes_thumb.png b/docs/source/auto_examples/images/thumb/sphx_glr_plot_otda_classes_thumb.png
new file mode 100644
index 0000000..a72fe37
--- /dev/null
+++ b/docs/source/auto_examples/images/thumb/sphx_glr_plot_otda_classes_thumb.png
Binary files differ
diff --git a/docs/source/auto_examples/images/thumb/sphx_glr_plot_otda_color_images_thumb.png b/docs/source/auto_examples/images/thumb/sphx_glr_plot_otda_color_images_thumb.png
new file mode 100644
index 0000000..16b7572
--- /dev/null
+++ b/docs/source/auto_examples/images/thumb/sphx_glr_plot_otda_color_images_thumb.png
Binary files differ
diff --git a/docs/source/auto_examples/images/thumb/sphx_glr_plot_otda_d2_thumb.png b/docs/source/auto_examples/images/thumb/sphx_glr_plot_otda_d2_thumb.png
new file mode 100644
index 0000000..cddf768
--- /dev/null
+++ b/docs/source/auto_examples/images/thumb/sphx_glr_plot_otda_d2_thumb.png
Binary files differ
diff --git a/docs/source/auto_examples/images/thumb/sphx_glr_plot_otda_mapping_colors_images_thumb.png b/docs/source/auto_examples/images/thumb/sphx_glr_plot_otda_mapping_colors_images_thumb.png
new file mode 100644
index 0000000..9666955
--- /dev/null
+++ b/docs/source/auto_examples/images/thumb/sphx_glr_plot_otda_mapping_colors_images_thumb.png
Binary files differ
diff --git a/docs/source/auto_examples/images/thumb/sphx_glr_plot_otda_mapping_thumb.png b/docs/source/auto_examples/images/thumb/sphx_glr_plot_otda_mapping_thumb.png
new file mode 100644
index 0000000..959cc44
--- /dev/null
+++ b/docs/source/auto_examples/images/thumb/sphx_glr_plot_otda_mapping_thumb.png
Binary files differ
diff --git a/docs/source/auto_examples/images/thumb/sphx_glr_test_OT_2D_samples_stabilized_thumb.png b/docs/source/auto_examples/images/thumb/sphx_glr_test_OT_2D_samples_stabilized_thumb.png
deleted file mode 100644
index cbc8e0f..0000000
--- a/docs/source/auto_examples/images/thumb/sphx_glr_test_OT_2D_samples_stabilized_thumb.png
+++ /dev/null
Binary files differ
diff --git a/docs/source/auto_examples/index.rst b/docs/source/auto_examples/index.rst
index 1695300..5d7a53d 100644
--- a/docs/source/auto_examples/index.rst
+++ b/docs/source/auto_examples/index.rst
@@ -1,9 +1,15 @@
+:orphan:
+
POT Examples
============
+This is a gallery of all the POT example files.
+
+
+
.. raw:: html
- <div class="sphx-glr-thumbcontainer" tooltip="@author: rflamary ">
+ <div class="sphx-glr-thumbcontainer" tooltip="This example illustrates the computation of EMD and Sinkhorn transport plans and their visualiz...">
.. only:: html
@@ -23,13 +29,13 @@ POT Examples
.. raw:: html
- <div class="sphx-glr-thumbcontainer" tooltip="@author: rflamary ">
+ <div class="sphx-glr-thumbcontainer" tooltip="Illustrates the use of the generic solver for regularized OT with user-designed regularization ...">
.. only:: html
- .. figure:: /auto_examples/images/thumb/sphx_glr_plot_WDA_thumb.png
+ .. figure:: /auto_examples/images/thumb/sphx_glr_plot_optim_OTreg_thumb.png
- :ref:`sphx_glr_auto_examples_plot_WDA.py`
+ :ref:`sphx_glr_auto_examples_plot_optim_OTreg.py`
.. raw:: html
@@ -39,17 +45,17 @@ POT Examples
.. toctree::
:hidden:
- /auto_examples/plot_WDA
+ /auto_examples/plot_optim_OTreg
.. raw:: html
- <div class="sphx-glr-thumbcontainer" tooltip=" ">
+ <div class="sphx-glr-thumbcontainer" tooltip="Illustration of 2D optimal transport between discributions that are weighted sum of diracs. The...">
.. only:: html
- .. figure:: /auto_examples/images/thumb/sphx_glr_plot_optim_OTreg_thumb.png
+ .. figure:: /auto_examples/images/thumb/sphx_glr_plot_OT_2D_samples_thumb.png
- :ref:`sphx_glr_auto_examples_plot_optim_OTreg.py`
+ :ref:`sphx_glr_auto_examples_plot_OT_2D_samples.py`
.. raw:: html
@@ -59,17 +65,17 @@ POT Examples
.. toctree::
:hidden:
- /auto_examples/plot_optim_OTreg
+ /auto_examples/plot_OT_2D_samples
.. raw:: html
- <div class="sphx-glr-thumbcontainer" tooltip="@author: rflamary ">
+ <div class="sphx-glr-thumbcontainer" tooltip="Shows how to compute multiple EMD and Sinkhorn with two differnt ground metrics and plot their ...">
.. only:: html
- .. figure:: /auto_examples/images/thumb/sphx_glr_plot_OT_2D_samples_thumb.png
+ .. figure:: /auto_examples/images/thumb/sphx_glr_plot_compute_emd_thumb.png
- :ref:`sphx_glr_auto_examples_plot_OT_2D_samples.py`
+ :ref:`sphx_glr_auto_examples_plot_compute_emd.py`
.. raw:: html
@@ -79,17 +85,17 @@ POT Examples
.. toctree::
:hidden:
- /auto_examples/plot_OT_2D_samples
+ /auto_examples/plot_compute_emd
.. raw:: html
- <div class="sphx-glr-thumbcontainer" tooltip="@author: rflamary ">
+ <div class="sphx-glr-thumbcontainer" tooltip="This example illustrate the use of WDA as proposed in [11].">
.. only:: html
- .. figure:: /auto_examples/images/thumb/sphx_glr_plot_compute_emd_thumb.png
+ .. figure:: /auto_examples/images/thumb/sphx_glr_plot_WDA_thumb.png
- :ref:`sphx_glr_auto_examples_plot_compute_emd.py`
+ :ref:`sphx_glr_auto_examples_plot_WDA.py`
.. raw:: html
@@ -99,17 +105,17 @@ POT Examples
.. toctree::
:hidden:
- /auto_examples/plot_compute_emd
+ /auto_examples/plot_WDA
.. raw:: html
- <div class="sphx-glr-thumbcontainer" tooltip="[6] Ferradans, S., Papadakis, N., Peyre, G., & Aujol, J. F. (2014). Regularized discrete optima...">
+ <div class="sphx-glr-thumbcontainer" tooltip="This example presents a way of transferring colors between two image with Optimal Transport as ...">
.. only:: html
- .. figure:: /auto_examples/images/thumb/sphx_glr_plot_OTDA_color_images_thumb.png
+ .. figure:: /auto_examples/images/thumb/sphx_glr_plot_otda_color_images_thumb.png
- :ref:`sphx_glr_auto_examples_plot_OTDA_color_images.py`
+ :ref:`sphx_glr_auto_examples_plot_otda_color_images.py`
.. raw:: html
@@ -119,17 +125,17 @@ POT Examples
.. toctree::
:hidden:
- /auto_examples/plot_OTDA_color_images
+ /auto_examples/plot_otda_color_images
.. raw:: html
- <div class="sphx-glr-thumbcontainer" tooltip="">
+ <div class="sphx-glr-thumbcontainer" tooltip="This example illustrates the computation of regularized Wassersyein Barycenter as proposed in [...">
.. only:: html
- .. figure:: /auto_examples/images/thumb/sphx_glr_plot_OTDA_classes_thumb.png
+ .. figure:: /auto_examples/images/thumb/sphx_glr_plot_barycenter_1D_thumb.png
- :ref:`sphx_glr_auto_examples_plot_OTDA_classes.py`
+ :ref:`sphx_glr_auto_examples_plot_barycenter_1D.py`
.. raw:: html
@@ -139,17 +145,17 @@ POT Examples
.. toctree::
:hidden:
- /auto_examples/plot_OTDA_classes
+ /auto_examples/plot_barycenter_1D
.. raw:: html
- <div class="sphx-glr-thumbcontainer" tooltip="">
+ <div class="sphx-glr-thumbcontainer" tooltip="OT for domain adaptation with image color adaptation [6] with mapping estimation [8].">
.. only:: html
- .. figure:: /auto_examples/images/thumb/sphx_glr_plot_OTDA_2D_thumb.png
+ .. figure:: /auto_examples/images/thumb/sphx_glr_plot_otda_mapping_colors_images_thumb.png
- :ref:`sphx_glr_auto_examples_plot_OTDA_2D.py`
+ :ref:`sphx_glr_auto_examples_plot_otda_mapping_colors_images.py`
.. raw:: html
@@ -159,17 +165,17 @@ POT Examples
.. toctree::
:hidden:
- /auto_examples/plot_OTDA_2D
+ /auto_examples/plot_otda_mapping_colors_images
.. raw:: html
- <div class="sphx-glr-thumbcontainer" tooltip="Stole the figure idea from Fig. 1 and 2 in https://arxiv.org/pdf/1706.07650.pdf">
+ <div class="sphx-glr-thumbcontainer" tooltip="This example presents how to use MappingTransport to estimate at the same time both the couplin...">
.. only:: html
- .. figure:: /auto_examples/images/thumb/sphx_glr_plot_OT_L1_vs_L2_thumb.png
+ .. figure:: /auto_examples/images/thumb/sphx_glr_plot_otda_mapping_thumb.png
- :ref:`sphx_glr_auto_examples_plot_OT_L1_vs_L2.py`
+ :ref:`sphx_glr_auto_examples_plot_otda_mapping.py`
.. raw:: html
@@ -179,17 +185,17 @@ POT Examples
.. toctree::
:hidden:
- /auto_examples/plot_OT_L1_vs_L2
+ /auto_examples/plot_otda_mapping
.. raw:: html
- <div class="sphx-glr-thumbcontainer" tooltip=" @author: rflamary ">
+ <div class="sphx-glr-thumbcontainer" tooltip="This example introduces a domain adaptation in a 2D setting and the 4 OTDA approaches currently...">
.. only:: html
- .. figure:: /auto_examples/images/thumb/sphx_glr_plot_barycenter_1D_thumb.png
+ .. figure:: /auto_examples/images/thumb/sphx_glr_plot_otda_classes_thumb.png
- :ref:`sphx_glr_auto_examples_plot_barycenter_1D.py`
+ :ref:`sphx_glr_auto_examples_plot_otda_classes.py`
.. raw:: html
@@ -199,17 +205,17 @@ POT Examples
.. toctree::
:hidden:
- /auto_examples/plot_barycenter_1D
+ /auto_examples/plot_otda_classes
.. raw:: html
- <div class="sphx-glr-thumbcontainer" tooltip="[6] Ferradans, S., Papadakis, N., Peyre, G., & Aujol, J. F. (2014). Regularized discrete op...">
+ <div class="sphx-glr-thumbcontainer" tooltip="This example introduces a domain adaptation in a 2D setting. It explicits the problem of domain...">
.. only:: html
- .. figure:: /auto_examples/images/thumb/sphx_glr_plot_OTDA_mapping_color_images_thumb.png
+ .. figure:: /auto_examples/images/thumb/sphx_glr_plot_otda_d2_thumb.png
- :ref:`sphx_glr_auto_examples_plot_OTDA_mapping_color_images.py`
+ :ref:`sphx_glr_auto_examples_plot_otda_d2.py`
.. raw:: html
@@ -219,17 +225,17 @@ POT Examples
.. toctree::
:hidden:
- /auto_examples/plot_OTDA_mapping_color_images
+ /auto_examples/plot_otda_d2
.. raw:: html
- <div class="sphx-glr-thumbcontainer" tooltip="[8] M. Perrot, N. Courty, R. Flamary, A. Habrard, "Mapping estimation for discrete optimal ...">
+ <div class="sphx-glr-thumbcontainer" tooltip="2D OT on empirical distributio with different gound metric.">
.. only:: html
- .. figure:: /auto_examples/images/thumb/sphx_glr_plot_OTDA_mapping_thumb.png
+ .. figure:: /auto_examples/images/thumb/sphx_glr_plot_OT_L1_vs_L2_thumb.png
- :ref:`sphx_glr_auto_examples_plot_OTDA_mapping.py`
+ :ref:`sphx_glr_auto_examples_plot_OT_L1_vs_L2.py`
.. raw:: html
@@ -239,7 +245,7 @@ POT Examples
.. toctree::
:hidden:
- /auto_examples/plot_OTDA_mapping
+ /auto_examples/plot_OT_L1_vs_L2
.. raw:: html
<div style='clear:both'></div>
@@ -251,14 +257,14 @@ POT Examples
.. container:: sphx-glr-download
- :download:`Download all examples in Python source code: auto_examples_python.zip </auto_examples/auto_examples_python.zip>`
+ :download:`Download all examples in Python source code: auto_examples_python.zip <//home/rflamary/PYTHON/POT/docs/source/auto_examples/auto_examples_python.zip>`
.. container:: sphx-glr-download
- :download:`Download all examples in Jupyter notebooks: auto_examples_jupyter.zip </auto_examples/auto_examples_jupyter.zip>`
+ :download:`Download all examples in Jupyter notebooks: auto_examples_jupyter.zip <//home/rflamary/PYTHON/POT/docs/source/auto_examples/auto_examples_jupyter.zip>`
.. rst-class:: sphx-glr-signature
- `Generated by Sphinx-Gallery <http://sphinx-gallery.readthedocs.io>`_
+ `Generated by Sphinx-Gallery <https://sphinx-gallery.readthedocs.io>`_
diff --git a/docs/source/auto_examples/plot_OTDA_2D.ipynb b/docs/source/auto_examples/plot_OTDA_2D.ipynb
deleted file mode 100644
index 2ffb256..0000000
--- a/docs/source/auto_examples/plot_OTDA_2D.ipynb
+++ /dev/null
@@ -1,54 +0,0 @@
-{
- "nbformat_minor": 0,
- "nbformat": 4,
- "cells": [
- {
- "execution_count": null,
- "cell_type": "code",
- "source": [
- "%matplotlib inline"
- ],
- "outputs": [],
- "metadata": {
- "collapsed": false
- }
- },
- {
- "source": [
- "\n# OT for empirical distributions\n\n\n\n"
- ],
- "cell_type": "markdown",
- "metadata": {}
- },
- {
- "execution_count": null,
- "cell_type": "code",
- "source": [
- "import numpy as np\nimport matplotlib.pylab as pl\nimport ot\n\n\n\n#%% parameters\n\nn=150 # nb bins\n\nxs,ys=ot.datasets.get_data_classif('3gauss',n)\nxt,yt=ot.datasets.get_data_classif('3gauss2',n)\n\na,b = ot.unif(n),ot.unif(n)\n# loss matrix\nM=ot.dist(xs,xt)\n#M/=M.max()\n\n#%% plot samples\n\npl.figure(1)\n\npl.subplot(2,2,1)\npl.scatter(xs[:,0],xs[:,1],c=ys,marker='+',label='Source samples')\npl.legend(loc=0)\npl.title('Source distributions')\n\npl.subplot(2,2,2)\npl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples')\npl.legend(loc=0)\npl.title('target distributions')\n\npl.figure(2)\npl.imshow(M,interpolation='nearest')\npl.title('Cost matrix M')\n\n\n#%% OT estimation\n\n# EMD\nG0=ot.emd(a,b,M)\n\n# sinkhorn\nlambd=1e-1\nGs=ot.sinkhorn(a,b,M,lambd)\n\n\n# Group lasso regularization\nreg=1e-1\neta=1e0\nGg=ot.da.sinkhorn_lpl1_mm(a,ys.astype(np.int),b,M,reg,eta)\n\n\n#%% visu matrices\n\npl.figure(3)\n\npl.subplot(2,3,1)\npl.imshow(G0,interpolation='nearest')\npl.title('OT matrix ')\n\npl.subplot(2,3,2)\npl.imshow(Gs,interpolation='nearest')\npl.title('OT matrix Sinkhorn')\n\npl.subplot(2,3,3)\npl.imshow(Gg,interpolation='nearest')\npl.title('OT matrix Group lasso')\n\npl.subplot(2,3,4)\not.plot.plot2D_samples_mat(xs,xt,G0,c=[.5,.5,1])\npl.scatter(xs[:,0],xs[:,1],c=ys,marker='+',label='Source samples')\npl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples')\n\n\npl.subplot(2,3,5)\not.plot.plot2D_samples_mat(xs,xt,Gs,c=[.5,.5,1])\npl.scatter(xs[:,0],xs[:,1],c=ys,marker='+',label='Source samples')\npl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples')\n\npl.subplot(2,3,6)\not.plot.plot2D_samples_mat(xs,xt,Gg,c=[.5,.5,1])\npl.scatter(xs[:,0],xs[:,1],c=ys,marker='+',label='Source samples')\npl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples')\n\n#%% sample interpolation\n\nxst0=n*G0.dot(xt)\nxsts=n*Gs.dot(xt)\nxstg=n*Gg.dot(xt)\n\npl.figure(4)\npl.subplot(2,3,1)\n\n\npl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples',alpha=0.5)\npl.scatter(xst0[:,0],xst0[:,1],c=ys,marker='+',label='Transp samples',s=30)\npl.title('Interp samples')\npl.legend(loc=0)\n\npl.subplot(2,3,2)\n\n\npl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples',alpha=0.5)\npl.scatter(xsts[:,0],xsts[:,1],c=ys,marker='+',label='Transp samples',s=30)\npl.title('Interp samples Sinkhorn')\n\npl.subplot(2,3,3)\n\npl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples',alpha=0.5)\npl.scatter(xstg[:,0],xstg[:,1],c=ys,marker='+',label='Transp samples',s=30)\npl.title('Interp samples Grouplasso')"
- ],
- "outputs": [],
- "metadata": {
- "collapsed": false
- }
- }
- ],
- "metadata": {
- "kernelspec": {
- "display_name": "Python 2",
- "name": "python2",
- "language": "python"
- },
- "language_info": {
- "mimetype": "text/x-python",
- "nbconvert_exporter": "python",
- "name": "python",
- "file_extension": ".py",
- "version": "2.7.12",
- "pygments_lexer": "ipython2",
- "codemirror_mode": {
- "version": 2,
- "name": "ipython"
- }
- }
- }
-} \ No newline at end of file
diff --git a/docs/source/auto_examples/plot_OTDA_2D.py b/docs/source/auto_examples/plot_OTDA_2D.py
deleted file mode 100644
index a1fb804..0000000
--- a/docs/source/auto_examples/plot_OTDA_2D.py
+++ /dev/null
@@ -1,120 +0,0 @@
-# -*- coding: utf-8 -*-
-"""
-==============================
-OT for empirical distributions
-==============================
-
-"""
-
-import numpy as np
-import matplotlib.pylab as pl
-import ot
-
-
-
-#%% parameters
-
-n=150 # nb bins
-
-xs,ys=ot.datasets.get_data_classif('3gauss',n)
-xt,yt=ot.datasets.get_data_classif('3gauss2',n)
-
-a,b = ot.unif(n),ot.unif(n)
-# loss matrix
-M=ot.dist(xs,xt)
-#M/=M.max()
-
-#%% plot samples
-
-pl.figure(1)
-
-pl.subplot(2,2,1)
-pl.scatter(xs[:,0],xs[:,1],c=ys,marker='+',label='Source samples')
-pl.legend(loc=0)
-pl.title('Source distributions')
-
-pl.subplot(2,2,2)
-pl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples')
-pl.legend(loc=0)
-pl.title('target distributions')
-
-pl.figure(2)
-pl.imshow(M,interpolation='nearest')
-pl.title('Cost matrix M')
-
-
-#%% OT estimation
-
-# EMD
-G0=ot.emd(a,b,M)
-
-# sinkhorn
-lambd=1e-1
-Gs=ot.sinkhorn(a,b,M,lambd)
-
-
-# Group lasso regularization
-reg=1e-1
-eta=1e0
-Gg=ot.da.sinkhorn_lpl1_mm(a,ys.astype(np.int),b,M,reg,eta)
-
-
-#%% visu matrices
-
-pl.figure(3)
-
-pl.subplot(2,3,1)
-pl.imshow(G0,interpolation='nearest')
-pl.title('OT matrix ')
-
-pl.subplot(2,3,2)
-pl.imshow(Gs,interpolation='nearest')
-pl.title('OT matrix Sinkhorn')
-
-pl.subplot(2,3,3)
-pl.imshow(Gg,interpolation='nearest')
-pl.title('OT matrix Group lasso')
-
-pl.subplot(2,3,4)
-ot.plot.plot2D_samples_mat(xs,xt,G0,c=[.5,.5,1])
-pl.scatter(xs[:,0],xs[:,1],c=ys,marker='+',label='Source samples')
-pl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples')
-
-
-pl.subplot(2,3,5)
-ot.plot.plot2D_samples_mat(xs,xt,Gs,c=[.5,.5,1])
-pl.scatter(xs[:,0],xs[:,1],c=ys,marker='+',label='Source samples')
-pl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples')
-
-pl.subplot(2,3,6)
-ot.plot.plot2D_samples_mat(xs,xt,Gg,c=[.5,.5,1])
-pl.scatter(xs[:,0],xs[:,1],c=ys,marker='+',label='Source samples')
-pl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples')
-
-#%% sample interpolation
-
-xst0=n*G0.dot(xt)
-xsts=n*Gs.dot(xt)
-xstg=n*Gg.dot(xt)
-
-pl.figure(4)
-pl.subplot(2,3,1)
-
-
-pl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples',alpha=0.5)
-pl.scatter(xst0[:,0],xst0[:,1],c=ys,marker='+',label='Transp samples',s=30)
-pl.title('Interp samples')
-pl.legend(loc=0)
-
-pl.subplot(2,3,2)
-
-
-pl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples',alpha=0.5)
-pl.scatter(xsts[:,0],xsts[:,1],c=ys,marker='+',label='Transp samples',s=30)
-pl.title('Interp samples Sinkhorn')
-
-pl.subplot(2,3,3)
-
-pl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples',alpha=0.5)
-pl.scatter(xstg[:,0],xstg[:,1],c=ys,marker='+',label='Transp samples',s=30)
-pl.title('Interp samples Grouplasso') \ No newline at end of file
diff --git a/docs/source/auto_examples/plot_OTDA_2D.rst b/docs/source/auto_examples/plot_OTDA_2D.rst
deleted file mode 100644
index b535bb0..0000000
--- a/docs/source/auto_examples/plot_OTDA_2D.rst
+++ /dev/null
@@ -1,175 +0,0 @@
-
-
-.. _sphx_glr_auto_examples_plot_OTDA_2D.py:
-
-
-==============================
-OT for empirical distributions
-==============================
-
-
-
-
-
-.. rst-class:: sphx-glr-horizontal
-
-
- *
-
- .. image:: /auto_examples/images/sphx_glr_plot_OTDA_2D_001.png
- :scale: 47
-
- *
-
- .. image:: /auto_examples/images/sphx_glr_plot_OTDA_2D_002.png
- :scale: 47
-
- *
-
- .. image:: /auto_examples/images/sphx_glr_plot_OTDA_2D_003.png
- :scale: 47
-
- *
-
- .. image:: /auto_examples/images/sphx_glr_plot_OTDA_2D_004.png
- :scale: 47
-
-
-
-
-
-.. code-block:: python
-
-
- import numpy as np
- import matplotlib.pylab as pl
- import ot
-
-
-
- #%% parameters
-
- n=150 # nb bins
-
- xs,ys=ot.datasets.get_data_classif('3gauss',n)
- xt,yt=ot.datasets.get_data_classif('3gauss2',n)
-
- a,b = ot.unif(n),ot.unif(n)
- # loss matrix
- M=ot.dist(xs,xt)
- #M/=M.max()
-
- #%% plot samples
-
- pl.figure(1)
-
- pl.subplot(2,2,1)
- pl.scatter(xs[:,0],xs[:,1],c=ys,marker='+',label='Source samples')
- pl.legend(loc=0)
- pl.title('Source distributions')
-
- pl.subplot(2,2,2)
- pl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples')
- pl.legend(loc=0)
- pl.title('target distributions')
-
- pl.figure(2)
- pl.imshow(M,interpolation='nearest')
- pl.title('Cost matrix M')
-
-
- #%% OT estimation
-
- # EMD
- G0=ot.emd(a,b,M)
-
- # sinkhorn
- lambd=1e-1
- Gs=ot.sinkhorn(a,b,M,lambd)
-
-
- # Group lasso regularization
- reg=1e-1
- eta=1e0
- Gg=ot.da.sinkhorn_lpl1_mm(a,ys.astype(np.int),b,M,reg,eta)
-
-
- #%% visu matrices
-
- pl.figure(3)
-
- pl.subplot(2,3,1)
- pl.imshow(G0,interpolation='nearest')
- pl.title('OT matrix ')
-
- pl.subplot(2,3,2)
- pl.imshow(Gs,interpolation='nearest')
- pl.title('OT matrix Sinkhorn')
-
- pl.subplot(2,3,3)
- pl.imshow(Gg,interpolation='nearest')
- pl.title('OT matrix Group lasso')
-
- pl.subplot(2,3,4)
- ot.plot.plot2D_samples_mat(xs,xt,G0,c=[.5,.5,1])
- pl.scatter(xs[:,0],xs[:,1],c=ys,marker='+',label='Source samples')
- pl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples')
-
-
- pl.subplot(2,3,5)
- ot.plot.plot2D_samples_mat(xs,xt,Gs,c=[.5,.5,1])
- pl.scatter(xs[:,0],xs[:,1],c=ys,marker='+',label='Source samples')
- pl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples')
-
- pl.subplot(2,3,6)
- ot.plot.plot2D_samples_mat(xs,xt,Gg,c=[.5,.5,1])
- pl.scatter(xs[:,0],xs[:,1],c=ys,marker='+',label='Source samples')
- pl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples')
-
- #%% sample interpolation
-
- xst0=n*G0.dot(xt)
- xsts=n*Gs.dot(xt)
- xstg=n*Gg.dot(xt)
-
- pl.figure(4)
- pl.subplot(2,3,1)
-
-
- pl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples',alpha=0.5)
- pl.scatter(xst0[:,0],xst0[:,1],c=ys,marker='+',label='Transp samples',s=30)
- pl.title('Interp samples')
- pl.legend(loc=0)
-
- pl.subplot(2,3,2)
-
-
- pl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples',alpha=0.5)
- pl.scatter(xsts[:,0],xsts[:,1],c=ys,marker='+',label='Transp samples',s=30)
- pl.title('Interp samples Sinkhorn')
-
- pl.subplot(2,3,3)
-
- pl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples',alpha=0.5)
- pl.scatter(xstg[:,0],xstg[:,1],c=ys,marker='+',label='Transp samples',s=30)
- pl.title('Interp samples Grouplasso')
-**Total running time of the script:** ( 0 minutes 17.372 seconds)
-
-
-
-.. container:: sphx-glr-footer
-
-
- .. container:: sphx-glr-download
-
- :download:`Download Python source code: plot_OTDA_2D.py <plot_OTDA_2D.py>`
-
-
-
- .. container:: sphx-glr-download
-
- :download:`Download Jupyter notebook: plot_OTDA_2D.ipynb <plot_OTDA_2D.ipynb>`
-
-.. rst-class:: sphx-glr-signature
-
- `Generated by Sphinx-Gallery <http://sphinx-gallery.readthedocs.io>`_
diff --git a/docs/source/auto_examples/plot_OTDA_classes.ipynb b/docs/source/auto_examples/plot_OTDA_classes.ipynb
deleted file mode 100644
index d9fcb87..0000000
--- a/docs/source/auto_examples/plot_OTDA_classes.ipynb
+++ /dev/null
@@ -1,54 +0,0 @@
-{
- "nbformat_minor": 0,
- "nbformat": 4,
- "cells": [
- {
- "execution_count": null,
- "cell_type": "code",
- "source": [
- "%matplotlib inline"
- ],
- "outputs": [],
- "metadata": {
- "collapsed": false
- }
- },
- {
- "source": [
- "\n# OT for domain adaptation\n\n\n\n"
- ],
- "cell_type": "markdown",
- "metadata": {}
- },
- {
- "execution_count": null,
- "cell_type": "code",
- "source": [
- "import matplotlib.pylab as pl\nimport ot\n\n\n\n\n#%% parameters\n\nn=150 # nb samples in source and target datasets\n\nxs,ys=ot.datasets.get_data_classif('3gauss',n)\nxt,yt=ot.datasets.get_data_classif('3gauss2',n)\n\n\n\n\n#%% plot samples\n\npl.figure(1)\n\npl.subplot(2,2,1)\npl.scatter(xs[:,0],xs[:,1],c=ys,marker='+',label='Source samples')\npl.legend(loc=0)\npl.title('Source distributions')\n\npl.subplot(2,2,2)\npl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples')\npl.legend(loc=0)\npl.title('target distributions')\n\n\n#%% OT estimation\n\n# LP problem\nda_emd=ot.da.OTDA() # init class\nda_emd.fit(xs,xt) # fit distributions\nxst0=da_emd.interp() # interpolation of source samples\n\n\n# sinkhorn regularization\nlambd=1e-1\nda_entrop=ot.da.OTDA_sinkhorn()\nda_entrop.fit(xs,xt,reg=lambd)\nxsts=da_entrop.interp()\n\n# non-convex Group lasso regularization\nreg=1e-1\neta=1e0\nda_lpl1=ot.da.OTDA_lpl1()\nda_lpl1.fit(xs,ys,xt,reg=reg,eta=eta)\nxstg=da_lpl1.interp()\n\n\n# True Group lasso regularization\nreg=1e-1\neta=2e0\nda_l1l2=ot.da.OTDA_l1l2()\nda_l1l2.fit(xs,ys,xt,reg=reg,eta=eta,numItermax=20,verbose=True)\nxstgl=da_l1l2.interp()\n\n\n#%% plot interpolated source samples\npl.figure(4,(15,8))\n\nparam_img={'interpolation':'nearest','cmap':'jet'}\n\npl.subplot(2,4,1)\npl.imshow(da_emd.G,**param_img)\npl.title('OT matrix')\n\n\npl.subplot(2,4,2)\npl.imshow(da_entrop.G,**param_img)\npl.title('OT matrix sinkhorn')\n\npl.subplot(2,4,3)\npl.imshow(da_lpl1.G,**param_img)\npl.title('OT matrix non-convex Group Lasso')\n\npl.subplot(2,4,4)\npl.imshow(da_l1l2.G,**param_img)\npl.title('OT matrix Group Lasso')\n\n\npl.subplot(2,4,5)\npl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples',alpha=0.3)\npl.scatter(xst0[:,0],xst0[:,1],c=ys,marker='+',label='Transp samples',s=30)\npl.title('Interp samples')\npl.legend(loc=0)\n\npl.subplot(2,4,6)\npl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples',alpha=0.3)\npl.scatter(xsts[:,0],xsts[:,1],c=ys,marker='+',label='Transp samples',s=30)\npl.title('Interp samples Sinkhorn')\n\npl.subplot(2,4,7)\npl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples',alpha=0.3)\npl.scatter(xstg[:,0],xstg[:,1],c=ys,marker='+',label='Transp samples',s=30)\npl.title('Interp samples non-convex Group Lasso')\n\npl.subplot(2,4,8)\npl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples',alpha=0.3)\npl.scatter(xstgl[:,0],xstgl[:,1],c=ys,marker='+',label='Transp samples',s=30)\npl.title('Interp samples Group Lasso')"
- ],
- "outputs": [],
- "metadata": {
- "collapsed": false
- }
- }
- ],
- "metadata": {
- "kernelspec": {
- "display_name": "Python 2",
- "name": "python2",
- "language": "python"
- },
- "language_info": {
- "mimetype": "text/x-python",
- "nbconvert_exporter": "python",
- "name": "python",
- "file_extension": ".py",
- "version": "2.7.12",
- "pygments_lexer": "ipython2",
- "codemirror_mode": {
- "version": 2,
- "name": "ipython"
- }
- }
- }
-} \ No newline at end of file
diff --git a/docs/source/auto_examples/plot_OTDA_classes.py b/docs/source/auto_examples/plot_OTDA_classes.py
deleted file mode 100644
index 089b45b..0000000
--- a/docs/source/auto_examples/plot_OTDA_classes.py
+++ /dev/null
@@ -1,112 +0,0 @@
-# -*- coding: utf-8 -*-
-"""
-========================
-OT for domain adaptation
-========================
-
-"""
-
-import matplotlib.pylab as pl
-import ot
-
-
-
-
-#%% parameters
-
-n=150 # nb samples in source and target datasets
-
-xs,ys=ot.datasets.get_data_classif('3gauss',n)
-xt,yt=ot.datasets.get_data_classif('3gauss2',n)
-
-
-
-
-#%% plot samples
-
-pl.figure(1)
-
-pl.subplot(2,2,1)
-pl.scatter(xs[:,0],xs[:,1],c=ys,marker='+',label='Source samples')
-pl.legend(loc=0)
-pl.title('Source distributions')
-
-pl.subplot(2,2,2)
-pl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples')
-pl.legend(loc=0)
-pl.title('target distributions')
-
-
-#%% OT estimation
-
-# LP problem
-da_emd=ot.da.OTDA() # init class
-da_emd.fit(xs,xt) # fit distributions
-xst0=da_emd.interp() # interpolation of source samples
-
-
-# sinkhorn regularization
-lambd=1e-1
-da_entrop=ot.da.OTDA_sinkhorn()
-da_entrop.fit(xs,xt,reg=lambd)
-xsts=da_entrop.interp()
-
-# non-convex Group lasso regularization
-reg=1e-1
-eta=1e0
-da_lpl1=ot.da.OTDA_lpl1()
-da_lpl1.fit(xs,ys,xt,reg=reg,eta=eta)
-xstg=da_lpl1.interp()
-
-
-# True Group lasso regularization
-reg=1e-1
-eta=2e0
-da_l1l2=ot.da.OTDA_l1l2()
-da_l1l2.fit(xs,ys,xt,reg=reg,eta=eta,numItermax=20,verbose=True)
-xstgl=da_l1l2.interp()
-
-
-#%% plot interpolated source samples
-pl.figure(4,(15,8))
-
-param_img={'interpolation':'nearest','cmap':'jet'}
-
-pl.subplot(2,4,1)
-pl.imshow(da_emd.G,**param_img)
-pl.title('OT matrix')
-
-
-pl.subplot(2,4,2)
-pl.imshow(da_entrop.G,**param_img)
-pl.title('OT matrix sinkhorn')
-
-pl.subplot(2,4,3)
-pl.imshow(da_lpl1.G,**param_img)
-pl.title('OT matrix non-convex Group Lasso')
-
-pl.subplot(2,4,4)
-pl.imshow(da_l1l2.G,**param_img)
-pl.title('OT matrix Group Lasso')
-
-
-pl.subplot(2,4,5)
-pl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples',alpha=0.3)
-pl.scatter(xst0[:,0],xst0[:,1],c=ys,marker='+',label='Transp samples',s=30)
-pl.title('Interp samples')
-pl.legend(loc=0)
-
-pl.subplot(2,4,6)
-pl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples',alpha=0.3)
-pl.scatter(xsts[:,0],xsts[:,1],c=ys,marker='+',label='Transp samples',s=30)
-pl.title('Interp samples Sinkhorn')
-
-pl.subplot(2,4,7)
-pl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples',alpha=0.3)
-pl.scatter(xstg[:,0],xstg[:,1],c=ys,marker='+',label='Transp samples',s=30)
-pl.title('Interp samples non-convex Group Lasso')
-
-pl.subplot(2,4,8)
-pl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples',alpha=0.3)
-pl.scatter(xstgl[:,0],xstgl[:,1],c=ys,marker='+',label='Transp samples',s=30)
-pl.title('Interp samples Group Lasso') \ No newline at end of file
diff --git a/docs/source/auto_examples/plot_OTDA_classes.rst b/docs/source/auto_examples/plot_OTDA_classes.rst
deleted file mode 100644
index 097e9fc..0000000
--- a/docs/source/auto_examples/plot_OTDA_classes.rst
+++ /dev/null
@@ -1,190 +0,0 @@
-
-
-.. _sphx_glr_auto_examples_plot_OTDA_classes.py:
-
-
-========================
-OT for domain adaptation
-========================
-
-
-
-
-
-.. rst-class:: sphx-glr-horizontal
-
-
- *
-
- .. image:: /auto_examples/images/sphx_glr_plot_OTDA_classes_001.png
- :scale: 47
-
- *
-
- .. image:: /auto_examples/images/sphx_glr_plot_OTDA_classes_004.png
- :scale: 47
-
-
-.. rst-class:: sphx-glr-script-out
-
- Out::
-
- It. |Loss |Delta loss
- --------------------------------
- 0|9.171271e+00|0.000000e+00
- 1|2.133783e+00|-3.298127e+00
- 2|1.895941e+00|-1.254484e-01
- 3|1.844628e+00|-2.781709e-02
- 4|1.824983e+00|-1.076467e-02
- 5|1.815453e+00|-5.249337e-03
- 6|1.808104e+00|-4.064733e-03
- 7|1.803558e+00|-2.520475e-03
- 8|1.801061e+00|-1.386155e-03
- 9|1.799391e+00|-9.279565e-04
- 10|1.797176e+00|-1.232778e-03
- 11|1.795465e+00|-9.529479e-04
- 12|1.795316e+00|-8.322362e-05
- 13|1.794523e+00|-4.418932e-04
- 14|1.794444e+00|-4.390599e-05
- 15|1.794395e+00|-2.710318e-05
- 16|1.793713e+00|-3.804028e-04
- 17|1.793110e+00|-3.359479e-04
- 18|1.792829e+00|-1.569563e-04
- 19|1.792621e+00|-1.159469e-04
- It. |Loss |Delta loss
- --------------------------------
- 20|1.791334e+00|-7.187689e-04
-
-
-
-
-|
-
-
-.. code-block:: python
-
-
- import matplotlib.pylab as pl
- import ot
-
-
-
-
- #%% parameters
-
- n=150 # nb samples in source and target datasets
-
- xs,ys=ot.datasets.get_data_classif('3gauss',n)
- xt,yt=ot.datasets.get_data_classif('3gauss2',n)
-
-
-
-
- #%% plot samples
-
- pl.figure(1)
-
- pl.subplot(2,2,1)
- pl.scatter(xs[:,0],xs[:,1],c=ys,marker='+',label='Source samples')
- pl.legend(loc=0)
- pl.title('Source distributions')
-
- pl.subplot(2,2,2)
- pl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples')
- pl.legend(loc=0)
- pl.title('target distributions')
-
-
- #%% OT estimation
-
- # LP problem
- da_emd=ot.da.OTDA() # init class
- da_emd.fit(xs,xt) # fit distributions
- xst0=da_emd.interp() # interpolation of source samples
-
-
- # sinkhorn regularization
- lambd=1e-1
- da_entrop=ot.da.OTDA_sinkhorn()
- da_entrop.fit(xs,xt,reg=lambd)
- xsts=da_entrop.interp()
-
- # non-convex Group lasso regularization
- reg=1e-1
- eta=1e0
- da_lpl1=ot.da.OTDA_lpl1()
- da_lpl1.fit(xs,ys,xt,reg=reg,eta=eta)
- xstg=da_lpl1.interp()
-
-
- # True Group lasso regularization
- reg=1e-1
- eta=2e0
- da_l1l2=ot.da.OTDA_l1l2()
- da_l1l2.fit(xs,ys,xt,reg=reg,eta=eta,numItermax=20,verbose=True)
- xstgl=da_l1l2.interp()
-
-
- #%% plot interpolated source samples
- pl.figure(4,(15,8))
-
- param_img={'interpolation':'nearest','cmap':'jet'}
-
- pl.subplot(2,4,1)
- pl.imshow(da_emd.G,**param_img)
- pl.title('OT matrix')
-
-
- pl.subplot(2,4,2)
- pl.imshow(da_entrop.G,**param_img)
- pl.title('OT matrix sinkhorn')
-
- pl.subplot(2,4,3)
- pl.imshow(da_lpl1.G,**param_img)
- pl.title('OT matrix non-convex Group Lasso')
-
- pl.subplot(2,4,4)
- pl.imshow(da_l1l2.G,**param_img)
- pl.title('OT matrix Group Lasso')
-
-
- pl.subplot(2,4,5)
- pl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples',alpha=0.3)
- pl.scatter(xst0[:,0],xst0[:,1],c=ys,marker='+',label='Transp samples',s=30)
- pl.title('Interp samples')
- pl.legend(loc=0)
-
- pl.subplot(2,4,6)
- pl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples',alpha=0.3)
- pl.scatter(xsts[:,0],xsts[:,1],c=ys,marker='+',label='Transp samples',s=30)
- pl.title('Interp samples Sinkhorn')
-
- pl.subplot(2,4,7)
- pl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples',alpha=0.3)
- pl.scatter(xstg[:,0],xstg[:,1],c=ys,marker='+',label='Transp samples',s=30)
- pl.title('Interp samples non-convex Group Lasso')
-
- pl.subplot(2,4,8)
- pl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples',alpha=0.3)
- pl.scatter(xstgl[:,0],xstgl[:,1],c=ys,marker='+',label='Transp samples',s=30)
- pl.title('Interp samples Group Lasso')
-**Total running time of the script:** ( 0 minutes 2.225 seconds)
-
-
-
-.. container:: sphx-glr-footer
-
-
- .. container:: sphx-glr-download
-
- :download:`Download Python source code: plot_OTDA_classes.py <plot_OTDA_classes.py>`
-
-
-
- .. container:: sphx-glr-download
-
- :download:`Download Jupyter notebook: plot_OTDA_classes.ipynb <plot_OTDA_classes.ipynb>`
-
-.. rst-class:: sphx-glr-signature
-
- `Generated by Sphinx-Gallery <http://sphinx-gallery.readthedocs.io>`_
diff --git a/docs/source/auto_examples/plot_OTDA_color_images.ipynb b/docs/source/auto_examples/plot_OTDA_color_images.ipynb
deleted file mode 100644
index d174828..0000000
--- a/docs/source/auto_examples/plot_OTDA_color_images.ipynb
+++ /dev/null
@@ -1,54 +0,0 @@
-{
- "nbformat_minor": 0,
- "nbformat": 4,
- "cells": [
- {
- "execution_count": null,
- "cell_type": "code",
- "source": [
- "%matplotlib inline"
- ],
- "outputs": [],
- "metadata": {
- "collapsed": false
- }
- },
- {
- "source": [
- "\n========================================================\nOT for domain adaptation with image color adaptation [6]\n========================================================\n\n[6] Ferradans, S., Papadakis, N., Peyre, G., & Aujol, J. F. (2014). Regularized discrete optimal transport. SIAM Journal on Imaging Sciences, 7(3), 1853-1882.\n\n"
- ],
- "cell_type": "markdown",
- "metadata": {}
- },
- {
- "execution_count": null,
- "cell_type": "code",
- "source": [
- "import numpy as np\nimport scipy.ndimage as spi\nimport matplotlib.pylab as pl\nimport ot\n\n\n#%% Loading images\n\nI1=spi.imread('../data/ocean_day.jpg').astype(np.float64)/256\nI2=spi.imread('../data/ocean_sunset.jpg').astype(np.float64)/256\n\n#%% Plot images\n\npl.figure(1)\n\npl.subplot(1,2,1)\npl.imshow(I1)\npl.title('Image 1')\n\npl.subplot(1,2,2)\npl.imshow(I2)\npl.title('Image 2')\n\npl.show()\n\n#%% Image conversion and dataset generation\n\ndef im2mat(I):\n \"\"\"Converts and image to matrix (one pixel per line)\"\"\"\n return I.reshape((I.shape[0]*I.shape[1],I.shape[2]))\n\ndef mat2im(X,shape):\n \"\"\"Converts back a matrix to an image\"\"\"\n return X.reshape(shape)\n\nX1=im2mat(I1)\nX2=im2mat(I2)\n\n# training samples\nnb=1000\nidx1=np.random.randint(X1.shape[0],size=(nb,))\nidx2=np.random.randint(X2.shape[0],size=(nb,))\n\nxs=X1[idx1,:]\nxt=X2[idx2,:]\n\n#%% Plot image distributions\n\n\npl.figure(2,(10,5))\n\npl.subplot(1,2,1)\npl.scatter(xs[:,0],xs[:,2],c=xs)\npl.axis([0,1,0,1])\npl.xlabel('Red')\npl.ylabel('Blue')\npl.title('Image 1')\n\npl.subplot(1,2,2)\n#pl.imshow(I2)\npl.scatter(xt[:,0],xt[:,2],c=xt)\npl.axis([0,1,0,1])\npl.xlabel('Red')\npl.ylabel('Blue')\npl.title('Image 2')\n\npl.show()\n\n\n\n#%% domain adaptation between images\n\n# LP problem\nda_emd=ot.da.OTDA() # init class\nda_emd.fit(xs,xt) # fit distributions\n\n\n# sinkhorn regularization\nlambd=1e-1\nda_entrop=ot.da.OTDA_sinkhorn()\nda_entrop.fit(xs,xt,reg=lambd)\n\n\n\n#%% prediction between images (using out of sample prediction as in [6])\n\nX1t=da_emd.predict(X1)\nX2t=da_emd.predict(X2,-1)\n\n\nX1te=da_entrop.predict(X1)\nX2te=da_entrop.predict(X2,-1)\n\n\ndef minmax(I):\n return np.minimum(np.maximum(I,0),1)\n\nI1t=minmax(mat2im(X1t,I1.shape))\nI2t=minmax(mat2im(X2t,I2.shape))\n\nI1te=minmax(mat2im(X1te,I1.shape))\nI2te=minmax(mat2im(X2te,I2.shape))\n\n#%% plot all images\n\npl.figure(2,(10,8))\n\npl.subplot(2,3,1)\n\npl.imshow(I1)\npl.title('Image 1')\n\npl.subplot(2,3,2)\npl.imshow(I1t)\npl.title('Image 1 Adapt')\n\n\npl.subplot(2,3,3)\npl.imshow(I1te)\npl.title('Image 1 Adapt (reg)')\n\npl.subplot(2,3,4)\n\npl.imshow(I2)\npl.title('Image 2')\n\npl.subplot(2,3,5)\npl.imshow(I2t)\npl.title('Image 2 Adapt')\n\n\npl.subplot(2,3,6)\npl.imshow(I2te)\npl.title('Image 2 Adapt (reg)')\n\npl.show()"
- ],
- "outputs": [],
- "metadata": {
- "collapsed": false
- }
- }
- ],
- "metadata": {
- "kernelspec": {
- "display_name": "Python 2",
- "name": "python2",
- "language": "python"
- },
- "language_info": {
- "mimetype": "text/x-python",
- "nbconvert_exporter": "python",
- "name": "python",
- "file_extension": ".py",
- "version": "2.7.12",
- "pygments_lexer": "ipython2",
- "codemirror_mode": {
- "version": 2,
- "name": "ipython"
- }
- }
- }
-} \ No newline at end of file
diff --git a/docs/source/auto_examples/plot_OTDA_color_images.py b/docs/source/auto_examples/plot_OTDA_color_images.py
deleted file mode 100644
index 68eee44..0000000
--- a/docs/source/auto_examples/plot_OTDA_color_images.py
+++ /dev/null
@@ -1,145 +0,0 @@
-# -*- coding: utf-8 -*-
-"""
-========================================================
-OT for domain adaptation with image color adaptation [6]
-========================================================
-
-[6] Ferradans, S., Papadakis, N., Peyre, G., & Aujol, J. F. (2014). Regularized discrete optimal transport. SIAM Journal on Imaging Sciences, 7(3), 1853-1882.
-"""
-
-import numpy as np
-import scipy.ndimage as spi
-import matplotlib.pylab as pl
-import ot
-
-
-#%% Loading images
-
-I1=spi.imread('../data/ocean_day.jpg').astype(np.float64)/256
-I2=spi.imread('../data/ocean_sunset.jpg').astype(np.float64)/256
-
-#%% Plot images
-
-pl.figure(1)
-
-pl.subplot(1,2,1)
-pl.imshow(I1)
-pl.title('Image 1')
-
-pl.subplot(1,2,2)
-pl.imshow(I2)
-pl.title('Image 2')
-
-pl.show()
-
-#%% Image conversion and dataset generation
-
-def im2mat(I):
- """Converts and image to matrix (one pixel per line)"""
- return I.reshape((I.shape[0]*I.shape[1],I.shape[2]))
-
-def mat2im(X,shape):
- """Converts back a matrix to an image"""
- return X.reshape(shape)
-
-X1=im2mat(I1)
-X2=im2mat(I2)
-
-# training samples
-nb=1000
-idx1=np.random.randint(X1.shape[0],size=(nb,))
-idx2=np.random.randint(X2.shape[0],size=(nb,))
-
-xs=X1[idx1,:]
-xt=X2[idx2,:]
-
-#%% Plot image distributions
-
-
-pl.figure(2,(10,5))
-
-pl.subplot(1,2,1)
-pl.scatter(xs[:,0],xs[:,2],c=xs)
-pl.axis([0,1,0,1])
-pl.xlabel('Red')
-pl.ylabel('Blue')
-pl.title('Image 1')
-
-pl.subplot(1,2,2)
-#pl.imshow(I2)
-pl.scatter(xt[:,0],xt[:,2],c=xt)
-pl.axis([0,1,0,1])
-pl.xlabel('Red')
-pl.ylabel('Blue')
-pl.title('Image 2')
-
-pl.show()
-
-
-
-#%% domain adaptation between images
-
-# LP problem
-da_emd=ot.da.OTDA() # init class
-da_emd.fit(xs,xt) # fit distributions
-
-
-# sinkhorn regularization
-lambd=1e-1
-da_entrop=ot.da.OTDA_sinkhorn()
-da_entrop.fit(xs,xt,reg=lambd)
-
-
-
-#%% prediction between images (using out of sample prediction as in [6])
-
-X1t=da_emd.predict(X1)
-X2t=da_emd.predict(X2,-1)
-
-
-X1te=da_entrop.predict(X1)
-X2te=da_entrop.predict(X2,-1)
-
-
-def minmax(I):
- return np.minimum(np.maximum(I,0),1)
-
-I1t=minmax(mat2im(X1t,I1.shape))
-I2t=minmax(mat2im(X2t,I2.shape))
-
-I1te=minmax(mat2im(X1te,I1.shape))
-I2te=minmax(mat2im(X2te,I2.shape))
-
-#%% plot all images
-
-pl.figure(2,(10,8))
-
-pl.subplot(2,3,1)
-
-pl.imshow(I1)
-pl.title('Image 1')
-
-pl.subplot(2,3,2)
-pl.imshow(I1t)
-pl.title('Image 1 Adapt')
-
-
-pl.subplot(2,3,3)
-pl.imshow(I1te)
-pl.title('Image 1 Adapt (reg)')
-
-pl.subplot(2,3,4)
-
-pl.imshow(I2)
-pl.title('Image 2')
-
-pl.subplot(2,3,5)
-pl.imshow(I2t)
-pl.title('Image 2 Adapt')
-
-
-pl.subplot(2,3,6)
-pl.imshow(I2te)
-pl.title('Image 2 Adapt (reg)')
-
-pl.show()
diff --git a/docs/source/auto_examples/plot_OTDA_color_images.rst b/docs/source/auto_examples/plot_OTDA_color_images.rst
deleted file mode 100644
index a982a90..0000000
--- a/docs/source/auto_examples/plot_OTDA_color_images.rst
+++ /dev/null
@@ -1,191 +0,0 @@
-
-
-.. _sphx_glr_auto_examples_plot_OTDA_color_images.py:
-
-
-========================================================
-OT for domain adaptation with image color adaptation [6]
-========================================================
-
-[6] Ferradans, S., Papadakis, N., Peyre, G., & Aujol, J. F. (2014). Regularized discrete optimal transport. SIAM Journal on Imaging Sciences, 7(3), 1853-1882.
-
-
-
-
-.. rst-class:: sphx-glr-horizontal
-
-
- *
-
- .. image:: /auto_examples/images/sphx_glr_plot_OTDA_color_images_001.png
- :scale: 47
-
- *
-
- .. image:: /auto_examples/images/sphx_glr_plot_OTDA_color_images_002.png
- :scale: 47
-
-
-
-
-
-.. code-block:: python
-
-
- import numpy as np
- import scipy.ndimage as spi
- import matplotlib.pylab as pl
- import ot
-
-
- #%% Loading images
-
- I1=spi.imread('../data/ocean_day.jpg').astype(np.float64)/256
- I2=spi.imread('../data/ocean_sunset.jpg').astype(np.float64)/256
-
- #%% Plot images
-
- pl.figure(1)
-
- pl.subplot(1,2,1)
- pl.imshow(I1)
- pl.title('Image 1')
-
- pl.subplot(1,2,2)
- pl.imshow(I2)
- pl.title('Image 2')
-
- pl.show()
-
- #%% Image conversion and dataset generation
-
- def im2mat(I):
- """Converts and image to matrix (one pixel per line)"""
- return I.reshape((I.shape[0]*I.shape[1],I.shape[2]))
-
- def mat2im(X,shape):
- """Converts back a matrix to an image"""
- return X.reshape(shape)
-
- X1=im2mat(I1)
- X2=im2mat(I2)
-
- # training samples
- nb=1000
- idx1=np.random.randint(X1.shape[0],size=(nb,))
- idx2=np.random.randint(X2.shape[0],size=(nb,))
-
- xs=X1[idx1,:]
- xt=X2[idx2,:]
-
- #%% Plot image distributions
-
-
- pl.figure(2,(10,5))
-
- pl.subplot(1,2,1)
- pl.scatter(xs[:,0],xs[:,2],c=xs)
- pl.axis([0,1,0,1])
- pl.xlabel('Red')
- pl.ylabel('Blue')
- pl.title('Image 1')
-
- pl.subplot(1,2,2)
- #pl.imshow(I2)
- pl.scatter(xt[:,0],xt[:,2],c=xt)
- pl.axis([0,1,0,1])
- pl.xlabel('Red')
- pl.ylabel('Blue')
- pl.title('Image 2')
-
- pl.show()
-
-
-
- #%% domain adaptation between images
-
- # LP problem
- da_emd=ot.da.OTDA() # init class
- da_emd.fit(xs,xt) # fit distributions
-
-
- # sinkhorn regularization
- lambd=1e-1
- da_entrop=ot.da.OTDA_sinkhorn()
- da_entrop.fit(xs,xt,reg=lambd)
-
-
-
- #%% prediction between images (using out of sample prediction as in [6])
-
- X1t=da_emd.predict(X1)
- X2t=da_emd.predict(X2,-1)
-
-
- X1te=da_entrop.predict(X1)
- X2te=da_entrop.predict(X2,-1)
-
-
- def minmax(I):
- return np.minimum(np.maximum(I,0),1)
-
- I1t=minmax(mat2im(X1t,I1.shape))
- I2t=minmax(mat2im(X2t,I2.shape))
-
- I1te=minmax(mat2im(X1te,I1.shape))
- I2te=minmax(mat2im(X2te,I2.shape))
-
- #%% plot all images
-
- pl.figure(2,(10,8))
-
- pl.subplot(2,3,1)
-
- pl.imshow(I1)
- pl.title('Image 1')
-
- pl.subplot(2,3,2)
- pl.imshow(I1t)
- pl.title('Image 1 Adapt')
-
-
- pl.subplot(2,3,3)
- pl.imshow(I1te)
- pl.title('Image 1 Adapt (reg)')
-
- pl.subplot(2,3,4)
-
- pl.imshow(I2)
- pl.title('Image 2')
-
- pl.subplot(2,3,5)
- pl.imshow(I2t)
- pl.title('Image 2 Adapt')
-
-
- pl.subplot(2,3,6)
- pl.imshow(I2te)
- pl.title('Image 2 Adapt (reg)')
-
- pl.show()
-
-**Total running time of the script:** ( 0 minutes 24.815 seconds)
-
-
-
-.. container:: sphx-glr-footer
-
-
- .. container:: sphx-glr-download
-
- :download:`Download Python source code: plot_OTDA_color_images.py <plot_OTDA_color_images.py>`
-
-
-
- .. container:: sphx-glr-download
-
- :download:`Download Jupyter notebook: plot_OTDA_color_images.ipynb <plot_OTDA_color_images.ipynb>`
-
-.. rst-class:: sphx-glr-signature
-
- `Generated by Sphinx-Gallery <http://sphinx-gallery.readthedocs.io>`_
diff --git a/docs/source/auto_examples/plot_OTDA_mapping.ipynb b/docs/source/auto_examples/plot_OTDA_mapping.ipynb
deleted file mode 100644
index ec405af..0000000
--- a/docs/source/auto_examples/plot_OTDA_mapping.ipynb
+++ /dev/null
@@ -1,54 +0,0 @@
-{
- "nbformat_minor": 0,
- "nbformat": 4,
- "cells": [
- {
- "execution_count": null,
- "cell_type": "code",
- "source": [
- "%matplotlib inline"
- ],
- "outputs": [],
- "metadata": {
- "collapsed": false
- }
- },
- {
- "source": [
- "\n===============================================\nOT mapping estimation for domain adaptation [8]\n===============================================\n\n[8] M. Perrot, N. Courty, R. Flamary, A. Habrard, \"Mapping estimation for\n discrete optimal transport\", Neural Information Processing Systems (NIPS), 2016.\n\n"
- ],
- "cell_type": "markdown",
- "metadata": {}
- },
- {
- "execution_count": null,
- "cell_type": "code",
- "source": [
- "import numpy as np\nimport matplotlib.pylab as pl\nimport ot\n\n\n\n#%% dataset generation\n\nnp.random.seed(0) # makes example reproducible\n\nn=100 # nb samples in source and target datasets\ntheta=2*np.pi/20\nnz=0.1\nxs,ys=ot.datasets.get_data_classif('gaussrot',n,nz=nz)\nxt,yt=ot.datasets.get_data_classif('gaussrot',n,theta=theta,nz=nz)\n\n# one of the target mode changes its variance (no linear mapping)\nxt[yt==2]*=3\nxt=xt+4\n\n\n#%% plot samples\n\npl.figure(1,(8,5))\npl.clf()\n\npl.scatter(xs[:,0],xs[:,1],c=ys,marker='+',label='Source samples')\npl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples')\n\npl.legend(loc=0)\npl.title('Source and target distributions')\n\n\n\n#%% OT linear mapping estimation\n\neta=1e-8 # quadratic regularization for regression\nmu=1e0 # weight of the OT linear term\nbias=True # estimate a bias\n\not_mapping=ot.da.OTDA_mapping_linear()\not_mapping.fit(xs,xt,mu=mu,eta=eta,bias=bias,numItermax = 20,verbose=True)\n\nxst=ot_mapping.predict(xs) # use the estimated mapping\nxst0=ot_mapping.interp() # use barycentric mapping\n\n\npl.figure(2,(10,7))\npl.clf()\npl.subplot(2,2,1)\npl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples',alpha=.3)\npl.scatter(xst0[:,0],xst0[:,1],c=ys,marker='+',label='barycentric mapping')\npl.title(\"barycentric mapping\")\n\npl.subplot(2,2,2)\npl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples',alpha=.3)\npl.scatter(xst[:,0],xst[:,1],c=ys,marker='+',label='Learned mapping')\npl.title(\"Learned mapping\")\n\n\n\n#%% Kernel mapping estimation\n\neta=1e-5 # quadratic regularization for regression\nmu=1e-1 # weight of the OT linear term\nbias=True # estimate a bias\nsigma=1 # sigma bandwidth fot gaussian kernel\n\n\not_mapping_kernel=ot.da.OTDA_mapping_kernel()\not_mapping_kernel.fit(xs,xt,mu=mu,eta=eta,sigma=sigma,bias=bias,numItermax = 10,verbose=True)\n\nxst_kernel=ot_mapping_kernel.predict(xs) # use the estimated mapping\nxst0_kernel=ot_mapping_kernel.interp() # use barycentric mapping\n\n\n#%% Plotting the mapped samples\n\npl.figure(2,(10,7))\npl.clf()\npl.subplot(2,2,1)\npl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples',alpha=.2)\npl.scatter(xst0[:,0],xst0[:,1],c=ys,marker='+',label='Mapped source samples')\npl.title(\"Bary. mapping (linear)\")\npl.legend(loc=0)\n\npl.subplot(2,2,2)\npl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples',alpha=.2)\npl.scatter(xst[:,0],xst[:,1],c=ys,marker='+',label='Learned mapping')\npl.title(\"Estim. mapping (linear)\")\n\npl.subplot(2,2,3)\npl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples',alpha=.2)\npl.scatter(xst0_kernel[:,0],xst0_kernel[:,1],c=ys,marker='+',label='barycentric mapping')\npl.title(\"Bary. mapping (kernel)\")\n\npl.subplot(2,2,4)\npl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples',alpha=.2)\npl.scatter(xst_kernel[:,0],xst_kernel[:,1],c=ys,marker='+',label='Learned mapping')\npl.title(\"Estim. mapping (kernel)\")"
- ],
- "outputs": [],
- "metadata": {
- "collapsed": false
- }
- }
- ],
- "metadata": {
- "kernelspec": {
- "display_name": "Python 2",
- "name": "python2",
- "language": "python"
- },
- "language_info": {
- "mimetype": "text/x-python",
- "nbconvert_exporter": "python",
- "name": "python",
- "file_extension": ".py",
- "version": "2.7.12",
- "pygments_lexer": "ipython2",
- "codemirror_mode": {
- "version": 2,
- "name": "ipython"
- }
- }
- }
-} \ No newline at end of file
diff --git a/docs/source/auto_examples/plot_OTDA_mapping.py b/docs/source/auto_examples/plot_OTDA_mapping.py
deleted file mode 100644
index 78b57e7..0000000
--- a/docs/source/auto_examples/plot_OTDA_mapping.py
+++ /dev/null
@@ -1,110 +0,0 @@
-# -*- coding: utf-8 -*-
-"""
-===============================================
-OT mapping estimation for domain adaptation [8]
-===============================================
-
-[8] M. Perrot, N. Courty, R. Flamary, A. Habrard, "Mapping estimation for
- discrete optimal transport", Neural Information Processing Systems (NIPS), 2016.
-"""
-
-import numpy as np
-import matplotlib.pylab as pl
-import ot
-
-
-
-#%% dataset generation
-
-np.random.seed(0) # makes example reproducible
-
-n=100 # nb samples in source and target datasets
-theta=2*np.pi/20
-nz=0.1
-xs,ys=ot.datasets.get_data_classif('gaussrot',n,nz=nz)
-xt,yt=ot.datasets.get_data_classif('gaussrot',n,theta=theta,nz=nz)
-
-# one of the target mode changes its variance (no linear mapping)
-xt[yt==2]*=3
-xt=xt+4
-
-
-#%% plot samples
-
-pl.figure(1,(8,5))
-pl.clf()
-
-pl.scatter(xs[:,0],xs[:,1],c=ys,marker='+',label='Source samples')
-pl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples')
-
-pl.legend(loc=0)
-pl.title('Source and target distributions')
-
-
-
-#%% OT linear mapping estimation
-
-eta=1e-8 # quadratic regularization for regression
-mu=1e0 # weight of the OT linear term
-bias=True # estimate a bias
-
-ot_mapping=ot.da.OTDA_mapping_linear()
-ot_mapping.fit(xs,xt,mu=mu,eta=eta,bias=bias,numItermax = 20,verbose=True)
-
-xst=ot_mapping.predict(xs) # use the estimated mapping
-xst0=ot_mapping.interp() # use barycentric mapping
-
-
-pl.figure(2,(10,7))
-pl.clf()
-pl.subplot(2,2,1)
-pl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples',alpha=.3)
-pl.scatter(xst0[:,0],xst0[:,1],c=ys,marker='+',label='barycentric mapping')
-pl.title("barycentric mapping")
-
-pl.subplot(2,2,2)
-pl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples',alpha=.3)
-pl.scatter(xst[:,0],xst[:,1],c=ys,marker='+',label='Learned mapping')
-pl.title("Learned mapping")
-
-
-
-#%% Kernel mapping estimation
-
-eta=1e-5 # quadratic regularization for regression
-mu=1e-1 # weight of the OT linear term
-bias=True # estimate a bias
-sigma=1 # sigma bandwidth fot gaussian kernel
-
-
-ot_mapping_kernel=ot.da.OTDA_mapping_kernel()
-ot_mapping_kernel.fit(xs,xt,mu=mu,eta=eta,sigma=sigma,bias=bias,numItermax = 10,verbose=True)
-
-xst_kernel=ot_mapping_kernel.predict(xs) # use the estimated mapping
-xst0_kernel=ot_mapping_kernel.interp() # use barycentric mapping
-
-
-#%% Plotting the mapped samples
-
-pl.figure(2,(10,7))
-pl.clf()
-pl.subplot(2,2,1)
-pl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples',alpha=.2)
-pl.scatter(xst0[:,0],xst0[:,1],c=ys,marker='+',label='Mapped source samples')
-pl.title("Bary. mapping (linear)")
-pl.legend(loc=0)
-
-pl.subplot(2,2,2)
-pl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples',alpha=.2)
-pl.scatter(xst[:,0],xst[:,1],c=ys,marker='+',label='Learned mapping')
-pl.title("Estim. mapping (linear)")
-
-pl.subplot(2,2,3)
-pl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples',alpha=.2)
-pl.scatter(xst0_kernel[:,0],xst0_kernel[:,1],c=ys,marker='+',label='barycentric mapping')
-pl.title("Bary. mapping (kernel)")
-
-pl.subplot(2,2,4)
-pl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples',alpha=.2)
-pl.scatter(xst_kernel[:,0],xst_kernel[:,1],c=ys,marker='+',label='Learned mapping')
-pl.title("Estim. mapping (kernel)")
diff --git a/docs/source/auto_examples/plot_OTDA_mapping.rst b/docs/source/auto_examples/plot_OTDA_mapping.rst
deleted file mode 100644
index 18da90d..0000000
--- a/docs/source/auto_examples/plot_OTDA_mapping.rst
+++ /dev/null
@@ -1,186 +0,0 @@
-
-
-.. _sphx_glr_auto_examples_plot_OTDA_mapping.py:
-
-
-===============================================
-OT mapping estimation for domain adaptation [8]
-===============================================
-
-[8] M. Perrot, N. Courty, R. Flamary, A. Habrard, "Mapping estimation for
- discrete optimal transport", Neural Information Processing Systems (NIPS), 2016.
-
-
-
-
-.. rst-class:: sphx-glr-horizontal
-
-
- *
-
- .. image:: /auto_examples/images/sphx_glr_plot_OTDA_mapping_001.png
- :scale: 47
-
- *
-
- .. image:: /auto_examples/images/sphx_glr_plot_OTDA_mapping_002.png
- :scale: 47
-
-
-.. rst-class:: sphx-glr-script-out
-
- Out::
-
- It. |Loss |Delta loss
- --------------------------------
- 0|4.009366e+03|0.000000e+00
- 1|3.999933e+03|-2.352753e-03
- 2|3.999520e+03|-1.031984e-04
- 3|3.999362e+03|-3.936391e-05
- 4|3.999281e+03|-2.032868e-05
- 5|3.999238e+03|-1.083083e-05
- 6|3.999229e+03|-2.125291e-06
- It. |Loss |Delta loss
- --------------------------------
- 0|4.026841e+02|0.000000e+00
- 1|3.990791e+02|-8.952439e-03
- 2|3.987954e+02|-7.107124e-04
- 3|3.986554e+02|-3.512453e-04
- 4|3.985721e+02|-2.087997e-04
- 5|3.985141e+02|-1.456184e-04
- 6|3.984729e+02|-1.034624e-04
- 7|3.984435e+02|-7.366943e-05
- 8|3.984199e+02|-5.922497e-05
- 9|3.984016e+02|-4.593063e-05
- 10|3.983867e+02|-3.733061e-05
-
-
-
-
-|
-
-
-.. code-block:: python
-
-
- import numpy as np
- import matplotlib.pylab as pl
- import ot
-
-
-
- #%% dataset generation
-
- np.random.seed(0) # makes example reproducible
-
- n=100 # nb samples in source and target datasets
- theta=2*np.pi/20
- nz=0.1
- xs,ys=ot.datasets.get_data_classif('gaussrot',n,nz=nz)
- xt,yt=ot.datasets.get_data_classif('gaussrot',n,theta=theta,nz=nz)
-
- # one of the target mode changes its variance (no linear mapping)
- xt[yt==2]*=3
- xt=xt+4
-
-
- #%% plot samples
-
- pl.figure(1,(8,5))
- pl.clf()
-
- pl.scatter(xs[:,0],xs[:,1],c=ys,marker='+',label='Source samples')
- pl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples')
-
- pl.legend(loc=0)
- pl.title('Source and target distributions')
-
-
-
- #%% OT linear mapping estimation
-
- eta=1e-8 # quadratic regularization for regression
- mu=1e0 # weight of the OT linear term
- bias=True # estimate a bias
-
- ot_mapping=ot.da.OTDA_mapping_linear()
- ot_mapping.fit(xs,xt,mu=mu,eta=eta,bias=bias,numItermax = 20,verbose=True)
-
- xst=ot_mapping.predict(xs) # use the estimated mapping
- xst0=ot_mapping.interp() # use barycentric mapping
-
-
- pl.figure(2,(10,7))
- pl.clf()
- pl.subplot(2,2,1)
- pl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples',alpha=.3)
- pl.scatter(xst0[:,0],xst0[:,1],c=ys,marker='+',label='barycentric mapping')
- pl.title("barycentric mapping")
-
- pl.subplot(2,2,2)
- pl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples',alpha=.3)
- pl.scatter(xst[:,0],xst[:,1],c=ys,marker='+',label='Learned mapping')
- pl.title("Learned mapping")
-
-
-
- #%% Kernel mapping estimation
-
- eta=1e-5 # quadratic regularization for regression
- mu=1e-1 # weight of the OT linear term
- bias=True # estimate a bias
- sigma=1 # sigma bandwidth fot gaussian kernel
-
-
- ot_mapping_kernel=ot.da.OTDA_mapping_kernel()
- ot_mapping_kernel.fit(xs,xt,mu=mu,eta=eta,sigma=sigma,bias=bias,numItermax = 10,verbose=True)
-
- xst_kernel=ot_mapping_kernel.predict(xs) # use the estimated mapping
- xst0_kernel=ot_mapping_kernel.interp() # use barycentric mapping
-
-
- #%% Plotting the mapped samples
-
- pl.figure(2,(10,7))
- pl.clf()
- pl.subplot(2,2,1)
- pl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples',alpha=.2)
- pl.scatter(xst0[:,0],xst0[:,1],c=ys,marker='+',label='Mapped source samples')
- pl.title("Bary. mapping (linear)")
- pl.legend(loc=0)
-
- pl.subplot(2,2,2)
- pl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples',alpha=.2)
- pl.scatter(xst[:,0],xst[:,1],c=ys,marker='+',label='Learned mapping')
- pl.title("Estim. mapping (linear)")
-
- pl.subplot(2,2,3)
- pl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples',alpha=.2)
- pl.scatter(xst0_kernel[:,0],xst0_kernel[:,1],c=ys,marker='+',label='barycentric mapping')
- pl.title("Bary. mapping (kernel)")
-
- pl.subplot(2,2,4)
- pl.scatter(xt[:,0],xt[:,1],c=yt,marker='o',label='Target samples',alpha=.2)
- pl.scatter(xst_kernel[:,0],xst_kernel[:,1],c=ys,marker='+',label='Learned mapping')
- pl.title("Estim. mapping (kernel)")
-
-**Total running time of the script:** ( 0 minutes 0.882 seconds)
-
-
-
-.. container:: sphx-glr-footer
-
-
- .. container:: sphx-glr-download
-
- :download:`Download Python source code: plot_OTDA_mapping.py <plot_OTDA_mapping.py>`
-
-
-
- .. container:: sphx-glr-download
-
- :download:`Download Jupyter notebook: plot_OTDA_mapping.ipynb <plot_OTDA_mapping.ipynb>`
-
-.. rst-class:: sphx-glr-signature
-
- `Generated by Sphinx-Gallery <http://sphinx-gallery.readthedocs.io>`_
diff --git a/docs/source/auto_examples/plot_OTDA_mapping_color_images.ipynb b/docs/source/auto_examples/plot_OTDA_mapping_color_images.ipynb
deleted file mode 100644
index 1136cc3..0000000
--- a/docs/source/auto_examples/plot_OTDA_mapping_color_images.ipynb
+++ /dev/null
@@ -1,54 +0,0 @@
-{
- "nbformat_minor": 0,
- "nbformat": 4,
- "cells": [
- {
- "execution_count": null,
- "cell_type": "code",
- "source": [
- "%matplotlib inline"
- ],
- "outputs": [],
- "metadata": {
- "collapsed": false
- }
- },
- {
- "source": [
- "\n====================================================================================\nOT for domain adaptation with image color adaptation [6] with mapping estimation [8]\n====================================================================================\n\n[6] Ferradans, S., Papadakis, N., Peyre, G., & Aujol, J. F. (2014). Regularized\n discrete optimal transport. SIAM Journal on Imaging Sciences, 7(3), 1853-1882.\n[8] M. Perrot, N. Courty, R. Flamary, A. Habrard, \"Mapping estimation for\n discrete optimal transport\", Neural Information Processing Systems (NIPS), 2016.\n\n\n"
- ],
- "cell_type": "markdown",
- "metadata": {}
- },
- {
- "execution_count": null,
- "cell_type": "code",
- "source": [
- "import numpy as np\nimport scipy.ndimage as spi\nimport matplotlib.pylab as pl\nimport ot\n\n\n#%% Loading images\n\nI1=spi.imread('../data/ocean_day.jpg').astype(np.float64)/256\nI2=spi.imread('../data/ocean_sunset.jpg').astype(np.float64)/256\n\n#%% Plot images\n\npl.figure(1)\n\npl.subplot(1,2,1)\npl.imshow(I1)\npl.title('Image 1')\n\npl.subplot(1,2,2)\npl.imshow(I2)\npl.title('Image 2')\n\npl.show()\n\n#%% Image conversion and dataset generation\n\ndef im2mat(I):\n \"\"\"Converts and image to matrix (one pixel per line)\"\"\"\n return I.reshape((I.shape[0]*I.shape[1],I.shape[2]))\n\ndef mat2im(X,shape):\n \"\"\"Converts back a matrix to an image\"\"\"\n return X.reshape(shape)\n\nX1=im2mat(I1)\nX2=im2mat(I2)\n\n# training samples\nnb=1000\nidx1=np.random.randint(X1.shape[0],size=(nb,))\nidx2=np.random.randint(X2.shape[0],size=(nb,))\n\nxs=X1[idx1,:]\nxt=X2[idx2,:]\n\n#%% Plot image distributions\n\n\npl.figure(2,(10,5))\n\npl.subplot(1,2,1)\npl.scatter(xs[:,0],xs[:,2],c=xs)\npl.axis([0,1,0,1])\npl.xlabel('Red')\npl.ylabel('Blue')\npl.title('Image 1')\n\npl.subplot(1,2,2)\n#pl.imshow(I2)\npl.scatter(xt[:,0],xt[:,2],c=xt)\npl.axis([0,1,0,1])\npl.xlabel('Red')\npl.ylabel('Blue')\npl.title('Image 2')\n\npl.show()\n\n\n\n#%% domain adaptation between images\ndef minmax(I):\n return np.minimum(np.maximum(I,0),1)\n# LP problem\nda_emd=ot.da.OTDA() # init class\nda_emd.fit(xs,xt) # fit distributions\n\nX1t=da_emd.predict(X1) # out of sample\nI1t=minmax(mat2im(X1t,I1.shape))\n\n# sinkhorn regularization\nlambd=1e-1\nda_entrop=ot.da.OTDA_sinkhorn()\nda_entrop.fit(xs,xt,reg=lambd)\n\nX1te=da_entrop.predict(X1)\nI1te=minmax(mat2im(X1te,I1.shape))\n\n# linear mapping estimation\neta=1e-8 # quadratic regularization for regression\nmu=1e0 # weight of the OT linear term\nbias=True # estimate a bias\n\not_mapping=ot.da.OTDA_mapping_linear()\not_mapping.fit(xs,xt,mu=mu,eta=eta,bias=bias,numItermax = 20,verbose=True)\n\nX1tl=ot_mapping.predict(X1) # use the estimated mapping\nI1tl=minmax(mat2im(X1tl,I1.shape))\n\n# nonlinear mapping estimation\neta=1e-2 # quadratic regularization for regression\nmu=1e0 # weight of the OT linear term\nbias=False # estimate a bias\nsigma=1 # sigma bandwidth fot gaussian kernel\n\n\not_mapping_kernel=ot.da.OTDA_mapping_kernel()\not_mapping_kernel.fit(xs,xt,mu=mu,eta=eta,sigma=sigma,bias=bias,numItermax = 10,verbose=True)\n\nX1tn=ot_mapping_kernel.predict(X1) # use the estimated mapping\nI1tn=minmax(mat2im(X1tn,I1.shape))\n#%% plot images\n\n\npl.figure(2,(10,8))\n\npl.subplot(2,3,1)\n\npl.imshow(I1)\npl.title('Im. 1')\n\npl.subplot(2,3,2)\n\npl.imshow(I2)\npl.title('Im. 2')\n\n\npl.subplot(2,3,3)\npl.imshow(I1t)\npl.title('Im. 1 Interp LP')\n\npl.subplot(2,3,4)\npl.imshow(I1te)\npl.title('Im. 1 Interp Entrop')\n\n\npl.subplot(2,3,5)\npl.imshow(I1tl)\npl.title('Im. 1 Linear mapping')\n\npl.subplot(2,3,6)\npl.imshow(I1tn)\npl.title('Im. 1 nonlinear mapping')\n\npl.show()"
- ],
- "outputs": [],
- "metadata": {
- "collapsed": false
- }
- }
- ],
- "metadata": {
- "kernelspec": {
- "display_name": "Python 2",
- "name": "python2",
- "language": "python"
- },
- "language_info": {
- "mimetype": "text/x-python",
- "nbconvert_exporter": "python",
- "name": "python",
- "file_extension": ".py",
- "version": "2.7.12",
- "pygments_lexer": "ipython2",
- "codemirror_mode": {
- "version": 2,
- "name": "ipython"
- }
- }
- }
-} \ No newline at end of file
diff --git a/docs/source/auto_examples/plot_OTDA_mapping_color_images.py b/docs/source/auto_examples/plot_OTDA_mapping_color_images.py
deleted file mode 100644
index f07dc6c..0000000
--- a/docs/source/auto_examples/plot_OTDA_mapping_color_images.py
+++ /dev/null
@@ -1,158 +0,0 @@
-# -*- coding: utf-8 -*-
-"""
-====================================================================================
-OT for domain adaptation with image color adaptation [6] with mapping estimation [8]
-====================================================================================
-
-[6] Ferradans, S., Papadakis, N., Peyre, G., & Aujol, J. F. (2014). Regularized
- discrete optimal transport. SIAM Journal on Imaging Sciences, 7(3), 1853-1882.
-[8] M. Perrot, N. Courty, R. Flamary, A. Habrard, "Mapping estimation for
- discrete optimal transport", Neural Information Processing Systems (NIPS), 2016.
-
-"""
-
-import numpy as np
-import scipy.ndimage as spi
-import matplotlib.pylab as pl
-import ot
-
-
-#%% Loading images
-
-I1=spi.imread('../data/ocean_day.jpg').astype(np.float64)/256
-I2=spi.imread('../data/ocean_sunset.jpg').astype(np.float64)/256
-
-#%% Plot images
-
-pl.figure(1)
-
-pl.subplot(1,2,1)
-pl.imshow(I1)
-pl.title('Image 1')
-
-pl.subplot(1,2,2)
-pl.imshow(I2)
-pl.title('Image 2')
-
-pl.show()
-
-#%% Image conversion and dataset generation
-
-def im2mat(I):
- """Converts and image to matrix (one pixel per line)"""
- return I.reshape((I.shape[0]*I.shape[1],I.shape[2]))
-
-def mat2im(X,shape):
- """Converts back a matrix to an image"""
- return X.reshape(shape)
-
-X1=im2mat(I1)
-X2=im2mat(I2)
-
-# training samples
-nb=1000
-idx1=np.random.randint(X1.shape[0],size=(nb,))
-idx2=np.random.randint(X2.shape[0],size=(nb,))
-
-xs=X1[idx1,:]
-xt=X2[idx2,:]
-
-#%% Plot image distributions
-
-
-pl.figure(2,(10,5))
-
-pl.subplot(1,2,1)
-pl.scatter(xs[:,0],xs[:,2],c=xs)
-pl.axis([0,1,0,1])
-pl.xlabel('Red')
-pl.ylabel('Blue')
-pl.title('Image 1')
-
-pl.subplot(1,2,2)
-#pl.imshow(I2)
-pl.scatter(xt[:,0],xt[:,2],c=xt)
-pl.axis([0,1,0,1])
-pl.xlabel('Red')
-pl.ylabel('Blue')
-pl.title('Image 2')
-
-pl.show()
-
-
-
-#%% domain adaptation between images
-def minmax(I):
- return np.minimum(np.maximum(I,0),1)
-# LP problem
-da_emd=ot.da.OTDA() # init class
-da_emd.fit(xs,xt) # fit distributions
-
-X1t=da_emd.predict(X1) # out of sample
-I1t=minmax(mat2im(X1t,I1.shape))
-
-# sinkhorn regularization
-lambd=1e-1
-da_entrop=ot.da.OTDA_sinkhorn()
-da_entrop.fit(xs,xt,reg=lambd)
-
-X1te=da_entrop.predict(X1)
-I1te=minmax(mat2im(X1te,I1.shape))
-
-# linear mapping estimation
-eta=1e-8 # quadratic regularization for regression
-mu=1e0 # weight of the OT linear term
-bias=True # estimate a bias
-
-ot_mapping=ot.da.OTDA_mapping_linear()
-ot_mapping.fit(xs,xt,mu=mu,eta=eta,bias=bias,numItermax = 20,verbose=True)
-
-X1tl=ot_mapping.predict(X1) # use the estimated mapping
-I1tl=minmax(mat2im(X1tl,I1.shape))
-
-# nonlinear mapping estimation
-eta=1e-2 # quadratic regularization for regression
-mu=1e0 # weight of the OT linear term
-bias=False # estimate a bias
-sigma=1 # sigma bandwidth fot gaussian kernel
-
-
-ot_mapping_kernel=ot.da.OTDA_mapping_kernel()
-ot_mapping_kernel.fit(xs,xt,mu=mu,eta=eta,sigma=sigma,bias=bias,numItermax = 10,verbose=True)
-
-X1tn=ot_mapping_kernel.predict(X1) # use the estimated mapping
-I1tn=minmax(mat2im(X1tn,I1.shape))
-#%% plot images
-
-
-pl.figure(2,(10,8))
-
-pl.subplot(2,3,1)
-
-pl.imshow(I1)
-pl.title('Im. 1')
-
-pl.subplot(2,3,2)
-
-pl.imshow(I2)
-pl.title('Im. 2')
-
-
-pl.subplot(2,3,3)
-pl.imshow(I1t)
-pl.title('Im. 1 Interp LP')
-
-pl.subplot(2,3,4)
-pl.imshow(I1te)
-pl.title('Im. 1 Interp Entrop')
-
-
-pl.subplot(2,3,5)
-pl.imshow(I1tl)
-pl.title('Im. 1 Linear mapping')
-
-pl.subplot(2,3,6)
-pl.imshow(I1tn)
-pl.title('Im. 1 nonlinear mapping')
-
-pl.show()
diff --git a/docs/source/auto_examples/plot_OTDA_mapping_color_images.rst b/docs/source/auto_examples/plot_OTDA_mapping_color_images.rst
deleted file mode 100644
index 60be3a4..0000000
--- a/docs/source/auto_examples/plot_OTDA_mapping_color_images.rst
+++ /dev/null
@@ -1,246 +0,0 @@
-
-
-.. _sphx_glr_auto_examples_plot_OTDA_mapping_color_images.py:
-
-
-====================================================================================
-OT for domain adaptation with image color adaptation [6] with mapping estimation [8]
-====================================================================================
-
-[6] Ferradans, S., Papadakis, N., Peyre, G., & Aujol, J. F. (2014). Regularized
- discrete optimal transport. SIAM Journal on Imaging Sciences, 7(3), 1853-1882.
-[8] M. Perrot, N. Courty, R. Flamary, A. Habrard, "Mapping estimation for
- discrete optimal transport", Neural Information Processing Systems (NIPS), 2016.
-
-
-
-
-
-.. rst-class:: sphx-glr-horizontal
-
-
- *
-
- .. image:: /auto_examples/images/sphx_glr_plot_OTDA_mapping_color_images_001.png
- :scale: 47
-
- *
-
- .. image:: /auto_examples/images/sphx_glr_plot_OTDA_mapping_color_images_002.png
- :scale: 47
-
-
-.. rst-class:: sphx-glr-script-out
-
- Out::
-
- It. |Loss |Delta loss
- --------------------------------
- 0|3.624802e+02|0.000000e+00
- 1|3.547180e+02|-2.141395e-02
- 2|3.545494e+02|-4.753955e-04
- 3|3.544646e+02|-2.391784e-04
- 4|3.544126e+02|-1.466280e-04
- 5|3.543775e+02|-9.921805e-05
- 6|3.543518e+02|-7.245828e-05
- 7|3.543323e+02|-5.491924e-05
- 8|3.543170e+02|-4.342401e-05
- 9|3.543046e+02|-3.472174e-05
- 10|3.542945e+02|-2.878681e-05
- 11|3.542859e+02|-2.417065e-05
- 12|3.542786e+02|-2.058131e-05
- 13|3.542723e+02|-1.768262e-05
- 14|3.542668e+02|-1.551616e-05
- 15|3.542620e+02|-1.371909e-05
- 16|3.542577e+02|-1.213326e-05
- 17|3.542538e+02|-1.085481e-05
- 18|3.542531e+02|-1.996006e-06
- It. |Loss |Delta loss
- --------------------------------
- 0|3.555768e+02|0.000000e+00
- 1|3.510071e+02|-1.285164e-02
- 2|3.509110e+02|-2.736701e-04
- 3|3.508748e+02|-1.031476e-04
- 4|3.508506e+02|-6.910585e-05
- 5|3.508330e+02|-5.014608e-05
- 6|3.508195e+02|-3.839166e-05
- 7|3.508090e+02|-3.004218e-05
- 8|3.508005e+02|-2.417627e-05
- 9|3.507935e+02|-2.004621e-05
- 10|3.507876e+02|-1.681731e-05
-
-
-
-
-|
-
-
-.. code-block:: python
-
-
- import numpy as np
- import scipy.ndimage as spi
- import matplotlib.pylab as pl
- import ot
-
-
- #%% Loading images
-
- I1=spi.imread('../data/ocean_day.jpg').astype(np.float64)/256
- I2=spi.imread('../data/ocean_sunset.jpg').astype(np.float64)/256
-
- #%% Plot images
-
- pl.figure(1)
-
- pl.subplot(1,2,1)
- pl.imshow(I1)
- pl.title('Image 1')
-
- pl.subplot(1,2,2)
- pl.imshow(I2)
- pl.title('Image 2')
-
- pl.show()
-
- #%% Image conversion and dataset generation
-
- def im2mat(I):
- """Converts and image to matrix (one pixel per line)"""
- return I.reshape((I.shape[0]*I.shape[1],I.shape[2]))
-
- def mat2im(X,shape):
- """Converts back a matrix to an image"""
- return X.reshape(shape)
-
- X1=im2mat(I1)
- X2=im2mat(I2)
-
- # training samples
- nb=1000
- idx1=np.random.randint(X1.shape[0],size=(nb,))
- idx2=np.random.randint(X2.shape[0],size=(nb,))
-
- xs=X1[idx1,:]
- xt=X2[idx2,:]
-
- #%% Plot image distributions
-
-
- pl.figure(2,(10,5))
-
- pl.subplot(1,2,1)
- pl.scatter(xs[:,0],xs[:,2],c=xs)
- pl.axis([0,1,0,1])
- pl.xlabel('Red')
- pl.ylabel('Blue')
- pl.title('Image 1')
-
- pl.subplot(1,2,2)
- #pl.imshow(I2)
- pl.scatter(xt[:,0],xt[:,2],c=xt)
- pl.axis([0,1,0,1])
- pl.xlabel('Red')
- pl.ylabel('Blue')
- pl.title('Image 2')
-
- pl.show()
-
-
-
- #%% domain adaptation between images
- def minmax(I):
- return np.minimum(np.maximum(I,0),1)
- # LP problem
- da_emd=ot.da.OTDA() # init class
- da_emd.fit(xs,xt) # fit distributions
-
- X1t=da_emd.predict(X1) # out of sample
- I1t=minmax(mat2im(X1t,I1.shape))
-
- # sinkhorn regularization
- lambd=1e-1
- da_entrop=ot.da.OTDA_sinkhorn()
- da_entrop.fit(xs,xt,reg=lambd)
-
- X1te=da_entrop.predict(X1)
- I1te=minmax(mat2im(X1te,I1.shape))
-
- # linear mapping estimation
- eta=1e-8 # quadratic regularization for regression
- mu=1e0 # weight of the OT linear term
- bias=True # estimate a bias
-
- ot_mapping=ot.da.OTDA_mapping_linear()
- ot_mapping.fit(xs,xt,mu=mu,eta=eta,bias=bias,numItermax = 20,verbose=True)
-
- X1tl=ot_mapping.predict(X1) # use the estimated mapping
- I1tl=minmax(mat2im(X1tl,I1.shape))
-
- # nonlinear mapping estimation
- eta=1e-2 # quadratic regularization for regression
- mu=1e0 # weight of the OT linear term
- bias=False # estimate a bias
- sigma=1 # sigma bandwidth fot gaussian kernel
-
-
- ot_mapping_kernel=ot.da.OTDA_mapping_kernel()
- ot_mapping_kernel.fit(xs,xt,mu=mu,eta=eta,sigma=sigma,bias=bias,numItermax = 10,verbose=True)
-
- X1tn=ot_mapping_kernel.predict(X1) # use the estimated mapping
- I1tn=minmax(mat2im(X1tn,I1.shape))
- #%% plot images
-
-
- pl.figure(2,(10,8))
-
- pl.subplot(2,3,1)
-
- pl.imshow(I1)
- pl.title('Im. 1')
-
- pl.subplot(2,3,2)
-
- pl.imshow(I2)
- pl.title('Im. 2')
-
-
- pl.subplot(2,3,3)
- pl.imshow(I1t)
- pl.title('Im. 1 Interp LP')
-
- pl.subplot(2,3,4)
- pl.imshow(I1te)
- pl.title('Im. 1 Interp Entrop')
-
-
- pl.subplot(2,3,5)
- pl.imshow(I1tl)
- pl.title('Im. 1 Linear mapping')
-
- pl.subplot(2,3,6)
- pl.imshow(I1tn)
- pl.title('Im. 1 nonlinear mapping')
-
- pl.show()
-
-**Total running time of the script:** ( 1 minutes 59.537 seconds)
-
-
-
-.. container:: sphx-glr-footer
-
-
- .. container:: sphx-glr-download
-
- :download:`Download Python source code: plot_OTDA_mapping_color_images.py <plot_OTDA_mapping_color_images.py>`
-
-
-
- .. container:: sphx-glr-download
-
- :download:`Download Jupyter notebook: plot_OTDA_mapping_color_images.ipynb <plot_OTDA_mapping_color_images.ipynb>`
-
-.. rst-class:: sphx-glr-signature
-
- `Generated by Sphinx-Gallery <http://sphinx-gallery.readthedocs.io>`_
diff --git a/docs/source/auto_examples/plot_OT_1D.ipynb b/docs/source/auto_examples/plot_OT_1D.ipynb
index 8715b97..26748c2 100644
--- a/docs/source/auto_examples/plot_OT_1D.ipynb
+++ b/docs/source/auto_examples/plot_OT_1D.ipynb
@@ -15,7 +15,7 @@
},
{
"source": [
- "\n# 1D optimal transport\n\n\n@author: rflamary\n\n"
+ "\n# 1D optimal transport\n\n\nThis example illustrates the computation of EMD and Sinkhorn transport plans\nand their visualization.\n\n\n"
],
"cell_type": "markdown",
"metadata": {}
@@ -24,7 +24,79 @@
"execution_count": null,
"cell_type": "code",
"source": [
- "import numpy as np\nimport matplotlib.pylab as pl\nimport ot\nfrom ot.datasets import get_1D_gauss as gauss\n\n\n#%% parameters\n\nn=100 # nb bins\n\n# bin positions\nx=np.arange(n,dtype=np.float64)\n\n# Gaussian distributions\na=gauss(n,m=20,s=5) # m= mean, s= std\nb=gauss(n,m=60,s=10)\n\n# loss matrix\nM=ot.dist(x.reshape((n,1)),x.reshape((n,1)))\nM/=M.max()\n\n#%% plot the distributions\n\npl.figure(1)\npl.plot(x,a,'b',label='Source distribution')\npl.plot(x,b,'r',label='Target distribution')\npl.legend()\n\n#%% plot distributions and loss matrix\n\npl.figure(2)\not.plot.plot1D_mat(a,b,M,'Cost matrix M')\n\n#%% EMD\n\nG0=ot.emd(a,b,M)\n\npl.figure(3)\not.plot.plot1D_mat(a,b,G0,'OT matrix G0')\n\n#%% Sinkhorn\n\nlambd=1e-3\nGs=ot.sinkhorn(a,b,M,lambd,verbose=True)\n\npl.figure(4)\not.plot.plot1D_mat(a,b,Gs,'OT matrix Sinkhorn')"
+ "# Author: Remi Flamary <remi.flamary@unice.fr>\n#\n# License: MIT License\n\nimport numpy as np\nimport matplotlib.pylab as pl\nimport ot\nfrom ot.datasets import get_1D_gauss as gauss"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Generate data\n-------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "#%% parameters\n\nn = 100 # nb bins\n\n# bin positions\nx = np.arange(n, dtype=np.float64)\n\n# Gaussian distributions\na = gauss(n, m=20, s=5) # m= mean, s= std\nb = gauss(n, m=60, s=10)\n\n# loss matrix\nM = ot.dist(x.reshape((n, 1)), x.reshape((n, 1)))\nM /= M.max()"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Plot distributions and loss matrix\n----------------------------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "#%% plot the distributions\n\npl.figure(1, figsize=(6.4, 3))\npl.plot(x, a, 'b', label='Source distribution')\npl.plot(x, b, 'r', label='Target distribution')\npl.legend()\n\n#%% plot distributions and loss matrix\n\npl.figure(2, figsize=(5, 5))\not.plot.plot1D_mat(a, b, M, 'Cost matrix M')"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Solve EMD\n---------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "#%% EMD\n\nG0 = ot.emd(a, b, M)\n\npl.figure(3, figsize=(5, 5))\not.plot.plot1D_mat(a, b, G0, 'OT matrix G0')"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Solve Sinkhorn\n--------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "#%% Sinkhorn\n\nlambd = 1e-3\nGs = ot.sinkhorn(a, b, M, lambd, verbose=True)\n\npl.figure(4, figsize=(5, 5))\not.plot.plot1D_mat(a, b, Gs, 'OT matrix Sinkhorn')\n\npl.show()"
],
"outputs": [],
"metadata": {
diff --git a/docs/source/auto_examples/plot_OT_1D.py b/docs/source/auto_examples/plot_OT_1D.py
index 6661aa3..719058f 100644
--- a/docs/source/auto_examples/plot_OT_1D.py
+++ b/docs/source/auto_examples/plot_OT_1D.py
@@ -4,53 +4,80 @@
1D optimal transport
====================
-@author: rflamary
+This example illustrates the computation of EMD and Sinkhorn transport plans
+and their visualization.
+
"""
+# Author: Remi Flamary <remi.flamary@unice.fr>
+#
+# License: MIT License
+
import numpy as np
import matplotlib.pylab as pl
import ot
from ot.datasets import get_1D_gauss as gauss
+##############################################################################
+# Generate data
+# -------------
+
#%% parameters
-n=100 # nb bins
+n = 100 # nb bins
# bin positions
-x=np.arange(n,dtype=np.float64)
+x = np.arange(n, dtype=np.float64)
# Gaussian distributions
-a=gauss(n,m=20,s=5) # m= mean, s= std
-b=gauss(n,m=60,s=10)
+a = gauss(n, m=20, s=5) # m= mean, s= std
+b = gauss(n, m=60, s=10)
# loss matrix
-M=ot.dist(x.reshape((n,1)),x.reshape((n,1)))
-M/=M.max()
+M = ot.dist(x.reshape((n, 1)), x.reshape((n, 1)))
+M /= M.max()
+
+
+##############################################################################
+# Plot distributions and loss matrix
+# ----------------------------------
#%% plot the distributions
-pl.figure(1)
-pl.plot(x,a,'b',label='Source distribution')
-pl.plot(x,b,'r',label='Target distribution')
+pl.figure(1, figsize=(6.4, 3))
+pl.plot(x, a, 'b', label='Source distribution')
+pl.plot(x, b, 'r', label='Target distribution')
pl.legend()
#%% plot distributions and loss matrix
-pl.figure(2)
-ot.plot.plot1D_mat(a,b,M,'Cost matrix M')
+pl.figure(2, figsize=(5, 5))
+ot.plot.plot1D_mat(a, b, M, 'Cost matrix M')
+
+##############################################################################
+# Solve EMD
+# ---------
+
#%% EMD
-G0=ot.emd(a,b,M)
+G0 = ot.emd(a, b, M)
+
+pl.figure(3, figsize=(5, 5))
+ot.plot.plot1D_mat(a, b, G0, 'OT matrix G0')
+
+##############################################################################
+# Solve Sinkhorn
+# --------------
-pl.figure(3)
-ot.plot.plot1D_mat(a,b,G0,'OT matrix G0')
#%% Sinkhorn
-lambd=1e-3
-Gs=ot.sinkhorn(a,b,M,lambd,verbose=True)
+lambd = 1e-3
+Gs = ot.sinkhorn(a, b, M, lambd, verbose=True)
+
+pl.figure(4, figsize=(5, 5))
+ot.plot.plot1D_mat(a, b, Gs, 'OT matrix Sinkhorn')
-pl.figure(4)
-ot.plot.plot1D_mat(a,b,Gs,'OT matrix Sinkhorn')
+pl.show()
diff --git a/docs/source/auto_examples/plot_OT_1D.rst b/docs/source/auto_examples/plot_OT_1D.rst
index 44b715b..b91916e 100644
--- a/docs/source/auto_examples/plot_OT_1D.rst
+++ b/docs/source/auto_examples/plot_OT_1D.rst
@@ -7,7 +7,80 @@
1D optimal transport
====================
-@author: rflamary
+This example illustrates the computation of EMD and Sinkhorn transport plans
+and their visualization.
+
+
+
+
+.. code-block:: python
+
+
+ # Author: Remi Flamary <remi.flamary@unice.fr>
+ #
+ # License: MIT License
+
+ import numpy as np
+ import matplotlib.pylab as pl
+ import ot
+ from ot.datasets import get_1D_gauss as gauss
+
+
+
+
+
+
+
+Generate data
+-------------
+
+
+
+.. code-block:: python
+
+
+
+ #%% parameters
+
+ n = 100 # nb bins
+
+ # bin positions
+ x = np.arange(n, dtype=np.float64)
+
+ # Gaussian distributions
+ a = gauss(n, m=20, s=5) # m= mean, s= std
+ b = gauss(n, m=60, s=10)
+
+ # loss matrix
+ M = ot.dist(x.reshape((n, 1)), x.reshape((n, 1)))
+ M /= M.max()
+
+
+
+
+
+
+
+
+Plot distributions and loss matrix
+----------------------------------
+
+
+
+.. code-block:: python
+
+
+ #%% plot the distributions
+
+ pl.figure(1, figsize=(6.4, 3))
+ pl.plot(x, a, 'b', label='Source distribution')
+ pl.plot(x, b, 'r', label='Target distribution')
+ pl.legend()
+
+ #%% plot distributions and loss matrix
+
+ pl.figure(2, figsize=(5, 5))
+ ot.plot.plot1D_mat(a, b, M, 'Cost matrix M')
@@ -25,94 +98,80 @@
.. image:: /auto_examples/images/sphx_glr_plot_OT_1D_002.png
:scale: 47
- *
- .. image:: /auto_examples/images/sphx_glr_plot_OT_1D_003.png
- :scale: 47
- *
- .. image:: /auto_examples/images/sphx_glr_plot_OT_1D_004.png
- :scale: 47
+Solve EMD
+---------
-.. rst-class:: sphx-glr-script-out
- Out::
+.. code-block:: python
- It. |Err
- -------------------
- 0|8.187970e-02|
- 10|3.460174e-02|
- 20|6.633335e-03|
- 30|9.797798e-04|
- 40|1.389606e-04|
- 50|1.959016e-05|
- 60|2.759079e-06|
- 70|3.885166e-07|
- 80|5.470605e-08|
- 90|7.702918e-09|
- 100|1.084609e-09|
- 110|1.527180e-10|
+ #%% EMD
+ G0 = ot.emd(a, b, M)
-|
+ pl.figure(3, figsize=(5, 5))
+ ot.plot.plot1D_mat(a, b, G0, 'OT matrix G0')
-.. code-block:: python
- import numpy as np
- import matplotlib.pylab as pl
- import ot
- from ot.datasets import get_1D_gauss as gauss
+.. image:: /auto_examples/images/sphx_glr_plot_OT_1D_005.png
+ :align: center
- #%% parameters
- n=100 # nb bins
- # bin positions
- x=np.arange(n,dtype=np.float64)
+Solve Sinkhorn
+--------------
- # Gaussian distributions
- a=gauss(n,m=20,s=5) # m= mean, s= std
- b=gauss(n,m=60,s=10)
- # loss matrix
- M=ot.dist(x.reshape((n,1)),x.reshape((n,1)))
- M/=M.max()
- #%% plot the distributions
+.. code-block:: python
- pl.figure(1)
- pl.plot(x,a,'b',label='Source distribution')
- pl.plot(x,b,'r',label='Target distribution')
- pl.legend()
- #%% plot distributions and loss matrix
- pl.figure(2)
- ot.plot.plot1D_mat(a,b,M,'Cost matrix M')
+ #%% Sinkhorn
- #%% EMD
+ lambd = 1e-3
+ Gs = ot.sinkhorn(a, b, M, lambd, verbose=True)
- G0=ot.emd(a,b,M)
+ pl.figure(4, figsize=(5, 5))
+ ot.plot.plot1D_mat(a, b, Gs, 'OT matrix Sinkhorn')
- pl.figure(3)
- ot.plot.plot1D_mat(a,b,G0,'OT matrix G0')
+ pl.show()
- #%% Sinkhorn
- lambd=1e-3
- Gs=ot.sinkhorn(a,b,M,lambd,verbose=True)
- pl.figure(4)
- ot.plot.plot1D_mat(a,b,Gs,'OT matrix Sinkhorn')
+.. image:: /auto_examples/images/sphx_glr_plot_OT_1D_007.png
+ :align: center
+
+
+.. rst-class:: sphx-glr-script-out
+
+ Out::
+
+ It. |Err
+ -------------------
+ 0|8.187970e-02|
+ 10|3.460174e-02|
+ 20|6.633335e-03|
+ 30|9.797798e-04|
+ 40|1.389606e-04|
+ 50|1.959016e-05|
+ 60|2.759079e-06|
+ 70|3.885166e-07|
+ 80|5.470605e-08|
+ 90|7.702918e-09|
+ 100|1.084609e-09|
+ 110|1.527180e-10|
+
-**Total running time of the script:** ( 0 minutes 0.674 seconds)
+**Total running time of the script:** ( 0 minutes 0.748 seconds)
@@ -131,4 +190,4 @@
.. rst-class:: sphx-glr-signature
- `Generated by Sphinx-Gallery <http://sphinx-gallery.readthedocs.io>`_
+ `Generated by Sphinx-Gallery <https://sphinx-gallery.readthedocs.io>`_
diff --git a/docs/source/auto_examples/plot_OT_2D_samples.ipynb b/docs/source/auto_examples/plot_OT_2D_samples.ipynb
index fad0467..41a37f3 100644
--- a/docs/source/auto_examples/plot_OT_2D_samples.ipynb
+++ b/docs/source/auto_examples/plot_OT_2D_samples.ipynb
@@ -15,7 +15,7 @@
},
{
"source": [
- "\n# 2D Optimal transport between empirical distributions\n\n\n@author: rflamary\n\n"
+ "\n# 2D Optimal transport between empirical distributions\n\n\nIllustration of 2D optimal transport between discributions that are weighted\nsum of diracs. The OT matrix is plotted with the samples.\n\n\n"
],
"cell_type": "markdown",
"metadata": {}
@@ -24,7 +24,79 @@
"execution_count": null,
"cell_type": "code",
"source": [
- "import numpy as np\nimport matplotlib.pylab as pl\nimport ot\n\n#%% parameters and data generation\n\nn=50 # nb samples\n\nmu_s=np.array([0,0])\ncov_s=np.array([[1,0],[0,1]])\n\nmu_t=np.array([4,4])\ncov_t=np.array([[1,-.8],[-.8,1]])\n\nxs=ot.datasets.get_2D_samples_gauss(n,mu_s,cov_s)\nxt=ot.datasets.get_2D_samples_gauss(n,mu_t,cov_t)\n\na,b = ot.unif(n),ot.unif(n) # uniform distribution on samples\n\n# loss matrix\nM=ot.dist(xs,xt)\nM/=M.max()\n\n#%% plot samples\n\npl.figure(1)\npl.plot(xs[:,0],xs[:,1],'+b',label='Source samples')\npl.plot(xt[:,0],xt[:,1],'xr',label='Target samples')\npl.legend(loc=0)\npl.title('Source and traget distributions')\n\npl.figure(2)\npl.imshow(M,interpolation='nearest')\npl.title('Cost matrix M')\n\n\n#%% EMD\n\nG0=ot.emd(a,b,M)\n\npl.figure(3)\npl.imshow(G0,interpolation='nearest')\npl.title('OT matrix G0')\n\npl.figure(4)\not.plot.plot2D_samples_mat(xs,xt,G0,c=[.5,.5,1])\npl.plot(xs[:,0],xs[:,1],'+b',label='Source samples')\npl.plot(xt[:,0],xt[:,1],'xr',label='Target samples')\npl.legend(loc=0)\npl.title('OT matrix with samples')\n\n\n#%% sinkhorn\n\n# reg term\nlambd=5e-4\n\nGs=ot.sinkhorn(a,b,M,lambd)\n\npl.figure(5)\npl.imshow(Gs,interpolation='nearest')\npl.title('OT matrix sinkhorn')\n\npl.figure(6)\not.plot.plot2D_samples_mat(xs,xt,Gs,color=[.5,.5,1])\npl.plot(xs[:,0],xs[:,1],'+b',label='Source samples')\npl.plot(xt[:,0],xt[:,1],'xr',label='Target samples')\npl.legend(loc=0)\npl.title('OT matrix Sinkhorn with samples')"
+ "# Author: Remi Flamary <remi.flamary@unice.fr>\n#\n# License: MIT License\n\nimport numpy as np\nimport matplotlib.pylab as pl\nimport ot"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Generate data\n-------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "#%% parameters and data generation\n\nn = 50 # nb samples\n\nmu_s = np.array([0, 0])\ncov_s = np.array([[1, 0], [0, 1]])\n\nmu_t = np.array([4, 4])\ncov_t = np.array([[1, -.8], [-.8, 1]])\n\nxs = ot.datasets.get_2D_samples_gauss(n, mu_s, cov_s)\nxt = ot.datasets.get_2D_samples_gauss(n, mu_t, cov_t)\n\na, b = np.ones((n,)) / n, np.ones((n,)) / n # uniform distribution on samples\n\n# loss matrix\nM = ot.dist(xs, xt)\nM /= M.max()"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Plot data\n---------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "#%% plot samples\n\npl.figure(1)\npl.plot(xs[:, 0], xs[:, 1], '+b', label='Source samples')\npl.plot(xt[:, 0], xt[:, 1], 'xr', label='Target samples')\npl.legend(loc=0)\npl.title('Source and target distributions')\n\npl.figure(2)\npl.imshow(M, interpolation='nearest')\npl.title('Cost matrix M')"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Compute EMD\n-----------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "#%% EMD\n\nG0 = ot.emd(a, b, M)\n\npl.figure(3)\npl.imshow(G0, interpolation='nearest')\npl.title('OT matrix G0')\n\npl.figure(4)\not.plot.plot2D_samples_mat(xs, xt, G0, c=[.5, .5, 1])\npl.plot(xs[:, 0], xs[:, 1], '+b', label='Source samples')\npl.plot(xt[:, 0], xt[:, 1], 'xr', label='Target samples')\npl.legend(loc=0)\npl.title('OT matrix with samples')"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Compute Sinkhorn\n----------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "#%% sinkhorn\n\n# reg term\nlambd = 1e-3\n\nGs = ot.sinkhorn(a, b, M, lambd)\n\npl.figure(5)\npl.imshow(Gs, interpolation='nearest')\npl.title('OT matrix sinkhorn')\n\npl.figure(6)\not.plot.plot2D_samples_mat(xs, xt, Gs, color=[.5, .5, 1])\npl.plot(xs[:, 0], xs[:, 1], '+b', label='Source samples')\npl.plot(xt[:, 0], xt[:, 1], 'xr', label='Target samples')\npl.legend(loc=0)\npl.title('OT matrix Sinkhorn with samples')\n\npl.show()"
],
"outputs": [],
"metadata": {
diff --git a/docs/source/auto_examples/plot_OT_2D_samples.py b/docs/source/auto_examples/plot_OT_2D_samples.py
index edfb781..9818ec5 100644
--- a/docs/source/auto_examples/plot_OT_2D_samples.py
+++ b/docs/source/auto_examples/plot_OT_2D_samples.py
@@ -4,75 +4,98 @@
2D Optimal transport between empirical distributions
====================================================
-@author: rflamary
+Illustration of 2D optimal transport between discributions that are weighted
+sum of diracs. The OT matrix is plotted with the samples.
+
"""
+# Author: Remi Flamary <remi.flamary@unice.fr>
+#
+# License: MIT License
+
import numpy as np
import matplotlib.pylab as pl
import ot
+##############################################################################
+# Generate data
+# -------------
+
#%% parameters and data generation
-n=50 # nb samples
+n = 50 # nb samples
-mu_s=np.array([0,0])
-cov_s=np.array([[1,0],[0,1]])
+mu_s = np.array([0, 0])
+cov_s = np.array([[1, 0], [0, 1]])
-mu_t=np.array([4,4])
-cov_t=np.array([[1,-.8],[-.8,1]])
+mu_t = np.array([4, 4])
+cov_t = np.array([[1, -.8], [-.8, 1]])
-xs=ot.datasets.get_2D_samples_gauss(n,mu_s,cov_s)
-xt=ot.datasets.get_2D_samples_gauss(n,mu_t,cov_t)
+xs = ot.datasets.get_2D_samples_gauss(n, mu_s, cov_s)
+xt = ot.datasets.get_2D_samples_gauss(n, mu_t, cov_t)
-a,b = ot.unif(n),ot.unif(n) # uniform distribution on samples
+a, b = np.ones((n,)) / n, np.ones((n,)) / n # uniform distribution on samples
# loss matrix
-M=ot.dist(xs,xt)
-M/=M.max()
+M = ot.dist(xs, xt)
+M /= M.max()
+
+##############################################################################
+# Plot data
+# ---------
#%% plot samples
pl.figure(1)
-pl.plot(xs[:,0],xs[:,1],'+b',label='Source samples')
-pl.plot(xt[:,0],xt[:,1],'xr',label='Target samples')
+pl.plot(xs[:, 0], xs[:, 1], '+b', label='Source samples')
+pl.plot(xt[:, 0], xt[:, 1], 'xr', label='Target samples')
pl.legend(loc=0)
-pl.title('Source and traget distributions')
+pl.title('Source and target distributions')
pl.figure(2)
-pl.imshow(M,interpolation='nearest')
+pl.imshow(M, interpolation='nearest')
pl.title('Cost matrix M')
+##############################################################################
+# Compute EMD
+# -----------
#%% EMD
-G0=ot.emd(a,b,M)
+G0 = ot.emd(a, b, M)
pl.figure(3)
-pl.imshow(G0,interpolation='nearest')
+pl.imshow(G0, interpolation='nearest')
pl.title('OT matrix G0')
pl.figure(4)
-ot.plot.plot2D_samples_mat(xs,xt,G0,c=[.5,.5,1])
-pl.plot(xs[:,0],xs[:,1],'+b',label='Source samples')
-pl.plot(xt[:,0],xt[:,1],'xr',label='Target samples')
+ot.plot.plot2D_samples_mat(xs, xt, G0, c=[.5, .5, 1])
+pl.plot(xs[:, 0], xs[:, 1], '+b', label='Source samples')
+pl.plot(xt[:, 0], xt[:, 1], 'xr', label='Target samples')
pl.legend(loc=0)
pl.title('OT matrix with samples')
+##############################################################################
+# Compute Sinkhorn
+# ----------------
+
#%% sinkhorn
# reg term
-lambd=5e-4
+lambd = 1e-3
-Gs=ot.sinkhorn(a,b,M,lambd)
+Gs = ot.sinkhorn(a, b, M, lambd)
pl.figure(5)
-pl.imshow(Gs,interpolation='nearest')
+pl.imshow(Gs, interpolation='nearest')
pl.title('OT matrix sinkhorn')
pl.figure(6)
-ot.plot.plot2D_samples_mat(xs,xt,Gs,color=[.5,.5,1])
-pl.plot(xs[:,0],xs[:,1],'+b',label='Source samples')
-pl.plot(xt[:,0],xt[:,1],'xr',label='Target samples')
+ot.plot.plot2D_samples_mat(xs, xt, Gs, color=[.5, .5, 1])
+pl.plot(xs[:, 0], xs[:, 1], '+b', label='Source samples')
+pl.plot(xt[:, 0], xt[:, 1], 'xr', label='Target samples')
pl.legend(loc=0)
pl.title('OT matrix Sinkhorn with samples')
+
+pl.show()
diff --git a/docs/source/auto_examples/plot_OT_2D_samples.rst b/docs/source/auto_examples/plot_OT_2D_samples.rst
index e05e591..0ad9cf0 100644
--- a/docs/source/auto_examples/plot_OT_2D_samples.rst
+++ b/docs/source/auto_examples/plot_OT_2D_samples.rst
@@ -7,131 +7,191 @@
2D Optimal transport between empirical distributions
====================================================
-@author: rflamary
+Illustration of 2D optimal transport between discributions that are weighted
+sum of diracs. The OT matrix is plotted with the samples.
-.. rst-class:: sphx-glr-horizontal
+.. code-block:: python
- *
+ # Author: Remi Flamary <remi.flamary@unice.fr>
+ #
+ # License: MIT License
- .. image:: /auto_examples/images/sphx_glr_plot_OT_2D_samples_001.png
- :scale: 47
+ import numpy as np
+ import matplotlib.pylab as pl
+ import ot
- *
- .. image:: /auto_examples/images/sphx_glr_plot_OT_2D_samples_002.png
- :scale: 47
- *
- .. image:: /auto_examples/images/sphx_glr_plot_OT_2D_samples_003.png
- :scale: 47
- *
- .. image:: /auto_examples/images/sphx_glr_plot_OT_2D_samples_004.png
- :scale: 47
- *
+Generate data
+-------------
- .. image:: /auto_examples/images/sphx_glr_plot_OT_2D_samples_005.png
- :scale: 47
- *
- .. image:: /auto_examples/images/sphx_glr_plot_OT_2D_samples_006.png
- :scale: 47
+.. code-block:: python
-.. rst-class:: sphx-glr-script-out
+ #%% parameters and data generation
- Out::
+ n = 50 # nb samples
- ('Warning: numerical errors at iteration', 0)
+ mu_s = np.array([0, 0])
+ cov_s = np.array([[1, 0], [0, 1]])
+ mu_t = np.array([4, 4])
+ cov_t = np.array([[1, -.8], [-.8, 1]])
+ xs = ot.datasets.get_2D_samples_gauss(n, mu_s, cov_s)
+ xt = ot.datasets.get_2D_samples_gauss(n, mu_t, cov_t)
+ a, b = np.ones((n,)) / n, np.ones((n,)) / n # uniform distribution on samples
-|
+ # loss matrix
+ M = ot.dist(xs, xt)
+ M /= M.max()
-.. code-block:: python
- import numpy as np
- import matplotlib.pylab as pl
- import ot
- #%% parameters and data generation
- n=50 # nb samples
- mu_s=np.array([0,0])
- cov_s=np.array([[1,0],[0,1]])
+Plot data
+---------
- mu_t=np.array([4,4])
- cov_t=np.array([[1,-.8],[-.8,1]])
- xs=ot.datasets.get_2D_samples_gauss(n,mu_s,cov_s)
- xt=ot.datasets.get_2D_samples_gauss(n,mu_t,cov_t)
- a,b = ot.unif(n),ot.unif(n) # uniform distribution on samples
+.. code-block:: python
- # loss matrix
- M=ot.dist(xs,xt)
- M/=M.max()
#%% plot samples
pl.figure(1)
- pl.plot(xs[:,0],xs[:,1],'+b',label='Source samples')
- pl.plot(xt[:,0],xt[:,1],'xr',label='Target samples')
+ pl.plot(xs[:, 0], xs[:, 1], '+b', label='Source samples')
+ pl.plot(xt[:, 0], xt[:, 1], 'xr', label='Target samples')
pl.legend(loc=0)
- pl.title('Source and traget distributions')
+ pl.title('Source and target distributions')
pl.figure(2)
- pl.imshow(M,interpolation='nearest')
+ pl.imshow(M, interpolation='nearest')
pl.title('Cost matrix M')
+
+
+.. rst-class:: sphx-glr-horizontal
+
+
+ *
+
+ .. image:: /auto_examples/images/sphx_glr_plot_OT_2D_samples_001.png
+ :scale: 47
+
+ *
+
+ .. image:: /auto_examples/images/sphx_glr_plot_OT_2D_samples_002.png
+ :scale: 47
+
+
+
+
+Compute EMD
+-----------
+
+
+
+.. code-block:: python
+
+
#%% EMD
- G0=ot.emd(a,b,M)
+ G0 = ot.emd(a, b, M)
pl.figure(3)
- pl.imshow(G0,interpolation='nearest')
+ pl.imshow(G0, interpolation='nearest')
pl.title('OT matrix G0')
pl.figure(4)
- ot.plot.plot2D_samples_mat(xs,xt,G0,c=[.5,.5,1])
- pl.plot(xs[:,0],xs[:,1],'+b',label='Source samples')
- pl.plot(xt[:,0],xt[:,1],'xr',label='Target samples')
+ ot.plot.plot2D_samples_mat(xs, xt, G0, c=[.5, .5, 1])
+ pl.plot(xs[:, 0], xs[:, 1], '+b', label='Source samples')
+ pl.plot(xt[:, 0], xt[:, 1], 'xr', label='Target samples')
pl.legend(loc=0)
pl.title('OT matrix with samples')
+
+
+
+.. rst-class:: sphx-glr-horizontal
+
+
+ *
+
+ .. image:: /auto_examples/images/sphx_glr_plot_OT_2D_samples_005.png
+ :scale: 47
+
+ *
+
+ .. image:: /auto_examples/images/sphx_glr_plot_OT_2D_samples_006.png
+ :scale: 47
+
+
+
+
+Compute Sinkhorn
+----------------
+
+
+
+.. code-block:: python
+
+
#%% sinkhorn
# reg term
- lambd=5e-4
+ lambd = 1e-3
- Gs=ot.sinkhorn(a,b,M,lambd)
+ Gs = ot.sinkhorn(a, b, M, lambd)
pl.figure(5)
- pl.imshow(Gs,interpolation='nearest')
+ pl.imshow(Gs, interpolation='nearest')
pl.title('OT matrix sinkhorn')
pl.figure(6)
- ot.plot.plot2D_samples_mat(xs,xt,Gs,color=[.5,.5,1])
- pl.plot(xs[:,0],xs[:,1],'+b',label='Source samples')
- pl.plot(xt[:,0],xt[:,1],'xr',label='Target samples')
+ ot.plot.plot2D_samples_mat(xs, xt, Gs, color=[.5, .5, 1])
+ pl.plot(xs[:, 0], xs[:, 1], '+b', label='Source samples')
+ pl.plot(xt[:, 0], xt[:, 1], 'xr', label='Target samples')
pl.legend(loc=0)
pl.title('OT matrix Sinkhorn with samples')
-**Total running time of the script:** ( 0 minutes 0.623 seconds)
+ pl.show()
+
+
+
+.. rst-class:: sphx-glr-horizontal
+
+
+ *
+
+ .. image:: /auto_examples/images/sphx_glr_plot_OT_2D_samples_009.png
+ :scale: 47
+
+ *
+
+ .. image:: /auto_examples/images/sphx_glr_plot_OT_2D_samples_010.png
+ :scale: 47
+
+
+
+
+**Total running time of the script:** ( 0 minutes 1.743 seconds)
@@ -150,4 +210,4 @@
.. rst-class:: sphx-glr-signature
- `Generated by Sphinx-Gallery <http://sphinx-gallery.readthedocs.io>`_
+ `Generated by Sphinx-Gallery <https://sphinx-gallery.readthedocs.io>`_
diff --git a/docs/source/auto_examples/plot_OT_L1_vs_L2.ipynb b/docs/source/auto_examples/plot_OT_L1_vs_L2.ipynb
index 46283ac..2b9a364 100644
--- a/docs/source/auto_examples/plot_OT_L1_vs_L2.ipynb
+++ b/docs/source/auto_examples/plot_OT_L1_vs_L2.ipynb
@@ -15,7 +15,7 @@
},
{
"source": [
- "\n# 2D Optimal transport for different metrics\n\n\nStole the figure idea from Fig. 1 and 2 in \nhttps://arxiv.org/pdf/1706.07650.pdf\n\n\n@author: rflamary\n\n"
+ "\n# 2D Optimal transport for different metrics\n\n\n2D OT on empirical distributio with different gound metric.\n\nStole the figure idea from Fig. 1 and 2 in\nhttps://arxiv.org/pdf/1706.07650.pdf\n\n\n\n"
],
"cell_type": "markdown",
"metadata": {}
@@ -24,7 +24,79 @@
"execution_count": null,
"cell_type": "code",
"source": [
- "import numpy as np\nimport matplotlib.pylab as pl\nimport ot\n\n#%% parameters and data generation\n\nfor data in range(2):\n\n if data:\n n=20 # nb samples\n xs=np.zeros((n,2))\n xs[:,0]=np.arange(n)+1\n xs[:,1]=(np.arange(n)+1)*-0.001 # to make it strictly convex...\n \n xt=np.zeros((n,2))\n xt[:,1]=np.arange(n)+1\n else:\n \n n=50 # nb samples\n xtot=np.zeros((n+1,2))\n xtot[:,0]=np.cos((np.arange(n+1)+1.0)*0.9/(n+2)*2*np.pi)\n xtot[:,1]=np.sin((np.arange(n+1)+1.0)*0.9/(n+2)*2*np.pi)\n \n xs=xtot[:n,:]\n xt=xtot[1:,:]\n \n \n \n a,b = ot.unif(n),ot.unif(n) # uniform distribution on samples\n \n # loss matrix\n M1=ot.dist(xs,xt,metric='euclidean')\n M1/=M1.max()\n \n # loss matrix\n M2=ot.dist(xs,xt,metric='sqeuclidean')\n M2/=M2.max()\n \n # loss matrix\n Mp=np.sqrt(ot.dist(xs,xt,metric='euclidean'))\n Mp/=Mp.max()\n \n #%% plot samples\n \n pl.figure(1+3*data)\n pl.clf()\n pl.plot(xs[:,0],xs[:,1],'+b',label='Source samples')\n pl.plot(xt[:,0],xt[:,1],'xr',label='Target samples')\n pl.axis('equal')\n pl.title('Source and traget distributions')\n \n pl.figure(2+3*data,(15,5))\n pl.subplot(1,3,1)\n pl.imshow(M1,interpolation='nearest')\n pl.title('Eucidean cost')\n pl.subplot(1,3,2)\n pl.imshow(M2,interpolation='nearest')\n pl.title('Squared Euclidean cost')\n \n pl.subplot(1,3,3)\n pl.imshow(Mp,interpolation='nearest')\n pl.title('Sqrt Euclidean cost')\n #%% EMD\n \n G1=ot.emd(a,b,M1)\n G2=ot.emd(a,b,M2)\n Gp=ot.emd(a,b,Mp)\n \n pl.figure(3+3*data,(15,5))\n \n pl.subplot(1,3,1)\n ot.plot.plot2D_samples_mat(xs,xt,G1,c=[.5,.5,1])\n pl.plot(xs[:,0],xs[:,1],'+b',label='Source samples')\n pl.plot(xt[:,0],xt[:,1],'xr',label='Target samples')\n pl.axis('equal')\n #pl.legend(loc=0)\n pl.title('OT Euclidean')\n \n pl.subplot(1,3,2)\n \n ot.plot.plot2D_samples_mat(xs,xt,G2,c=[.5,.5,1])\n pl.plot(xs[:,0],xs[:,1],'+b',label='Source samples')\n pl.plot(xt[:,0],xt[:,1],'xr',label='Target samples')\n pl.axis('equal')\n #pl.legend(loc=0)\n pl.title('OT squared Euclidean')\n \n pl.subplot(1,3,3)\n \n ot.plot.plot2D_samples_mat(xs,xt,Gp,c=[.5,.5,1])\n pl.plot(xs[:,0],xs[:,1],'+b',label='Source samples')\n pl.plot(xt[:,0],xt[:,1],'xr',label='Target samples')\n pl.axis('equal')\n #pl.legend(loc=0)\n pl.title('OT sqrt Euclidean')"
+ "# Author: Remi Flamary <remi.flamary@unice.fr>\n#\n# License: MIT License\n\nimport numpy as np\nimport matplotlib.pylab as pl\nimport ot"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Dataset 1 : uniform sampling\n----------------------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "n = 20 # nb samples\nxs = np.zeros((n, 2))\nxs[:, 0] = np.arange(n) + 1\nxs[:, 1] = (np.arange(n) + 1) * -0.001 # to make it strictly convex...\n\nxt = np.zeros((n, 2))\nxt[:, 1] = np.arange(n) + 1\n\na, b = ot.unif(n), ot.unif(n) # uniform distribution on samples\n\n# loss matrix\nM1 = ot.dist(xs, xt, metric='euclidean')\nM1 /= M1.max()\n\n# loss matrix\nM2 = ot.dist(xs, xt, metric='sqeuclidean')\nM2 /= M2.max()\n\n# loss matrix\nMp = np.sqrt(ot.dist(xs, xt, metric='euclidean'))\nMp /= Mp.max()\n\n# Data\npl.figure(1, figsize=(7, 3))\npl.clf()\npl.plot(xs[:, 0], xs[:, 1], '+b', label='Source samples')\npl.plot(xt[:, 0], xt[:, 1], 'xr', label='Target samples')\npl.axis('equal')\npl.title('Source and traget distributions')\n\n\n# Cost matrices\npl.figure(2, figsize=(7, 3))\n\npl.subplot(1, 3, 1)\npl.imshow(M1, interpolation='nearest')\npl.title('Euclidean cost')\n\npl.subplot(1, 3, 2)\npl.imshow(M2, interpolation='nearest')\npl.title('Squared Euclidean cost')\n\npl.subplot(1, 3, 3)\npl.imshow(Mp, interpolation='nearest')\npl.title('Sqrt Euclidean cost')\npl.tight_layout()"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Dataset 1 : Plot OT Matrices\n----------------------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "#%% EMD\nG1 = ot.emd(a, b, M1)\nG2 = ot.emd(a, b, M2)\nGp = ot.emd(a, b, Mp)\n\n# OT matrices\npl.figure(3, figsize=(7, 3))\n\npl.subplot(1, 3, 1)\not.plot.plot2D_samples_mat(xs, xt, G1, c=[.5, .5, 1])\npl.plot(xs[:, 0], xs[:, 1], '+b', label='Source samples')\npl.plot(xt[:, 0], xt[:, 1], 'xr', label='Target samples')\npl.axis('equal')\n# pl.legend(loc=0)\npl.title('OT Euclidean')\n\npl.subplot(1, 3, 2)\not.plot.plot2D_samples_mat(xs, xt, G2, c=[.5, .5, 1])\npl.plot(xs[:, 0], xs[:, 1], '+b', label='Source samples')\npl.plot(xt[:, 0], xt[:, 1], 'xr', label='Target samples')\npl.axis('equal')\n# pl.legend(loc=0)\npl.title('OT squared Euclidean')\n\npl.subplot(1, 3, 3)\not.plot.plot2D_samples_mat(xs, xt, Gp, c=[.5, .5, 1])\npl.plot(xs[:, 0], xs[:, 1], '+b', label='Source samples')\npl.plot(xt[:, 0], xt[:, 1], 'xr', label='Target samples')\npl.axis('equal')\n# pl.legend(loc=0)\npl.title('OT sqrt Euclidean')\npl.tight_layout()\n\npl.show()"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Dataset 2 : Partial circle\n--------------------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "n = 50 # nb samples\nxtot = np.zeros((n + 1, 2))\nxtot[:, 0] = np.cos(\n (np.arange(n + 1) + 1.0) * 0.9 / (n + 2) * 2 * np.pi)\nxtot[:, 1] = np.sin(\n (np.arange(n + 1) + 1.0) * 0.9 / (n + 2) * 2 * np.pi)\n\nxs = xtot[:n, :]\nxt = xtot[1:, :]\n\na, b = ot.unif(n), ot.unif(n) # uniform distribution on samples\n\n# loss matrix\nM1 = ot.dist(xs, xt, metric='euclidean')\nM1 /= M1.max()\n\n# loss matrix\nM2 = ot.dist(xs, xt, metric='sqeuclidean')\nM2 /= M2.max()\n\n# loss matrix\nMp = np.sqrt(ot.dist(xs, xt, metric='euclidean'))\nMp /= Mp.max()\n\n\n# Data\npl.figure(4, figsize=(7, 3))\npl.clf()\npl.plot(xs[:, 0], xs[:, 1], '+b', label='Source samples')\npl.plot(xt[:, 0], xt[:, 1], 'xr', label='Target samples')\npl.axis('equal')\npl.title('Source and traget distributions')\n\n\n# Cost matrices\npl.figure(5, figsize=(7, 3))\n\npl.subplot(1, 3, 1)\npl.imshow(M1, interpolation='nearest')\npl.title('Euclidean cost')\n\npl.subplot(1, 3, 2)\npl.imshow(M2, interpolation='nearest')\npl.title('Squared Euclidean cost')\n\npl.subplot(1, 3, 3)\npl.imshow(Mp, interpolation='nearest')\npl.title('Sqrt Euclidean cost')\npl.tight_layout()"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Dataset 2 : Plot OT Matrices\n-----------------------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "#%% EMD\nG1 = ot.emd(a, b, M1)\nG2 = ot.emd(a, b, M2)\nGp = ot.emd(a, b, Mp)\n\n# OT matrices\npl.figure(6, figsize=(7, 3))\n\npl.subplot(1, 3, 1)\not.plot.plot2D_samples_mat(xs, xt, G1, c=[.5, .5, 1])\npl.plot(xs[:, 0], xs[:, 1], '+b', label='Source samples')\npl.plot(xt[:, 0], xt[:, 1], 'xr', label='Target samples')\npl.axis('equal')\n# pl.legend(loc=0)\npl.title('OT Euclidean')\n\npl.subplot(1, 3, 2)\not.plot.plot2D_samples_mat(xs, xt, G2, c=[.5, .5, 1])\npl.plot(xs[:, 0], xs[:, 1], '+b', label='Source samples')\npl.plot(xt[:, 0], xt[:, 1], 'xr', label='Target samples')\npl.axis('equal')\n# pl.legend(loc=0)\npl.title('OT squared Euclidean')\n\npl.subplot(1, 3, 3)\not.plot.plot2D_samples_mat(xs, xt, Gp, c=[.5, .5, 1])\npl.plot(xs[:, 0], xs[:, 1], '+b', label='Source samples')\npl.plot(xt[:, 0], xt[:, 1], 'xr', label='Target samples')\npl.axis('equal')\n# pl.legend(loc=0)\npl.title('OT sqrt Euclidean')\npl.tight_layout()\n\npl.show()"
],
"outputs": [],
"metadata": {
diff --git a/docs/source/auto_examples/plot_OT_L1_vs_L2.py b/docs/source/auto_examples/plot_OT_L1_vs_L2.py
index 9bb92fe..090e809 100644
--- a/docs/source/auto_examples/plot_OT_L1_vs_L2.py
+++ b/docs/source/auto_examples/plot_OT_L1_vs_L2.py
@@ -4,105 +4,204 @@
2D Optimal transport for different metrics
==========================================
-Stole the figure idea from Fig. 1 and 2 in
+2D OT on empirical distributio with different gound metric.
+
+Stole the figure idea from Fig. 1 and 2 in
https://arxiv.org/pdf/1706.07650.pdf
-@author: rflamary
"""
+# Author: Remi Flamary <remi.flamary@unice.fr>
+#
+# License: MIT License
+
import numpy as np
import matplotlib.pylab as pl
import ot
-#%% parameters and data generation
-
-for data in range(2):
-
- if data:
- n=20 # nb samples
- xs=np.zeros((n,2))
- xs[:,0]=np.arange(n)+1
- xs[:,1]=(np.arange(n)+1)*-0.001 # to make it strictly convex...
-
- xt=np.zeros((n,2))
- xt[:,1]=np.arange(n)+1
- else:
-
- n=50 # nb samples
- xtot=np.zeros((n+1,2))
- xtot[:,0]=np.cos((np.arange(n+1)+1.0)*0.9/(n+2)*2*np.pi)
- xtot[:,1]=np.sin((np.arange(n+1)+1.0)*0.9/(n+2)*2*np.pi)
-
- xs=xtot[:n,:]
- xt=xtot[1:,:]
-
-
-
- a,b = ot.unif(n),ot.unif(n) # uniform distribution on samples
-
- # loss matrix
- M1=ot.dist(xs,xt,metric='euclidean')
- M1/=M1.max()
-
- # loss matrix
- M2=ot.dist(xs,xt,metric='sqeuclidean')
- M2/=M2.max()
-
- # loss matrix
- Mp=np.sqrt(ot.dist(xs,xt,metric='euclidean'))
- Mp/=Mp.max()
-
- #%% plot samples
-
- pl.figure(1+3*data)
- pl.clf()
- pl.plot(xs[:,0],xs[:,1],'+b',label='Source samples')
- pl.plot(xt[:,0],xt[:,1],'xr',label='Target samples')
- pl.axis('equal')
- pl.title('Source and traget distributions')
-
- pl.figure(2+3*data,(15,5))
- pl.subplot(1,3,1)
- pl.imshow(M1,interpolation='nearest')
- pl.title('Eucidean cost')
- pl.subplot(1,3,2)
- pl.imshow(M2,interpolation='nearest')
- pl.title('Squared Euclidean cost')
-
- pl.subplot(1,3,3)
- pl.imshow(Mp,interpolation='nearest')
- pl.title('Sqrt Euclidean cost')
- #%% EMD
-
- G1=ot.emd(a,b,M1)
- G2=ot.emd(a,b,M2)
- Gp=ot.emd(a,b,Mp)
-
- pl.figure(3+3*data,(15,5))
-
- pl.subplot(1,3,1)
- ot.plot.plot2D_samples_mat(xs,xt,G1,c=[.5,.5,1])
- pl.plot(xs[:,0],xs[:,1],'+b',label='Source samples')
- pl.plot(xt[:,0],xt[:,1],'xr',label='Target samples')
- pl.axis('equal')
- #pl.legend(loc=0)
- pl.title('OT Euclidean')
-
- pl.subplot(1,3,2)
-
- ot.plot.plot2D_samples_mat(xs,xt,G2,c=[.5,.5,1])
- pl.plot(xs[:,0],xs[:,1],'+b',label='Source samples')
- pl.plot(xt[:,0],xt[:,1],'xr',label='Target samples')
- pl.axis('equal')
- #pl.legend(loc=0)
- pl.title('OT squared Euclidean')
-
- pl.subplot(1,3,3)
-
- ot.plot.plot2D_samples_mat(xs,xt,Gp,c=[.5,.5,1])
- pl.plot(xs[:,0],xs[:,1],'+b',label='Source samples')
- pl.plot(xt[:,0],xt[:,1],'xr',label='Target samples')
- pl.axis('equal')
- #pl.legend(loc=0)
- pl.title('OT sqrt Euclidean')
+##############################################################################
+# Dataset 1 : uniform sampling
+# ----------------------------
+
+n = 20 # nb samples
+xs = np.zeros((n, 2))
+xs[:, 0] = np.arange(n) + 1
+xs[:, 1] = (np.arange(n) + 1) * -0.001 # to make it strictly convex...
+
+xt = np.zeros((n, 2))
+xt[:, 1] = np.arange(n) + 1
+
+a, b = ot.unif(n), ot.unif(n) # uniform distribution on samples
+
+# loss matrix
+M1 = ot.dist(xs, xt, metric='euclidean')
+M1 /= M1.max()
+
+# loss matrix
+M2 = ot.dist(xs, xt, metric='sqeuclidean')
+M2 /= M2.max()
+
+# loss matrix
+Mp = np.sqrt(ot.dist(xs, xt, metric='euclidean'))
+Mp /= Mp.max()
+
+# Data
+pl.figure(1, figsize=(7, 3))
+pl.clf()
+pl.plot(xs[:, 0], xs[:, 1], '+b', label='Source samples')
+pl.plot(xt[:, 0], xt[:, 1], 'xr', label='Target samples')
+pl.axis('equal')
+pl.title('Source and traget distributions')
+
+
+# Cost matrices
+pl.figure(2, figsize=(7, 3))
+
+pl.subplot(1, 3, 1)
+pl.imshow(M1, interpolation='nearest')
+pl.title('Euclidean cost')
+
+pl.subplot(1, 3, 2)
+pl.imshow(M2, interpolation='nearest')
+pl.title('Squared Euclidean cost')
+
+pl.subplot(1, 3, 3)
+pl.imshow(Mp, interpolation='nearest')
+pl.title('Sqrt Euclidean cost')
+pl.tight_layout()
+
+##############################################################################
+# Dataset 1 : Plot OT Matrices
+# ----------------------------
+
+
+#%% EMD
+G1 = ot.emd(a, b, M1)
+G2 = ot.emd(a, b, M2)
+Gp = ot.emd(a, b, Mp)
+
+# OT matrices
+pl.figure(3, figsize=(7, 3))
+
+pl.subplot(1, 3, 1)
+ot.plot.plot2D_samples_mat(xs, xt, G1, c=[.5, .5, 1])
+pl.plot(xs[:, 0], xs[:, 1], '+b', label='Source samples')
+pl.plot(xt[:, 0], xt[:, 1], 'xr', label='Target samples')
+pl.axis('equal')
+# pl.legend(loc=0)
+pl.title('OT Euclidean')
+
+pl.subplot(1, 3, 2)
+ot.plot.plot2D_samples_mat(xs, xt, G2, c=[.5, .5, 1])
+pl.plot(xs[:, 0], xs[:, 1], '+b', label='Source samples')
+pl.plot(xt[:, 0], xt[:, 1], 'xr', label='Target samples')
+pl.axis('equal')
+# pl.legend(loc=0)
+pl.title('OT squared Euclidean')
+
+pl.subplot(1, 3, 3)
+ot.plot.plot2D_samples_mat(xs, xt, Gp, c=[.5, .5, 1])
+pl.plot(xs[:, 0], xs[:, 1], '+b', label='Source samples')
+pl.plot(xt[:, 0], xt[:, 1], 'xr', label='Target samples')
+pl.axis('equal')
+# pl.legend(loc=0)
+pl.title('OT sqrt Euclidean')
+pl.tight_layout()
+
+pl.show()
+
+
+##############################################################################
+# Dataset 2 : Partial circle
+# --------------------------
+
+n = 50 # nb samples
+xtot = np.zeros((n + 1, 2))
+xtot[:, 0] = np.cos(
+ (np.arange(n + 1) + 1.0) * 0.9 / (n + 2) * 2 * np.pi)
+xtot[:, 1] = np.sin(
+ (np.arange(n + 1) + 1.0) * 0.9 / (n + 2) * 2 * np.pi)
+
+xs = xtot[:n, :]
+xt = xtot[1:, :]
+
+a, b = ot.unif(n), ot.unif(n) # uniform distribution on samples
+
+# loss matrix
+M1 = ot.dist(xs, xt, metric='euclidean')
+M1 /= M1.max()
+
+# loss matrix
+M2 = ot.dist(xs, xt, metric='sqeuclidean')
+M2 /= M2.max()
+
+# loss matrix
+Mp = np.sqrt(ot.dist(xs, xt, metric='euclidean'))
+Mp /= Mp.max()
+
+
+# Data
+pl.figure(4, figsize=(7, 3))
+pl.clf()
+pl.plot(xs[:, 0], xs[:, 1], '+b', label='Source samples')
+pl.plot(xt[:, 0], xt[:, 1], 'xr', label='Target samples')
+pl.axis('equal')
+pl.title('Source and traget distributions')
+
+
+# Cost matrices
+pl.figure(5, figsize=(7, 3))
+
+pl.subplot(1, 3, 1)
+pl.imshow(M1, interpolation='nearest')
+pl.title('Euclidean cost')
+
+pl.subplot(1, 3, 2)
+pl.imshow(M2, interpolation='nearest')
+pl.title('Squared Euclidean cost')
+
+pl.subplot(1, 3, 3)
+pl.imshow(Mp, interpolation='nearest')
+pl.title('Sqrt Euclidean cost')
+pl.tight_layout()
+
+##############################################################################
+# Dataset 2 : Plot OT Matrices
+# -----------------------------
+
+
+#%% EMD
+G1 = ot.emd(a, b, M1)
+G2 = ot.emd(a, b, M2)
+Gp = ot.emd(a, b, Mp)
+
+# OT matrices
+pl.figure(6, figsize=(7, 3))
+
+pl.subplot(1, 3, 1)
+ot.plot.plot2D_samples_mat(xs, xt, G1, c=[.5, .5, 1])
+pl.plot(xs[:, 0], xs[:, 1], '+b', label='Source samples')
+pl.plot(xt[:, 0], xt[:, 1], 'xr', label='Target samples')
+pl.axis('equal')
+# pl.legend(loc=0)
+pl.title('OT Euclidean')
+
+pl.subplot(1, 3, 2)
+ot.plot.plot2D_samples_mat(xs, xt, G2, c=[.5, .5, 1])
+pl.plot(xs[:, 0], xs[:, 1], '+b', label='Source samples')
+pl.plot(xt[:, 0], xt[:, 1], 'xr', label='Target samples')
+pl.axis('equal')
+# pl.legend(loc=0)
+pl.title('OT squared Euclidean')
+
+pl.subplot(1, 3, 3)
+ot.plot.plot2D_samples_mat(xs, xt, Gp, c=[.5, .5, 1])
+pl.plot(xs[:, 0], xs[:, 1], '+b', label='Source samples')
+pl.plot(xt[:, 0], xt[:, 1], 'xr', label='Target samples')
+pl.axis('equal')
+# pl.legend(loc=0)
+pl.title('OT sqrt Euclidean')
+pl.tight_layout()
+
+pl.show()
diff --git a/docs/source/auto_examples/plot_OT_L1_vs_L2.rst b/docs/source/auto_examples/plot_OT_L1_vs_L2.rst
index 4e94bef..f97b373 100644
--- a/docs/source/auto_examples/plot_OT_L1_vs_L2.rst
+++ b/docs/source/auto_examples/plot_OT_L1_vs_L2.rst
@@ -7,11 +7,86 @@
2D Optimal transport for different metrics
==========================================
-Stole the figure idea from Fig. 1 and 2 in
+2D OT on empirical distributio with different gound metric.
+
+Stole the figure idea from Fig. 1 and 2 in
https://arxiv.org/pdf/1706.07650.pdf
-@author: rflamary
+
+
+
+.. code-block:: python
+
+
+ # Author: Remi Flamary <remi.flamary@unice.fr>
+ #
+ # License: MIT License
+
+ import numpy as np
+ import matplotlib.pylab as pl
+ import ot
+
+
+
+
+
+
+
+Dataset 1 : uniform sampling
+----------------------------
+
+
+
+.. code-block:: python
+
+
+ n = 20 # nb samples
+ xs = np.zeros((n, 2))
+ xs[:, 0] = np.arange(n) + 1
+ xs[:, 1] = (np.arange(n) + 1) * -0.001 # to make it strictly convex...
+
+ xt = np.zeros((n, 2))
+ xt[:, 1] = np.arange(n) + 1
+
+ a, b = ot.unif(n), ot.unif(n) # uniform distribution on samples
+
+ # loss matrix
+ M1 = ot.dist(xs, xt, metric='euclidean')
+ M1 /= M1.max()
+
+ # loss matrix
+ M2 = ot.dist(xs, xt, metric='sqeuclidean')
+ M2 /= M2.max()
+
+ # loss matrix
+ Mp = np.sqrt(ot.dist(xs, xt, metric='euclidean'))
+ Mp /= Mp.max()
+
+ # Data
+ pl.figure(1, figsize=(7, 3))
+ pl.clf()
+ pl.plot(xs[:, 0], xs[:, 1], '+b', label='Source samples')
+ pl.plot(xt[:, 0], xt[:, 1], 'xr', label='Target samples')
+ pl.axis('equal')
+ pl.title('Source and traget distributions')
+
+
+ # Cost matrices
+ pl.figure(2, figsize=(7, 3))
+
+ pl.subplot(1, 3, 1)
+ pl.imshow(M1, interpolation='nearest')
+ pl.title('Euclidean cost')
+
+ pl.subplot(1, 3, 2)
+ pl.imshow(M2, interpolation='nearest')
+ pl.title('Squared Euclidean cost')
+
+ pl.subplot(1, 3, 3)
+ pl.imshow(Mp, interpolation='nearest')
+ pl.title('Sqrt Euclidean cost')
+ pl.tight_layout()
@@ -29,130 +104,193 @@ https://arxiv.org/pdf/1706.07650.pdf
.. image:: /auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_002.png
:scale: 47
- *
- .. image:: /auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_003.png
- :scale: 47
- *
- .. image:: /auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_004.png
- :scale: 47
+Dataset 1 : Plot OT Matrices
+----------------------------
+
+
+
+.. code-block:: python
+
+
+
+ #%% EMD
+ G1 = ot.emd(a, b, M1)
+ G2 = ot.emd(a, b, M2)
+ Gp = ot.emd(a, b, Mp)
+
+ # OT matrices
+ pl.figure(3, figsize=(7, 3))
+
+ pl.subplot(1, 3, 1)
+ ot.plot.plot2D_samples_mat(xs, xt, G1, c=[.5, .5, 1])
+ pl.plot(xs[:, 0], xs[:, 1], '+b', label='Source samples')
+ pl.plot(xt[:, 0], xt[:, 1], 'xr', label='Target samples')
+ pl.axis('equal')
+ # pl.legend(loc=0)
+ pl.title('OT Euclidean')
+
+ pl.subplot(1, 3, 2)
+ ot.plot.plot2D_samples_mat(xs, xt, G2, c=[.5, .5, 1])
+ pl.plot(xs[:, 0], xs[:, 1], '+b', label='Source samples')
+ pl.plot(xt[:, 0], xt[:, 1], 'xr', label='Target samples')
+ pl.axis('equal')
+ # pl.legend(loc=0)
+ pl.title('OT squared Euclidean')
+
+ pl.subplot(1, 3, 3)
+ ot.plot.plot2D_samples_mat(xs, xt, Gp, c=[.5, .5, 1])
+ pl.plot(xs[:, 0], xs[:, 1], '+b', label='Source samples')
+ pl.plot(xt[:, 0], xt[:, 1], 'xr', label='Target samples')
+ pl.axis('equal')
+ # pl.legend(loc=0)
+ pl.title('OT sqrt Euclidean')
+ pl.tight_layout()
+
+ pl.show()
+
+
+
+
+
+.. image:: /auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_005.png
+ :align: center
+
+
+
+
+Dataset 2 : Partial circle
+--------------------------
+
+
+
+.. code-block:: python
+
+
+ n = 50 # nb samples
+ xtot = np.zeros((n + 1, 2))
+ xtot[:, 0] = np.cos(
+ (np.arange(n + 1) + 1.0) * 0.9 / (n + 2) * 2 * np.pi)
+ xtot[:, 1] = np.sin(
+ (np.arange(n + 1) + 1.0) * 0.9 / (n + 2) * 2 * np.pi)
+
+ xs = xtot[:n, :]
+ xt = xtot[1:, :]
+
+ a, b = ot.unif(n), ot.unif(n) # uniform distribution on samples
+
+ # loss matrix
+ M1 = ot.dist(xs, xt, metric='euclidean')
+ M1 /= M1.max()
+
+ # loss matrix
+ M2 = ot.dist(xs, xt, metric='sqeuclidean')
+ M2 /= M2.max()
+
+ # loss matrix
+ Mp = np.sqrt(ot.dist(xs, xt, metric='euclidean'))
+ Mp /= Mp.max()
+
+
+ # Data
+ pl.figure(4, figsize=(7, 3))
+ pl.clf()
+ pl.plot(xs[:, 0], xs[:, 1], '+b', label='Source samples')
+ pl.plot(xt[:, 0], xt[:, 1], 'xr', label='Target samples')
+ pl.axis('equal')
+ pl.title('Source and traget distributions')
+
+
+ # Cost matrices
+ pl.figure(5, figsize=(7, 3))
+
+ pl.subplot(1, 3, 1)
+ pl.imshow(M1, interpolation='nearest')
+ pl.title('Euclidean cost')
+
+ pl.subplot(1, 3, 2)
+ pl.imshow(M2, interpolation='nearest')
+ pl.title('Squared Euclidean cost')
+
+ pl.subplot(1, 3, 3)
+ pl.imshow(Mp, interpolation='nearest')
+ pl.title('Sqrt Euclidean cost')
+ pl.tight_layout()
+
+
+
+
+.. rst-class:: sphx-glr-horizontal
+
*
- .. image:: /auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_005.png
+ .. image:: /auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_007.png
:scale: 47
*
- .. image:: /auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_006.png
+ .. image:: /auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_008.png
:scale: 47
+Dataset 2 : Plot OT Matrices
+-----------------------------
+
+
.. code-block:: python
- import numpy as np
- import matplotlib.pylab as pl
- import ot
- #%% parameters and data generation
-
- for data in range(2):
-
- if data:
- n=20 # nb samples
- xs=np.zeros((n,2))
- xs[:,0]=np.arange(n)+1
- xs[:,1]=(np.arange(n)+1)*-0.001 # to make it strictly convex...
-
- xt=np.zeros((n,2))
- xt[:,1]=np.arange(n)+1
- else:
-
- n=50 # nb samples
- xtot=np.zeros((n+1,2))
- xtot[:,0]=np.cos((np.arange(n+1)+1.0)*0.9/(n+2)*2*np.pi)
- xtot[:,1]=np.sin((np.arange(n+1)+1.0)*0.9/(n+2)*2*np.pi)
-
- xs=xtot[:n,:]
- xt=xtot[1:,:]
-
-
-
- a,b = ot.unif(n),ot.unif(n) # uniform distribution on samples
-
- # loss matrix
- M1=ot.dist(xs,xt,metric='euclidean')
- M1/=M1.max()
-
- # loss matrix
- M2=ot.dist(xs,xt,metric='sqeuclidean')
- M2/=M2.max()
-
- # loss matrix
- Mp=np.sqrt(ot.dist(xs,xt,metric='euclidean'))
- Mp/=Mp.max()
-
- #%% plot samples
-
- pl.figure(1+3*data)
- pl.clf()
- pl.plot(xs[:,0],xs[:,1],'+b',label='Source samples')
- pl.plot(xt[:,0],xt[:,1],'xr',label='Target samples')
- pl.axis('equal')
- pl.title('Source and traget distributions')
-
- pl.figure(2+3*data,(15,5))
- pl.subplot(1,3,1)
- pl.imshow(M1,interpolation='nearest')
- pl.title('Eucidean cost')
- pl.subplot(1,3,2)
- pl.imshow(M2,interpolation='nearest')
- pl.title('Squared Euclidean cost')
-
- pl.subplot(1,3,3)
- pl.imshow(Mp,interpolation='nearest')
- pl.title('Sqrt Euclidean cost')
- #%% EMD
-
- G1=ot.emd(a,b,M1)
- G2=ot.emd(a,b,M2)
- Gp=ot.emd(a,b,Mp)
-
- pl.figure(3+3*data,(15,5))
-
- pl.subplot(1,3,1)
- ot.plot.plot2D_samples_mat(xs,xt,G1,c=[.5,.5,1])
- pl.plot(xs[:,0],xs[:,1],'+b',label='Source samples')
- pl.plot(xt[:,0],xt[:,1],'xr',label='Target samples')
- pl.axis('equal')
- #pl.legend(loc=0)
- pl.title('OT Euclidean')
-
- pl.subplot(1,3,2)
-
- ot.plot.plot2D_samples_mat(xs,xt,G2,c=[.5,.5,1])
- pl.plot(xs[:,0],xs[:,1],'+b',label='Source samples')
- pl.plot(xt[:,0],xt[:,1],'xr',label='Target samples')
- pl.axis('equal')
- #pl.legend(loc=0)
- pl.title('OT squared Euclidean')
-
- pl.subplot(1,3,3)
-
- ot.plot.plot2D_samples_mat(xs,xt,Gp,c=[.5,.5,1])
- pl.plot(xs[:,0],xs[:,1],'+b',label='Source samples')
- pl.plot(xt[:,0],xt[:,1],'xr',label='Target samples')
- pl.axis('equal')
- #pl.legend(loc=0)
- pl.title('OT sqrt Euclidean')
-
-**Total running time of the script:** ( 0 minutes 1.417 seconds)
+ #%% EMD
+ G1 = ot.emd(a, b, M1)
+ G2 = ot.emd(a, b, M2)
+ Gp = ot.emd(a, b, Mp)
+
+ # OT matrices
+ pl.figure(6, figsize=(7, 3))
+
+ pl.subplot(1, 3, 1)
+ ot.plot.plot2D_samples_mat(xs, xt, G1, c=[.5, .5, 1])
+ pl.plot(xs[:, 0], xs[:, 1], '+b', label='Source samples')
+ pl.plot(xt[:, 0], xt[:, 1], 'xr', label='Target samples')
+ pl.axis('equal')
+ # pl.legend(loc=0)
+ pl.title('OT Euclidean')
+
+ pl.subplot(1, 3, 2)
+ ot.plot.plot2D_samples_mat(xs, xt, G2, c=[.5, .5, 1])
+ pl.plot(xs[:, 0], xs[:, 1], '+b', label='Source samples')
+ pl.plot(xt[:, 0], xt[:, 1], 'xr', label='Target samples')
+ pl.axis('equal')
+ # pl.legend(loc=0)
+ pl.title('OT squared Euclidean')
+
+ pl.subplot(1, 3, 3)
+ ot.plot.plot2D_samples_mat(xs, xt, Gp, c=[.5, .5, 1])
+ pl.plot(xs[:, 0], xs[:, 1], '+b', label='Source samples')
+ pl.plot(xt[:, 0], xt[:, 1], 'xr', label='Target samples')
+ pl.axis('equal')
+ # pl.legend(loc=0)
+ pl.title('OT sqrt Euclidean')
+ pl.tight_layout()
+
+ pl.show()
+
+
+
+.. image:: /auto_examples/images/sphx_glr_plot_OT_L1_vs_L2_011.png
+ :align: center
+
+
+
+
+**Total running time of the script:** ( 0 minutes 1.134 seconds)
@@ -171,4 +309,4 @@ https://arxiv.org/pdf/1706.07650.pdf
.. rst-class:: sphx-glr-signature
- `Generated by Sphinx-Gallery <http://sphinx-gallery.readthedocs.io>`_
+ `Generated by Sphinx-Gallery <https://sphinx-gallery.readthedocs.io>`_
diff --git a/docs/source/auto_examples/plot_OT_conv.ipynb b/docs/source/auto_examples/plot_OT_conv.ipynb
deleted file mode 100644
index 7fc4af0..0000000
--- a/docs/source/auto_examples/plot_OT_conv.ipynb
+++ /dev/null
@@ -1,54 +0,0 @@
-{
- "nbformat_minor": 0,
- "nbformat": 4,
- "cells": [
- {
- "execution_count": null,
- "cell_type": "code",
- "source": [
- "%matplotlib inline"
- ],
- "outputs": [],
- "metadata": {
- "collapsed": false
- }
- },
- {
- "source": [
- "\n# 1D Wasserstein barycenter demo\n\n\n\n@author: rflamary\n\n"
- ],
- "cell_type": "markdown",
- "metadata": {}
- },
- {
- "execution_count": null,
- "cell_type": "code",
- "source": [
- "import numpy as np\nimport matplotlib.pylab as pl\nimport ot\nfrom mpl_toolkits.mplot3d import Axes3D #necessary for 3d plot even if not used\nimport scipy as sp\nimport scipy.signal as sps\n#%% parameters\n\nn=10 # nb bins\n\n# bin positions\nx=np.arange(n,dtype=np.float64)\n\nxx,yy=np.meshgrid(x,x)\n\n\nxpos=np.hstack((xx.reshape(-1,1),yy.reshape(-1,1)))\n\nM=ot.dist(xpos)\n\n\nI0=((xx-5)**2+(yy-5)**2<3**2)*1.0\nI1=((xx-7)**2+(yy-7)**2<3**2)*1.0\n\nI0/=I0.sum()\nI1/=I1.sum()\n\ni0=I0.ravel()\ni1=I1.ravel()\n\nM=M[i0>0,:][:,i1>0].copy()\ni0=i0[i0>0]\ni1=i1[i1>0]\nItot=np.concatenate((I0[:,:,np.newaxis],I1[:,:,np.newaxis]),2)\n\n\n#%% plot the distributions\n\npl.figure(1)\npl.subplot(2,2,1)\npl.imshow(I0)\npl.subplot(2,2,2)\npl.imshow(I1)\n\n\n#%% barycenter computation\n\nalpha=0.5 # 0<=alpha<=1\nweights=np.array([1-alpha,alpha])\n\n\ndef conv2(I,k):\n return sp.ndimage.convolve1d(sp.ndimage.convolve1d(I,k,axis=1),k,axis=0)\n\ndef conv2n(I,k):\n res=np.zeros_like(I)\n for i in range(I.shape[2]):\n res[:,:,i]=conv2(I[:,:,i],k)\n return res\n\n\ndef get_1Dkernel(reg,thr=1e-16,wmax=1024):\n w=max(min(wmax,2*int((-np.log(thr)*reg)**(.5))),3)\n x=np.arange(w,dtype=np.float64)\n return np.exp(-((x-w/2)**2)/reg)\n \nthr=1e-16\nreg=1e0\n\nk=get_1Dkernel(reg)\npl.figure(2)\npl.plot(k)\n\nI05=conv2(I0,k)\n\npl.figure(1)\npl.subplot(2,2,1)\npl.imshow(I0)\npl.subplot(2,2,2)\npl.imshow(I05)\n\n#%%\n\nG=ot.emd(i0,i1,M)\nr0=np.sum(M*G)\n\nreg=1e-1\nGs=ot.bregman.sinkhorn_knopp(i0,i1,M,reg=reg)\nrs=np.sum(M*Gs)\n\n#%%\n\ndef mylog(u):\n tmp=np.log(u)\n tmp[np.isnan(tmp)]=0\n return tmp\n\ndef sinkhorn_conv(a,b, reg, numItermax = 1000, stopThr=1e-9, verbose=False, log=False,**kwargs):\n\n\n a=np.asarray(a,dtype=np.float64)\n b=np.asarray(b,dtype=np.float64)\n \n \n if len(b.shape)>2:\n nbb=b.shape[2]\n a=a[:,:,np.newaxis]\n else:\n nbb=0\n \n\n if log:\n log={'err':[]}\n\n # we assume that no distances are null except those of the diagonal of distances\n if nbb:\n u = np.ones((a.shape[0],a.shape[1],nbb))/(np.prod(a.shape[:2]))\n v = np.ones((a.shape[0],a.shape[1],nbb))/(np.prod(b.shape[:2]))\n a0=1.0/(np.prod(b.shape[:2]))\n else:\n u = np.ones((a.shape[0],a.shape[1]))/(np.prod(a.shape[:2]))\n v = np.ones((a.shape[0],a.shape[1]))/(np.prod(b.shape[:2]))\n a0=1.0/(np.prod(b.shape[:2]))\n \n \n k=get_1Dkernel(reg)\n \n if nbb:\n K=lambda I: conv2n(I,k)\n else:\n K=lambda I: conv2(I,k)\n\n cpt = 0\n err=1\n while (err>stopThr and cpt<numItermax):\n uprev = u\n vprev = v\n \n v = np.divide(b, K(u))\n u = np.divide(a, K(v))\n\n if (np.any(np.isnan(u)) or np.any(np.isnan(v)) \n or np.any(np.isinf(u)) or np.any(np.isinf(v))):\n # we have reached the machine precision\n # come back to previous solution and quit loop\n print('Warning: numerical errors at iteration', cpt)\n u = uprev\n v = vprev\n break\n if cpt%10==0:\n # we can speed up the process by checking for the error only all the 10th iterations\n\n err = np.sum((u-uprev)**2)/np.sum((u)**2)+np.sum((v-vprev)**2)/np.sum((v)**2)\n\n if log:\n log['err'].append(err)\n\n if verbose:\n if cpt%200 ==0:\n print('{:5s}|{:12s}'.format('It.','Err')+'\\n'+'-'*19)\n print('{:5d}|{:8e}|'.format(cpt,err))\n cpt = cpt +1\n if log:\n log['u']=u\n log['v']=v\n \n if nbb: #return only loss \n res=np.zeros((nbb))\n for i in range(nbb):\n res[i]=np.sum(u[:,i].reshape((-1,1))*K*v[:,i].reshape((1,-1))*M)\n if log:\n return res,log\n else:\n return res \n \n else: # return OT matrix\n res=reg*a0*np.sum(a*mylog(u+(u==0))+b*mylog(v+(v==0)))\n if log:\n \n return res,log\n else:\n return res\n\nreg=1e0\nr,log=sinkhorn_conv(I0,I1,reg,verbose=True,log=True)\na=I0\nb=I1\nu=log['u']\nv=log['v']\n#%% barycenter interpolation"
- ],
- "outputs": [],
- "metadata": {
- "collapsed": false
- }
- }
- ],
- "metadata": {
- "kernelspec": {
- "display_name": "Python 2",
- "name": "python2",
- "language": "python"
- },
- "language_info": {
- "mimetype": "text/x-python",
- "nbconvert_exporter": "python",
- "name": "python",
- "file_extension": ".py",
- "version": "2.7.12",
- "pygments_lexer": "ipython2",
- "codemirror_mode": {
- "version": 2,
- "name": "ipython"
- }
- }
- }
-} \ No newline at end of file
diff --git a/docs/source/auto_examples/plot_OT_conv.py b/docs/source/auto_examples/plot_OT_conv.py
deleted file mode 100644
index a86e7a2..0000000
--- a/docs/source/auto_examples/plot_OT_conv.py
+++ /dev/null
@@ -1,200 +0,0 @@
-# -*- coding: utf-8 -*-
-"""
-==============================
-1D Wasserstein barycenter demo
-==============================
-
-
-@author: rflamary
-"""
-
-import numpy as np
-import matplotlib.pylab as pl
-import ot
-from mpl_toolkits.mplot3d import Axes3D #necessary for 3d plot even if not used
-import scipy as sp
-import scipy.signal as sps
-#%% parameters
-
-n=10 # nb bins
-
-# bin positions
-x=np.arange(n,dtype=np.float64)
-
-xx,yy=np.meshgrid(x,x)
-
-
-xpos=np.hstack((xx.reshape(-1,1),yy.reshape(-1,1)))
-
-M=ot.dist(xpos)
-
-
-I0=((xx-5)**2+(yy-5)**2<3**2)*1.0
-I1=((xx-7)**2+(yy-7)**2<3**2)*1.0
-
-I0/=I0.sum()
-I1/=I1.sum()
-
-i0=I0.ravel()
-i1=I1.ravel()
-
-M=M[i0>0,:][:,i1>0].copy()
-i0=i0[i0>0]
-i1=i1[i1>0]
-Itot=np.concatenate((I0[:,:,np.newaxis],I1[:,:,np.newaxis]),2)
-
-
-#%% plot the distributions
-
-pl.figure(1)
-pl.subplot(2,2,1)
-pl.imshow(I0)
-pl.subplot(2,2,2)
-pl.imshow(I1)
-
-
-#%% barycenter computation
-
-alpha=0.5 # 0<=alpha<=1
-weights=np.array([1-alpha,alpha])
-
-
-def conv2(I,k):
- return sp.ndimage.convolve1d(sp.ndimage.convolve1d(I,k,axis=1),k,axis=0)
-
-def conv2n(I,k):
- res=np.zeros_like(I)
- for i in range(I.shape[2]):
- res[:,:,i]=conv2(I[:,:,i],k)
- return res
-
-
-def get_1Dkernel(reg,thr=1e-16,wmax=1024):
- w=max(min(wmax,2*int((-np.log(thr)*reg)**(.5))),3)
- x=np.arange(w,dtype=np.float64)
- return np.exp(-((x-w/2)**2)/reg)
-
-thr=1e-16
-reg=1e0
-
-k=get_1Dkernel(reg)
-pl.figure(2)
-pl.plot(k)
-
-I05=conv2(I0,k)
-
-pl.figure(1)
-pl.subplot(2,2,1)
-pl.imshow(I0)
-pl.subplot(2,2,2)
-pl.imshow(I05)
-
-#%%
-
-G=ot.emd(i0,i1,M)
-r0=np.sum(M*G)
-
-reg=1e-1
-Gs=ot.bregman.sinkhorn_knopp(i0,i1,M,reg=reg)
-rs=np.sum(M*Gs)
-
-#%%
-
-def mylog(u):
- tmp=np.log(u)
- tmp[np.isnan(tmp)]=0
- return tmp
-
-def sinkhorn_conv(a,b, reg, numItermax = 1000, stopThr=1e-9, verbose=False, log=False,**kwargs):
-
-
- a=np.asarray(a,dtype=np.float64)
- b=np.asarray(b,dtype=np.float64)
-
-
- if len(b.shape)>2:
- nbb=b.shape[2]
- a=a[:,:,np.newaxis]
- else:
- nbb=0
-
-
- if log:
- log={'err':[]}
-
- # we assume that no distances are null except those of the diagonal of distances
- if nbb:
- u = np.ones((a.shape[0],a.shape[1],nbb))/(np.prod(a.shape[:2]))
- v = np.ones((a.shape[0],a.shape[1],nbb))/(np.prod(b.shape[:2]))
- a0=1.0/(np.prod(b.shape[:2]))
- else:
- u = np.ones((a.shape[0],a.shape[1]))/(np.prod(a.shape[:2]))
- v = np.ones((a.shape[0],a.shape[1]))/(np.prod(b.shape[:2]))
- a0=1.0/(np.prod(b.shape[:2]))
-
-
- k=get_1Dkernel(reg)
-
- if nbb:
- K=lambda I: conv2n(I,k)
- else:
- K=lambda I: conv2(I,k)
-
- cpt = 0
- err=1
- while (err>stopThr and cpt<numItermax):
- uprev = u
- vprev = v
-
- v = np.divide(b, K(u))
- u = np.divide(a, K(v))
-
- if (np.any(np.isnan(u)) or np.any(np.isnan(v))
- or np.any(np.isinf(u)) or np.any(np.isinf(v))):
- # we have reached the machine precision
- # come back to previous solution and quit loop
- print('Warning: numerical errors at iteration', cpt)
- u = uprev
- v = vprev
- break
- if cpt%10==0:
- # we can speed up the process by checking for the error only all the 10th iterations
-
- err = np.sum((u-uprev)**2)/np.sum((u)**2)+np.sum((v-vprev)**2)/np.sum((v)**2)
-
- if log:
- log['err'].append(err)
-
- if verbose:
- if cpt%200 ==0:
- print('{:5s}|{:12s}'.format('It.','Err')+'\n'+'-'*19)
- print('{:5d}|{:8e}|'.format(cpt,err))
- cpt = cpt +1
- if log:
- log['u']=u
- log['v']=v
-
- if nbb: #return only loss
- res=np.zeros((nbb))
- for i in range(nbb):
- res[i]=np.sum(u[:,i].reshape((-1,1))*K*v[:,i].reshape((1,-1))*M)
- if log:
- return res,log
- else:
- return res
-
- else: # return OT matrix
- res=reg*a0*np.sum(a*mylog(u+(u==0))+b*mylog(v+(v==0)))
- if log:
-
- return res,log
- else:
- return res
-
-reg=1e0
-r,log=sinkhorn_conv(I0,I1,reg,verbose=True,log=True)
-a=I0
-b=I1
-u=log['u']
-v=log['v']
-#%% barycenter interpolation
diff --git a/docs/source/auto_examples/plot_OT_conv.rst b/docs/source/auto_examples/plot_OT_conv.rst
deleted file mode 100644
index 039bbdb..0000000
--- a/docs/source/auto_examples/plot_OT_conv.rst
+++ /dev/null
@@ -1,241 +0,0 @@
-
-
-.. _sphx_glr_auto_examples_plot_OT_conv.py:
-
-
-==============================
-1D Wasserstein barycenter demo
-==============================
-
-
-@author: rflamary
-
-
-
-
-.. code-block:: pytb
-
- Traceback (most recent call last):
- File "/home/rflamary/.local/lib/python2.7/site-packages/sphinx_gallery/gen_rst.py", line 518, in execute_code_block
- exec(code_block, example_globals)
- File "<string>", line 86, in <module>
- TypeError: unsupported operand type(s) for *: 'float' and 'Mock'
-
-
-
-
-
-.. code-block:: python
-
-
- import numpy as np
- import matplotlib.pylab as pl
- import ot
- from mpl_toolkits.mplot3d import Axes3D #necessary for 3d plot even if not used
- import scipy as sp
- import scipy.signal as sps
- #%% parameters
-
- n=10 # nb bins
-
- # bin positions
- x=np.arange(n,dtype=np.float64)
-
- xx,yy=np.meshgrid(x,x)
-
-
- xpos=np.hstack((xx.reshape(-1,1),yy.reshape(-1,1)))
-
- M=ot.dist(xpos)
-
-
- I0=((xx-5)**2+(yy-5)**2<3**2)*1.0
- I1=((xx-7)**2+(yy-7)**2<3**2)*1.0
-
- I0/=I0.sum()
- I1/=I1.sum()
-
- i0=I0.ravel()
- i1=I1.ravel()
-
- M=M[i0>0,:][:,i1>0].copy()
- i0=i0[i0>0]
- i1=i1[i1>0]
- Itot=np.concatenate((I0[:,:,np.newaxis],I1[:,:,np.newaxis]),2)
-
-
- #%% plot the distributions
-
- pl.figure(1)
- pl.subplot(2,2,1)
- pl.imshow(I0)
- pl.subplot(2,2,2)
- pl.imshow(I1)
-
-
- #%% barycenter computation
-
- alpha=0.5 # 0<=alpha<=1
- weights=np.array([1-alpha,alpha])
-
-
- def conv2(I,k):
- return sp.ndimage.convolve1d(sp.ndimage.convolve1d(I,k,axis=1),k,axis=0)
-
- def conv2n(I,k):
- res=np.zeros_like(I)
- for i in range(I.shape[2]):
- res[:,:,i]=conv2(I[:,:,i],k)
- return res
-
-
- def get_1Dkernel(reg,thr=1e-16,wmax=1024):
- w=max(min(wmax,2*int((-np.log(thr)*reg)**(.5))),3)
- x=np.arange(w,dtype=np.float64)
- return np.exp(-((x-w/2)**2)/reg)
-
- thr=1e-16
- reg=1e0
-
- k=get_1Dkernel(reg)
- pl.figure(2)
- pl.plot(k)
-
- I05=conv2(I0,k)
-
- pl.figure(1)
- pl.subplot(2,2,1)
- pl.imshow(I0)
- pl.subplot(2,2,2)
- pl.imshow(I05)
-
- #%%
-
- G=ot.emd(i0,i1,M)
- r0=np.sum(M*G)
-
- reg=1e-1
- Gs=ot.bregman.sinkhorn_knopp(i0,i1,M,reg=reg)
- rs=np.sum(M*Gs)
-
- #%%
-
- def mylog(u):
- tmp=np.log(u)
- tmp[np.isnan(tmp)]=0
- return tmp
-
- def sinkhorn_conv(a,b, reg, numItermax = 1000, stopThr=1e-9, verbose=False, log=False,**kwargs):
-
-
- a=np.asarray(a,dtype=np.float64)
- b=np.asarray(b,dtype=np.float64)
-
-
- if len(b.shape)>2:
- nbb=b.shape[2]
- a=a[:,:,np.newaxis]
- else:
- nbb=0
-
-
- if log:
- log={'err':[]}
-
- # we assume that no distances are null except those of the diagonal of distances
- if nbb:
- u = np.ones((a.shape[0],a.shape[1],nbb))/(np.prod(a.shape[:2]))
- v = np.ones((a.shape[0],a.shape[1],nbb))/(np.prod(b.shape[:2]))
- a0=1.0/(np.prod(b.shape[:2]))
- else:
- u = np.ones((a.shape[0],a.shape[1]))/(np.prod(a.shape[:2]))
- v = np.ones((a.shape[0],a.shape[1]))/(np.prod(b.shape[:2]))
- a0=1.0/(np.prod(b.shape[:2]))
-
-
- k=get_1Dkernel(reg)
-
- if nbb:
- K=lambda I: conv2n(I,k)
- else:
- K=lambda I: conv2(I,k)
-
- cpt = 0
- err=1
- while (err>stopThr and cpt<numItermax):
- uprev = u
- vprev = v
-
- v = np.divide(b, K(u))
- u = np.divide(a, K(v))
-
- if (np.any(np.isnan(u)) or np.any(np.isnan(v))
- or np.any(np.isinf(u)) or np.any(np.isinf(v))):
- # we have reached the machine precision
- # come back to previous solution and quit loop
- print('Warning: numerical errors at iteration', cpt)
- u = uprev
- v = vprev
- break
- if cpt%10==0:
- # we can speed up the process by checking for the error only all the 10th iterations
-
- err = np.sum((u-uprev)**2)/np.sum((u)**2)+np.sum((v-vprev)**2)/np.sum((v)**2)
-
- if log:
- log['err'].append(err)
-
- if verbose:
- if cpt%200 ==0:
- print('{:5s}|{:12s}'.format('It.','Err')+'\n'+'-'*19)
- print('{:5d}|{:8e}|'.format(cpt,err))
- cpt = cpt +1
- if log:
- log['u']=u
- log['v']=v
-
- if nbb: #return only loss
- res=np.zeros((nbb))
- for i in range(nbb):
- res[i]=np.sum(u[:,i].reshape((-1,1))*K*v[:,i].reshape((1,-1))*M)
- if log:
- return res,log
- else:
- return res
-
- else: # return OT matrix
- res=reg*a0*np.sum(a*mylog(u+(u==0))+b*mylog(v+(v==0)))
- if log:
-
- return res,log
- else:
- return res
-
- reg=1e0
- r,log=sinkhorn_conv(I0,I1,reg,verbose=True,log=True)
- a=I0
- b=I1
- u=log['u']
- v=log['v']
- #%% barycenter interpolation
-
-**Total running time of the script:** ( 0 minutes 0.000 seconds)
-
-
-
-.. container:: sphx-glr-footer
-
-
- .. container:: sphx-glr-download
-
- :download:`Download Python source code: plot_OT_conv.py <plot_OT_conv.py>`
-
-
-
- .. container:: sphx-glr-download
-
- :download:`Download Jupyter notebook: plot_OT_conv.ipynb <plot_OT_conv.ipynb>`
-
-.. rst-class:: sphx-glr-signature
-
- `Generated by Sphinx-Gallery <http://sphinx-gallery.readthedocs.io>`_
diff --git a/docs/source/auto_examples/plot_WDA.ipynb b/docs/source/auto_examples/plot_WDA.ipynb
index 408a605..1661c53 100644
--- a/docs/source/auto_examples/plot_WDA.ipynb
+++ b/docs/source/auto_examples/plot_WDA.ipynb
@@ -15,7 +15,7 @@
},
{
"source": [
- "\n# Wasserstein Discriminant Analysis\n\n\n@author: rflamary\n\n"
+ "\n# Wasserstein Discriminant Analysis\n\n\nThis example illustrate the use of WDA as proposed in [11].\n\n\n[11] Flamary, R., Cuturi, M., Courty, N., & Rakotomamonjy, A. (2016).\nWasserstein Discriminant Analysis.\n\n\n"
],
"cell_type": "markdown",
"metadata": {}
@@ -24,7 +24,97 @@
"execution_count": null,
"cell_type": "code",
"source": [
- "import numpy as np\nimport matplotlib.pylab as pl\nimport ot\nfrom ot.datasets import get_1D_gauss as gauss\nfrom ot.dr import wda\n\n\n#%% parameters\n\nn=1000 # nb samples in source and target datasets\nnz=0.2\nxs,ys=ot.datasets.get_data_classif('3gauss',n,nz)\nxt,yt=ot.datasets.get_data_classif('3gauss',n,nz)\n\nnbnoise=8\n\nxs=np.hstack((xs,np.random.randn(n,nbnoise)))\nxt=np.hstack((xt,np.random.randn(n,nbnoise)))\n\n#%% plot samples\n\npl.figure(1)\n\n\npl.scatter(xt[:,0],xt[:,1],c=ys,marker='+',label='Source samples')\npl.legend(loc=0)\npl.title('Discriminant dimensions')\n\n\n#%% plot distributions and loss matrix\np=2\nreg=1\nk=10\nmaxiter=100\n\nP,proj = wda(xs,ys,p,reg,k,maxiter=maxiter)\n\n#%% plot samples\n\nxsp=proj(xs)\nxtp=proj(xt)\n\npl.figure(1,(10,5))\n\npl.subplot(1,2,1)\npl.scatter(xsp[:,0],xsp[:,1],c=ys,marker='+',label='Projected samples')\npl.legend(loc=0)\npl.title('Projected training samples')\n\n\npl.subplot(1,2,2)\npl.scatter(xtp[:,0],xtp[:,1],c=ys,marker='+',label='Projected samples')\npl.legend(loc=0)\npl.title('Projected test samples')"
+ "# Author: Remi Flamary <remi.flamary@unice.fr>\n#\n# License: MIT License\n\nimport numpy as np\nimport matplotlib.pylab as pl\n\nfrom ot.dr import wda, fda"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Generate data\n-------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "#%% parameters\n\nn = 1000 # nb samples in source and target datasets\nnz = 0.2\n\n# generate circle dataset\nt = np.random.rand(n) * 2 * np.pi\nys = np.floor((np.arange(n) * 1.0 / n * 3)) + 1\nxs = np.concatenate(\n (np.cos(t).reshape((-1, 1)), np.sin(t).reshape((-1, 1))), 1)\nxs = xs * ys.reshape(-1, 1) + nz * np.random.randn(n, 2)\n\nt = np.random.rand(n) * 2 * np.pi\nyt = np.floor((np.arange(n) * 1.0 / n * 3)) + 1\nxt = np.concatenate(\n (np.cos(t).reshape((-1, 1)), np.sin(t).reshape((-1, 1))), 1)\nxt = xt * yt.reshape(-1, 1) + nz * np.random.randn(n, 2)\n\nnbnoise = 8\n\nxs = np.hstack((xs, np.random.randn(n, nbnoise)))\nxt = np.hstack((xt, np.random.randn(n, nbnoise)))"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Plot data\n---------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "#%% plot samples\npl.figure(1, figsize=(6.4, 3.5))\n\npl.subplot(1, 2, 1)\npl.scatter(xt[:, 0], xt[:, 1], c=ys, marker='+', label='Source samples')\npl.legend(loc=0)\npl.title('Discriminant dimensions')\n\npl.subplot(1, 2, 2)\npl.scatter(xt[:, 2], xt[:, 3], c=ys, marker='+', label='Source samples')\npl.legend(loc=0)\npl.title('Other dimensions')\npl.tight_layout()"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Compute Fisher Discriminant Analysis\n------------------------------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "#%% Compute FDA\np = 2\n\nPfda, projfda = fda(xs, ys, p)"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Compute Wasserstein Discriminant Analysis\n-----------------------------------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "#%% Compute WDA\np = 2\nreg = 1e0\nk = 10\nmaxiter = 100\n\nPwda, projwda = wda(xs, ys, p, reg, k, maxiter=maxiter)"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Plot 2D projections\n-------------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "#%% plot samples\n\nxsp = projfda(xs)\nxtp = projfda(xt)\n\nxspw = projwda(xs)\nxtpw = projwda(xt)\n\npl.figure(2)\n\npl.subplot(2, 2, 1)\npl.scatter(xsp[:, 0], xsp[:, 1], c=ys, marker='+', label='Projected samples')\npl.legend(loc=0)\npl.title('Projected training samples FDA')\n\npl.subplot(2, 2, 2)\npl.scatter(xtp[:, 0], xtp[:, 1], c=ys, marker='+', label='Projected samples')\npl.legend(loc=0)\npl.title('Projected test samples FDA')\n\npl.subplot(2, 2, 3)\npl.scatter(xspw[:, 0], xspw[:, 1], c=ys, marker='+', label='Projected samples')\npl.legend(loc=0)\npl.title('Projected training samples WDA')\n\npl.subplot(2, 2, 4)\npl.scatter(xtpw[:, 0], xtpw[:, 1], c=ys, marker='+', label='Projected samples')\npl.legend(loc=0)\npl.title('Projected test samples WDA')\npl.tight_layout()\n\npl.show()"
],
"outputs": [],
"metadata": {
diff --git a/docs/source/auto_examples/plot_WDA.py b/docs/source/auto_examples/plot_WDA.py
index bbe3888..93cc237 100644
--- a/docs/source/auto_examples/plot_WDA.py
+++ b/docs/source/auto_examples/plot_WDA.py
@@ -4,60 +4,124 @@
Wasserstein Discriminant Analysis
=================================
-@author: rflamary
+This example illustrate the use of WDA as proposed in [11].
+
+
+[11] Flamary, R., Cuturi, M., Courty, N., & Rakotomamonjy, A. (2016).
+Wasserstein Discriminant Analysis.
+
"""
+# Author: Remi Flamary <remi.flamary@unice.fr>
+#
+# License: MIT License
+
import numpy as np
import matplotlib.pylab as pl
-import ot
-from ot.datasets import get_1D_gauss as gauss
-from ot.dr import wda
+from ot.dr import wda, fda
+
+
+##############################################################################
+# Generate data
+# -------------
#%% parameters
-n=1000 # nb samples in source and target datasets
-nz=0.2
-xs,ys=ot.datasets.get_data_classif('3gauss',n,nz)
-xt,yt=ot.datasets.get_data_classif('3gauss',n,nz)
+n = 1000 # nb samples in source and target datasets
+nz = 0.2
-nbnoise=8
+# generate circle dataset
+t = np.random.rand(n) * 2 * np.pi
+ys = np.floor((np.arange(n) * 1.0 / n * 3)) + 1
+xs = np.concatenate(
+ (np.cos(t).reshape((-1, 1)), np.sin(t).reshape((-1, 1))), 1)
+xs = xs * ys.reshape(-1, 1) + nz * np.random.randn(n, 2)
-xs=np.hstack((xs,np.random.randn(n,nbnoise)))
-xt=np.hstack((xt,np.random.randn(n,nbnoise)))
+t = np.random.rand(n) * 2 * np.pi
+yt = np.floor((np.arange(n) * 1.0 / n * 3)) + 1
+xt = np.concatenate(
+ (np.cos(t).reshape((-1, 1)), np.sin(t).reshape((-1, 1))), 1)
+xt = xt * yt.reshape(-1, 1) + nz * np.random.randn(n, 2)
-#%% plot samples
+nbnoise = 8
+
+xs = np.hstack((xs, np.random.randn(n, nbnoise)))
+xt = np.hstack((xt, np.random.randn(n, nbnoise)))
-pl.figure(1)
+##############################################################################
+# Plot data
+# ---------
+#%% plot samples
+pl.figure(1, figsize=(6.4, 3.5))
-pl.scatter(xt[:,0],xt[:,1],c=ys,marker='+',label='Source samples')
+pl.subplot(1, 2, 1)
+pl.scatter(xt[:, 0], xt[:, 1], c=ys, marker='+', label='Source samples')
pl.legend(loc=0)
pl.title('Discriminant dimensions')
+pl.subplot(1, 2, 2)
+pl.scatter(xt[:, 2], xt[:, 3], c=ys, marker='+', label='Source samples')
+pl.legend(loc=0)
+pl.title('Other dimensions')
+pl.tight_layout()
+
+##############################################################################
+# Compute Fisher Discriminant Analysis
+# ------------------------------------
-#%% plot distributions and loss matrix
-p=2
-reg=1
-k=10
-maxiter=100
+#%% Compute FDA
+p = 2
-P,proj = wda(xs,ys,p,reg,k,maxiter=maxiter)
+Pfda, projfda = fda(xs, ys, p)
+
+##############################################################################
+# Compute Wasserstein Discriminant Analysis
+# -----------------------------------------
+
+#%% Compute WDA
+p = 2
+reg = 1e0
+k = 10
+maxiter = 100
+
+Pwda, projwda = wda(xs, ys, p, reg, k, maxiter=maxiter)
+
+
+##############################################################################
+# Plot 2D projections
+# -------------------
#%% plot samples
-xsp=proj(xs)
-xtp=proj(xt)
+xsp = projfda(xs)
+xtp = projfda(xt)
-pl.figure(1,(10,5))
+xspw = projwda(xs)
+xtpw = projwda(xt)
-pl.subplot(1,2,1)
-pl.scatter(xsp[:,0],xsp[:,1],c=ys,marker='+',label='Projected samples')
+pl.figure(2)
+
+pl.subplot(2, 2, 1)
+pl.scatter(xsp[:, 0], xsp[:, 1], c=ys, marker='+', label='Projected samples')
pl.legend(loc=0)
-pl.title('Projected training samples')
+pl.title('Projected training samples FDA')
+pl.subplot(2, 2, 2)
+pl.scatter(xtp[:, 0], xtp[:, 1], c=ys, marker='+', label='Projected samples')
+pl.legend(loc=0)
+pl.title('Projected test samples FDA')
-pl.subplot(1,2,2)
-pl.scatter(xtp[:,0],xtp[:,1],c=ys,marker='+',label='Projected samples')
+pl.subplot(2, 2, 3)
+pl.scatter(xspw[:, 0], xspw[:, 1], c=ys, marker='+', label='Projected samples')
pl.legend(loc=0)
-pl.title('Projected test samples')
+pl.title('Projected training samples WDA')
+
+pl.subplot(2, 2, 4)
+pl.scatter(xtpw[:, 0], xtpw[:, 1], c=ys, marker='+', label='Projected samples')
+pl.legend(loc=0)
+pl.title('Projected test samples WDA')
+pl.tight_layout()
+
+pl.show()
diff --git a/docs/source/auto_examples/plot_WDA.rst b/docs/source/auto_examples/plot_WDA.rst
index 540555d..64ddb47 100644
--- a/docs/source/auto_examples/plot_WDA.rst
+++ b/docs/source/auto_examples/plot_WDA.rst
@@ -7,108 +7,216 @@
Wasserstein Discriminant Analysis
=================================
-@author: rflamary
+This example illustrate the use of WDA as proposed in [11].
+[11] Flamary, R., Cuturi, M., Courty, N., & Rakotomamonjy, A. (2016).
+Wasserstein Discriminant Analysis.
-.. image:: /auto_examples/images/sphx_glr_plot_WDA_001.png
- :align: center
-.. rst-class:: sphx-glr-script-out
+.. code-block:: python
- Out::
- Compiling cost function...
- Computing gradient of cost function...
- iter cost val grad. norm
- 1 +5.2427396265941129e-01 8.16627951e-01
- 2 +1.7904850059627236e-01 1.91366819e-01
- 3 +1.6985797253002377e-01 1.70940682e-01
- 4 +1.3903474972292729e-01 1.28606342e-01
- 5 +7.4961734618782416e-02 6.41973980e-02
- 6 +7.1900245222486239e-02 4.25693592e-02
- 7 +7.0472023318269614e-02 2.34599232e-02
- 8 +6.9917568641317152e-02 5.66542766e-03
- 9 +6.9885086242452696e-02 4.05756115e-04
- 10 +6.9884967432653489e-02 2.16836017e-04
- 11 +6.9884923649884148e-02 5.74961622e-05
- 12 +6.9884921818258436e-02 3.83257203e-05
- 13 +6.9884920459612282e-02 9.97486224e-06
- 14 +6.9884920414414409e-02 7.33567875e-06
- 15 +6.9884920388431387e-02 5.23889187e-06
- 16 +6.9884920385183902e-02 4.91959084e-06
- 17 +6.9884920373983223e-02 3.56451669e-06
- 18 +6.9884920369701245e-02 2.88858709e-06
- 19 +6.9884920361621208e-02 1.82294279e-07
- Terminated - min grad norm reached after 19 iterations, 9.65 seconds.
+ # Author: Remi Flamary <remi.flamary@unice.fr>
+ #
+ # License: MIT License
+ import numpy as np
+ import matplotlib.pylab as pl
+ from ot.dr import wda, fda
-|
-.. code-block:: python
- import numpy as np
- import matplotlib.pylab as pl
- import ot
- from ot.datasets import get_1D_gauss as gauss
- from ot.dr import wda
+
+
+Generate data
+-------------
+
+
+
+.. code-block:: python
#%% parameters
- n=1000 # nb samples in source and target datasets
- nz=0.2
- xs,ys=ot.datasets.get_data_classif('3gauss',n,nz)
- xt,yt=ot.datasets.get_data_classif('3gauss',n,nz)
+ n = 1000 # nb samples in source and target datasets
+ nz = 0.2
+
+ # generate circle dataset
+ t = np.random.rand(n) * 2 * np.pi
+ ys = np.floor((np.arange(n) * 1.0 / n * 3)) + 1
+ xs = np.concatenate(
+ (np.cos(t).reshape((-1, 1)), np.sin(t).reshape((-1, 1))), 1)
+ xs = xs * ys.reshape(-1, 1) + nz * np.random.randn(n, 2)
+
+ t = np.random.rand(n) * 2 * np.pi
+ yt = np.floor((np.arange(n) * 1.0 / n * 3)) + 1
+ xt = np.concatenate(
+ (np.cos(t).reshape((-1, 1)), np.sin(t).reshape((-1, 1))), 1)
+ xt = xt * yt.reshape(-1, 1) + nz * np.random.randn(n, 2)
+
+ nbnoise = 8
+
+ xs = np.hstack((xs, np.random.randn(n, nbnoise)))
+ xt = np.hstack((xt, np.random.randn(n, nbnoise)))
- nbnoise=8
- xs=np.hstack((xs,np.random.randn(n,nbnoise)))
- xt=np.hstack((xt,np.random.randn(n,nbnoise)))
- #%% plot samples
- pl.figure(1)
- pl.scatter(xt[:,0],xt[:,1],c=ys,marker='+',label='Source samples')
+
+Plot data
+---------
+
+
+
+.. code-block:: python
+
+
+ #%% plot samples
+ pl.figure(1, figsize=(6.4, 3.5))
+
+ pl.subplot(1, 2, 1)
+ pl.scatter(xt[:, 0], xt[:, 1], c=ys, marker='+', label='Source samples')
pl.legend(loc=0)
pl.title('Discriminant dimensions')
+ pl.subplot(1, 2, 2)
+ pl.scatter(xt[:, 2], xt[:, 3], c=ys, marker='+', label='Source samples')
+ pl.legend(loc=0)
+ pl.title('Other dimensions')
+ pl.tight_layout()
+
+
+
+
+.. image:: /auto_examples/images/sphx_glr_plot_WDA_001.png
+ :align: center
+
+
+
+
+Compute Fisher Discriminant Analysis
+------------------------------------
+
+
+
+.. code-block:: python
- #%% plot distributions and loss matrix
- p=2
- reg=1
- k=10
- maxiter=100
- P,proj = wda(xs,ys,p,reg,k,maxiter=maxiter)
+ #%% Compute FDA
+ p = 2
+
+ Pfda, projfda = fda(xs, ys, p)
+
+
+
+
+
+
+
+Compute Wasserstein Discriminant Analysis
+-----------------------------------------
+
+
+
+.. code-block:: python
+
+
+ #%% Compute WDA
+ p = 2
+ reg = 1e0
+ k = 10
+ maxiter = 100
+
+ Pwda, projwda = wda(xs, ys, p, reg, k, maxiter=maxiter)
+
+
+
+
+
+
+.. rst-class:: sphx-glr-script-out
+
+ Out::
+
+ Compiling cost function...
+ Computing gradient of cost function...
+ iter cost val grad. norm
+ 1 +5.4993226050368416e-01 5.18285173e-01
+ 2 +3.4883000507542844e-01 1.96795818e-01
+ 3 +2.9841234004693890e-01 2.33029475e-01
+ 4 +2.3976476757548179e-01 1.38593951e-01
+ 5 +2.3614468346177828e-01 1.19615394e-01
+ 6 +2.2586536502789240e-01 4.82430685e-02
+ 7 +2.2451030967794622e-01 2.56564039e-02
+ 8 +2.2421446331083625e-01 1.47932578e-02
+ 9 +2.2407441444450052e-01 1.12040327e-03
+ 10 +2.2407365923337522e-01 3.78899763e-04
+ 11 +2.2407356874011675e-01 1.79740810e-05
+ 12 +2.2407356862959993e-01 1.25643005e-05
+ 13 +2.2407356853043561e-01 1.40415001e-06
+ 14 +2.2407356852925220e-01 3.41183585e-07
+ Terminated - min grad norm reached after 14 iterations, 6.78 seconds.
+
+
+Plot 2D projections
+-------------------
+
+
+
+.. code-block:: python
+
#%% plot samples
- xsp=proj(xs)
- xtp=proj(xt)
+ xsp = projfda(xs)
+ xtp = projfda(xt)
+
+ xspw = projwda(xs)
+ xtpw = projwda(xt)
- pl.figure(1,(10,5))
+ pl.figure(2)
- pl.subplot(1,2,1)
- pl.scatter(xsp[:,0],xsp[:,1],c=ys,marker='+',label='Projected samples')
+ pl.subplot(2, 2, 1)
+ pl.scatter(xsp[:, 0], xsp[:, 1], c=ys, marker='+', label='Projected samples')
pl.legend(loc=0)
- pl.title('Projected training samples')
+ pl.title('Projected training samples FDA')
+ pl.subplot(2, 2, 2)
+ pl.scatter(xtp[:, 0], xtp[:, 1], c=ys, marker='+', label='Projected samples')
+ pl.legend(loc=0)
+ pl.title('Projected test samples FDA')
+
+ pl.subplot(2, 2, 3)
+ pl.scatter(xspw[:, 0], xspw[:, 1], c=ys, marker='+', label='Projected samples')
+ pl.legend(loc=0)
+ pl.title('Projected training samples WDA')
- pl.subplot(1,2,2)
- pl.scatter(xtp[:,0],xtp[:,1],c=ys,marker='+',label='Projected samples')
+ pl.subplot(2, 2, 4)
+ pl.scatter(xtpw[:, 0], xtpw[:, 1], c=ys, marker='+', label='Projected samples')
pl.legend(loc=0)
- pl.title('Projected test samples')
+ pl.title('Projected test samples WDA')
+ pl.tight_layout()
+
+ pl.show()
+
+
+
+.. image:: /auto_examples/images/sphx_glr_plot_WDA_003.png
+ :align: center
+
+
+
-**Total running time of the script:** ( 0 minutes 16.902 seconds)
+**Total running time of the script:** ( 0 minutes 7.637 seconds)
@@ -127,4 +235,4 @@ Wasserstein Discriminant Analysis
.. rst-class:: sphx-glr-signature
- `Generated by Sphinx-Gallery <http://sphinx-gallery.readthedocs.io>`_
+ `Generated by Sphinx-Gallery <https://sphinx-gallery.readthedocs.io>`_
diff --git a/docs/source/auto_examples/plot_barycenter_1D.ipynb b/docs/source/auto_examples/plot_barycenter_1D.ipynb
index 36f3975..a19e0fd 100644
--- a/docs/source/auto_examples/plot_barycenter_1D.ipynb
+++ b/docs/source/auto_examples/plot_barycenter_1D.ipynb
@@ -15,7 +15,7 @@
},
{
"source": [
- "\n# 1D Wasserstein barycenter demo\n\n\n\n@author: rflamary\n\n"
+ "\n# 1D Wasserstein barycenter demo\n\n\nThis example illustrates the computation of regularized Wassersyein Barycenter\nas proposed in [3].\n\n\n[3] Benamou, J. D., Carlier, G., Cuturi, M., Nenna, L., & Peyr\u00e9, G. (2015).\nIterative Bregman projections for regularized transportation problems\nSIAM Journal on Scientific Computing, 37(2), A1111-A1138.\n\n\n"
],
"cell_type": "markdown",
"metadata": {}
@@ -24,7 +24,79 @@
"execution_count": null,
"cell_type": "code",
"source": [
- "import numpy as np\nimport matplotlib.pylab as pl\nimport ot\nfrom mpl_toolkits.mplot3d import Axes3D #necessary for 3d plot even if not used\nfrom matplotlib.collections import PolyCollection\n\n\n#%% parameters\n\nn=100 # nb bins\n\n# bin positions\nx=np.arange(n,dtype=np.float64)\n\n# Gaussian distributions\na1=ot.datasets.get_1D_gauss(n,m=20,s=5) # m= mean, s= std\na2=ot.datasets.get_1D_gauss(n,m=60,s=8)\n\n# creating matrix A containing all distributions\nA=np.vstack((a1,a2)).T\nnbd=A.shape[1]\n\n# loss matrix + normalization\nM=ot.utils.dist0(n)\nM/=M.max()\n\n#%% plot the distributions\n\npl.figure(1)\nfor i in range(nbd):\n pl.plot(x,A[:,i])\npl.title('Distributions')\n\n#%% barycenter computation\n\nalpha=0.2 # 0<=alpha<=1\nweights=np.array([1-alpha,alpha])\n\n# l2bary\nbary_l2=A.dot(weights)\n\n# wasserstein\nreg=1e-3\nbary_wass=ot.bregman.barycenter(A,M,reg,weights)\n\npl.figure(2)\npl.clf()\npl.subplot(2,1,1)\nfor i in range(nbd):\n pl.plot(x,A[:,i])\npl.title('Distributions')\n\npl.subplot(2,1,2)\npl.plot(x,bary_l2,'r',label='l2')\npl.plot(x,bary_wass,'g',label='Wasserstein')\npl.legend()\npl.title('Barycenters')\n\n\n#%% barycenter interpolation\n\nnbalpha=11\nalphalist=np.linspace(0,1,nbalpha)\n\n\nB_l2=np.zeros((n,nbalpha))\n\nB_wass=np.copy(B_l2)\n\nfor i in range(0,nbalpha):\n alpha=alphalist[i]\n weights=np.array([1-alpha,alpha])\n B_l2[:,i]=A.dot(weights)\n B_wass[:,i]=ot.bregman.barycenter(A,M,reg,weights)\n\n#%% plot interpolation\n\npl.figure(3,(10,5))\n\n#pl.subplot(1,2,1)\ncmap=pl.cm.get_cmap('viridis')\nverts = []\nzs = alphalist\nfor i,z in enumerate(zs):\n ys = B_l2[:,i]\n verts.append(list(zip(x, ys)))\n\nax = pl.gcf().gca(projection='3d')\n\npoly = PolyCollection(verts,facecolors=[cmap(a) for a in alphalist])\npoly.set_alpha(0.7)\nax.add_collection3d(poly, zs=zs, zdir='y')\n\nax.set_xlabel('x')\nax.set_xlim3d(0, n)\nax.set_ylabel('$\\\\alpha$')\nax.set_ylim3d(0,1)\nax.set_zlabel('')\nax.set_zlim3d(0, B_l2.max()*1.01)\npl.title('Barycenter interpolation with l2')\n\npl.show()\n\npl.figure(4,(10,5))\n\n#pl.subplot(1,2,1)\ncmap=pl.cm.get_cmap('viridis')\nverts = []\nzs = alphalist\nfor i,z in enumerate(zs):\n ys = B_wass[:,i]\n verts.append(list(zip(x, ys)))\n\nax = pl.gcf().gca(projection='3d')\n\npoly = PolyCollection(verts,facecolors=[cmap(a) for a in alphalist])\npoly.set_alpha(0.7)\nax.add_collection3d(poly, zs=zs, zdir='y')\n\nax.set_xlabel('x')\nax.set_xlim3d(0, n)\nax.set_ylabel('$\\\\alpha$')\nax.set_ylim3d(0,1)\nax.set_zlabel('')\nax.set_zlim3d(0, B_l2.max()*1.01)\npl.title('Barycenter interpolation with Wasserstein')\n\npl.show()"
+ "# Author: Remi Flamary <remi.flamary@unice.fr>\n#\n# License: MIT License\n\nimport numpy as np\nimport matplotlib.pylab as pl\nimport ot\n# necessary for 3d plot even if not used\nfrom mpl_toolkits.mplot3d import Axes3D # noqa\nfrom matplotlib.collections import PolyCollection"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Generate data\n-------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "#%% parameters\n\nn = 100 # nb bins\n\n# bin positions\nx = np.arange(n, dtype=np.float64)\n\n# Gaussian distributions\na1 = ot.datasets.get_1D_gauss(n, m=20, s=5) # m= mean, s= std\na2 = ot.datasets.get_1D_gauss(n, m=60, s=8)\n\n# creating matrix A containing all distributions\nA = np.vstack((a1, a2)).T\nn_distributions = A.shape[1]\n\n# loss matrix + normalization\nM = ot.utils.dist0(n)\nM /= M.max()"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Plot data\n---------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "#%% plot the distributions\n\npl.figure(1, figsize=(6.4, 3))\nfor i in range(n_distributions):\n pl.plot(x, A[:, i])\npl.title('Distributions')\npl.tight_layout()"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Barycenter computation\n----------------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "#%% barycenter computation\n\nalpha = 0.2 # 0<=alpha<=1\nweights = np.array([1 - alpha, alpha])\n\n# l2bary\nbary_l2 = A.dot(weights)\n\n# wasserstein\nreg = 1e-3\nbary_wass = ot.bregman.barycenter(A, M, reg, weights)\n\npl.figure(2)\npl.clf()\npl.subplot(2, 1, 1)\nfor i in range(n_distributions):\n pl.plot(x, A[:, i])\npl.title('Distributions')\n\npl.subplot(2, 1, 2)\npl.plot(x, bary_l2, 'r', label='l2')\npl.plot(x, bary_wass, 'g', label='Wasserstein')\npl.legend()\npl.title('Barycenters')\npl.tight_layout()"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Barycentric interpolation\n-------------------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "#%% barycenter interpolation\n\nn_alpha = 11\nalpha_list = np.linspace(0, 1, n_alpha)\n\n\nB_l2 = np.zeros((n, n_alpha))\n\nB_wass = np.copy(B_l2)\n\nfor i in range(0, n_alpha):\n alpha = alpha_list[i]\n weights = np.array([1 - alpha, alpha])\n B_l2[:, i] = A.dot(weights)\n B_wass[:, i] = ot.bregman.barycenter(A, M, reg, weights)\n\n#%% plot interpolation\n\npl.figure(3)\n\ncmap = pl.cm.get_cmap('viridis')\nverts = []\nzs = alpha_list\nfor i, z in enumerate(zs):\n ys = B_l2[:, i]\n verts.append(list(zip(x, ys)))\n\nax = pl.gcf().gca(projection='3d')\n\npoly = PolyCollection(verts, facecolors=[cmap(a) for a in alpha_list])\npoly.set_alpha(0.7)\nax.add_collection3d(poly, zs=zs, zdir='y')\nax.set_xlabel('x')\nax.set_xlim3d(0, n)\nax.set_ylabel('$\\\\alpha$')\nax.set_ylim3d(0, 1)\nax.set_zlabel('')\nax.set_zlim3d(0, B_l2.max() * 1.01)\npl.title('Barycenter interpolation with l2')\npl.tight_layout()\n\npl.figure(4)\ncmap = pl.cm.get_cmap('viridis')\nverts = []\nzs = alpha_list\nfor i, z in enumerate(zs):\n ys = B_wass[:, i]\n verts.append(list(zip(x, ys)))\n\nax = pl.gcf().gca(projection='3d')\n\npoly = PolyCollection(verts, facecolors=[cmap(a) for a in alpha_list])\npoly.set_alpha(0.7)\nax.add_collection3d(poly, zs=zs, zdir='y')\nax.set_xlabel('x')\nax.set_xlim3d(0, n)\nax.set_ylabel('$\\\\alpha$')\nax.set_ylim3d(0, 1)\nax.set_zlabel('')\nax.set_zlim3d(0, B_l2.max() * 1.01)\npl.title('Barycenter interpolation with Wasserstein')\npl.tight_layout()\n\npl.show()"
],
"outputs": [],
"metadata": {
diff --git a/docs/source/auto_examples/plot_barycenter_1D.py b/docs/source/auto_examples/plot_barycenter_1D.py
index 30eecbf..620936b 100644
--- a/docs/source/auto_examples/plot_barycenter_1D.py
+++ b/docs/source/auto_examples/plot_barycenter_1D.py
@@ -4,135 +4,157 @@
1D Wasserstein barycenter demo
==============================
+This example illustrates the computation of regularized Wassersyein Barycenter
+as proposed in [3].
+
+
+[3] Benamou, J. D., Carlier, G., Cuturi, M., Nenna, L., & Peyré, G. (2015).
+Iterative Bregman projections for regularized transportation problems
+SIAM Journal on Scientific Computing, 37(2), A1111-A1138.
-@author: rflamary
"""
+# Author: Remi Flamary <remi.flamary@unice.fr>
+#
+# License: MIT License
+
import numpy as np
import matplotlib.pylab as pl
import ot
-from mpl_toolkits.mplot3d import Axes3D #necessary for 3d plot even if not used
+# necessary for 3d plot even if not used
+from mpl_toolkits.mplot3d import Axes3D # noqa
from matplotlib.collections import PolyCollection
+##############################################################################
+# Generate data
+# -------------
#%% parameters
-n=100 # nb bins
+n = 100 # nb bins
# bin positions
-x=np.arange(n,dtype=np.float64)
+x = np.arange(n, dtype=np.float64)
# Gaussian distributions
-a1=ot.datasets.get_1D_gauss(n,m=20,s=5) # m= mean, s= std
-a2=ot.datasets.get_1D_gauss(n,m=60,s=8)
+a1 = ot.datasets.get_1D_gauss(n, m=20, s=5) # m= mean, s= std
+a2 = ot.datasets.get_1D_gauss(n, m=60, s=8)
# creating matrix A containing all distributions
-A=np.vstack((a1,a2)).T
-nbd=A.shape[1]
+A = np.vstack((a1, a2)).T
+n_distributions = A.shape[1]
# loss matrix + normalization
-M=ot.utils.dist0(n)
-M/=M.max()
+M = ot.utils.dist0(n)
+M /= M.max()
+
+##############################################################################
+# Plot data
+# ---------
#%% plot the distributions
-pl.figure(1)
-for i in range(nbd):
- pl.plot(x,A[:,i])
+pl.figure(1, figsize=(6.4, 3))
+for i in range(n_distributions):
+ pl.plot(x, A[:, i])
pl.title('Distributions')
+pl.tight_layout()
+
+##############################################################################
+# Barycenter computation
+# ----------------------
#%% barycenter computation
-alpha=0.2 # 0<=alpha<=1
-weights=np.array([1-alpha,alpha])
+alpha = 0.2 # 0<=alpha<=1
+weights = np.array([1 - alpha, alpha])
# l2bary
-bary_l2=A.dot(weights)
+bary_l2 = A.dot(weights)
# wasserstein
-reg=1e-3
-bary_wass=ot.bregman.barycenter(A,M,reg,weights)
+reg = 1e-3
+bary_wass = ot.bregman.barycenter(A, M, reg, weights)
pl.figure(2)
pl.clf()
-pl.subplot(2,1,1)
-for i in range(nbd):
- pl.plot(x,A[:,i])
+pl.subplot(2, 1, 1)
+for i in range(n_distributions):
+ pl.plot(x, A[:, i])
pl.title('Distributions')
-pl.subplot(2,1,2)
-pl.plot(x,bary_l2,'r',label='l2')
-pl.plot(x,bary_wass,'g',label='Wasserstein')
+pl.subplot(2, 1, 2)
+pl.plot(x, bary_l2, 'r', label='l2')
+pl.plot(x, bary_wass, 'g', label='Wasserstein')
pl.legend()
pl.title('Barycenters')
+pl.tight_layout()
+##############################################################################
+# Barycentric interpolation
+# -------------------------
#%% barycenter interpolation
-nbalpha=11
-alphalist=np.linspace(0,1,nbalpha)
+n_alpha = 11
+alpha_list = np.linspace(0, 1, n_alpha)
-B_l2=np.zeros((n,nbalpha))
+B_l2 = np.zeros((n, n_alpha))
-B_wass=np.copy(B_l2)
+B_wass = np.copy(B_l2)
-for i in range(0,nbalpha):
- alpha=alphalist[i]
- weights=np.array([1-alpha,alpha])
- B_l2[:,i]=A.dot(weights)
- B_wass[:,i]=ot.bregman.barycenter(A,M,reg,weights)
+for i in range(0, n_alpha):
+ alpha = alpha_list[i]
+ weights = np.array([1 - alpha, alpha])
+ B_l2[:, i] = A.dot(weights)
+ B_wass[:, i] = ot.bregman.barycenter(A, M, reg, weights)
#%% plot interpolation
-pl.figure(3,(10,5))
+pl.figure(3)
-#pl.subplot(1,2,1)
-cmap=pl.cm.get_cmap('viridis')
+cmap = pl.cm.get_cmap('viridis')
verts = []
-zs = alphalist
-for i,z in enumerate(zs):
- ys = B_l2[:,i]
+zs = alpha_list
+for i, z in enumerate(zs):
+ ys = B_l2[:, i]
verts.append(list(zip(x, ys)))
ax = pl.gcf().gca(projection='3d')
-poly = PolyCollection(verts,facecolors=[cmap(a) for a in alphalist])
+poly = PolyCollection(verts, facecolors=[cmap(a) for a in alpha_list])
poly.set_alpha(0.7)
ax.add_collection3d(poly, zs=zs, zdir='y')
-
ax.set_xlabel('x')
ax.set_xlim3d(0, n)
ax.set_ylabel('$\\alpha$')
-ax.set_ylim3d(0,1)
+ax.set_ylim3d(0, 1)
ax.set_zlabel('')
-ax.set_zlim3d(0, B_l2.max()*1.01)
+ax.set_zlim3d(0, B_l2.max() * 1.01)
pl.title('Barycenter interpolation with l2')
+pl.tight_layout()
-pl.show()
-
-pl.figure(4,(10,5))
-
-#pl.subplot(1,2,1)
-cmap=pl.cm.get_cmap('viridis')
+pl.figure(4)
+cmap = pl.cm.get_cmap('viridis')
verts = []
-zs = alphalist
-for i,z in enumerate(zs):
- ys = B_wass[:,i]
+zs = alpha_list
+for i, z in enumerate(zs):
+ ys = B_wass[:, i]
verts.append(list(zip(x, ys)))
ax = pl.gcf().gca(projection='3d')
-poly = PolyCollection(verts,facecolors=[cmap(a) for a in alphalist])
+poly = PolyCollection(verts, facecolors=[cmap(a) for a in alpha_list])
poly.set_alpha(0.7)
ax.add_collection3d(poly, zs=zs, zdir='y')
-
ax.set_xlabel('x')
ax.set_xlim3d(0, n)
ax.set_ylabel('$\\alpha$')
-ax.set_ylim3d(0,1)
+ax.set_ylim3d(0, 1)
ax.set_zlabel('')
-ax.set_zlim3d(0, B_l2.max()*1.01)
+ax.set_zlim3d(0, B_l2.max() * 1.01)
pl.title('Barycenter interpolation with Wasserstein')
+pl.tight_layout()
-pl.show() \ No newline at end of file
+pl.show()
diff --git a/docs/source/auto_examples/plot_barycenter_1D.rst b/docs/source/auto_examples/plot_barycenter_1D.rst
index 1b15c77..413fae3 100644
--- a/docs/source/auto_examples/plot_barycenter_1D.rst
+++ b/docs/source/auto_examples/plot_barycenter_1D.rst
@@ -7,171 +7,230 @@
1D Wasserstein barycenter demo
==============================
+This example illustrates the computation of regularized Wassersyein Barycenter
+as proposed in [3].
-@author: rflamary
+[3] Benamou, J. D., Carlier, G., Cuturi, M., Nenna, L., & Peyré, G. (2015).
+Iterative Bregman projections for regularized transportation problems
+SIAM Journal on Scientific Computing, 37(2), A1111-A1138.
-.. rst-class:: sphx-glr-horizontal
+.. code-block:: python
- *
- .. image:: /auto_examples/images/sphx_glr_plot_barycenter_1D_001.png
- :scale: 47
+ # Author: Remi Flamary <remi.flamary@unice.fr>
+ #
+ # License: MIT License
- *
+ import numpy as np
+ import matplotlib.pylab as pl
+ import ot
+ # necessary for 3d plot even if not used
+ from mpl_toolkits.mplot3d import Axes3D # noqa
+ from matplotlib.collections import PolyCollection
- .. image:: /auto_examples/images/sphx_glr_plot_barycenter_1D_002.png
- :scale: 47
- *
- .. image:: /auto_examples/images/sphx_glr_plot_barycenter_1D_003.png
- :scale: 47
- *
- .. image:: /auto_examples/images/sphx_glr_plot_barycenter_1D_004.png
- :scale: 47
+Generate data
+-------------
.. code-block:: python
- import numpy as np
- import matplotlib.pylab as pl
- import ot
- from mpl_toolkits.mplot3d import Axes3D #necessary for 3d plot even if not used
- from matplotlib.collections import PolyCollection
-
-
#%% parameters
- n=100 # nb bins
+ n = 100 # nb bins
# bin positions
- x=np.arange(n,dtype=np.float64)
+ x = np.arange(n, dtype=np.float64)
# Gaussian distributions
- a1=ot.datasets.get_1D_gauss(n,m=20,s=5) # m= mean, s= std
- a2=ot.datasets.get_1D_gauss(n,m=60,s=8)
+ a1 = ot.datasets.get_1D_gauss(n, m=20, s=5) # m= mean, s= std
+ a2 = ot.datasets.get_1D_gauss(n, m=60, s=8)
# creating matrix A containing all distributions
- A=np.vstack((a1,a2)).T
- nbd=A.shape[1]
+ A = np.vstack((a1, a2)).T
+ n_distributions = A.shape[1]
# loss matrix + normalization
- M=ot.utils.dist0(n)
- M/=M.max()
+ M = ot.utils.dist0(n)
+ M /= M.max()
+
+
+
+
+
+
+
+Plot data
+---------
+
+
+
+.. code-block:: python
+
#%% plot the distributions
- pl.figure(1)
- for i in range(nbd):
- pl.plot(x,A[:,i])
+ pl.figure(1, figsize=(6.4, 3))
+ for i in range(n_distributions):
+ pl.plot(x, A[:, i])
pl.title('Distributions')
+ pl.tight_layout()
+
+
+
+
+.. image:: /auto_examples/images/sphx_glr_plot_barycenter_1D_001.png
+ :align: center
+
+
+
+
+Barycenter computation
+----------------------
+
+
+
+.. code-block:: python
+
#%% barycenter computation
- alpha=0.2 # 0<=alpha<=1
- weights=np.array([1-alpha,alpha])
+ alpha = 0.2 # 0<=alpha<=1
+ weights = np.array([1 - alpha, alpha])
# l2bary
- bary_l2=A.dot(weights)
+ bary_l2 = A.dot(weights)
# wasserstein
- reg=1e-3
- bary_wass=ot.bregman.barycenter(A,M,reg,weights)
+ reg = 1e-3
+ bary_wass = ot.bregman.barycenter(A, M, reg, weights)
pl.figure(2)
pl.clf()
- pl.subplot(2,1,1)
- for i in range(nbd):
- pl.plot(x,A[:,i])
+ pl.subplot(2, 1, 1)
+ for i in range(n_distributions):
+ pl.plot(x, A[:, i])
pl.title('Distributions')
- pl.subplot(2,1,2)
- pl.plot(x,bary_l2,'r',label='l2')
- pl.plot(x,bary_wass,'g',label='Wasserstein')
+ pl.subplot(2, 1, 2)
+ pl.plot(x, bary_l2, 'r', label='l2')
+ pl.plot(x, bary_wass, 'g', label='Wasserstein')
pl.legend()
pl.title('Barycenters')
+ pl.tight_layout()
+
+
+
+
+.. image:: /auto_examples/images/sphx_glr_plot_barycenter_1D_003.png
+ :align: center
+
+
+
+
+Barycentric interpolation
+-------------------------
+
+
+
+.. code-block:: python
#%% barycenter interpolation
- nbalpha=11
- alphalist=np.linspace(0,1,nbalpha)
+ n_alpha = 11
+ alpha_list = np.linspace(0, 1, n_alpha)
- B_l2=np.zeros((n,nbalpha))
+ B_l2 = np.zeros((n, n_alpha))
- B_wass=np.copy(B_l2)
+ B_wass = np.copy(B_l2)
- for i in range(0,nbalpha):
- alpha=alphalist[i]
- weights=np.array([1-alpha,alpha])
- B_l2[:,i]=A.dot(weights)
- B_wass[:,i]=ot.bregman.barycenter(A,M,reg,weights)
+ for i in range(0, n_alpha):
+ alpha = alpha_list[i]
+ weights = np.array([1 - alpha, alpha])
+ B_l2[:, i] = A.dot(weights)
+ B_wass[:, i] = ot.bregman.barycenter(A, M, reg, weights)
#%% plot interpolation
- pl.figure(3,(10,5))
+ pl.figure(3)
- #pl.subplot(1,2,1)
- cmap=pl.cm.get_cmap('viridis')
+ cmap = pl.cm.get_cmap('viridis')
verts = []
- zs = alphalist
- for i,z in enumerate(zs):
- ys = B_l2[:,i]
+ zs = alpha_list
+ for i, z in enumerate(zs):
+ ys = B_l2[:, i]
verts.append(list(zip(x, ys)))
ax = pl.gcf().gca(projection='3d')
- poly = PolyCollection(verts,facecolors=[cmap(a) for a in alphalist])
+ poly = PolyCollection(verts, facecolors=[cmap(a) for a in alpha_list])
poly.set_alpha(0.7)
ax.add_collection3d(poly, zs=zs, zdir='y')
-
ax.set_xlabel('x')
ax.set_xlim3d(0, n)
ax.set_ylabel('$\\alpha$')
- ax.set_ylim3d(0,1)
+ ax.set_ylim3d(0, 1)
ax.set_zlabel('')
- ax.set_zlim3d(0, B_l2.max()*1.01)
+ ax.set_zlim3d(0, B_l2.max() * 1.01)
pl.title('Barycenter interpolation with l2')
+ pl.tight_layout()
- pl.show()
-
- pl.figure(4,(10,5))
-
- #pl.subplot(1,2,1)
- cmap=pl.cm.get_cmap('viridis')
+ pl.figure(4)
+ cmap = pl.cm.get_cmap('viridis')
verts = []
- zs = alphalist
- for i,z in enumerate(zs):
- ys = B_wass[:,i]
+ zs = alpha_list
+ for i, z in enumerate(zs):
+ ys = B_wass[:, i]
verts.append(list(zip(x, ys)))
ax = pl.gcf().gca(projection='3d')
- poly = PolyCollection(verts,facecolors=[cmap(a) for a in alphalist])
+ poly = PolyCollection(verts, facecolors=[cmap(a) for a in alpha_list])
poly.set_alpha(0.7)
ax.add_collection3d(poly, zs=zs, zdir='y')
-
ax.set_xlabel('x')
ax.set_xlim3d(0, n)
ax.set_ylabel('$\\alpha$')
- ax.set_ylim3d(0,1)
+ ax.set_ylim3d(0, 1)
ax.set_zlabel('')
- ax.set_zlim3d(0, B_l2.max()*1.01)
+ ax.set_zlim3d(0, B_l2.max() * 1.01)
pl.title('Barycenter interpolation with Wasserstein')
+ pl.tight_layout()
pl.show()
-**Total running time of the script:** ( 0 minutes 2.274 seconds)
+
+
+
+.. rst-class:: sphx-glr-horizontal
+
+
+ *
+
+ .. image:: /auto_examples/images/sphx_glr_plot_barycenter_1D_005.png
+ :scale: 47
+
+ *
+
+ .. image:: /auto_examples/images/sphx_glr_plot_barycenter_1D_006.png
+ :scale: 47
+
+
+
+
+**Total running time of the script:** ( 0 minutes 0.431 seconds)
@@ -190,4 +249,4 @@
.. rst-class:: sphx-glr-signature
- `Generated by Sphinx-Gallery <http://sphinx-gallery.readthedocs.io>`_
+ `Generated by Sphinx-Gallery <https://sphinx-gallery.readthedocs.io>`_
diff --git a/docs/source/auto_examples/plot_compute_emd.ipynb b/docs/source/auto_examples/plot_compute_emd.ipynb
index 4162144..b9b8bc5 100644
--- a/docs/source/auto_examples/plot_compute_emd.ipynb
+++ b/docs/source/auto_examples/plot_compute_emd.ipynb
@@ -15,7 +15,7 @@
},
{
"source": [
- "\n# 1D optimal transport\n\n\n@author: rflamary\n\n"
+ "\n# Plot multiple EMD\n\n\nShows how to compute multiple EMD and Sinkhorn with two differnt\nground metrics and plot their values for diffeent distributions.\n\n\n\n"
],
"cell_type": "markdown",
"metadata": {}
@@ -24,7 +24,79 @@
"execution_count": null,
"cell_type": "code",
"source": [
- "import numpy as np\nimport matplotlib.pylab as pl\nimport ot\nfrom ot.datasets import get_1D_gauss as gauss\n\n\n#%% parameters\n\nn=100 # nb bins\nn_target=50 # nb target distributions\n\n\n# bin positions\nx=np.arange(n,dtype=np.float64)\n\nlst_m=np.linspace(20,90,n_target)\n\n# Gaussian distributions\na=gauss(n,m=20,s=5) # m= mean, s= std\n\nB=np.zeros((n,n_target))\n\nfor i,m in enumerate(lst_m):\n B[:,i]=gauss(n,m=m,s=5)\n\n# loss matrix and normalization\nM=ot.dist(x.reshape((n,1)),x.reshape((n,1)),'euclidean')\nM/=M.max()\nM2=ot.dist(x.reshape((n,1)),x.reshape((n,1)),'sqeuclidean')\nM2/=M2.max()\n#%% plot the distributions\n\npl.figure(1)\npl.subplot(2,1,1)\npl.plot(x,a,'b',label='Source distribution')\npl.title('Source distribution')\npl.subplot(2,1,2)\npl.plot(x,B,label='Target distributions')\npl.title('Target distributions')\n\n#%% Compute and plot distributions and loss matrix\n\nd_emd=ot.emd2(a,B,M) # direct computation of EMD\nd_emd2=ot.emd2(a,B,M2) # direct computation of EMD with loss M3\n\n\npl.figure(2)\npl.plot(d_emd,label='Euclidean EMD')\npl.plot(d_emd2,label='Squared Euclidean EMD')\npl.title('EMD distances')\npl.legend()\n\n#%%\nreg=1e-2\nd_sinkhorn=ot.sinkhorn(a,B,M,reg)\nd_sinkhorn2=ot.sinkhorn(a,B,M2,reg)\n\npl.figure(2)\npl.clf()\npl.plot(d_emd,label='Euclidean EMD')\npl.plot(d_emd2,label='Squared Euclidean EMD')\npl.plot(d_sinkhorn,'+',label='Euclidean Sinkhorn')\npl.plot(d_sinkhorn2,'+',label='Squared Euclidean Sinkhorn')\npl.title('EMD distances')\npl.legend()"
+ "# Author: Remi Flamary <remi.flamary@unice.fr>\n#\n# License: MIT License\n\nimport numpy as np\nimport matplotlib.pylab as pl\nimport ot\nfrom ot.datasets import get_1D_gauss as gauss"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Generate data\n-------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "#%% parameters\n\nn = 100 # nb bins\nn_target = 50 # nb target distributions\n\n\n# bin positions\nx = np.arange(n, dtype=np.float64)\n\nlst_m = np.linspace(20, 90, n_target)\n\n# Gaussian distributions\na = gauss(n, m=20, s=5) # m= mean, s= std\n\nB = np.zeros((n, n_target))\n\nfor i, m in enumerate(lst_m):\n B[:, i] = gauss(n, m=m, s=5)\n\n# loss matrix and normalization\nM = ot.dist(x.reshape((n, 1)), x.reshape((n, 1)), 'euclidean')\nM /= M.max()\nM2 = ot.dist(x.reshape((n, 1)), x.reshape((n, 1)), 'sqeuclidean')\nM2 /= M2.max()"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Plot data\n---------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "#%% plot the distributions\n\npl.figure(1)\npl.subplot(2, 1, 1)\npl.plot(x, a, 'b', label='Source distribution')\npl.title('Source distribution')\npl.subplot(2, 1, 2)\npl.plot(x, B, label='Target distributions')\npl.title('Target distributions')\npl.tight_layout()"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Compute EMD for the different losses\n------------------------------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "#%% Compute and plot distributions and loss matrix\n\nd_emd = ot.emd2(a, B, M) # direct computation of EMD\nd_emd2 = ot.emd2(a, B, M2) # direct computation of EMD with loss M2\n\n\npl.figure(2)\npl.plot(d_emd, label='Euclidean EMD')\npl.plot(d_emd2, label='Squared Euclidean EMD')\npl.title('EMD distances')\npl.legend()"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Compute Sinkhorn for the different losses\n-----------------------------------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "#%%\nreg = 1e-2\nd_sinkhorn = ot.sinkhorn2(a, B, M, reg)\nd_sinkhorn2 = ot.sinkhorn2(a, B, M2, reg)\n\npl.figure(2)\npl.clf()\npl.plot(d_emd, label='Euclidean EMD')\npl.plot(d_emd2, label='Squared Euclidean EMD')\npl.plot(d_sinkhorn, '+', label='Euclidean Sinkhorn')\npl.plot(d_sinkhorn2, '+', label='Squared Euclidean Sinkhorn')\npl.title('EMD distances')\npl.legend()\n\npl.show()"
],
"outputs": [],
"metadata": {
diff --git a/docs/source/auto_examples/plot_compute_emd.py b/docs/source/auto_examples/plot_compute_emd.py
index c7063e8..73b42c3 100644
--- a/docs/source/auto_examples/plot_compute_emd.py
+++ b/docs/source/auto_examples/plot_compute_emd.py
@@ -1,74 +1,102 @@
# -*- coding: utf-8 -*-
"""
-====================
-1D optimal transport
-====================
+=================
+Plot multiple EMD
+=================
+
+Shows how to compute multiple EMD and Sinkhorn with two differnt
+ground metrics and plot their values for diffeent distributions.
+
-@author: rflamary
"""
+# Author: Remi Flamary <remi.flamary@unice.fr>
+#
+# License: MIT License
+
import numpy as np
import matplotlib.pylab as pl
import ot
from ot.datasets import get_1D_gauss as gauss
+##############################################################################
+# Generate data
+# -------------
+
#%% parameters
-n=100 # nb bins
-n_target=50 # nb target distributions
+n = 100 # nb bins
+n_target = 50 # nb target distributions
# bin positions
-x=np.arange(n,dtype=np.float64)
+x = np.arange(n, dtype=np.float64)
-lst_m=np.linspace(20,90,n_target)
+lst_m = np.linspace(20, 90, n_target)
# Gaussian distributions
-a=gauss(n,m=20,s=5) # m= mean, s= std
+a = gauss(n, m=20, s=5) # m= mean, s= std
-B=np.zeros((n,n_target))
+B = np.zeros((n, n_target))
-for i,m in enumerate(lst_m):
- B[:,i]=gauss(n,m=m,s=5)
+for i, m in enumerate(lst_m):
+ B[:, i] = gauss(n, m=m, s=5)
# loss matrix and normalization
-M=ot.dist(x.reshape((n,1)),x.reshape((n,1)),'euclidean')
-M/=M.max()
-M2=ot.dist(x.reshape((n,1)),x.reshape((n,1)),'sqeuclidean')
-M2/=M2.max()
+M = ot.dist(x.reshape((n, 1)), x.reshape((n, 1)), 'euclidean')
+M /= M.max()
+M2 = ot.dist(x.reshape((n, 1)), x.reshape((n, 1)), 'sqeuclidean')
+M2 /= M2.max()
+
+##############################################################################
+# Plot data
+# ---------
+
#%% plot the distributions
pl.figure(1)
-pl.subplot(2,1,1)
-pl.plot(x,a,'b',label='Source distribution')
+pl.subplot(2, 1, 1)
+pl.plot(x, a, 'b', label='Source distribution')
pl.title('Source distribution')
-pl.subplot(2,1,2)
-pl.plot(x,B,label='Target distributions')
+pl.subplot(2, 1, 2)
+pl.plot(x, B, label='Target distributions')
pl.title('Target distributions')
+pl.tight_layout()
+
+
+##############################################################################
+# Compute EMD for the different losses
+# ------------------------------------
#%% Compute and plot distributions and loss matrix
-d_emd=ot.emd2(a,B,M) # direct computation of EMD
-d_emd2=ot.emd2(a,B,M2) # direct computation of EMD with loss M3
+d_emd = ot.emd2(a, B, M) # direct computation of EMD
+d_emd2 = ot.emd2(a, B, M2) # direct computation of EMD with loss M2
pl.figure(2)
-pl.plot(d_emd,label='Euclidean EMD')
-pl.plot(d_emd2,label='Squared Euclidean EMD')
+pl.plot(d_emd, label='Euclidean EMD')
+pl.plot(d_emd2, label='Squared Euclidean EMD')
pl.title('EMD distances')
pl.legend()
+##############################################################################
+# Compute Sinkhorn for the different losses
+# -----------------------------------------
+
#%%
-reg=1e-2
-d_sinkhorn=ot.sinkhorn(a,B,M,reg)
-d_sinkhorn2=ot.sinkhorn(a,B,M2,reg)
+reg = 1e-2
+d_sinkhorn = ot.sinkhorn2(a, B, M, reg)
+d_sinkhorn2 = ot.sinkhorn2(a, B, M2, reg)
pl.figure(2)
pl.clf()
-pl.plot(d_emd,label='Euclidean EMD')
-pl.plot(d_emd2,label='Squared Euclidean EMD')
-pl.plot(d_sinkhorn,'+',label='Euclidean Sinkhorn')
-pl.plot(d_sinkhorn2,'+',label='Squared Euclidean Sinkhorn')
+pl.plot(d_emd, label='Euclidean EMD')
+pl.plot(d_emd2, label='Squared Euclidean EMD')
+pl.plot(d_sinkhorn, '+', label='Euclidean Sinkhorn')
+pl.plot(d_sinkhorn2, '+', label='Squared Euclidean Sinkhorn')
pl.title('EMD distances')
-pl.legend() \ No newline at end of file
+pl.legend()
+
+pl.show()
diff --git a/docs/source/auto_examples/plot_compute_emd.rst b/docs/source/auto_examples/plot_compute_emd.rst
index 4c7445b..ce79e20 100644
--- a/docs/source/auto_examples/plot_compute_emd.rst
+++ b/docs/source/auto_examples/plot_compute_emd.rst
@@ -3,101 +3,166 @@
.. _sphx_glr_auto_examples_plot_compute_emd.py:
-====================
-1D optimal transport
-====================
+=================
+Plot multiple EMD
+=================
-@author: rflamary
+Shows how to compute multiple EMD and Sinkhorn with two differnt
+ground metrics and plot their values for diffeent distributions.
-.. rst-class:: sphx-glr-horizontal
+.. code-block:: python
- *
- .. image:: /auto_examples/images/sphx_glr_plot_compute_emd_001.png
- :scale: 47
+ # Author: Remi Flamary <remi.flamary@unice.fr>
+ #
+ # License: MIT License
- *
+ import numpy as np
+ import matplotlib.pylab as pl
+ import ot
+ from ot.datasets import get_1D_gauss as gauss
- .. image:: /auto_examples/images/sphx_glr_plot_compute_emd_002.png
- :scale: 47
-.. code-block:: python
- import numpy as np
- import matplotlib.pylab as pl
- import ot
- from ot.datasets import get_1D_gauss as gauss
+Generate data
+-------------
+
+
+
+.. code-block:: python
#%% parameters
- n=100 # nb bins
- n_target=50 # nb target distributions
+ n = 100 # nb bins
+ n_target = 50 # nb target distributions
# bin positions
- x=np.arange(n,dtype=np.float64)
+ x = np.arange(n, dtype=np.float64)
- lst_m=np.linspace(20,90,n_target)
+ lst_m = np.linspace(20, 90, n_target)
# Gaussian distributions
- a=gauss(n,m=20,s=5) # m= mean, s= std
+ a = gauss(n, m=20, s=5) # m= mean, s= std
- B=np.zeros((n,n_target))
+ B = np.zeros((n, n_target))
- for i,m in enumerate(lst_m):
- B[:,i]=gauss(n,m=m,s=5)
+ for i, m in enumerate(lst_m):
+ B[:, i] = gauss(n, m=m, s=5)
# loss matrix and normalization
- M=ot.dist(x.reshape((n,1)),x.reshape((n,1)),'euclidean')
- M/=M.max()
- M2=ot.dist(x.reshape((n,1)),x.reshape((n,1)),'sqeuclidean')
- M2/=M2.max()
+ M = ot.dist(x.reshape((n, 1)), x.reshape((n, 1)), 'euclidean')
+ M /= M.max()
+ M2 = ot.dist(x.reshape((n, 1)), x.reshape((n, 1)), 'sqeuclidean')
+ M2 /= M2.max()
+
+
+
+
+
+
+
+Plot data
+---------
+
+
+
+.. code-block:: python
+
+
#%% plot the distributions
pl.figure(1)
- pl.subplot(2,1,1)
- pl.plot(x,a,'b',label='Source distribution')
+ pl.subplot(2, 1, 1)
+ pl.plot(x, a, 'b', label='Source distribution')
pl.title('Source distribution')
- pl.subplot(2,1,2)
- pl.plot(x,B,label='Target distributions')
+ pl.subplot(2, 1, 2)
+ pl.plot(x, B, label='Target distributions')
pl.title('Target distributions')
+ pl.tight_layout()
+
+
+
+
+
+.. image:: /auto_examples/images/sphx_glr_plot_compute_emd_001.png
+ :align: center
+
+
+
+
+Compute EMD for the different losses
+------------------------------------
+
+
+
+.. code-block:: python
+
#%% Compute and plot distributions and loss matrix
- d_emd=ot.emd2(a,B,M) # direct computation of EMD
- d_emd2=ot.emd2(a,B,M2) # direct computation of EMD with loss M3
+ d_emd = ot.emd2(a, B, M) # direct computation of EMD
+ d_emd2 = ot.emd2(a, B, M2) # direct computation of EMD with loss M2
pl.figure(2)
- pl.plot(d_emd,label='Euclidean EMD')
- pl.plot(d_emd2,label='Squared Euclidean EMD')
+ pl.plot(d_emd, label='Euclidean EMD')
+ pl.plot(d_emd2, label='Squared Euclidean EMD')
pl.title('EMD distances')
pl.legend()
+
+
+
+.. image:: /auto_examples/images/sphx_glr_plot_compute_emd_003.png
+ :align: center
+
+
+
+
+Compute Sinkhorn for the different losses
+-----------------------------------------
+
+
+
+.. code-block:: python
+
+
#%%
- reg=1e-2
- d_sinkhorn=ot.sinkhorn(a,B,M,reg)
- d_sinkhorn2=ot.sinkhorn(a,B,M2,reg)
+ reg = 1e-2
+ d_sinkhorn = ot.sinkhorn2(a, B, M, reg)
+ d_sinkhorn2 = ot.sinkhorn2(a, B, M2, reg)
pl.figure(2)
pl.clf()
- pl.plot(d_emd,label='Euclidean EMD')
- pl.plot(d_emd2,label='Squared Euclidean EMD')
- pl.plot(d_sinkhorn,'+',label='Euclidean Sinkhorn')
- pl.plot(d_sinkhorn2,'+',label='Squared Euclidean Sinkhorn')
+ pl.plot(d_emd, label='Euclidean EMD')
+ pl.plot(d_emd2, label='Squared Euclidean EMD')
+ pl.plot(d_sinkhorn, '+', label='Euclidean Sinkhorn')
+ pl.plot(d_sinkhorn2, '+', label='Squared Euclidean Sinkhorn')
pl.title('EMD distances')
pl.legend()
-**Total running time of the script:** ( 0 minutes 0.521 seconds)
+
+ pl.show()
+
+
+
+.. image:: /auto_examples/images/sphx_glr_plot_compute_emd_004.png
+ :align: center
+
+
+
+
+**Total running time of the script:** ( 0 minutes 0.441 seconds)
@@ -116,4 +181,4 @@
.. rst-class:: sphx-glr-signature
- `Generated by Sphinx-Gallery <http://sphinx-gallery.readthedocs.io>`_
+ `Generated by Sphinx-Gallery <https://sphinx-gallery.readthedocs.io>`_
diff --git a/docs/source/auto_examples/plot_optim_OTreg.ipynb b/docs/source/auto_examples/plot_optim_OTreg.ipynb
index 5ded922..333331b 100644
--- a/docs/source/auto_examples/plot_optim_OTreg.ipynb
+++ b/docs/source/auto_examples/plot_optim_OTreg.ipynb
@@ -15,7 +15,7 @@
},
{
"source": [
- "\n# Regularized OT with generic solver\n\n\n\n\n"
+ "\n# Regularized OT with generic solver\n\n\nIllustrates the use of the generic solver for regularized OT with\nuser-designed regularization term. It uses Conditional gradient as in [6] and\ngeneralized Conditional Gradient as proposed in [5][7].\n\n\n[5] N. Courty; R. Flamary; D. Tuia; A. Rakotomamonjy, Optimal Transport for\nDomain Adaptation, in IEEE Transactions on Pattern Analysis and Machine\nIntelligence , vol.PP, no.99, pp.1-1.\n\n[6] Ferradans, S., Papadakis, N., Peyr\u00e9, G., & Aujol, J. F. (2014).\nRegularized discrete optimal transport. SIAM Journal on Imaging Sciences,\n7(3), 1853-1882.\n\n[7] Rakotomamonjy, A., Flamary, R., & Courty, N. (2015). Generalized\nconditional gradient: analysis of convergence and applications.\narXiv preprint arXiv:1510.06567.\n\n\n\n\n"
],
"cell_type": "markdown",
"metadata": {}
@@ -24,7 +24,97 @@
"execution_count": null,
"cell_type": "code",
"source": [
- "import numpy as np\nimport matplotlib.pylab as pl\nimport ot\n\n\n\n#%% parameters\n\nn=100 # nb bins\n\n# bin positions\nx=np.arange(n,dtype=np.float64)\n\n# Gaussian distributions\na=ot.datasets.get_1D_gauss(n,m=20,s=5) # m= mean, s= std\nb=ot.datasets.get_1D_gauss(n,m=60,s=10)\n\n# loss matrix\nM=ot.dist(x.reshape((n,1)),x.reshape((n,1)))\nM/=M.max()\n\n#%% EMD\n\nG0=ot.emd(a,b,M)\n\npl.figure(3)\not.plot.plot1D_mat(a,b,G0,'OT matrix G0')\n\n#%% Example with Frobenius norm regularization\n\ndef f(G): return 0.5*np.sum(G**2)\ndef df(G): return G\n\nreg=1e-1\n\nGl2=ot.optim.cg(a,b,M,reg,f,df,verbose=True)\n\npl.figure(3)\not.plot.plot1D_mat(a,b,Gl2,'OT matrix Frob. reg')\n\n#%% Example with entropic regularization\n\ndef f(G): return np.sum(G*np.log(G))\ndef df(G): return np.log(G)+1\n\nreg=1e-3\n\nGe=ot.optim.cg(a,b,M,reg,f,df,verbose=True)\n\npl.figure(4)\not.plot.plot1D_mat(a,b,Ge,'OT matrix Entrop. reg')\n\n#%% Example with Frobenius norm + entropic regularization with gcg\n\ndef f(G): return 0.5*np.sum(G**2)\ndef df(G): return G\n\nreg1=1e-3\nreg2=1e-1\n\nGel2=ot.optim.gcg(a,b,M,reg1,reg2,f,df,verbose=True)\n\npl.figure(5)\not.plot.plot1D_mat(a,b,Gel2,'OT entropic + matrix Frob. reg')\npl.show()"
+ "import numpy as np\nimport matplotlib.pylab as pl\nimport ot"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Generate data\n-------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "#%% parameters\n\nn = 100 # nb bins\n\n# bin positions\nx = np.arange(n, dtype=np.float64)\n\n# Gaussian distributions\na = ot.datasets.get_1D_gauss(n, m=20, s=5) # m= mean, s= std\nb = ot.datasets.get_1D_gauss(n, m=60, s=10)\n\n# loss matrix\nM = ot.dist(x.reshape((n, 1)), x.reshape((n, 1)))\nM /= M.max()"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Solve EMD\n---------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "#%% EMD\n\nG0 = ot.emd(a, b, M)\n\npl.figure(3, figsize=(5, 5))\not.plot.plot1D_mat(a, b, G0, 'OT matrix G0')"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Solve EMD with Frobenius norm regularization\n--------------------------------------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "#%% Example with Frobenius norm regularization\n\n\ndef f(G):\n return 0.5 * np.sum(G**2)\n\n\ndef df(G):\n return G\n\n\nreg = 1e-1\n\nGl2 = ot.optim.cg(a, b, M, reg, f, df, verbose=True)\n\npl.figure(3)\not.plot.plot1D_mat(a, b, Gl2, 'OT matrix Frob. reg')"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Solve EMD with entropic regularization\n--------------------------------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "#%% Example with entropic regularization\n\n\ndef f(G):\n return np.sum(G * np.log(G))\n\n\ndef df(G):\n return np.log(G) + 1.\n\n\nreg = 1e-3\n\nGe = ot.optim.cg(a, b, M, reg, f, df, verbose=True)\n\npl.figure(4, figsize=(5, 5))\not.plot.plot1D_mat(a, b, Ge, 'OT matrix Entrop. reg')"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Solve EMD with Frobenius norm + entropic regularization\n-------------------------------------------------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "#%% Example with Frobenius norm + entropic regularization with gcg\n\n\ndef f(G):\n return 0.5 * np.sum(G**2)\n\n\ndef df(G):\n return G\n\n\nreg1 = 1e-3\nreg2 = 1e-1\n\nGel2 = ot.optim.gcg(a, b, M, reg1, reg2, f, df, verbose=True)\n\npl.figure(5, figsize=(5, 5))\not.plot.plot1D_mat(a, b, Gel2, 'OT entropic + matrix Frob. reg')\npl.show()"
],
"outputs": [],
"metadata": {
diff --git a/docs/source/auto_examples/plot_optim_OTreg.py b/docs/source/auto_examples/plot_optim_OTreg.py
index 8abb426..e1a737e 100644
--- a/docs/source/auto_examples/plot_optim_OTreg.py
+++ b/docs/source/auto_examples/plot_optim_OTreg.py
@@ -4,6 +4,24 @@
Regularized OT with generic solver
==================================
+Illustrates the use of the generic solver for regularized OT with
+user-designed regularization term. It uses Conditional gradient as in [6] and
+generalized Conditional Gradient as proposed in [5][7].
+
+
+[5] N. Courty; R. Flamary; D. Tuia; A. Rakotomamonjy, Optimal Transport for
+Domain Adaptation, in IEEE Transactions on Pattern Analysis and Machine
+Intelligence , vol.PP, no.99, pp.1-1.
+
+[6] Ferradans, S., Papadakis, N., Peyré, G., & Aujol, J. F. (2014).
+Regularized discrete optimal transport. SIAM Journal on Imaging Sciences,
+7(3), 1853-1882.
+
+[7] Rakotomamonjy, A., Flamary, R., & Courty, N. (2015). Generalized
+conditional gradient: analysis of convergence and applications.
+arXiv preprint arXiv:1510.06567.
+
+
"""
@@ -12,63 +30,100 @@ import matplotlib.pylab as pl
import ot
+##############################################################################
+# Generate data
+# -------------
#%% parameters
-n=100 # nb bins
+n = 100 # nb bins
# bin positions
-x=np.arange(n,dtype=np.float64)
+x = np.arange(n, dtype=np.float64)
# Gaussian distributions
-a=ot.datasets.get_1D_gauss(n,m=20,s=5) # m= mean, s= std
-b=ot.datasets.get_1D_gauss(n,m=60,s=10)
+a = ot.datasets.get_1D_gauss(n, m=20, s=5) # m= mean, s= std
+b = ot.datasets.get_1D_gauss(n, m=60, s=10)
# loss matrix
-M=ot.dist(x.reshape((n,1)),x.reshape((n,1)))
-M/=M.max()
+M = ot.dist(x.reshape((n, 1)), x.reshape((n, 1)))
+M /= M.max()
+
+##############################################################################
+# Solve EMD
+# ---------
#%% EMD
-G0=ot.emd(a,b,M)
+G0 = ot.emd(a, b, M)
-pl.figure(3)
-ot.plot.plot1D_mat(a,b,G0,'OT matrix G0')
+pl.figure(3, figsize=(5, 5))
+ot.plot.plot1D_mat(a, b, G0, 'OT matrix G0')
+
+##############################################################################
+# Solve EMD with Frobenius norm regularization
+# --------------------------------------------
#%% Example with Frobenius norm regularization
-def f(G): return 0.5*np.sum(G**2)
-def df(G): return G
-reg=1e-1
+def f(G):
+ return 0.5 * np.sum(G**2)
+
+
+def df(G):
+ return G
+
-Gl2=ot.optim.cg(a,b,M,reg,f,df,verbose=True)
+reg = 1e-1
+
+Gl2 = ot.optim.cg(a, b, M, reg, f, df, verbose=True)
pl.figure(3)
-ot.plot.plot1D_mat(a,b,Gl2,'OT matrix Frob. reg')
+ot.plot.plot1D_mat(a, b, Gl2, 'OT matrix Frob. reg')
+
+##############################################################################
+# Solve EMD with entropic regularization
+# --------------------------------------
#%% Example with entropic regularization
-def f(G): return np.sum(G*np.log(G))
-def df(G): return np.log(G)+1
-reg=1e-3
+def f(G):
+ return np.sum(G * np.log(G))
+
+
+def df(G):
+ return np.log(G) + 1.
+
+
+reg = 1e-3
-Ge=ot.optim.cg(a,b,M,reg,f,df,verbose=True)
+Ge = ot.optim.cg(a, b, M, reg, f, df, verbose=True)
-pl.figure(4)
-ot.plot.plot1D_mat(a,b,Ge,'OT matrix Entrop. reg')
+pl.figure(4, figsize=(5, 5))
+ot.plot.plot1D_mat(a, b, Ge, 'OT matrix Entrop. reg')
+
+##############################################################################
+# Solve EMD with Frobenius norm + entropic regularization
+# -------------------------------------------------------
#%% Example with Frobenius norm + entropic regularization with gcg
-def f(G): return 0.5*np.sum(G**2)
-def df(G): return G
-reg1=1e-3
-reg2=1e-1
+def f(G):
+ return 0.5 * np.sum(G**2)
+
+
+def df(G):
+ return G
+
+
+reg1 = 1e-3
+reg2 = 1e-1
-Gel2=ot.optim.gcg(a,b,M,reg1,reg2,f,df,verbose=True)
+Gel2 = ot.optim.gcg(a, b, M, reg1, reg2, f, df, verbose=True)
-pl.figure(5)
-ot.plot.plot1D_mat(a,b,Gel2,'OT entropic + matrix Frob. reg')
-pl.show() \ No newline at end of file
+pl.figure(5, figsize=(5, 5))
+ot.plot.plot1D_mat(a, b, Gel2, 'OT entropic + matrix Frob. reg')
+pl.show()
diff --git a/docs/source/auto_examples/plot_optim_OTreg.rst b/docs/source/auto_examples/plot_optim_OTreg.rst
index 70cd26c..f628024 100644
--- a/docs/source/auto_examples/plot_optim_OTreg.rst
+++ b/docs/source/auto_examples/plot_optim_OTreg.rst
@@ -7,28 +7,126 @@
Regularized OT with generic solver
==================================
+Illustrates the use of the generic solver for regularized OT with
+user-designed regularization term. It uses Conditional gradient as in [6] and
+generalized Conditional Gradient as proposed in [5][7].
+[5] N. Courty; R. Flamary; D. Tuia; A. Rakotomamonjy, Optimal Transport for
+Domain Adaptation, in IEEE Transactions on Pattern Analysis and Machine
+Intelligence , vol.PP, no.99, pp.1-1.
+[6] Ferradans, S., Papadakis, N., Peyré, G., & Aujol, J. F. (2014).
+Regularized discrete optimal transport. SIAM Journal on Imaging Sciences,
+7(3), 1853-1882.
+[7] Rakotomamonjy, A., Flamary, R., & Courty, N. (2015). Generalized
+conditional gradient: analysis of convergence and applications.
+arXiv preprint arXiv:1510.06567.
-.. rst-class:: sphx-glr-horizontal
- *
- .. image:: /auto_examples/images/sphx_glr_plot_optim_OTreg_003.png
- :scale: 47
- *
- .. image:: /auto_examples/images/sphx_glr_plot_optim_OTreg_004.png
- :scale: 47
+.. code-block:: python
+
+
+ import numpy as np
+ import matplotlib.pylab as pl
+ import ot
+
+
+
+
+
+
+
+
+Generate data
+-------------
+
+
+
+.. code-block:: python
+
+
+ #%% parameters
+
+ n = 100 # nb bins
+
+ # bin positions
+ x = np.arange(n, dtype=np.float64)
+
+ # Gaussian distributions
+ a = ot.datasets.get_1D_gauss(n, m=20, s=5) # m= mean, s= std
+ b = ot.datasets.get_1D_gauss(n, m=60, s=10)
+
+ # loss matrix
+ M = ot.dist(x.reshape((n, 1)), x.reshape((n, 1)))
+ M /= M.max()
+
+
+
+
+
+
+
+Solve EMD
+---------
+
+
+
+.. code-block:: python
+
+
+ #%% EMD
+
+ G0 = ot.emd(a, b, M)
+
+ pl.figure(3, figsize=(5, 5))
+ ot.plot.plot1D_mat(a, b, G0, 'OT matrix G0')
+
+
+
+
+.. image:: /auto_examples/images/sphx_glr_plot_optim_OTreg_003.png
+ :align: center
+
+
+
+
+Solve EMD with Frobenius norm regularization
+--------------------------------------------
+
+
+
+.. code-block:: python
+
+
+ #%% Example with Frobenius norm regularization
+
+
+ def f(G):
+ return 0.5 * np.sum(G**2)
+
+
+ def df(G):
+ return G
+
+
+ reg = 1e-1
+
+ Gl2 = ot.optim.cg(a, b, M, reg, f, df, verbose=True)
+
+ pl.figure(3)
+ ot.plot.plot1D_mat(a, b, Gl2, 'OT matrix Frob. reg')
- *
- .. image:: /auto_examples/images/sphx_glr_plot_optim_OTreg_005.png
- :scale: 47
+
+
+.. image:: /auto_examples/images/sphx_glr_plot_optim_OTreg_004.png
+ :align: center
.. rst-class:: sphx-glr-script-out
@@ -258,6 +356,45 @@ Regularized OT with generic solver
It. |Loss |Delta loss
--------------------------------
200|1.663543e-01|-8.737134e-08
+
+
+Solve EMD with entropic regularization
+--------------------------------------
+
+
+
+.. code-block:: python
+
+
+ #%% Example with entropic regularization
+
+
+ def f(G):
+ return np.sum(G * np.log(G))
+
+
+ def df(G):
+ return np.log(G) + 1.
+
+
+ reg = 1e-3
+
+ Ge = ot.optim.cg(a, b, M, reg, f, df, verbose=True)
+
+ pl.figure(4, figsize=(5, 5))
+ ot.plot.plot1D_mat(a, b, Ge, 'OT matrix Entrop. reg')
+
+
+
+
+.. image:: /auto_examples/images/sphx_glr_plot_optim_OTreg_006.png
+ :align: center
+
+
+.. rst-class:: sphx-glr-script-out
+
+ Out::
+
It. |Loss |Delta loss
--------------------------------
0|1.692289e-01|0.000000e+00
@@ -481,89 +618,56 @@ Regularized OT with generic solver
It. |Loss |Delta loss
--------------------------------
200|1.607143e-01|-2.151971e-10
- It. |Loss |Delta loss
- --------------------------------
- 0|1.693084e-01|0.000000e+00
- 1|1.610121e-01|-5.152589e-02
- 2|1.609378e-01|-4.622297e-04
- 3|1.609284e-01|-5.830043e-05
- 4|1.609284e-01|-1.111580e-12
-
+Solve EMD with Frobenius norm + entropic regularization
+-------------------------------------------------------
-|
.. code-block:: python
- import numpy as np
- import matplotlib.pylab as pl
- import ot
-
-
-
- #%% parameters
-
- n=100 # nb bins
-
- # bin positions
- x=np.arange(n,dtype=np.float64)
-
- # Gaussian distributions
- a=ot.datasets.get_1D_gauss(n,m=20,s=5) # m= mean, s= std
- b=ot.datasets.get_1D_gauss(n,m=60,s=10)
-
- # loss matrix
- M=ot.dist(x.reshape((n,1)),x.reshape((n,1)))
- M/=M.max()
-
- #%% EMD
+ #%% Example with Frobenius norm + entropic regularization with gcg
- G0=ot.emd(a,b,M)
- pl.figure(3)
- ot.plot.plot1D_mat(a,b,G0,'OT matrix G0')
+ def f(G):
+ return 0.5 * np.sum(G**2)
- #%% Example with Frobenius norm regularization
- def f(G): return 0.5*np.sum(G**2)
- def df(G): return G
+ def df(G):
+ return G
- reg=1e-1
- Gl2=ot.optim.cg(a,b,M,reg,f,df,verbose=True)
+ reg1 = 1e-3
+ reg2 = 1e-1
- pl.figure(3)
- ot.plot.plot1D_mat(a,b,Gl2,'OT matrix Frob. reg')
+ Gel2 = ot.optim.gcg(a, b, M, reg1, reg2, f, df, verbose=True)
- #%% Example with entropic regularization
+ pl.figure(5, figsize=(5, 5))
+ ot.plot.plot1D_mat(a, b, Gel2, 'OT entropic + matrix Frob. reg')
+ pl.show()
- def f(G): return np.sum(G*np.log(G))
- def df(G): return np.log(G)+1
- reg=1e-3
- Ge=ot.optim.cg(a,b,M,reg,f,df,verbose=True)
+.. image:: /auto_examples/images/sphx_glr_plot_optim_OTreg_008.png
+ :align: center
- pl.figure(4)
- ot.plot.plot1D_mat(a,b,Ge,'OT matrix Entrop. reg')
- #%% Example with Frobenius norm + entropic regularization with gcg
+.. rst-class:: sphx-glr-script-out
- def f(G): return 0.5*np.sum(G**2)
- def df(G): return G
+ Out::
- reg1=1e-3
- reg2=1e-1
+ It. |Loss |Delta loss
+ --------------------------------
+ 0|1.693084e-01|0.000000e+00
+ 1|1.610121e-01|-5.152589e-02
+ 2|1.609378e-01|-4.622297e-04
+ 3|1.609284e-01|-5.830043e-05
+ 4|1.609284e-01|-1.111407e-12
- Gel2=ot.optim.gcg(a,b,M,reg1,reg2,f,df,verbose=True)
- pl.figure(5)
- ot.plot.plot1D_mat(a,b,Gel2,'OT entropic + matrix Frob. reg')
- pl.show()
-**Total running time of the script:** ( 0 minutes 2.319 seconds)
+**Total running time of the script:** ( 0 minutes 1.809 seconds)
@@ -582,4 +686,4 @@ Regularized OT with generic solver
.. rst-class:: sphx-glr-signature
- `Generated by Sphinx-Gallery <http://sphinx-gallery.readthedocs.io>`_
+ `Generated by Sphinx-Gallery <https://sphinx-gallery.readthedocs.io>`_
diff --git a/docs/source/auto_examples/plot_otda_classes.ipynb b/docs/source/auto_examples/plot_otda_classes.ipynb
new file mode 100644
index 0000000..6754fa5
--- /dev/null
+++ b/docs/source/auto_examples/plot_otda_classes.ipynb
@@ -0,0 +1,126 @@
+{
+ "nbformat_minor": 0,
+ "nbformat": 4,
+ "cells": [
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "%matplotlib inline"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "\n# OT for domain adaptation\n\n\nThis example introduces a domain adaptation in a 2D setting and the 4 OTDA\napproaches currently supported in POT.\n\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "# Authors: Remi Flamary <remi.flamary@unice.fr>\n# Stanislas Chambon <stan.chambon@gmail.com>\n#\n# License: MIT License\n\nimport matplotlib.pylab as pl\nimport ot"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Generate data\n-------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "n_source_samples = 150\nn_target_samples = 150\n\nXs, ys = ot.datasets.get_data_classif('3gauss', n_source_samples)\nXt, yt = ot.datasets.get_data_classif('3gauss2', n_target_samples)"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Instantiate the different transport algorithms and fit them\n-----------------------------------------------------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "# EMD Transport\not_emd = ot.da.EMDTransport()\not_emd.fit(Xs=Xs, Xt=Xt)\n\n# Sinkhorn Transport\not_sinkhorn = ot.da.SinkhornTransport(reg_e=1e-1)\not_sinkhorn.fit(Xs=Xs, Xt=Xt)\n\n# Sinkhorn Transport with Group lasso regularization\not_lpl1 = ot.da.SinkhornLpl1Transport(reg_e=1e-1, reg_cl=1e0)\not_lpl1.fit(Xs=Xs, ys=ys, Xt=Xt)\n\n# Sinkhorn Transport with Group lasso regularization l1l2\not_l1l2 = ot.da.SinkhornL1l2Transport(reg_e=1e-1, reg_cl=2e0, max_iter=20,\n verbose=True)\not_l1l2.fit(Xs=Xs, ys=ys, Xt=Xt)\n\n# transport source samples onto target samples\ntransp_Xs_emd = ot_emd.transform(Xs=Xs)\ntransp_Xs_sinkhorn = ot_sinkhorn.transform(Xs=Xs)\ntransp_Xs_lpl1 = ot_lpl1.transform(Xs=Xs)\ntransp_Xs_l1l2 = ot_l1l2.transform(Xs=Xs)"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Fig 1 : plots source and target samples\n---------------------------------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "pl.figure(1, figsize=(10, 5))\npl.subplot(1, 2, 1)\npl.scatter(Xs[:, 0], Xs[:, 1], c=ys, marker='+', label='Source samples')\npl.xticks([])\npl.yticks([])\npl.legend(loc=0)\npl.title('Source samples')\n\npl.subplot(1, 2, 2)\npl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o', label='Target samples')\npl.xticks([])\npl.yticks([])\npl.legend(loc=0)\npl.title('Target samples')\npl.tight_layout()"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Fig 2 : plot optimal couplings and transported samples\n------------------------------------------------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "param_img = {'interpolation': 'nearest', 'cmap': 'spectral'}\n\npl.figure(2, figsize=(15, 8))\npl.subplot(2, 4, 1)\npl.imshow(ot_emd.coupling_, **param_img)\npl.xticks([])\npl.yticks([])\npl.title('Optimal coupling\\nEMDTransport')\n\npl.subplot(2, 4, 2)\npl.imshow(ot_sinkhorn.coupling_, **param_img)\npl.xticks([])\npl.yticks([])\npl.title('Optimal coupling\\nSinkhornTransport')\n\npl.subplot(2, 4, 3)\npl.imshow(ot_lpl1.coupling_, **param_img)\npl.xticks([])\npl.yticks([])\npl.title('Optimal coupling\\nSinkhornLpl1Transport')\n\npl.subplot(2, 4, 4)\npl.imshow(ot_l1l2.coupling_, **param_img)\npl.xticks([])\npl.yticks([])\npl.title('Optimal coupling\\nSinkhornL1l2Transport')\n\npl.subplot(2, 4, 5)\npl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o',\n label='Target samples', alpha=0.3)\npl.scatter(transp_Xs_emd[:, 0], transp_Xs_emd[:, 1], c=ys,\n marker='+', label='Transp samples', s=30)\npl.xticks([])\npl.yticks([])\npl.title('Transported samples\\nEmdTransport')\npl.legend(loc=\"lower left\")\n\npl.subplot(2, 4, 6)\npl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o',\n label='Target samples', alpha=0.3)\npl.scatter(transp_Xs_sinkhorn[:, 0], transp_Xs_sinkhorn[:, 1], c=ys,\n marker='+', label='Transp samples', s=30)\npl.xticks([])\npl.yticks([])\npl.title('Transported samples\\nSinkhornTransport')\n\npl.subplot(2, 4, 7)\npl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o',\n label='Target samples', alpha=0.3)\npl.scatter(transp_Xs_lpl1[:, 0], transp_Xs_lpl1[:, 1], c=ys,\n marker='+', label='Transp samples', s=30)\npl.xticks([])\npl.yticks([])\npl.title('Transported samples\\nSinkhornLpl1Transport')\n\npl.subplot(2, 4, 8)\npl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o',\n label='Target samples', alpha=0.3)\npl.scatter(transp_Xs_l1l2[:, 0], transp_Xs_l1l2[:, 1], c=ys,\n marker='+', label='Transp samples', s=30)\npl.xticks([])\npl.yticks([])\npl.title('Transported samples\\nSinkhornL1l2Transport')\npl.tight_layout()\n\npl.show()"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ }
+ ],
+ "metadata": {
+ "kernelspec": {
+ "display_name": "Python 2",
+ "name": "python2",
+ "language": "python"
+ },
+ "language_info": {
+ "mimetype": "text/x-python",
+ "nbconvert_exporter": "python",
+ "name": "python",
+ "file_extension": ".py",
+ "version": "2.7.12",
+ "pygments_lexer": "ipython2",
+ "codemirror_mode": {
+ "version": 2,
+ "name": "ipython"
+ }
+ }
+ }
+} \ No newline at end of file
diff --git a/docs/source/auto_examples/plot_otda_classes.py b/docs/source/auto_examples/plot_otda_classes.py
new file mode 100644
index 0000000..b14c11a
--- /dev/null
+++ b/docs/source/auto_examples/plot_otda_classes.py
@@ -0,0 +1,150 @@
+# -*- coding: utf-8 -*-
+"""
+========================
+OT for domain adaptation
+========================
+
+This example introduces a domain adaptation in a 2D setting and the 4 OTDA
+approaches currently supported in POT.
+
+"""
+
+# Authors: Remi Flamary <remi.flamary@unice.fr>
+# Stanislas Chambon <stan.chambon@gmail.com>
+#
+# License: MIT License
+
+import matplotlib.pylab as pl
+import ot
+
+
+##############################################################################
+# Generate data
+# -------------
+
+n_source_samples = 150
+n_target_samples = 150
+
+Xs, ys = ot.datasets.get_data_classif('3gauss', n_source_samples)
+Xt, yt = ot.datasets.get_data_classif('3gauss2', n_target_samples)
+
+
+##############################################################################
+# Instantiate the different transport algorithms and fit them
+# -----------------------------------------------------------
+
+# EMD Transport
+ot_emd = ot.da.EMDTransport()
+ot_emd.fit(Xs=Xs, Xt=Xt)
+
+# Sinkhorn Transport
+ot_sinkhorn = ot.da.SinkhornTransport(reg_e=1e-1)
+ot_sinkhorn.fit(Xs=Xs, Xt=Xt)
+
+# Sinkhorn Transport with Group lasso regularization
+ot_lpl1 = ot.da.SinkhornLpl1Transport(reg_e=1e-1, reg_cl=1e0)
+ot_lpl1.fit(Xs=Xs, ys=ys, Xt=Xt)
+
+# Sinkhorn Transport with Group lasso regularization l1l2
+ot_l1l2 = ot.da.SinkhornL1l2Transport(reg_e=1e-1, reg_cl=2e0, max_iter=20,
+ verbose=True)
+ot_l1l2.fit(Xs=Xs, ys=ys, Xt=Xt)
+
+# transport source samples onto target samples
+transp_Xs_emd = ot_emd.transform(Xs=Xs)
+transp_Xs_sinkhorn = ot_sinkhorn.transform(Xs=Xs)
+transp_Xs_lpl1 = ot_lpl1.transform(Xs=Xs)
+transp_Xs_l1l2 = ot_l1l2.transform(Xs=Xs)
+
+
+##############################################################################
+# Fig 1 : plots source and target samples
+# ---------------------------------------
+
+pl.figure(1, figsize=(10, 5))
+pl.subplot(1, 2, 1)
+pl.scatter(Xs[:, 0], Xs[:, 1], c=ys, marker='+', label='Source samples')
+pl.xticks([])
+pl.yticks([])
+pl.legend(loc=0)
+pl.title('Source samples')
+
+pl.subplot(1, 2, 2)
+pl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o', label='Target samples')
+pl.xticks([])
+pl.yticks([])
+pl.legend(loc=0)
+pl.title('Target samples')
+pl.tight_layout()
+
+
+##############################################################################
+# Fig 2 : plot optimal couplings and transported samples
+# ------------------------------------------------------
+
+param_img = {'interpolation': 'nearest', 'cmap': 'spectral'}
+
+pl.figure(2, figsize=(15, 8))
+pl.subplot(2, 4, 1)
+pl.imshow(ot_emd.coupling_, **param_img)
+pl.xticks([])
+pl.yticks([])
+pl.title('Optimal coupling\nEMDTransport')
+
+pl.subplot(2, 4, 2)
+pl.imshow(ot_sinkhorn.coupling_, **param_img)
+pl.xticks([])
+pl.yticks([])
+pl.title('Optimal coupling\nSinkhornTransport')
+
+pl.subplot(2, 4, 3)
+pl.imshow(ot_lpl1.coupling_, **param_img)
+pl.xticks([])
+pl.yticks([])
+pl.title('Optimal coupling\nSinkhornLpl1Transport')
+
+pl.subplot(2, 4, 4)
+pl.imshow(ot_l1l2.coupling_, **param_img)
+pl.xticks([])
+pl.yticks([])
+pl.title('Optimal coupling\nSinkhornL1l2Transport')
+
+pl.subplot(2, 4, 5)
+pl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o',
+ label='Target samples', alpha=0.3)
+pl.scatter(transp_Xs_emd[:, 0], transp_Xs_emd[:, 1], c=ys,
+ marker='+', label='Transp samples', s=30)
+pl.xticks([])
+pl.yticks([])
+pl.title('Transported samples\nEmdTransport')
+pl.legend(loc="lower left")
+
+pl.subplot(2, 4, 6)
+pl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o',
+ label='Target samples', alpha=0.3)
+pl.scatter(transp_Xs_sinkhorn[:, 0], transp_Xs_sinkhorn[:, 1], c=ys,
+ marker='+', label='Transp samples', s=30)
+pl.xticks([])
+pl.yticks([])
+pl.title('Transported samples\nSinkhornTransport')
+
+pl.subplot(2, 4, 7)
+pl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o',
+ label='Target samples', alpha=0.3)
+pl.scatter(transp_Xs_lpl1[:, 0], transp_Xs_lpl1[:, 1], c=ys,
+ marker='+', label='Transp samples', s=30)
+pl.xticks([])
+pl.yticks([])
+pl.title('Transported samples\nSinkhornLpl1Transport')
+
+pl.subplot(2, 4, 8)
+pl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o',
+ label='Target samples', alpha=0.3)
+pl.scatter(transp_Xs_l1l2[:, 0], transp_Xs_l1l2[:, 1], c=ys,
+ marker='+', label='Transp samples', s=30)
+pl.xticks([])
+pl.yticks([])
+pl.title('Transported samples\nSinkhornL1l2Transport')
+pl.tight_layout()
+
+pl.show()
diff --git a/docs/source/auto_examples/plot_otda_classes.rst b/docs/source/auto_examples/plot_otda_classes.rst
new file mode 100644
index 0000000..f19a99f
--- /dev/null
+++ b/docs/source/auto_examples/plot_otda_classes.rst
@@ -0,0 +1,258 @@
+
+
+.. _sphx_glr_auto_examples_plot_otda_classes.py:
+
+
+========================
+OT for domain adaptation
+========================
+
+This example introduces a domain adaptation in a 2D setting and the 4 OTDA
+approaches currently supported in POT.
+
+
+
+
+.. code-block:: python
+
+
+ # Authors: Remi Flamary <remi.flamary@unice.fr>
+ # Stanislas Chambon <stan.chambon@gmail.com>
+ #
+ # License: MIT License
+
+ import matplotlib.pylab as pl
+ import ot
+
+
+
+
+
+
+
+
+Generate data
+-------------
+
+
+
+.. code-block:: python
+
+
+ n_source_samples = 150
+ n_target_samples = 150
+
+ Xs, ys = ot.datasets.get_data_classif('3gauss', n_source_samples)
+ Xt, yt = ot.datasets.get_data_classif('3gauss2', n_target_samples)
+
+
+
+
+
+
+
+
+Instantiate the different transport algorithms and fit them
+-----------------------------------------------------------
+
+
+
+.. code-block:: python
+
+
+ # EMD Transport
+ ot_emd = ot.da.EMDTransport()
+ ot_emd.fit(Xs=Xs, Xt=Xt)
+
+ # Sinkhorn Transport
+ ot_sinkhorn = ot.da.SinkhornTransport(reg_e=1e-1)
+ ot_sinkhorn.fit(Xs=Xs, Xt=Xt)
+
+ # Sinkhorn Transport with Group lasso regularization
+ ot_lpl1 = ot.da.SinkhornLpl1Transport(reg_e=1e-1, reg_cl=1e0)
+ ot_lpl1.fit(Xs=Xs, ys=ys, Xt=Xt)
+
+ # Sinkhorn Transport with Group lasso regularization l1l2
+ ot_l1l2 = ot.da.SinkhornL1l2Transport(reg_e=1e-1, reg_cl=2e0, max_iter=20,
+ verbose=True)
+ ot_l1l2.fit(Xs=Xs, ys=ys, Xt=Xt)
+
+ # transport source samples onto target samples
+ transp_Xs_emd = ot_emd.transform(Xs=Xs)
+ transp_Xs_sinkhorn = ot_sinkhorn.transform(Xs=Xs)
+ transp_Xs_lpl1 = ot_lpl1.transform(Xs=Xs)
+ transp_Xs_l1l2 = ot_l1l2.transform(Xs=Xs)
+
+
+
+
+
+
+.. rst-class:: sphx-glr-script-out
+
+ Out::
+
+ It. |Loss |Delta loss
+ --------------------------------
+ 0|9.552437e+00|0.000000e+00
+ 1|1.921833e+00|-3.970483e+00
+ 2|1.671022e+00|-1.500942e-01
+ 3|1.615147e+00|-3.459458e-02
+ 4|1.594289e+00|-1.308252e-02
+ 5|1.587287e+00|-4.411254e-03
+ 6|1.581665e+00|-3.554702e-03
+ 7|1.577022e+00|-2.943809e-03
+ 8|1.573870e+00|-2.002870e-03
+ 9|1.571645e+00|-1.415696e-03
+ 10|1.569342e+00|-1.467590e-03
+ 11|1.567863e+00|-9.432233e-04
+ 12|1.566558e+00|-8.329769e-04
+ 13|1.565414e+00|-7.311320e-04
+ 14|1.564425e+00|-6.319985e-04
+ 15|1.563955e+00|-3.007604e-04
+ 16|1.563658e+00|-1.894627e-04
+ 17|1.562886e+00|-4.941143e-04
+ 18|1.562578e+00|-1.974031e-04
+ 19|1.562445e+00|-8.468825e-05
+ It. |Loss |Delta loss
+ --------------------------------
+ 20|1.562007e+00|-2.805136e-04
+
+
+Fig 1 : plots source and target samples
+---------------------------------------
+
+
+
+.. code-block:: python
+
+
+ pl.figure(1, figsize=(10, 5))
+ pl.subplot(1, 2, 1)
+ pl.scatter(Xs[:, 0], Xs[:, 1], c=ys, marker='+', label='Source samples')
+ pl.xticks([])
+ pl.yticks([])
+ pl.legend(loc=0)
+ pl.title('Source samples')
+
+ pl.subplot(1, 2, 2)
+ pl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o', label='Target samples')
+ pl.xticks([])
+ pl.yticks([])
+ pl.legend(loc=0)
+ pl.title('Target samples')
+ pl.tight_layout()
+
+
+
+
+
+.. image:: /auto_examples/images/sphx_glr_plot_otda_classes_001.png
+ :align: center
+
+
+
+
+Fig 2 : plot optimal couplings and transported samples
+------------------------------------------------------
+
+
+
+.. code-block:: python
+
+
+ param_img = {'interpolation': 'nearest', 'cmap': 'spectral'}
+
+ pl.figure(2, figsize=(15, 8))
+ pl.subplot(2, 4, 1)
+ pl.imshow(ot_emd.coupling_, **param_img)
+ pl.xticks([])
+ pl.yticks([])
+ pl.title('Optimal coupling\nEMDTransport')
+
+ pl.subplot(2, 4, 2)
+ pl.imshow(ot_sinkhorn.coupling_, **param_img)
+ pl.xticks([])
+ pl.yticks([])
+ pl.title('Optimal coupling\nSinkhornTransport')
+
+ pl.subplot(2, 4, 3)
+ pl.imshow(ot_lpl1.coupling_, **param_img)
+ pl.xticks([])
+ pl.yticks([])
+ pl.title('Optimal coupling\nSinkhornLpl1Transport')
+
+ pl.subplot(2, 4, 4)
+ pl.imshow(ot_l1l2.coupling_, **param_img)
+ pl.xticks([])
+ pl.yticks([])
+ pl.title('Optimal coupling\nSinkhornL1l2Transport')
+
+ pl.subplot(2, 4, 5)
+ pl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o',
+ label='Target samples', alpha=0.3)
+ pl.scatter(transp_Xs_emd[:, 0], transp_Xs_emd[:, 1], c=ys,
+ marker='+', label='Transp samples', s=30)
+ pl.xticks([])
+ pl.yticks([])
+ pl.title('Transported samples\nEmdTransport')
+ pl.legend(loc="lower left")
+
+ pl.subplot(2, 4, 6)
+ pl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o',
+ label='Target samples', alpha=0.3)
+ pl.scatter(transp_Xs_sinkhorn[:, 0], transp_Xs_sinkhorn[:, 1], c=ys,
+ marker='+', label='Transp samples', s=30)
+ pl.xticks([])
+ pl.yticks([])
+ pl.title('Transported samples\nSinkhornTransport')
+
+ pl.subplot(2, 4, 7)
+ pl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o',
+ label='Target samples', alpha=0.3)
+ pl.scatter(transp_Xs_lpl1[:, 0], transp_Xs_lpl1[:, 1], c=ys,
+ marker='+', label='Transp samples', s=30)
+ pl.xticks([])
+ pl.yticks([])
+ pl.title('Transported samples\nSinkhornLpl1Transport')
+
+ pl.subplot(2, 4, 8)
+ pl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o',
+ label='Target samples', alpha=0.3)
+ pl.scatter(transp_Xs_l1l2[:, 0], transp_Xs_l1l2[:, 1], c=ys,
+ marker='+', label='Transp samples', s=30)
+ pl.xticks([])
+ pl.yticks([])
+ pl.title('Transported samples\nSinkhornL1l2Transport')
+ pl.tight_layout()
+
+ pl.show()
+
+
+
+.. image:: /auto_examples/images/sphx_glr_plot_otda_classes_003.png
+ :align: center
+
+
+
+
+**Total running time of the script:** ( 0 minutes 1.596 seconds)
+
+
+
+.. container:: sphx-glr-footer
+
+
+ .. container:: sphx-glr-download
+
+ :download:`Download Python source code: plot_otda_classes.py <plot_otda_classes.py>`
+
+
+
+ .. container:: sphx-glr-download
+
+ :download:`Download Jupyter notebook: plot_otda_classes.ipynb <plot_otda_classes.ipynb>`
+
+.. rst-class:: sphx-glr-signature
+
+ `Generated by Sphinx-Gallery <https://sphinx-gallery.readthedocs.io>`_
diff --git a/docs/source/auto_examples/plot_otda_color_images.ipynb b/docs/source/auto_examples/plot_otda_color_images.ipynb
new file mode 100644
index 0000000..2daf406
--- /dev/null
+++ b/docs/source/auto_examples/plot_otda_color_images.ipynb
@@ -0,0 +1,144 @@
+{
+ "nbformat_minor": 0,
+ "nbformat": 4,
+ "cells": [
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "%matplotlib inline"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "\n# OT for image color adaptation\n\n\nThis example presents a way of transferring colors between two image\nwith Optimal Transport as introduced in [6]\n\n[6] Ferradans, S., Papadakis, N., Peyre, G., & Aujol, J. F. (2014).\nRegularized discrete optimal transport.\nSIAM Journal on Imaging Sciences, 7(3), 1853-1882.\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "# Authors: Remi Flamary <remi.flamary@unice.fr>\n# Stanislas Chambon <stan.chambon@gmail.com>\n#\n# License: MIT License\n\nimport numpy as np\nfrom scipy import ndimage\nimport matplotlib.pylab as pl\nimport ot\n\n\nr = np.random.RandomState(42)\n\n\ndef im2mat(I):\n \"\"\"Converts and image to matrix (one pixel per line)\"\"\"\n return I.reshape((I.shape[0] * I.shape[1], I.shape[2]))\n\n\ndef mat2im(X, shape):\n \"\"\"Converts back a matrix to an image\"\"\"\n return X.reshape(shape)\n\n\ndef minmax(I):\n return np.clip(I, 0, 1)"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Generate data\n-------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "# Loading images\nI1 = ndimage.imread('../data/ocean_day.jpg').astype(np.float64) / 256\nI2 = ndimage.imread('../data/ocean_sunset.jpg').astype(np.float64) / 256\n\nX1 = im2mat(I1)\nX2 = im2mat(I2)\n\n# training samples\nnb = 1000\nidx1 = r.randint(X1.shape[0], size=(nb,))\nidx2 = r.randint(X2.shape[0], size=(nb,))\n\nXs = X1[idx1, :]\nXt = X2[idx2, :]"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Plot original image\n-------------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "pl.figure(1, figsize=(6.4, 3))\n\npl.subplot(1, 2, 1)\npl.imshow(I1)\npl.axis('off')\npl.title('Image 1')\n\npl.subplot(1, 2, 2)\npl.imshow(I2)\npl.axis('off')\npl.title('Image 2')"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Scatter plot of colors\n----------------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "pl.figure(2, figsize=(6.4, 3))\n\npl.subplot(1, 2, 1)\npl.scatter(Xs[:, 0], Xs[:, 2], c=Xs)\npl.axis([0, 1, 0, 1])\npl.xlabel('Red')\npl.ylabel('Blue')\npl.title('Image 1')\n\npl.subplot(1, 2, 2)\npl.scatter(Xt[:, 0], Xt[:, 2], c=Xt)\npl.axis([0, 1, 0, 1])\npl.xlabel('Red')\npl.ylabel('Blue')\npl.title('Image 2')\npl.tight_layout()"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Instantiate the different transport algorithms and fit them\n-----------------------------------------------------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "# EMDTransport\not_emd = ot.da.EMDTransport()\not_emd.fit(Xs=Xs, Xt=Xt)\n\n# SinkhornTransport\not_sinkhorn = ot.da.SinkhornTransport(reg_e=1e-1)\not_sinkhorn.fit(Xs=Xs, Xt=Xt)\n\n# prediction between images (using out of sample prediction as in [6])\ntransp_Xs_emd = ot_emd.transform(Xs=X1)\ntransp_Xt_emd = ot_emd.inverse_transform(Xt=X2)\n\ntransp_Xs_sinkhorn = ot_emd.transform(Xs=X1)\ntransp_Xt_sinkhorn = ot_emd.inverse_transform(Xt=X2)\n\nI1t = minmax(mat2im(transp_Xs_emd, I1.shape))\nI2t = minmax(mat2im(transp_Xt_emd, I2.shape))\n\nI1te = minmax(mat2im(transp_Xs_sinkhorn, I1.shape))\nI2te = minmax(mat2im(transp_Xt_sinkhorn, I2.shape))"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Plot new images\n---------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "pl.figure(3, figsize=(8, 4))\n\npl.subplot(2, 3, 1)\npl.imshow(I1)\npl.axis('off')\npl.title('Image 1')\n\npl.subplot(2, 3, 2)\npl.imshow(I1t)\npl.axis('off')\npl.title('Image 1 Adapt')\n\npl.subplot(2, 3, 3)\npl.imshow(I1te)\npl.axis('off')\npl.title('Image 1 Adapt (reg)')\n\npl.subplot(2, 3, 4)\npl.imshow(I2)\npl.axis('off')\npl.title('Image 2')\n\npl.subplot(2, 3, 5)\npl.imshow(I2t)\npl.axis('off')\npl.title('Image 2 Adapt')\n\npl.subplot(2, 3, 6)\npl.imshow(I2te)\npl.axis('off')\npl.title('Image 2 Adapt (reg)')\npl.tight_layout()\n\npl.show()"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ }
+ ],
+ "metadata": {
+ "kernelspec": {
+ "display_name": "Python 2",
+ "name": "python2",
+ "language": "python"
+ },
+ "language_info": {
+ "mimetype": "text/x-python",
+ "nbconvert_exporter": "python",
+ "name": "python",
+ "file_extension": ".py",
+ "version": "2.7.12",
+ "pygments_lexer": "ipython2",
+ "codemirror_mode": {
+ "version": 2,
+ "name": "ipython"
+ }
+ }
+ }
+} \ No newline at end of file
diff --git a/docs/source/auto_examples/plot_otda_color_images.py b/docs/source/auto_examples/plot_otda_color_images.py
new file mode 100644
index 0000000..e77aec0
--- /dev/null
+++ b/docs/source/auto_examples/plot_otda_color_images.py
@@ -0,0 +1,165 @@
+# -*- coding: utf-8 -*-
+"""
+=============================
+OT for image color adaptation
+=============================
+
+This example presents a way of transferring colors between two image
+with Optimal Transport as introduced in [6]
+
+[6] Ferradans, S., Papadakis, N., Peyre, G., & Aujol, J. F. (2014).
+Regularized discrete optimal transport.
+SIAM Journal on Imaging Sciences, 7(3), 1853-1882.
+"""
+
+# Authors: Remi Flamary <remi.flamary@unice.fr>
+# Stanislas Chambon <stan.chambon@gmail.com>
+#
+# License: MIT License
+
+import numpy as np
+from scipy import ndimage
+import matplotlib.pylab as pl
+import ot
+
+
+r = np.random.RandomState(42)
+
+
+def im2mat(I):
+ """Converts and image to matrix (one pixel per line)"""
+ return I.reshape((I.shape[0] * I.shape[1], I.shape[2]))
+
+
+def mat2im(X, shape):
+ """Converts back a matrix to an image"""
+ return X.reshape(shape)
+
+
+def minmax(I):
+ return np.clip(I, 0, 1)
+
+
+##############################################################################
+# Generate data
+# -------------
+
+# Loading images
+I1 = ndimage.imread('../data/ocean_day.jpg').astype(np.float64) / 256
+I2 = ndimage.imread('../data/ocean_sunset.jpg').astype(np.float64) / 256
+
+X1 = im2mat(I1)
+X2 = im2mat(I2)
+
+# training samples
+nb = 1000
+idx1 = r.randint(X1.shape[0], size=(nb,))
+idx2 = r.randint(X2.shape[0], size=(nb,))
+
+Xs = X1[idx1, :]
+Xt = X2[idx2, :]
+
+
+##############################################################################
+# Plot original image
+# -------------------
+
+pl.figure(1, figsize=(6.4, 3))
+
+pl.subplot(1, 2, 1)
+pl.imshow(I1)
+pl.axis('off')
+pl.title('Image 1')
+
+pl.subplot(1, 2, 2)
+pl.imshow(I2)
+pl.axis('off')
+pl.title('Image 2')
+
+
+##############################################################################
+# Scatter plot of colors
+# ----------------------
+
+pl.figure(2, figsize=(6.4, 3))
+
+pl.subplot(1, 2, 1)
+pl.scatter(Xs[:, 0], Xs[:, 2], c=Xs)
+pl.axis([0, 1, 0, 1])
+pl.xlabel('Red')
+pl.ylabel('Blue')
+pl.title('Image 1')
+
+pl.subplot(1, 2, 2)
+pl.scatter(Xt[:, 0], Xt[:, 2], c=Xt)
+pl.axis([0, 1, 0, 1])
+pl.xlabel('Red')
+pl.ylabel('Blue')
+pl.title('Image 2')
+pl.tight_layout()
+
+
+##############################################################################
+# Instantiate the different transport algorithms and fit them
+# -----------------------------------------------------------
+
+# EMDTransport
+ot_emd = ot.da.EMDTransport()
+ot_emd.fit(Xs=Xs, Xt=Xt)
+
+# SinkhornTransport
+ot_sinkhorn = ot.da.SinkhornTransport(reg_e=1e-1)
+ot_sinkhorn.fit(Xs=Xs, Xt=Xt)
+
+# prediction between images (using out of sample prediction as in [6])
+transp_Xs_emd = ot_emd.transform(Xs=X1)
+transp_Xt_emd = ot_emd.inverse_transform(Xt=X2)
+
+transp_Xs_sinkhorn = ot_emd.transform(Xs=X1)
+transp_Xt_sinkhorn = ot_emd.inverse_transform(Xt=X2)
+
+I1t = minmax(mat2im(transp_Xs_emd, I1.shape))
+I2t = minmax(mat2im(transp_Xt_emd, I2.shape))
+
+I1te = minmax(mat2im(transp_Xs_sinkhorn, I1.shape))
+I2te = minmax(mat2im(transp_Xt_sinkhorn, I2.shape))
+
+
+##############################################################################
+# Plot new images
+# ---------------
+
+pl.figure(3, figsize=(8, 4))
+
+pl.subplot(2, 3, 1)
+pl.imshow(I1)
+pl.axis('off')
+pl.title('Image 1')
+
+pl.subplot(2, 3, 2)
+pl.imshow(I1t)
+pl.axis('off')
+pl.title('Image 1 Adapt')
+
+pl.subplot(2, 3, 3)
+pl.imshow(I1te)
+pl.axis('off')
+pl.title('Image 1 Adapt (reg)')
+
+pl.subplot(2, 3, 4)
+pl.imshow(I2)
+pl.axis('off')
+pl.title('Image 2')
+
+pl.subplot(2, 3, 5)
+pl.imshow(I2t)
+pl.axis('off')
+pl.title('Image 2 Adapt')
+
+pl.subplot(2, 3, 6)
+pl.imshow(I2te)
+pl.axis('off')
+pl.title('Image 2 Adapt (reg)')
+pl.tight_layout()
+
+pl.show()
diff --git a/docs/source/auto_examples/plot_otda_color_images.rst b/docs/source/auto_examples/plot_otda_color_images.rst
new file mode 100644
index 0000000..4772bed
--- /dev/null
+++ b/docs/source/auto_examples/plot_otda_color_images.rst
@@ -0,0 +1,257 @@
+
+
+.. _sphx_glr_auto_examples_plot_otda_color_images.py:
+
+
+=============================
+OT for image color adaptation
+=============================
+
+This example presents a way of transferring colors between two image
+with Optimal Transport as introduced in [6]
+
+[6] Ferradans, S., Papadakis, N., Peyre, G., & Aujol, J. F. (2014).
+Regularized discrete optimal transport.
+SIAM Journal on Imaging Sciences, 7(3), 1853-1882.
+
+
+
+.. code-block:: python
+
+
+ # Authors: Remi Flamary <remi.flamary@unice.fr>
+ # Stanislas Chambon <stan.chambon@gmail.com>
+ #
+ # License: MIT License
+
+ import numpy as np
+ from scipy import ndimage
+ import matplotlib.pylab as pl
+ import ot
+
+
+ r = np.random.RandomState(42)
+
+
+ def im2mat(I):
+ """Converts and image to matrix (one pixel per line)"""
+ return I.reshape((I.shape[0] * I.shape[1], I.shape[2]))
+
+
+ def mat2im(X, shape):
+ """Converts back a matrix to an image"""
+ return X.reshape(shape)
+
+
+ def minmax(I):
+ return np.clip(I, 0, 1)
+
+
+
+
+
+
+
+
+Generate data
+-------------
+
+
+
+.. code-block:: python
+
+
+ # Loading images
+ I1 = ndimage.imread('../data/ocean_day.jpg').astype(np.float64) / 256
+ I2 = ndimage.imread('../data/ocean_sunset.jpg').astype(np.float64) / 256
+
+ X1 = im2mat(I1)
+ X2 = im2mat(I2)
+
+ # training samples
+ nb = 1000
+ idx1 = r.randint(X1.shape[0], size=(nb,))
+ idx2 = r.randint(X2.shape[0], size=(nb,))
+
+ Xs = X1[idx1, :]
+ Xt = X2[idx2, :]
+
+
+
+
+
+
+
+
+Plot original image
+-------------------
+
+
+
+.. code-block:: python
+
+
+ pl.figure(1, figsize=(6.4, 3))
+
+ pl.subplot(1, 2, 1)
+ pl.imshow(I1)
+ pl.axis('off')
+ pl.title('Image 1')
+
+ pl.subplot(1, 2, 2)
+ pl.imshow(I2)
+ pl.axis('off')
+ pl.title('Image 2')
+
+
+
+
+
+.. image:: /auto_examples/images/sphx_glr_plot_otda_color_images_001.png
+ :align: center
+
+
+
+
+Scatter plot of colors
+----------------------
+
+
+
+.. code-block:: python
+
+
+ pl.figure(2, figsize=(6.4, 3))
+
+ pl.subplot(1, 2, 1)
+ pl.scatter(Xs[:, 0], Xs[:, 2], c=Xs)
+ pl.axis([0, 1, 0, 1])
+ pl.xlabel('Red')
+ pl.ylabel('Blue')
+ pl.title('Image 1')
+
+ pl.subplot(1, 2, 2)
+ pl.scatter(Xt[:, 0], Xt[:, 2], c=Xt)
+ pl.axis([0, 1, 0, 1])
+ pl.xlabel('Red')
+ pl.ylabel('Blue')
+ pl.title('Image 2')
+ pl.tight_layout()
+
+
+
+
+
+.. image:: /auto_examples/images/sphx_glr_plot_otda_color_images_003.png
+ :align: center
+
+
+
+
+Instantiate the different transport algorithms and fit them
+-----------------------------------------------------------
+
+
+
+.. code-block:: python
+
+
+ # EMDTransport
+ ot_emd = ot.da.EMDTransport()
+ ot_emd.fit(Xs=Xs, Xt=Xt)
+
+ # SinkhornTransport
+ ot_sinkhorn = ot.da.SinkhornTransport(reg_e=1e-1)
+ ot_sinkhorn.fit(Xs=Xs, Xt=Xt)
+
+ # prediction between images (using out of sample prediction as in [6])
+ transp_Xs_emd = ot_emd.transform(Xs=X1)
+ transp_Xt_emd = ot_emd.inverse_transform(Xt=X2)
+
+ transp_Xs_sinkhorn = ot_emd.transform(Xs=X1)
+ transp_Xt_sinkhorn = ot_emd.inverse_transform(Xt=X2)
+
+ I1t = minmax(mat2im(transp_Xs_emd, I1.shape))
+ I2t = minmax(mat2im(transp_Xt_emd, I2.shape))
+
+ I1te = minmax(mat2im(transp_Xs_sinkhorn, I1.shape))
+ I2te = minmax(mat2im(transp_Xt_sinkhorn, I2.shape))
+
+
+
+
+
+
+
+
+Plot new images
+---------------
+
+
+
+.. code-block:: python
+
+
+ pl.figure(3, figsize=(8, 4))
+
+ pl.subplot(2, 3, 1)
+ pl.imshow(I1)
+ pl.axis('off')
+ pl.title('Image 1')
+
+ pl.subplot(2, 3, 2)
+ pl.imshow(I1t)
+ pl.axis('off')
+ pl.title('Image 1 Adapt')
+
+ pl.subplot(2, 3, 3)
+ pl.imshow(I1te)
+ pl.axis('off')
+ pl.title('Image 1 Adapt (reg)')
+
+ pl.subplot(2, 3, 4)
+ pl.imshow(I2)
+ pl.axis('off')
+ pl.title('Image 2')
+
+ pl.subplot(2, 3, 5)
+ pl.imshow(I2t)
+ pl.axis('off')
+ pl.title('Image 2 Adapt')
+
+ pl.subplot(2, 3, 6)
+ pl.imshow(I2te)
+ pl.axis('off')
+ pl.title('Image 2 Adapt (reg)')
+ pl.tight_layout()
+
+ pl.show()
+
+
+
+.. image:: /auto_examples/images/sphx_glr_plot_otda_color_images_005.png
+ :align: center
+
+
+
+
+**Total running time of the script:** ( 2 minutes 24.561 seconds)
+
+
+
+.. container:: sphx-glr-footer
+
+
+ .. container:: sphx-glr-download
+
+ :download:`Download Python source code: plot_otda_color_images.py <plot_otda_color_images.py>`
+
+
+
+ .. container:: sphx-glr-download
+
+ :download:`Download Jupyter notebook: plot_otda_color_images.ipynb <plot_otda_color_images.ipynb>`
+
+.. rst-class:: sphx-glr-signature
+
+ `Generated by Sphinx-Gallery <https://sphinx-gallery.readthedocs.io>`_
diff --git a/docs/source/auto_examples/plot_otda_d2.ipynb b/docs/source/auto_examples/plot_otda_d2.ipynb
new file mode 100644
index 0000000..7bfcc9a
--- /dev/null
+++ b/docs/source/auto_examples/plot_otda_d2.ipynb
@@ -0,0 +1,144 @@
+{
+ "nbformat_minor": 0,
+ "nbformat": 4,
+ "cells": [
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "%matplotlib inline"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "\n# OT for domain adaptation on empirical distributions\n\n\nThis example introduces a domain adaptation in a 2D setting. It explicits\nthe problem of domain adaptation and introduces some optimal transport\napproaches to solve it.\n\nQuantities such as optimal couplings, greater coupling coefficients and\ntransported samples are represented in order to give a visual understanding\nof what the transport methods are doing.\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "# Authors: Remi Flamary <remi.flamary@unice.fr>\n# Stanislas Chambon <stan.chambon@gmail.com>\n#\n# License: MIT License\n\nimport matplotlib.pylab as pl\nimport ot"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "generate data\n-------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "n_samples_source = 150\nn_samples_target = 150\n\nXs, ys = ot.datasets.get_data_classif('3gauss', n_samples_source)\nXt, yt = ot.datasets.get_data_classif('3gauss2', n_samples_target)\n\n# Cost matrix\nM = ot.dist(Xs, Xt, metric='sqeuclidean')"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Instantiate the different transport algorithms and fit them\n-----------------------------------------------------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "# EMD Transport\not_emd = ot.da.EMDTransport()\not_emd.fit(Xs=Xs, Xt=Xt)\n\n# Sinkhorn Transport\not_sinkhorn = ot.da.SinkhornTransport(reg_e=1e-1)\not_sinkhorn.fit(Xs=Xs, Xt=Xt)\n\n# Sinkhorn Transport with Group lasso regularization\not_lpl1 = ot.da.SinkhornLpl1Transport(reg_e=1e-1, reg_cl=1e0)\not_lpl1.fit(Xs=Xs, ys=ys, Xt=Xt)\n\n# transport source samples onto target samples\ntransp_Xs_emd = ot_emd.transform(Xs=Xs)\ntransp_Xs_sinkhorn = ot_sinkhorn.transform(Xs=Xs)\ntransp_Xs_lpl1 = ot_lpl1.transform(Xs=Xs)"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Fig 1 : plots source and target samples + matrix of pairwise distance\n---------------------------------------------------------------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "pl.figure(1, figsize=(10, 10))\npl.subplot(2, 2, 1)\npl.scatter(Xs[:, 0], Xs[:, 1], c=ys, marker='+', label='Source samples')\npl.xticks([])\npl.yticks([])\npl.legend(loc=0)\npl.title('Source samples')\n\npl.subplot(2, 2, 2)\npl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o', label='Target samples')\npl.xticks([])\npl.yticks([])\npl.legend(loc=0)\npl.title('Target samples')\n\npl.subplot(2, 2, 3)\npl.imshow(M, interpolation='nearest')\npl.xticks([])\npl.yticks([])\npl.title('Matrix of pairwise distances')\npl.tight_layout()"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Fig 2 : plots optimal couplings for the different methods\n---------------------------------------------------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "pl.figure(2, figsize=(10, 6))\n\npl.subplot(2, 3, 1)\npl.imshow(ot_emd.coupling_, interpolation='nearest')\npl.xticks([])\npl.yticks([])\npl.title('Optimal coupling\\nEMDTransport')\n\npl.subplot(2, 3, 2)\npl.imshow(ot_sinkhorn.coupling_, interpolation='nearest')\npl.xticks([])\npl.yticks([])\npl.title('Optimal coupling\\nSinkhornTransport')\n\npl.subplot(2, 3, 3)\npl.imshow(ot_lpl1.coupling_, interpolation='nearest')\npl.xticks([])\npl.yticks([])\npl.title('Optimal coupling\\nSinkhornLpl1Transport')\n\npl.subplot(2, 3, 4)\not.plot.plot2D_samples_mat(Xs, Xt, ot_emd.coupling_, c=[.5, .5, 1])\npl.scatter(Xs[:, 0], Xs[:, 1], c=ys, marker='+', label='Source samples')\npl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o', label='Target samples')\npl.xticks([])\npl.yticks([])\npl.title('Main coupling coefficients\\nEMDTransport')\n\npl.subplot(2, 3, 5)\not.plot.plot2D_samples_mat(Xs, Xt, ot_sinkhorn.coupling_, c=[.5, .5, 1])\npl.scatter(Xs[:, 0], Xs[:, 1], c=ys, marker='+', label='Source samples')\npl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o', label='Target samples')\npl.xticks([])\npl.yticks([])\npl.title('Main coupling coefficients\\nSinkhornTransport')\n\npl.subplot(2, 3, 6)\not.plot.plot2D_samples_mat(Xs, Xt, ot_lpl1.coupling_, c=[.5, .5, 1])\npl.scatter(Xs[:, 0], Xs[:, 1], c=ys, marker='+', label='Source samples')\npl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o', label='Target samples')\npl.xticks([])\npl.yticks([])\npl.title('Main coupling coefficients\\nSinkhornLpl1Transport')\npl.tight_layout()"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Fig 3 : plot transported samples\n--------------------------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "# display transported samples\npl.figure(4, figsize=(10, 4))\npl.subplot(1, 3, 1)\npl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o',\n label='Target samples', alpha=0.5)\npl.scatter(transp_Xs_emd[:, 0], transp_Xs_emd[:, 1], c=ys,\n marker='+', label='Transp samples', s=30)\npl.title('Transported samples\\nEmdTransport')\npl.legend(loc=0)\npl.xticks([])\npl.yticks([])\n\npl.subplot(1, 3, 2)\npl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o',\n label='Target samples', alpha=0.5)\npl.scatter(transp_Xs_sinkhorn[:, 0], transp_Xs_sinkhorn[:, 1], c=ys,\n marker='+', label='Transp samples', s=30)\npl.title('Transported samples\\nSinkhornTransport')\npl.xticks([])\npl.yticks([])\n\npl.subplot(1, 3, 3)\npl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o',\n label='Target samples', alpha=0.5)\npl.scatter(transp_Xs_lpl1[:, 0], transp_Xs_lpl1[:, 1], c=ys,\n marker='+', label='Transp samples', s=30)\npl.title('Transported samples\\nSinkhornLpl1Transport')\npl.xticks([])\npl.yticks([])\n\npl.tight_layout()\npl.show()"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ }
+ ],
+ "metadata": {
+ "kernelspec": {
+ "display_name": "Python 2",
+ "name": "python2",
+ "language": "python"
+ },
+ "language_info": {
+ "mimetype": "text/x-python",
+ "nbconvert_exporter": "python",
+ "name": "python",
+ "file_extension": ".py",
+ "version": "2.7.12",
+ "pygments_lexer": "ipython2",
+ "codemirror_mode": {
+ "version": 2,
+ "name": "ipython"
+ }
+ }
+ }
+} \ No newline at end of file
diff --git a/docs/source/auto_examples/plot_otda_d2.py b/docs/source/auto_examples/plot_otda_d2.py
new file mode 100644
index 0000000..e53d7d6
--- /dev/null
+++ b/docs/source/auto_examples/plot_otda_d2.py
@@ -0,0 +1,172 @@
+# -*- coding: utf-8 -*-
+"""
+===================================================
+OT for domain adaptation on empirical distributions
+===================================================
+
+This example introduces a domain adaptation in a 2D setting. It explicits
+the problem of domain adaptation and introduces some optimal transport
+approaches to solve it.
+
+Quantities such as optimal couplings, greater coupling coefficients and
+transported samples are represented in order to give a visual understanding
+of what the transport methods are doing.
+"""
+
+# Authors: Remi Flamary <remi.flamary@unice.fr>
+# Stanislas Chambon <stan.chambon@gmail.com>
+#
+# License: MIT License
+
+import matplotlib.pylab as pl
+import ot
+
+
+##############################################################################
+# generate data
+# -------------
+
+n_samples_source = 150
+n_samples_target = 150
+
+Xs, ys = ot.datasets.get_data_classif('3gauss', n_samples_source)
+Xt, yt = ot.datasets.get_data_classif('3gauss2', n_samples_target)
+
+# Cost matrix
+M = ot.dist(Xs, Xt, metric='sqeuclidean')
+
+
+##############################################################################
+# Instantiate the different transport algorithms and fit them
+# -----------------------------------------------------------
+
+# EMD Transport
+ot_emd = ot.da.EMDTransport()
+ot_emd.fit(Xs=Xs, Xt=Xt)
+
+# Sinkhorn Transport
+ot_sinkhorn = ot.da.SinkhornTransport(reg_e=1e-1)
+ot_sinkhorn.fit(Xs=Xs, Xt=Xt)
+
+# Sinkhorn Transport with Group lasso regularization
+ot_lpl1 = ot.da.SinkhornLpl1Transport(reg_e=1e-1, reg_cl=1e0)
+ot_lpl1.fit(Xs=Xs, ys=ys, Xt=Xt)
+
+# transport source samples onto target samples
+transp_Xs_emd = ot_emd.transform(Xs=Xs)
+transp_Xs_sinkhorn = ot_sinkhorn.transform(Xs=Xs)
+transp_Xs_lpl1 = ot_lpl1.transform(Xs=Xs)
+
+
+##############################################################################
+# Fig 1 : plots source and target samples + matrix of pairwise distance
+# ---------------------------------------------------------------------
+
+pl.figure(1, figsize=(10, 10))
+pl.subplot(2, 2, 1)
+pl.scatter(Xs[:, 0], Xs[:, 1], c=ys, marker='+', label='Source samples')
+pl.xticks([])
+pl.yticks([])
+pl.legend(loc=0)
+pl.title('Source samples')
+
+pl.subplot(2, 2, 2)
+pl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o', label='Target samples')
+pl.xticks([])
+pl.yticks([])
+pl.legend(loc=0)
+pl.title('Target samples')
+
+pl.subplot(2, 2, 3)
+pl.imshow(M, interpolation='nearest')
+pl.xticks([])
+pl.yticks([])
+pl.title('Matrix of pairwise distances')
+pl.tight_layout()
+
+
+##############################################################################
+# Fig 2 : plots optimal couplings for the different methods
+# ---------------------------------------------------------
+pl.figure(2, figsize=(10, 6))
+
+pl.subplot(2, 3, 1)
+pl.imshow(ot_emd.coupling_, interpolation='nearest')
+pl.xticks([])
+pl.yticks([])
+pl.title('Optimal coupling\nEMDTransport')
+
+pl.subplot(2, 3, 2)
+pl.imshow(ot_sinkhorn.coupling_, interpolation='nearest')
+pl.xticks([])
+pl.yticks([])
+pl.title('Optimal coupling\nSinkhornTransport')
+
+pl.subplot(2, 3, 3)
+pl.imshow(ot_lpl1.coupling_, interpolation='nearest')
+pl.xticks([])
+pl.yticks([])
+pl.title('Optimal coupling\nSinkhornLpl1Transport')
+
+pl.subplot(2, 3, 4)
+ot.plot.plot2D_samples_mat(Xs, Xt, ot_emd.coupling_, c=[.5, .5, 1])
+pl.scatter(Xs[:, 0], Xs[:, 1], c=ys, marker='+', label='Source samples')
+pl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o', label='Target samples')
+pl.xticks([])
+pl.yticks([])
+pl.title('Main coupling coefficients\nEMDTransport')
+
+pl.subplot(2, 3, 5)
+ot.plot.plot2D_samples_mat(Xs, Xt, ot_sinkhorn.coupling_, c=[.5, .5, 1])
+pl.scatter(Xs[:, 0], Xs[:, 1], c=ys, marker='+', label='Source samples')
+pl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o', label='Target samples')
+pl.xticks([])
+pl.yticks([])
+pl.title('Main coupling coefficients\nSinkhornTransport')
+
+pl.subplot(2, 3, 6)
+ot.plot.plot2D_samples_mat(Xs, Xt, ot_lpl1.coupling_, c=[.5, .5, 1])
+pl.scatter(Xs[:, 0], Xs[:, 1], c=ys, marker='+', label='Source samples')
+pl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o', label='Target samples')
+pl.xticks([])
+pl.yticks([])
+pl.title('Main coupling coefficients\nSinkhornLpl1Transport')
+pl.tight_layout()
+
+
+##############################################################################
+# Fig 3 : plot transported samples
+# --------------------------------
+
+# display transported samples
+pl.figure(4, figsize=(10, 4))
+pl.subplot(1, 3, 1)
+pl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o',
+ label='Target samples', alpha=0.5)
+pl.scatter(transp_Xs_emd[:, 0], transp_Xs_emd[:, 1], c=ys,
+ marker='+', label='Transp samples', s=30)
+pl.title('Transported samples\nEmdTransport')
+pl.legend(loc=0)
+pl.xticks([])
+pl.yticks([])
+
+pl.subplot(1, 3, 2)
+pl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o',
+ label='Target samples', alpha=0.5)
+pl.scatter(transp_Xs_sinkhorn[:, 0], transp_Xs_sinkhorn[:, 1], c=ys,
+ marker='+', label='Transp samples', s=30)
+pl.title('Transported samples\nSinkhornTransport')
+pl.xticks([])
+pl.yticks([])
+
+pl.subplot(1, 3, 3)
+pl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o',
+ label='Target samples', alpha=0.5)
+pl.scatter(transp_Xs_lpl1[:, 0], transp_Xs_lpl1[:, 1], c=ys,
+ marker='+', label='Transp samples', s=30)
+pl.title('Transported samples\nSinkhornLpl1Transport')
+pl.xticks([])
+pl.yticks([])
+
+pl.tight_layout()
+pl.show()
diff --git a/docs/source/auto_examples/plot_otda_d2.rst b/docs/source/auto_examples/plot_otda_d2.rst
new file mode 100644
index 0000000..2b716e1
--- /dev/null
+++ b/docs/source/auto_examples/plot_otda_d2.rst
@@ -0,0 +1,264 @@
+
+
+.. _sphx_glr_auto_examples_plot_otda_d2.py:
+
+
+===================================================
+OT for domain adaptation on empirical distributions
+===================================================
+
+This example introduces a domain adaptation in a 2D setting. It explicits
+the problem of domain adaptation and introduces some optimal transport
+approaches to solve it.
+
+Quantities such as optimal couplings, greater coupling coefficients and
+transported samples are represented in order to give a visual understanding
+of what the transport methods are doing.
+
+
+
+.. code-block:: python
+
+
+ # Authors: Remi Flamary <remi.flamary@unice.fr>
+ # Stanislas Chambon <stan.chambon@gmail.com>
+ #
+ # License: MIT License
+
+ import matplotlib.pylab as pl
+ import ot
+
+
+
+
+
+
+
+
+generate data
+-------------
+
+
+
+.. code-block:: python
+
+
+ n_samples_source = 150
+ n_samples_target = 150
+
+ Xs, ys = ot.datasets.get_data_classif('3gauss', n_samples_source)
+ Xt, yt = ot.datasets.get_data_classif('3gauss2', n_samples_target)
+
+ # Cost matrix
+ M = ot.dist(Xs, Xt, metric='sqeuclidean')
+
+
+
+
+
+
+
+
+Instantiate the different transport algorithms and fit them
+-----------------------------------------------------------
+
+
+
+.. code-block:: python
+
+
+ # EMD Transport
+ ot_emd = ot.da.EMDTransport()
+ ot_emd.fit(Xs=Xs, Xt=Xt)
+
+ # Sinkhorn Transport
+ ot_sinkhorn = ot.da.SinkhornTransport(reg_e=1e-1)
+ ot_sinkhorn.fit(Xs=Xs, Xt=Xt)
+
+ # Sinkhorn Transport with Group lasso regularization
+ ot_lpl1 = ot.da.SinkhornLpl1Transport(reg_e=1e-1, reg_cl=1e0)
+ ot_lpl1.fit(Xs=Xs, ys=ys, Xt=Xt)
+
+ # transport source samples onto target samples
+ transp_Xs_emd = ot_emd.transform(Xs=Xs)
+ transp_Xs_sinkhorn = ot_sinkhorn.transform(Xs=Xs)
+ transp_Xs_lpl1 = ot_lpl1.transform(Xs=Xs)
+
+
+
+
+
+
+
+
+Fig 1 : plots source and target samples + matrix of pairwise distance
+---------------------------------------------------------------------
+
+
+
+.. code-block:: python
+
+
+ pl.figure(1, figsize=(10, 10))
+ pl.subplot(2, 2, 1)
+ pl.scatter(Xs[:, 0], Xs[:, 1], c=ys, marker='+', label='Source samples')
+ pl.xticks([])
+ pl.yticks([])
+ pl.legend(loc=0)
+ pl.title('Source samples')
+
+ pl.subplot(2, 2, 2)
+ pl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o', label='Target samples')
+ pl.xticks([])
+ pl.yticks([])
+ pl.legend(loc=0)
+ pl.title('Target samples')
+
+ pl.subplot(2, 2, 3)
+ pl.imshow(M, interpolation='nearest')
+ pl.xticks([])
+ pl.yticks([])
+ pl.title('Matrix of pairwise distances')
+ pl.tight_layout()
+
+
+
+
+
+.. image:: /auto_examples/images/sphx_glr_plot_otda_d2_001.png
+ :align: center
+
+
+
+
+Fig 2 : plots optimal couplings for the different methods
+---------------------------------------------------------
+
+
+
+.. code-block:: python
+
+ pl.figure(2, figsize=(10, 6))
+
+ pl.subplot(2, 3, 1)
+ pl.imshow(ot_emd.coupling_, interpolation='nearest')
+ pl.xticks([])
+ pl.yticks([])
+ pl.title('Optimal coupling\nEMDTransport')
+
+ pl.subplot(2, 3, 2)
+ pl.imshow(ot_sinkhorn.coupling_, interpolation='nearest')
+ pl.xticks([])
+ pl.yticks([])
+ pl.title('Optimal coupling\nSinkhornTransport')
+
+ pl.subplot(2, 3, 3)
+ pl.imshow(ot_lpl1.coupling_, interpolation='nearest')
+ pl.xticks([])
+ pl.yticks([])
+ pl.title('Optimal coupling\nSinkhornLpl1Transport')
+
+ pl.subplot(2, 3, 4)
+ ot.plot.plot2D_samples_mat(Xs, Xt, ot_emd.coupling_, c=[.5, .5, 1])
+ pl.scatter(Xs[:, 0], Xs[:, 1], c=ys, marker='+', label='Source samples')
+ pl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o', label='Target samples')
+ pl.xticks([])
+ pl.yticks([])
+ pl.title('Main coupling coefficients\nEMDTransport')
+
+ pl.subplot(2, 3, 5)
+ ot.plot.plot2D_samples_mat(Xs, Xt, ot_sinkhorn.coupling_, c=[.5, .5, 1])
+ pl.scatter(Xs[:, 0], Xs[:, 1], c=ys, marker='+', label='Source samples')
+ pl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o', label='Target samples')
+ pl.xticks([])
+ pl.yticks([])
+ pl.title('Main coupling coefficients\nSinkhornTransport')
+
+ pl.subplot(2, 3, 6)
+ ot.plot.plot2D_samples_mat(Xs, Xt, ot_lpl1.coupling_, c=[.5, .5, 1])
+ pl.scatter(Xs[:, 0], Xs[:, 1], c=ys, marker='+', label='Source samples')
+ pl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o', label='Target samples')
+ pl.xticks([])
+ pl.yticks([])
+ pl.title('Main coupling coefficients\nSinkhornLpl1Transport')
+ pl.tight_layout()
+
+
+
+
+
+.. image:: /auto_examples/images/sphx_glr_plot_otda_d2_003.png
+ :align: center
+
+
+
+
+Fig 3 : plot transported samples
+--------------------------------
+
+
+
+.. code-block:: python
+
+
+ # display transported samples
+ pl.figure(4, figsize=(10, 4))
+ pl.subplot(1, 3, 1)
+ pl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o',
+ label='Target samples', alpha=0.5)
+ pl.scatter(transp_Xs_emd[:, 0], transp_Xs_emd[:, 1], c=ys,
+ marker='+', label='Transp samples', s=30)
+ pl.title('Transported samples\nEmdTransport')
+ pl.legend(loc=0)
+ pl.xticks([])
+ pl.yticks([])
+
+ pl.subplot(1, 3, 2)
+ pl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o',
+ label='Target samples', alpha=0.5)
+ pl.scatter(transp_Xs_sinkhorn[:, 0], transp_Xs_sinkhorn[:, 1], c=ys,
+ marker='+', label='Transp samples', s=30)
+ pl.title('Transported samples\nSinkhornTransport')
+ pl.xticks([])
+ pl.yticks([])
+
+ pl.subplot(1, 3, 3)
+ pl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o',
+ label='Target samples', alpha=0.5)
+ pl.scatter(transp_Xs_lpl1[:, 0], transp_Xs_lpl1[:, 1], c=ys,
+ marker='+', label='Transp samples', s=30)
+ pl.title('Transported samples\nSinkhornLpl1Transport')
+ pl.xticks([])
+ pl.yticks([])
+
+ pl.tight_layout()
+ pl.show()
+
+
+
+.. image:: /auto_examples/images/sphx_glr_plot_otda_d2_006.png
+ :align: center
+
+
+
+
+**Total running time of the script:** ( 0 minutes 32.084 seconds)
+
+
+
+.. container:: sphx-glr-footer
+
+
+ .. container:: sphx-glr-download
+
+ :download:`Download Python source code: plot_otda_d2.py <plot_otda_d2.py>`
+
+
+
+ .. container:: sphx-glr-download
+
+ :download:`Download Jupyter notebook: plot_otda_d2.ipynb <plot_otda_d2.ipynb>`
+
+.. rst-class:: sphx-glr-signature
+
+ `Generated by Sphinx-Gallery <https://sphinx-gallery.readthedocs.io>`_
diff --git a/docs/source/auto_examples/plot_otda_mapping.ipynb b/docs/source/auto_examples/plot_otda_mapping.ipynb
new file mode 100644
index 0000000..0374146
--- /dev/null
+++ b/docs/source/auto_examples/plot_otda_mapping.ipynb
@@ -0,0 +1,126 @@
+{
+ "nbformat_minor": 0,
+ "nbformat": 4,
+ "cells": [
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "%matplotlib inline"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "\n# OT mapping estimation for domain adaptation\n\n\nThis example presents how to use MappingTransport to estimate at the same\ntime both the coupling transport and approximate the transport map with either\na linear or a kernelized mapping as introduced in [8].\n\n[8] M. Perrot, N. Courty, R. Flamary, A. Habrard,\n \"Mapping estimation for discrete optimal transport\",\n Neural Information Processing Systems (NIPS), 2016.\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "# Authors: Remi Flamary <remi.flamary@unice.fr>\n# Stanislas Chambon <stan.chambon@gmail.com>\n#\n# License: MIT License\n\nimport numpy as np\nimport matplotlib.pylab as pl\nimport ot"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Generate data\n-------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "n_source_samples = 100\nn_target_samples = 100\ntheta = 2 * np.pi / 20\nnoise_level = 0.1\n\nXs, ys = ot.datasets.get_data_classif(\n 'gaussrot', n_source_samples, nz=noise_level)\nXs_new, _ = ot.datasets.get_data_classif(\n 'gaussrot', n_source_samples, nz=noise_level)\nXt, yt = ot.datasets.get_data_classif(\n 'gaussrot', n_target_samples, theta=theta, nz=noise_level)\n\n# one of the target mode changes its variance (no linear mapping)\nXt[yt == 2] *= 3\nXt = Xt + 4"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Plot data\n---------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "pl.figure(1, (10, 5))\npl.clf()\npl.scatter(Xs[:, 0], Xs[:, 1], c=ys, marker='+', label='Source samples')\npl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o', label='Target samples')\npl.legend(loc=0)\npl.title('Source and target distributions')"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Instantiate the different transport algorithms and fit them\n-----------------------------------------------------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "# MappingTransport with linear kernel\not_mapping_linear = ot.da.MappingTransport(\n kernel=\"linear\", mu=1e0, eta=1e-8, bias=True,\n max_iter=20, verbose=True)\n\not_mapping_linear.fit(Xs=Xs, Xt=Xt)\n\n# for original source samples, transform applies barycentric mapping\ntransp_Xs_linear = ot_mapping_linear.transform(Xs=Xs)\n\n# for out of source samples, transform applies the linear mapping\ntransp_Xs_linear_new = ot_mapping_linear.transform(Xs=Xs_new)\n\n\n# MappingTransport with gaussian kernel\not_mapping_gaussian = ot.da.MappingTransport(\n kernel=\"gaussian\", eta=1e-5, mu=1e-1, bias=True, sigma=1,\n max_iter=10, verbose=True)\not_mapping_gaussian.fit(Xs=Xs, Xt=Xt)\n\n# for original source samples, transform applies barycentric mapping\ntransp_Xs_gaussian = ot_mapping_gaussian.transform(Xs=Xs)\n\n# for out of source samples, transform applies the gaussian mapping\ntransp_Xs_gaussian_new = ot_mapping_gaussian.transform(Xs=Xs_new)"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Plot transported samples\n------------------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "pl.figure(2)\npl.clf()\npl.subplot(2, 2, 1)\npl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o',\n label='Target samples', alpha=.2)\npl.scatter(transp_Xs_linear[:, 0], transp_Xs_linear[:, 1], c=ys, marker='+',\n label='Mapped source samples')\npl.title(\"Bary. mapping (linear)\")\npl.legend(loc=0)\n\npl.subplot(2, 2, 2)\npl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o',\n label='Target samples', alpha=.2)\npl.scatter(transp_Xs_linear_new[:, 0], transp_Xs_linear_new[:, 1],\n c=ys, marker='+', label='Learned mapping')\npl.title(\"Estim. mapping (linear)\")\n\npl.subplot(2, 2, 3)\npl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o',\n label='Target samples', alpha=.2)\npl.scatter(transp_Xs_gaussian[:, 0], transp_Xs_gaussian[:, 1], c=ys,\n marker='+', label='barycentric mapping')\npl.title(\"Bary. mapping (kernel)\")\n\npl.subplot(2, 2, 4)\npl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o',\n label='Target samples', alpha=.2)\npl.scatter(transp_Xs_gaussian_new[:, 0], transp_Xs_gaussian_new[:, 1], c=ys,\n marker='+', label='Learned mapping')\npl.title(\"Estim. mapping (kernel)\")\npl.tight_layout()\n\npl.show()"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ }
+ ],
+ "metadata": {
+ "kernelspec": {
+ "display_name": "Python 2",
+ "name": "python2",
+ "language": "python"
+ },
+ "language_info": {
+ "mimetype": "text/x-python",
+ "nbconvert_exporter": "python",
+ "name": "python",
+ "file_extension": ".py",
+ "version": "2.7.12",
+ "pygments_lexer": "ipython2",
+ "codemirror_mode": {
+ "version": 2,
+ "name": "ipython"
+ }
+ }
+ }
+} \ No newline at end of file
diff --git a/docs/source/auto_examples/plot_otda_mapping.py b/docs/source/auto_examples/plot_otda_mapping.py
new file mode 100644
index 0000000..167c3a1
--- /dev/null
+++ b/docs/source/auto_examples/plot_otda_mapping.py
@@ -0,0 +1,125 @@
+# -*- coding: utf-8 -*-
+"""
+===========================================
+OT mapping estimation for domain adaptation
+===========================================
+
+This example presents how to use MappingTransport to estimate at the same
+time both the coupling transport and approximate the transport map with either
+a linear or a kernelized mapping as introduced in [8].
+
+[8] M. Perrot, N. Courty, R. Flamary, A. Habrard,
+ "Mapping estimation for discrete optimal transport",
+ Neural Information Processing Systems (NIPS), 2016.
+"""
+
+# Authors: Remi Flamary <remi.flamary@unice.fr>
+# Stanislas Chambon <stan.chambon@gmail.com>
+#
+# License: MIT License
+
+import numpy as np
+import matplotlib.pylab as pl
+import ot
+
+
+##############################################################################
+# Generate data
+# -------------
+
+n_source_samples = 100
+n_target_samples = 100
+theta = 2 * np.pi / 20
+noise_level = 0.1
+
+Xs, ys = ot.datasets.get_data_classif(
+ 'gaussrot', n_source_samples, nz=noise_level)
+Xs_new, _ = ot.datasets.get_data_classif(
+ 'gaussrot', n_source_samples, nz=noise_level)
+Xt, yt = ot.datasets.get_data_classif(
+ 'gaussrot', n_target_samples, theta=theta, nz=noise_level)
+
+# one of the target mode changes its variance (no linear mapping)
+Xt[yt == 2] *= 3
+Xt = Xt + 4
+
+##############################################################################
+# Plot data
+# ---------
+
+pl.figure(1, (10, 5))
+pl.clf()
+pl.scatter(Xs[:, 0], Xs[:, 1], c=ys, marker='+', label='Source samples')
+pl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o', label='Target samples')
+pl.legend(loc=0)
+pl.title('Source and target distributions')
+
+
+##############################################################################
+# Instantiate the different transport algorithms and fit them
+# -----------------------------------------------------------
+
+# MappingTransport with linear kernel
+ot_mapping_linear = ot.da.MappingTransport(
+ kernel="linear", mu=1e0, eta=1e-8, bias=True,
+ max_iter=20, verbose=True)
+
+ot_mapping_linear.fit(Xs=Xs, Xt=Xt)
+
+# for original source samples, transform applies barycentric mapping
+transp_Xs_linear = ot_mapping_linear.transform(Xs=Xs)
+
+# for out of source samples, transform applies the linear mapping
+transp_Xs_linear_new = ot_mapping_linear.transform(Xs=Xs_new)
+
+
+# MappingTransport with gaussian kernel
+ot_mapping_gaussian = ot.da.MappingTransport(
+ kernel="gaussian", eta=1e-5, mu=1e-1, bias=True, sigma=1,
+ max_iter=10, verbose=True)
+ot_mapping_gaussian.fit(Xs=Xs, Xt=Xt)
+
+# for original source samples, transform applies barycentric mapping
+transp_Xs_gaussian = ot_mapping_gaussian.transform(Xs=Xs)
+
+# for out of source samples, transform applies the gaussian mapping
+transp_Xs_gaussian_new = ot_mapping_gaussian.transform(Xs=Xs_new)
+
+
+##############################################################################
+# Plot transported samples
+# ------------------------
+
+pl.figure(2)
+pl.clf()
+pl.subplot(2, 2, 1)
+pl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o',
+ label='Target samples', alpha=.2)
+pl.scatter(transp_Xs_linear[:, 0], transp_Xs_linear[:, 1], c=ys, marker='+',
+ label='Mapped source samples')
+pl.title("Bary. mapping (linear)")
+pl.legend(loc=0)
+
+pl.subplot(2, 2, 2)
+pl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o',
+ label='Target samples', alpha=.2)
+pl.scatter(transp_Xs_linear_new[:, 0], transp_Xs_linear_new[:, 1],
+ c=ys, marker='+', label='Learned mapping')
+pl.title("Estim. mapping (linear)")
+
+pl.subplot(2, 2, 3)
+pl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o',
+ label='Target samples', alpha=.2)
+pl.scatter(transp_Xs_gaussian[:, 0], transp_Xs_gaussian[:, 1], c=ys,
+ marker='+', label='barycentric mapping')
+pl.title("Bary. mapping (kernel)")
+
+pl.subplot(2, 2, 4)
+pl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o',
+ label='Target samples', alpha=.2)
+pl.scatter(transp_Xs_gaussian_new[:, 0], transp_Xs_gaussian_new[:, 1], c=ys,
+ marker='+', label='Learned mapping')
+pl.title("Estim. mapping (kernel)")
+pl.tight_layout()
+
+pl.show()
diff --git a/docs/source/auto_examples/plot_otda_mapping.rst b/docs/source/auto_examples/plot_otda_mapping.rst
new file mode 100644
index 0000000..6c1c780
--- /dev/null
+++ b/docs/source/auto_examples/plot_otda_mapping.rst
@@ -0,0 +1,228 @@
+
+
+.. _sphx_glr_auto_examples_plot_otda_mapping.py:
+
+
+===========================================
+OT mapping estimation for domain adaptation
+===========================================
+
+This example presents how to use MappingTransport to estimate at the same
+time both the coupling transport and approximate the transport map with either
+a linear or a kernelized mapping as introduced in [8].
+
+[8] M. Perrot, N. Courty, R. Flamary, A. Habrard,
+ "Mapping estimation for discrete optimal transport",
+ Neural Information Processing Systems (NIPS), 2016.
+
+
+
+.. code-block:: python
+
+
+ # Authors: Remi Flamary <remi.flamary@unice.fr>
+ # Stanislas Chambon <stan.chambon@gmail.com>
+ #
+ # License: MIT License
+
+ import numpy as np
+ import matplotlib.pylab as pl
+ import ot
+
+
+
+
+
+
+
+
+Generate data
+-------------
+
+
+
+.. code-block:: python
+
+
+ n_source_samples = 100
+ n_target_samples = 100
+ theta = 2 * np.pi / 20
+ noise_level = 0.1
+
+ Xs, ys = ot.datasets.get_data_classif(
+ 'gaussrot', n_source_samples, nz=noise_level)
+ Xs_new, _ = ot.datasets.get_data_classif(
+ 'gaussrot', n_source_samples, nz=noise_level)
+ Xt, yt = ot.datasets.get_data_classif(
+ 'gaussrot', n_target_samples, theta=theta, nz=noise_level)
+
+ # one of the target mode changes its variance (no linear mapping)
+ Xt[yt == 2] *= 3
+ Xt = Xt + 4
+
+
+
+
+
+
+
+Plot data
+---------
+
+
+
+.. code-block:: python
+
+
+ pl.figure(1, (10, 5))
+ pl.clf()
+ pl.scatter(Xs[:, 0], Xs[:, 1], c=ys, marker='+', label='Source samples')
+ pl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o', label='Target samples')
+ pl.legend(loc=0)
+ pl.title('Source and target distributions')
+
+
+
+
+
+.. image:: /auto_examples/images/sphx_glr_plot_otda_mapping_001.png
+ :align: center
+
+
+
+
+Instantiate the different transport algorithms and fit them
+-----------------------------------------------------------
+
+
+
+.. code-block:: python
+
+
+ # MappingTransport with linear kernel
+ ot_mapping_linear = ot.da.MappingTransport(
+ kernel="linear", mu=1e0, eta=1e-8, bias=True,
+ max_iter=20, verbose=True)
+
+ ot_mapping_linear.fit(Xs=Xs, Xt=Xt)
+
+ # for original source samples, transform applies barycentric mapping
+ transp_Xs_linear = ot_mapping_linear.transform(Xs=Xs)
+
+ # for out of source samples, transform applies the linear mapping
+ transp_Xs_linear_new = ot_mapping_linear.transform(Xs=Xs_new)
+
+
+ # MappingTransport with gaussian kernel
+ ot_mapping_gaussian = ot.da.MappingTransport(
+ kernel="gaussian", eta=1e-5, mu=1e-1, bias=True, sigma=1,
+ max_iter=10, verbose=True)
+ ot_mapping_gaussian.fit(Xs=Xs, Xt=Xt)
+
+ # for original source samples, transform applies barycentric mapping
+ transp_Xs_gaussian = ot_mapping_gaussian.transform(Xs=Xs)
+
+ # for out of source samples, transform applies the gaussian mapping
+ transp_Xs_gaussian_new = ot_mapping_gaussian.transform(Xs=Xs_new)
+
+
+
+
+
+
+.. rst-class:: sphx-glr-script-out
+
+ Out::
+
+ It. |Loss |Delta loss
+ --------------------------------
+ 0|4.307233e+03|0.000000e+00
+ 1|4.296694e+03|-2.446759e-03
+ 2|4.296419e+03|-6.417421e-05
+ 3|4.296328e+03|-2.110209e-05
+ 4|4.296305e+03|-5.298603e-06
+ It. |Loss |Delta loss
+ --------------------------------
+ 0|4.325624e+02|0.000000e+00
+ 1|4.281958e+02|-1.009489e-02
+ 2|4.279370e+02|-6.042202e-04
+ 3|4.278109e+02|-2.947651e-04
+ 4|4.277212e+02|-2.096651e-04
+ 5|4.276589e+02|-1.456221e-04
+ 6|4.276141e+02|-1.048476e-04
+ 7|4.275803e+02|-7.906213e-05
+ 8|4.275531e+02|-6.360573e-05
+ 9|4.275314e+02|-5.076642e-05
+ 10|4.275129e+02|-4.325858e-05
+
+
+Plot transported samples
+------------------------
+
+
+
+.. code-block:: python
+
+
+ pl.figure(2)
+ pl.clf()
+ pl.subplot(2, 2, 1)
+ pl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o',
+ label='Target samples', alpha=.2)
+ pl.scatter(transp_Xs_linear[:, 0], transp_Xs_linear[:, 1], c=ys, marker='+',
+ label='Mapped source samples')
+ pl.title("Bary. mapping (linear)")
+ pl.legend(loc=0)
+
+ pl.subplot(2, 2, 2)
+ pl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o',
+ label='Target samples', alpha=.2)
+ pl.scatter(transp_Xs_linear_new[:, 0], transp_Xs_linear_new[:, 1],
+ c=ys, marker='+', label='Learned mapping')
+ pl.title("Estim. mapping (linear)")
+
+ pl.subplot(2, 2, 3)
+ pl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o',
+ label='Target samples', alpha=.2)
+ pl.scatter(transp_Xs_gaussian[:, 0], transp_Xs_gaussian[:, 1], c=ys,
+ marker='+', label='barycentric mapping')
+ pl.title("Bary. mapping (kernel)")
+
+ pl.subplot(2, 2, 4)
+ pl.scatter(Xt[:, 0], Xt[:, 1], c=yt, marker='o',
+ label='Target samples', alpha=.2)
+ pl.scatter(transp_Xs_gaussian_new[:, 0], transp_Xs_gaussian_new[:, 1], c=ys,
+ marker='+', label='Learned mapping')
+ pl.title("Estim. mapping (kernel)")
+ pl.tight_layout()
+
+ pl.show()
+
+
+
+.. image:: /auto_examples/images/sphx_glr_plot_otda_mapping_003.png
+ :align: center
+
+
+
+
+**Total running time of the script:** ( 0 minutes 0.747 seconds)
+
+
+
+.. container:: sphx-glr-footer
+
+
+ .. container:: sphx-glr-download
+
+ :download:`Download Python source code: plot_otda_mapping.py <plot_otda_mapping.py>`
+
+
+
+ .. container:: sphx-glr-download
+
+ :download:`Download Jupyter notebook: plot_otda_mapping.ipynb <plot_otda_mapping.ipynb>`
+
+.. rst-class:: sphx-glr-signature
+
+ `Generated by Sphinx-Gallery <https://sphinx-gallery.readthedocs.io>`_
diff --git a/docs/source/auto_examples/plot_otda_mapping_colors_images.ipynb b/docs/source/auto_examples/plot_otda_mapping_colors_images.ipynb
new file mode 100644
index 0000000..56caa8a
--- /dev/null
+++ b/docs/source/auto_examples/plot_otda_mapping_colors_images.ipynb
@@ -0,0 +1,144 @@
+{
+ "nbformat_minor": 0,
+ "nbformat": 4,
+ "cells": [
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "%matplotlib inline"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "\n# OT for image color adaptation with mapping estimation\n\n\nOT for domain adaptation with image color adaptation [6] with mapping\nestimation [8].\n\n[6] Ferradans, S., Papadakis, N., Peyre, G., & Aujol, J. F. (2014). Regularized\n discrete optimal transport. SIAM Journal on Imaging Sciences, 7(3),\n 1853-1882.\n[8] M. Perrot, N. Courty, R. Flamary, A. Habrard, \"Mapping estimation for\n discrete optimal transport\", Neural Information Processing Systems (NIPS),\n 2016.\n\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "# Authors: Remi Flamary <remi.flamary@unice.fr>\n# Stanislas Chambon <stan.chambon@gmail.com>\n#\n# License: MIT License\n\nimport numpy as np\nfrom scipy import ndimage\nimport matplotlib.pylab as pl\nimport ot\n\nr = np.random.RandomState(42)\n\n\ndef im2mat(I):\n \"\"\"Converts and image to matrix (one pixel per line)\"\"\"\n return I.reshape((I.shape[0] * I.shape[1], I.shape[2]))\n\n\ndef mat2im(X, shape):\n \"\"\"Converts back a matrix to an image\"\"\"\n return X.reshape(shape)\n\n\ndef minmax(I):\n return np.clip(I, 0, 1)"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Generate data\n-------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "# Loading images\nI1 = ndimage.imread('../data/ocean_day.jpg').astype(np.float64) / 256\nI2 = ndimage.imread('../data/ocean_sunset.jpg').astype(np.float64) / 256\n\n\nX1 = im2mat(I1)\nX2 = im2mat(I2)\n\n# training samples\nnb = 1000\nidx1 = r.randint(X1.shape[0], size=(nb,))\nidx2 = r.randint(X2.shape[0], size=(nb,))\n\nXs = X1[idx1, :]\nXt = X2[idx2, :]"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Domain adaptation for pixel distribution transfer\n-------------------------------------------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "# EMDTransport\not_emd = ot.da.EMDTransport()\not_emd.fit(Xs=Xs, Xt=Xt)\ntransp_Xs_emd = ot_emd.transform(Xs=X1)\nImage_emd = minmax(mat2im(transp_Xs_emd, I1.shape))\n\n# SinkhornTransport\not_sinkhorn = ot.da.SinkhornTransport(reg_e=1e-1)\not_sinkhorn.fit(Xs=Xs, Xt=Xt)\ntransp_Xs_sinkhorn = ot_emd.transform(Xs=X1)\nImage_sinkhorn = minmax(mat2im(transp_Xs_sinkhorn, I1.shape))\n\not_mapping_linear = ot.da.MappingTransport(\n mu=1e0, eta=1e-8, bias=True, max_iter=20, verbose=True)\not_mapping_linear.fit(Xs=Xs, Xt=Xt)\n\nX1tl = ot_mapping_linear.transform(Xs=X1)\nImage_mapping_linear = minmax(mat2im(X1tl, I1.shape))\n\not_mapping_gaussian = ot.da.MappingTransport(\n mu=1e0, eta=1e-2, sigma=1, bias=False, max_iter=10, verbose=True)\not_mapping_gaussian.fit(Xs=Xs, Xt=Xt)\n\nX1tn = ot_mapping_gaussian.transform(Xs=X1) # use the estimated mapping\nImage_mapping_gaussian = minmax(mat2im(X1tn, I1.shape))"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Plot original images\n--------------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "pl.figure(1, figsize=(6.4, 3))\npl.subplot(1, 2, 1)\npl.imshow(I1)\npl.axis('off')\npl.title('Image 1')\n\npl.subplot(1, 2, 2)\npl.imshow(I2)\npl.axis('off')\npl.title('Image 2')\npl.tight_layout()"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Plot pixel values distribution\n------------------------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "pl.figure(2, figsize=(6.4, 5))\n\npl.subplot(1, 2, 1)\npl.scatter(Xs[:, 0], Xs[:, 2], c=Xs)\npl.axis([0, 1, 0, 1])\npl.xlabel('Red')\npl.ylabel('Blue')\npl.title('Image 1')\n\npl.subplot(1, 2, 2)\npl.scatter(Xt[:, 0], Xt[:, 2], c=Xt)\npl.axis([0, 1, 0, 1])\npl.xlabel('Red')\npl.ylabel('Blue')\npl.title('Image 2')\npl.tight_layout()"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ },
+ {
+ "source": [
+ "Plot transformed images\n-----------------------\n\n"
+ ],
+ "cell_type": "markdown",
+ "metadata": {}
+ },
+ {
+ "execution_count": null,
+ "cell_type": "code",
+ "source": [
+ "pl.figure(2, figsize=(10, 5))\n\npl.subplot(2, 3, 1)\npl.imshow(I1)\npl.axis('off')\npl.title('Im. 1')\n\npl.subplot(2, 3, 4)\npl.imshow(I2)\npl.axis('off')\npl.title('Im. 2')\n\npl.subplot(2, 3, 2)\npl.imshow(Image_emd)\npl.axis('off')\npl.title('EmdTransport')\n\npl.subplot(2, 3, 5)\npl.imshow(Image_sinkhorn)\npl.axis('off')\npl.title('SinkhornTransport')\n\npl.subplot(2, 3, 3)\npl.imshow(Image_mapping_linear)\npl.axis('off')\npl.title('MappingTransport (linear)')\n\npl.subplot(2, 3, 6)\npl.imshow(Image_mapping_gaussian)\npl.axis('off')\npl.title('MappingTransport (gaussian)')\npl.tight_layout()\n\npl.show()"
+ ],
+ "outputs": [],
+ "metadata": {
+ "collapsed": false
+ }
+ }
+ ],
+ "metadata": {
+ "kernelspec": {
+ "display_name": "Python 2",
+ "name": "python2",
+ "language": "python"
+ },
+ "language_info": {
+ "mimetype": "text/x-python",
+ "nbconvert_exporter": "python",
+ "name": "python",
+ "file_extension": ".py",
+ "version": "2.7.12",
+ "pygments_lexer": "ipython2",
+ "codemirror_mode": {
+ "version": 2,
+ "name": "ipython"
+ }
+ }
+ }
+} \ No newline at end of file
diff --git a/docs/source/auto_examples/plot_otda_mapping_colors_images.py b/docs/source/auto_examples/plot_otda_mapping_colors_images.py
new file mode 100644
index 0000000..5f1e844
--- /dev/null
+++ b/docs/source/auto_examples/plot_otda_mapping_colors_images.py
@@ -0,0 +1,174 @@
+# -*- coding: utf-8 -*-
+"""
+=====================================================
+OT for image color adaptation with mapping estimation
+=====================================================
+
+OT for domain adaptation with image color adaptation [6] with mapping
+estimation [8].
+
+[6] Ferradans, S., Papadakis, N., Peyre, G., & Aujol, J. F. (2014). Regularized
+ discrete optimal transport. SIAM Journal on Imaging Sciences, 7(3),
+ 1853-1882.
+[8] M. Perrot, N. Courty, R. Flamary, A. Habrard, "Mapping estimation for
+ discrete optimal transport", Neural Information Processing Systems (NIPS),
+ 2016.
+
+"""
+
+# Authors: Remi Flamary <remi.flamary@unice.fr>
+# Stanislas Chambon <stan.chambon@gmail.com>
+#
+# License: MIT License
+
+import numpy as np
+from scipy import ndimage
+import matplotlib.pylab as pl
+import ot
+
+r = np.random.RandomState(42)
+
+
+def im2mat(I):
+ """Converts and image to matrix (one pixel per line)"""
+ return I.reshape((I.shape[0] * I.shape[1], I.shape[2]))
+
+
+def mat2im(X, shape):
+ """Converts back a matrix to an image"""
+ return X.reshape(shape)
+
+
+def minmax(I):
+ return np.clip(I, 0, 1)
+
+
+##############################################################################
+# Generate data
+# -------------
+
+# Loading images
+I1 = ndimage.imread('../data/ocean_day.jpg').astype(np.float64) / 256
+I2 = ndimage.imread('../data/ocean_sunset.jpg').astype(np.float64) / 256
+
+
+X1 = im2mat(I1)
+X2 = im2mat(I2)
+
+# training samples
+nb = 1000
+idx1 = r.randint(X1.shape[0], size=(nb,))
+idx2 = r.randint(X2.shape[0], size=(nb,))
+
+Xs = X1[idx1, :]
+Xt = X2[idx2, :]
+
+
+##############################################################################
+# Domain adaptation for pixel distribution transfer
+# -------------------------------------------------
+
+# EMDTransport
+ot_emd = ot.da.EMDTransport()
+ot_emd.fit(Xs=Xs, Xt=Xt)
+transp_Xs_emd = ot_emd.transform(Xs=X1)
+Image_emd = minmax(mat2im(transp_Xs_emd, I1.shape))
+
+# SinkhornTransport
+ot_sinkhorn = ot.da.SinkhornTransport(reg_e=1e-1)
+ot_sinkhorn.fit(Xs=Xs, Xt=Xt)
+transp_Xs_sinkhorn = ot_emd.transform(Xs=X1)
+Image_sinkhorn = minmax(mat2im(transp_Xs_sinkhorn, I1.shape))
+
+ot_mapping_linear = ot.da.MappingTransport(
+ mu=1e0, eta=1e-8, bias=True, max_iter=20, verbose=True)
+ot_mapping_linear.fit(Xs=Xs, Xt=Xt)
+
+X1tl = ot_mapping_linear.transform(Xs=X1)
+Image_mapping_linear = minmax(mat2im(X1tl, I1.shape))
+
+ot_mapping_gaussian = ot.da.MappingTransport(
+ mu=1e0, eta=1e-2, sigma=1, bias=False, max_iter=10, verbose=True)
+ot_mapping_gaussian.fit(Xs=Xs, Xt=Xt)
+
+X1tn = ot_mapping_gaussian.transform(Xs=X1) # use the estimated mapping
+Image_mapping_gaussian = minmax(mat2im(X1tn, I1.shape))
+
+
+##############################################################################
+# Plot original images
+# --------------------
+
+pl.figure(1, figsize=(6.4, 3))
+pl.subplot(1, 2, 1)
+pl.imshow(I1)
+pl.axis('off')
+pl.title('Image 1')
+
+pl.subplot(1, 2, 2)
+pl.imshow(I2)
+pl.axis('off')
+pl.title('Image 2')
+pl.tight_layout()
+
+
+##############################################################################
+# Plot pixel values distribution
+# ------------------------------
+
+pl.figure(2, figsize=(6.4, 5))
+
+pl.subplot(1, 2, 1)
+pl.scatter(Xs[:, 0], Xs[:, 2], c=Xs)
+pl.axis([0, 1, 0, 1])
+pl.xlabel('Red')
+pl.ylabel('Blue')
+pl.title('Image 1')
+
+pl.subplot(1, 2, 2)
+pl.scatter(Xt[:, 0], Xt[:, 2], c=Xt)
+pl.axis([0, 1, 0, 1])
+pl.xlabel('Red')
+pl.ylabel('Blue')
+pl.title('Image 2')
+pl.tight_layout()
+
+
+##############################################################################
+# Plot transformed images
+# -----------------------
+
+pl.figure(2, figsize=(10, 5))
+
+pl.subplot(2, 3, 1)
+pl.imshow(I1)
+pl.axis('off')
+pl.title('Im. 1')
+
+pl.subplot(2, 3, 4)
+pl.imshow(I2)
+pl.axis('off')
+pl.title('Im. 2')
+
+pl.subplot(2, 3, 2)
+pl.imshow(Image_emd)
+pl.axis('off')
+pl.title('EmdTransport')
+
+pl.subplot(2, 3, 5)
+pl.imshow(Image_sinkhorn)
+pl.axis('off')
+pl.title('SinkhornTransport')
+
+pl.subplot(2, 3, 3)
+pl.imshow(Image_mapping_linear)
+pl.axis('off')
+pl.title('MappingTransport (linear)')
+
+pl.subplot(2, 3, 6)
+pl.imshow(Image_mapping_gaussian)
+pl.axis('off')
+pl.title('MappingTransport (gaussian)')
+pl.tight_layout()
+
+pl.show()
diff --git a/docs/source/auto_examples/plot_otda_mapping_colors_images.rst b/docs/source/auto_examples/plot_otda_mapping_colors_images.rst
new file mode 100644
index 0000000..86b1312
--- /dev/null
+++ b/docs/source/auto_examples/plot_otda_mapping_colors_images.rst
@@ -0,0 +1,305 @@
+
+
+.. _sphx_glr_auto_examples_plot_otda_mapping_colors_images.py:
+
+
+=====================================================
+OT for image color adaptation with mapping estimation
+=====================================================
+
+OT for domain adaptation with image color adaptation [6] with mapping
+estimation [8].
+
+[6] Ferradans, S., Papadakis, N., Peyre, G., & Aujol, J. F. (2014). Regularized
+ discrete optimal transport. SIAM Journal on Imaging Sciences, 7(3),
+ 1853-1882.
+[8] M. Perrot, N. Courty, R. Flamary, A. Habrard, "Mapping estimation for
+ discrete optimal transport", Neural Information Processing Systems (NIPS),
+ 2016.
+
+
+
+
+.. code-block:: python
+
+
+ # Authors: Remi Flamary <remi.flamary@unice.fr>
+ # Stanislas Chambon <stan.chambon@gmail.com>
+ #
+ # License: MIT License
+
+ import numpy as np
+ from scipy import ndimage
+ import matplotlib.pylab as pl
+ import ot
+
+ r = np.random.RandomState(42)
+
+
+ def im2mat(I):
+ """Converts and image to matrix (one pixel per line)"""
+ return I.reshape((I.shape[0] * I.shape[1], I.shape[2]))
+
+
+ def mat2im(X, shape):
+ """Converts back a matrix to an image"""
+ return X.reshape(shape)
+
+
+ def minmax(I):
+ return np.clip(I, 0, 1)
+
+
+
+
+
+
+
+
+Generate data
+-------------
+
+
+
+.. code-block:: python
+
+
+ # Loading images
+ I1 = ndimage.imread('../data/ocean_day.jpg').astype(np.float64) / 256
+ I2 = ndimage.imread('../data/ocean_sunset.jpg').astype(np.float64) / 256
+
+
+ X1 = im2mat(I1)
+ X2 = im2mat(I2)
+
+ # training samples
+ nb = 1000
+ idx1 = r.randint(X1.shape[0], size=(nb,))
+ idx2 = r.randint(X2.shape[0], size=(nb,))
+
+ Xs = X1[idx1, :]
+ Xt = X2[idx2, :]
+
+
+
+
+
+
+
+
+Domain adaptation for pixel distribution transfer
+-------------------------------------------------
+
+
+
+.. code-block:: python
+
+
+ # EMDTransport
+ ot_emd = ot.da.EMDTransport()
+ ot_emd.fit(Xs=Xs, Xt=Xt)
+ transp_Xs_emd = ot_emd.transform(Xs=X1)
+ Image_emd = minmax(mat2im(transp_Xs_emd, I1.shape))
+
+ # SinkhornTransport
+ ot_sinkhorn = ot.da.SinkhornTransport(reg_e=1e-1)
+ ot_sinkhorn.fit(Xs=Xs, Xt=Xt)
+ transp_Xs_sinkhorn = ot_emd.transform(Xs=X1)
+ Image_sinkhorn = minmax(mat2im(transp_Xs_sinkhorn, I1.shape))
+
+ ot_mapping_linear = ot.da.MappingTransport(
+ mu=1e0, eta=1e-8, bias=True, max_iter=20, verbose=True)
+ ot_mapping_linear.fit(Xs=Xs, Xt=Xt)
+
+ X1tl = ot_mapping_linear.transform(Xs=X1)
+ Image_mapping_linear = minmax(mat2im(X1tl, I1.shape))
+
+ ot_mapping_gaussian = ot.da.MappingTransport(
+ mu=1e0, eta=1e-2, sigma=1, bias=False, max_iter=10, verbose=True)
+ ot_mapping_gaussian.fit(Xs=Xs, Xt=Xt)
+
+ X1tn = ot_mapping_gaussian.transform(Xs=X1) # use the estimated mapping
+ Image_mapping_gaussian = minmax(mat2im(X1tn, I1.shape))
+
+
+
+
+
+
+.. rst-class:: sphx-glr-script-out
+
+ Out::
+
+ It. |Loss |Delta loss
+ --------------------------------
+ 0|3.680512e+02|0.000000e+00
+ 1|3.592454e+02|-2.392562e-02
+ 2|3.590671e+02|-4.960473e-04
+ 3|3.589736e+02|-2.604894e-04
+ 4|3.589161e+02|-1.602816e-04
+ 5|3.588766e+02|-1.099971e-04
+ 6|3.588476e+02|-8.084400e-05
+ 7|3.588256e+02|-6.131161e-05
+ 8|3.588083e+02|-4.807549e-05
+ 9|3.587943e+02|-3.899414e-05
+ 10|3.587827e+02|-3.245280e-05
+ 11|3.587729e+02|-2.721256e-05
+ 12|3.587646e+02|-2.316249e-05
+ 13|3.587574e+02|-2.000192e-05
+ 14|3.587512e+02|-1.748898e-05
+ 15|3.587457e+02|-1.535131e-05
+ 16|3.587408e+02|-1.366515e-05
+ 17|3.587364e+02|-1.210563e-05
+ 18|3.587325e+02|-1.097138e-05
+ 19|3.587310e+02|-4.099596e-06
+ It. |Loss |Delta loss
+ --------------------------------
+ 0|3.784805e+02|0.000000e+00
+ 1|3.646476e+02|-3.654847e-02
+ 2|3.642970e+02|-9.615381e-04
+ 3|3.641622e+02|-3.699897e-04
+ 4|3.640886e+02|-2.021154e-04
+ 5|3.640419e+02|-1.280913e-04
+ 6|3.640096e+02|-8.898145e-05
+ 7|3.639858e+02|-6.514301e-05
+ 8|3.639677e+02|-4.977195e-05
+ 9|3.639534e+02|-3.936050e-05
+ 10|3.639417e+02|-3.205223e-05
+
+
+Plot original images
+--------------------
+
+
+
+.. code-block:: python
+
+
+ pl.figure(1, figsize=(6.4, 3))
+ pl.subplot(1, 2, 1)
+ pl.imshow(I1)
+ pl.axis('off')
+ pl.title('Image 1')
+
+ pl.subplot(1, 2, 2)
+ pl.imshow(I2)
+ pl.axis('off')
+ pl.title('Image 2')
+ pl.tight_layout()
+
+
+
+
+
+.. image:: /auto_examples/images/sphx_glr_plot_otda_mapping_colors_images_001.png
+ :align: center
+
+
+
+
+Plot pixel values distribution
+------------------------------
+
+
+
+.. code-block:: python
+
+
+ pl.figure(2, figsize=(6.4, 5))
+
+ pl.subplot(1, 2, 1)
+ pl.scatter(Xs[:, 0], Xs[:, 2], c=Xs)
+ pl.axis([0, 1, 0, 1])
+ pl.xlabel('Red')
+ pl.ylabel('Blue')
+ pl.title('Image 1')
+
+ pl.subplot(1, 2, 2)
+ pl.scatter(Xt[:, 0], Xt[:, 2], c=Xt)
+ pl.axis([0, 1, 0, 1])
+ pl.xlabel('Red')
+ pl.ylabel('Blue')
+ pl.title('Image 2')
+ pl.tight_layout()
+
+
+
+
+
+.. image:: /auto_examples/images/sphx_glr_plot_otda_mapping_colors_images_003.png
+ :align: center
+
+
+
+
+Plot transformed images
+-----------------------
+
+
+
+.. code-block:: python
+
+
+ pl.figure(2, figsize=(10, 5))
+
+ pl.subplot(2, 3, 1)
+ pl.imshow(I1)
+ pl.axis('off')
+ pl.title('Im. 1')
+
+ pl.subplot(2, 3, 4)
+ pl.imshow(I2)
+ pl.axis('off')
+ pl.title('Im. 2')
+
+ pl.subplot(2, 3, 2)
+ pl.imshow(Image_emd)
+ pl.axis('off')
+ pl.title('EmdTransport')
+
+ pl.subplot(2, 3, 5)
+ pl.imshow(Image_sinkhorn)
+ pl.axis('off')
+ pl.title('SinkhornTransport')
+
+ pl.subplot(2, 3, 3)
+ pl.imshow(Image_mapping_linear)
+ pl.axis('off')
+ pl.title('MappingTransport (linear)')
+
+ pl.subplot(2, 3, 6)
+ pl.imshow(Image_mapping_gaussian)
+ pl.axis('off')
+ pl.title('MappingTransport (gaussian)')
+ pl.tight_layout()
+
+ pl.show()
+
+
+
+.. image:: /auto_examples/images/sphx_glr_plot_otda_mapping_colors_images_004.png
+ :align: center
+
+
+
+
+**Total running time of the script:** ( 2 minutes 5.213 seconds)
+
+
+
+.. container:: sphx-glr-footer
+
+
+ .. container:: sphx-glr-download
+
+ :download:`Download Python source code: plot_otda_mapping_colors_images.py <plot_otda_mapping_colors_images.py>`
+
+
+
+ .. container:: sphx-glr-download
+
+ :download:`Download Jupyter notebook: plot_otda_mapping_colors_images.ipynb <plot_otda_mapping_colors_images.ipynb>`
+
+.. rst-class:: sphx-glr-signature
+
+ `Generated by Sphinx-Gallery <https://sphinx-gallery.readthedocs.io>`_
diff --git a/docs/source/auto_examples/searchindex b/docs/source/auto_examples/searchindex
new file mode 100644
index 0000000..2cad500
--- /dev/null
+++ b/docs/source/auto_examples/searchindex
Binary files differ
diff --git a/docs/source/conf.py b/docs/source/conf.py
index ff08899..4105d87 100644
--- a/docs/source/conf.py
+++ b/docs/source/conf.py
@@ -50,7 +50,7 @@ sys.path.insert(0, os.path.abspath("../.."))
#needs_sphinx = '1.0'
# Add any Sphinx extension module names here, as strings. They can be
-# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
+# extensions coming with Sphinx (named #'sphinx.ext.*') or your custom
# ones.
extensions = [
'sphinx.ext.autodoc',
@@ -62,7 +62,7 @@ extensions = [
'sphinx.ext.ifconfig',
'sphinx.ext.viewcode',
'sphinx.ext.napoleon',
-# 'sphinx_gallery.gen_gallery',
+ #'sphinx_gallery.gen_gallery',
]
# Add any paths that contain templates here, relative to this directory.
@@ -261,7 +261,7 @@ latex_elements = {
# author, documentclass [howto, manual, or own class]).
latex_documents = [
(master_doc, 'POT.tex', u'POT Python Optimal Transport library',
- u'Rémi Flamary, Nicolas Courty', 'manual'),
+ author, 'manual'),
]
# The name of an image file (relative to this directory) to place at the top of
@@ -305,7 +305,7 @@ man_pages = [
# dir menu entry, description, category)
texinfo_documents = [
(master_doc, 'POT', u'POT Python Optimal Transport library Documentation',
- author, 'POT', 'One line description of project.',
+ author, 'POT', 'Python Optimal Transport librar.',
'Miscellaneous'),
]
@@ -326,9 +326,9 @@ texinfo_documents = [
intersphinx_mapping = {'https://docs.python.org/': None}
sphinx_gallery_conf = {
- 'examples_dirs': '../../examples',
+ 'examples_dirs': ['../../examples','../../examples/da'],
'gallery_dirs': 'auto_examples',
- 'mod_example_dir': '../modules/generated/',
+ 'backreferences_dir': '../modules/generated/',
'reference_url': {
'numpy': 'http://docs.scipy.org/doc/numpy-1.9.1',
'scipy': 'http://docs.scipy.org/doc/scipy-0.17.0/reference'}
diff --git a/docs/source/examples.rst b/docs/source/examples.rst
deleted file mode 100644
index f209543..0000000
--- a/docs/source/examples.rst
+++ /dev/null
@@ -1,39 +0,0 @@
-
-
-Examples
-============
-
-1D Optimal transport
----------------------
-
-.. literalinclude:: ../../examples/demo_OT_1D.py
-
-2D Optimal transport on empirical distributions
------------------------------------------------
-
-.. literalinclude:: ../../examples/demo_OT_2D_samples.py
-
-1D Wasserstein barycenter
--------------------------
-
-.. literalinclude:: ../../examples/demo_barycenter_1D.py
-
-OT with user provided regularization
-------------------------------------
-
-.. literalinclude:: ../../examples/demo_optim_OTreg.py
-
-Domain adaptation with optimal transport
-----------------------------------------
-
-.. literalinclude:: ../../examples/demo_OTDA_classes.py
-
-Color transfer in images
-------------------------
-
-.. literalinclude:: ../../examples/demo_OTDA_color_images.py
-
-OT mapping estimation for domain adaptation
--------------------------------------------
-
-.. literalinclude:: ../../examples/demo_OTDA_mapping.py
diff --git a/docs/source/readme.rst b/docs/source/readme.rst
index c1e0017..1482838 100644
--- a/docs/source/readme.rst
+++ b/docs/source/readme.rst
@@ -150,27 +150,27 @@ Here is a list of the Python notebooks available
want a quick look:
- `1D optimal
- transport <https://github.com/rflamary/POT/blob/master/notebooks/Demo_1D_OT.ipynb>`__
+ transport <https://github.com/rflamary/POT/blob/master/notebooks/plot_OT_1D.ipynb>`__
- `OT Ground
- Loss <https://github.com/rflamary/POT/blob/master/notebooks/Demo_Ground_Loss.ipynb>`__
+ Loss <https://github.com/rflamary/POT/blob/master/notebooks/plot_OT_L1_vs_L2.ipynb>`__
- `Multiple EMD
- computation <https://github.com/rflamary/POT/blob/master/notebooks/Demo_Compute_EMD.ipynb>`__
+ computation <https://github.com/rflamary/POT/blob/master/notebooks/plot_compute_emd.ipynb>`__
- `2D optimal transport on empirical
- distributions <https://github.com/rflamary/POT/blob/master/notebooks/Demo_2D_OT_samples.ipynb>`__
+ distributions <https://github.com/rflamary/POT/blob/master/notebooks/plot_OT_2D_samples.ipynb>`__
- `1D Wasserstein
- barycenter <https://github.com/rflamary/POT/blob/master/notebooks/Demo_1D_barycenter.ipynb>`__
+ barycenter <https://github.com/rflamary/POT/blob/master/notebooks/plot_barycenter_1D.ipynb>`__
- `OT with user provided
- regularization <https://github.com/rflamary/POT/blob/master/notebooks/Demo_Optim_OTreg.ipynb>`__
+ regularization <https://github.com/rflamary/POT/blob/master/notebooks/plot_optim_OTreg.ipynb>`__
- `Domain adaptation with optimal
- transport <https://github.com/rflamary/POT/blob/master/notebooks/Demo_2D_OT_DomainAdaptation.ipynb>`__
+ transport <https://github.com/rflamary/POT/blob/master/notebooks/plot_otda_d2.ipynb>`__
- `Color transfer in
- images <https://github.com/rflamary/POT/blob/master/notebooks/Demo_Image_ColorAdaptation.ipynb>`__
+ images <https://github.com/rflamary/POT/blob/master/notebooks/plot_otda_color_images.ipynb>`__
- `OT mapping estimation for domain
- adaptation <https://github.com/rflamary/POT/blob/master/notebooks/Demo_2D_OTmapping_DomainAdaptation.ipynb>`__
+ adaptation <https://github.com/rflamary/POT/blob/master/notebooks/plot_otda_mapping.ipynb>`__
- `OT mapping estimation for color transfer in
- images <https://github.com/rflamary/POT/blob/master/notebooks/Demo_Image_ColorAdaptation_mapping.ipynb>`__
+ images <https://github.com/rflamary/POT/blob/master/notebooks/plot_otda_mapping_colors_images.ipynb>`__
- `Wasserstein Discriminant
- Analysis <https://github.com/rflamary/POT/blob/master/notebooks/Demo_Wasserstein_Discriminant_Analysis.ipynb>`__
+ Analysis <https://github.com/rflamary/POT/blob/master/notebooks/plot_WDA.ipynb>`__
You can also see the notebooks with `Jupyter
nbviewer <https://nbviewer.jupyter.org/github/rflamary/POT/tree/master/notebooks/>`__.
@@ -187,6 +187,9 @@ The contributors to this library are:
- `Michael Perrot <http://perso.univ-st-etienne.fr/pem82055/>`__
(Mapping estimation)
- `Léo Gautheron <https://github.com/aje>`__ (GPU implementation)
+- `Nathalie
+ Gayraud <https://www.linkedin.com/in/nathalie-t-h-gayraud/?ppe=1>`__
+- `Stanislas Chambon <https://slasnista.github.io/>`__
This toolbox benefit a lot from open source research and we would like
to thank the following persons for providing some code (in various