.. _sphx_glr_auto_examples_plot_WDA.py: ================================= Wasserstein Discriminant Analysis ================================= This example illustrate the use of WDA as proposed in [11]. [11] Flamary, R., Cuturi, M., Courty, N., & Rakotomamonjy, A. (2016). Wasserstein Discriminant Analysis. .. code-block:: python # Author: Remi Flamary # # License: MIT License import numpy as np import matplotlib.pylab as pl from ot.dr import wda, fda Generate data ------------- .. code-block:: python #%% parameters n = 1000 # nb samples in source and target datasets nz = 0.2 # generate circle dataset t = np.random.rand(n) * 2 * np.pi ys = np.floor((np.arange(n) * 1.0 / n * 3)) + 1 xs = np.concatenate( (np.cos(t).reshape((-1, 1)), np.sin(t).reshape((-1, 1))), 1) xs = xs * ys.reshape(-1, 1) + nz * np.random.randn(n, 2) t = np.random.rand(n) * 2 * np.pi yt = np.floor((np.arange(n) * 1.0 / n * 3)) + 1 xt = np.concatenate( (np.cos(t).reshape((-1, 1)), np.sin(t).reshape((-1, 1))), 1) xt = xt * yt.reshape(-1, 1) + nz * np.random.randn(n, 2) nbnoise = 8 xs = np.hstack((xs, np.random.randn(n, nbnoise))) xt = np.hstack((xt, np.random.randn(n, nbnoise))) Plot data --------- .. code-block:: python #%% plot samples pl.figure(1, figsize=(6.4, 3.5)) pl.subplot(1, 2, 1) pl.scatter(xt[:, 0], xt[:, 1], c=ys, marker='+', label='Source samples') pl.legend(loc=0) pl.title('Discriminant dimensions') pl.subplot(1, 2, 2) pl.scatter(xt[:, 2], xt[:, 3], c=ys, marker='+', label='Source samples') pl.legend(loc=0) pl.title('Other dimensions') pl.tight_layout() .. image:: /auto_examples/images/sphx_glr_plot_WDA_001.png :align: center Compute Fisher Discriminant Analysis ------------------------------------ .. code-block:: python #%% Compute FDA p = 2 Pfda, projfda = fda(xs, ys, p) Compute Wasserstein Discriminant Analysis ----------------------------------------- .. code-block:: python #%% Compute WDA p = 2 reg = 1e0 k = 10 maxiter = 100 Pwda, projwda = wda(xs, ys, p, reg, k, maxiter=maxiter) .. rst-class:: sphx-glr-script-out Out:: Compiling cost function... Computing gradient of cost function... iter cost val grad. norm 1 +5.4993226050368416e-01 5.18285173e-01 2 +3.4883000507542844e-01 1.96795818e-01 3 +2.9841234004693890e-01 2.33029475e-01 4 +2.3976476757548179e-01 1.38593951e-01 5 +2.3614468346177828e-01 1.19615394e-01 6 +2.2586536502789240e-01 4.82430685e-02 7 +2.2451030967794622e-01 2.56564039e-02 8 +2.2421446331083625e-01 1.47932578e-02 9 +2.2407441444450052e-01 1.12040327e-03 10 +2.2407365923337522e-01 3.78899763e-04 11 +2.2407356874011675e-01 1.79740810e-05 12 +2.2407356862959993e-01 1.25643005e-05 13 +2.2407356853043561e-01 1.40415001e-06 14 +2.2407356852925220e-01 3.41183585e-07 Terminated - min grad norm reached after 14 iterations, 6.78 seconds. Plot 2D projections ------------------- .. code-block:: python #%% plot samples xsp = projfda(xs) xtp = projfda(xt) xspw = projwda(xs) xtpw = projwda(xt) pl.figure(2) pl.subplot(2, 2, 1) pl.scatter(xsp[:, 0], xsp[:, 1], c=ys, marker='+', label='Projected samples') pl.legend(loc=0) pl.title('Projected training samples FDA') pl.subplot(2, 2, 2) pl.scatter(xtp[:, 0], xtp[:, 1], c=ys, marker='+', label='Projected samples') pl.legend(loc=0) pl.title('Projected test samples FDA') pl.subplot(2, 2, 3) pl.scatter(xspw[:, 0], xspw[:, 1], c=ys, marker='+', label='Projected samples') pl.legend(loc=0) pl.title('Projected training samples WDA') pl.subplot(2, 2, 4) pl.scatter(xtpw[:, 0], xtpw[:, 1], c=ys, marker='+', label='Projected samples') pl.legend(loc=0) pl.title('Projected test samples WDA') pl.tight_layout() pl.show() .. image:: /auto_examples/images/sphx_glr_plot_WDA_003.png :align: center **Total running time of the script:** ( 0 minutes 7.637 seconds) .. container:: sphx-glr-footer .. container:: sphx-glr-download :download:`Download Python source code: plot_WDA.py ` .. container:: sphx-glr-download :download:`Download Jupyter notebook: plot_WDA.ipynb ` .. rst-class:: sphx-glr-signature `Generated by Sphinx-Gallery `_