.. _sphx_glr_auto_examples_plot_optim_OTreg.py: ================================== Regularized OT with generic solver ================================== Illustrates the use of the generic solver for regularized OT with user-designed regularization term. It uses Conditional gradient as in [6] and generalized Conditional Gradient as proposed in [5][7]. [5] N. Courty; R. Flamary; D. Tuia; A. Rakotomamonjy, Optimal Transport for Domain Adaptation, in IEEE Transactions on Pattern Analysis and Machine Intelligence , vol.PP, no.99, pp.1-1. [6] Ferradans, S., Papadakis, N., Peyré, G., & Aujol, J. F. (2014). Regularized discrete optimal transport. SIAM Journal on Imaging Sciences, 7(3), 1853-1882. [7] Rakotomamonjy, A., Flamary, R., & Courty, N. (2015). Generalized conditional gradient: analysis of convergence and applications. arXiv preprint arXiv:1510.06567. .. code-block:: python import numpy as np import matplotlib.pylab as pl import ot Generate data ------------- .. code-block:: python #%% parameters n = 100 # nb bins # bin positions x = np.arange(n, dtype=np.float64) # Gaussian distributions a = ot.datasets.get_1D_gauss(n, m=20, s=5) # m= mean, s= std b = ot.datasets.get_1D_gauss(n, m=60, s=10) # loss matrix M = ot.dist(x.reshape((n, 1)), x.reshape((n, 1))) M /= M.max() Solve EMD --------- .. code-block:: python #%% EMD G0 = ot.emd(a, b, M) pl.figure(3, figsize=(5, 5)) ot.plot.plot1D_mat(a, b, G0, 'OT matrix G0') .. image:: /auto_examples/images/sphx_glr_plot_optim_OTreg_003.png :align: center Solve EMD with Frobenius norm regularization -------------------------------------------- .. code-block:: python #%% Example with Frobenius norm regularization def f(G): return 0.5 * np.sum(G**2) def df(G): return G reg = 1e-1 Gl2 = ot.optim.cg(a, b, M, reg, f, df, verbose=True) pl.figure(3) ot.plot.plot1D_mat(a, b, Gl2, 'OT matrix Frob. reg') .. image:: /auto_examples/images/sphx_glr_plot_optim_OTreg_004.png :align: center .. rst-class:: sphx-glr-script-out Out:: It. |Loss |Delta loss -------------------------------- 0|1.760578e-01|0.000000e+00 1|1.669467e-01|-5.457501e-02 2|1.665639e-01|-2.298130e-03 3|1.664378e-01|-7.572776e-04 4|1.664077e-01|-1.811855e-04 5|1.663912e-01|-9.936787e-05 6|1.663852e-01|-3.555826e-05 7|1.663814e-01|-2.305693e-05 8|1.663785e-01|-1.760450e-05 9|1.663767e-01|-1.078011e-05 10|1.663751e-01|-9.525192e-06 11|1.663737e-01|-8.396466e-06 12|1.663727e-01|-6.086938e-06 13|1.663720e-01|-4.042609e-06 14|1.663713e-01|-4.160914e-06 15|1.663707e-01|-3.823502e-06 16|1.663702e-01|-3.022440e-06 17|1.663697e-01|-3.181249e-06 18|1.663692e-01|-2.698532e-06 19|1.663687e-01|-3.258253e-06 It. |Loss |Delta loss -------------------------------- 20|1.663682e-01|-2.741118e-06 21|1.663678e-01|-2.624135e-06 22|1.663673e-01|-2.645179e-06 23|1.663670e-01|-1.957237e-06 24|1.663666e-01|-2.261541e-06 25|1.663663e-01|-1.851305e-06 26|1.663660e-01|-1.942296e-06 27|1.663657e-01|-2.092896e-06 28|1.663653e-01|-1.924361e-06 29|1.663651e-01|-1.625455e-06 30|1.663648e-01|-1.641123e-06 31|1.663645e-01|-1.566666e-06 32|1.663643e-01|-1.338514e-06 33|1.663641e-01|-1.222711e-06 34|1.663639e-01|-1.221805e-06 35|1.663637e-01|-1.440781e-06 36|1.663634e-01|-1.520091e-06 37|1.663632e-01|-1.288193e-06 38|1.663630e-01|-1.123055e-06 39|1.663628e-01|-1.024487e-06 It. |Loss |Delta loss -------------------------------- 40|1.663627e-01|-1.079606e-06 41|1.663625e-01|-1.172093e-06 42|1.663623e-01|-1.047880e-06 43|1.663621e-01|-1.010577e-06 44|1.663619e-01|-1.064438e-06 45|1.663618e-01|-9.882375e-07 46|1.663616e-01|-8.532647e-07 47|1.663615e-01|-9.930189e-07 48|1.663613e-01|-8.728955e-07 49|1.663612e-01|-9.524214e-07 50|1.663610e-01|-9.088418e-07 51|1.663609e-01|-7.639430e-07 52|1.663608e-01|-6.662611e-07 53|1.663607e-01|-7.133700e-07 54|1.663605e-01|-7.648141e-07 55|1.663604e-01|-6.557516e-07 56|1.663603e-01|-7.304213e-07 57|1.663602e-01|-6.353809e-07 58|1.663601e-01|-7.968279e-07 59|1.663600e-01|-6.367159e-07 It. |Loss |Delta loss -------------------------------- 60|1.663599e-01|-5.610790e-07 61|1.663598e-01|-5.787466e-07 62|1.663596e-01|-6.937777e-07 63|1.663596e-01|-5.599432e-07 64|1.663595e-01|-5.813048e-07 65|1.663594e-01|-5.724600e-07 66|1.663593e-01|-6.081892e-07 67|1.663592e-01|-5.948732e-07 68|1.663591e-01|-4.941833e-07 69|1.663590e-01|-5.213739e-07 70|1.663589e-01|-5.127355e-07 71|1.663588e-01|-4.349251e-07 72|1.663588e-01|-5.007084e-07 73|1.663587e-01|-4.880265e-07 74|1.663586e-01|-4.931950e-07 75|1.663585e-01|-4.981309e-07 76|1.663584e-01|-3.952959e-07 77|1.663584e-01|-4.544857e-07 78|1.663583e-01|-4.237579e-07 79|1.663582e-01|-4.382386e-07 It. |Loss |Delta loss -------------------------------- 80|1.663582e-01|-3.646051e-07 81|1.663581e-01|-4.197994e-07 82|1.663580e-01|-4.072764e-07 83|1.663580e-01|-3.994645e-07 84|1.663579e-01|-4.842721e-07 85|1.663578e-01|-3.276486e-07 86|1.663578e-01|-3.737346e-07 87|1.663577e-01|-4.282043e-07 88|1.663576e-01|-4.020937e-07 89|1.663576e-01|-3.431951e-07 90|1.663575e-01|-3.052335e-07 91|1.663575e-01|-3.500538e-07 92|1.663574e-01|-3.063176e-07 93|1.663573e-01|-3.576367e-07 94|1.663573e-01|-3.224681e-07 95|1.663572e-01|-3.673221e-07 96|1.663572e-01|-3.635561e-07 97|1.663571e-01|-3.527236e-07 98|1.663571e-01|-2.788548e-07 99|1.663570e-01|-2.727141e-07 It. |Loss |Delta loss -------------------------------- 100|1.663570e-01|-3.127278e-07 101|1.663569e-01|-2.637504e-07 102|1.663569e-01|-2.922750e-07 103|1.663568e-01|-3.076454e-07 104|1.663568e-01|-2.911509e-07 105|1.663567e-01|-2.403398e-07 106|1.663567e-01|-2.439790e-07 107|1.663567e-01|-2.634542e-07 108|1.663566e-01|-2.452203e-07 109|1.663566e-01|-2.852991e-07 110|1.663565e-01|-2.165490e-07 111|1.663565e-01|-2.450250e-07 112|1.663564e-01|-2.685294e-07 113|1.663564e-01|-2.821800e-07 114|1.663564e-01|-2.237390e-07 115|1.663563e-01|-1.992842e-07 116|1.663563e-01|-2.166739e-07 117|1.663563e-01|-2.086064e-07 118|1.663562e-01|-2.435945e-07 119|1.663562e-01|-2.292497e-07 It. |Loss |Delta loss -------------------------------- 120|1.663561e-01|-2.366209e-07 121|1.663561e-01|-2.138746e-07 122|1.663561e-01|-2.009637e-07 123|1.663560e-01|-2.386258e-07 124|1.663560e-01|-1.927442e-07 125|1.663560e-01|-2.081681e-07 126|1.663559e-01|-1.759123e-07 127|1.663559e-01|-1.890771e-07 128|1.663559e-01|-1.971315e-07 129|1.663558e-01|-2.101983e-07 130|1.663558e-01|-2.035645e-07 131|1.663558e-01|-1.984492e-07 132|1.663557e-01|-1.849064e-07 133|1.663557e-01|-1.795703e-07 134|1.663557e-01|-1.624087e-07 135|1.663557e-01|-1.689557e-07 136|1.663556e-01|-1.644308e-07 137|1.663556e-01|-1.618007e-07 138|1.663556e-01|-1.483013e-07 139|1.663555e-01|-1.708771e-07 It. |Loss |Delta loss -------------------------------- 140|1.663555e-01|-2.013847e-07 141|1.663555e-01|-1.721217e-07 142|1.663554e-01|-2.027911e-07 143|1.663554e-01|-1.764565e-07 144|1.663554e-01|-1.677151e-07 145|1.663554e-01|-1.351982e-07 146|1.663553e-01|-1.423360e-07 147|1.663553e-01|-1.541112e-07 148|1.663553e-01|-1.491601e-07 149|1.663553e-01|-1.466407e-07 150|1.663552e-01|-1.801524e-07 151|1.663552e-01|-1.714107e-07 152|1.663552e-01|-1.491257e-07 153|1.663552e-01|-1.513799e-07 154|1.663551e-01|-1.354539e-07 155|1.663551e-01|-1.233818e-07 156|1.663551e-01|-1.576219e-07 157|1.663551e-01|-1.452791e-07 158|1.663550e-01|-1.262867e-07 159|1.663550e-01|-1.316379e-07 It. |Loss |Delta loss -------------------------------- 160|1.663550e-01|-1.295447e-07 161|1.663550e-01|-1.283286e-07 162|1.663550e-01|-1.569222e-07 163|1.663549e-01|-1.172942e-07 164|1.663549e-01|-1.399809e-07 165|1.663549e-01|-1.229432e-07 166|1.663549e-01|-1.326191e-07 167|1.663548e-01|-1.209694e-07 168|1.663548e-01|-1.372136e-07 169|1.663548e-01|-1.338395e-07 170|1.663548e-01|-1.416497e-07 171|1.663548e-01|-1.298576e-07 172|1.663547e-01|-1.190590e-07 173|1.663547e-01|-1.167083e-07 174|1.663547e-01|-1.069425e-07 175|1.663547e-01|-1.217780e-07 176|1.663547e-01|-1.140754e-07 177|1.663546e-01|-1.160707e-07 178|1.663546e-01|-1.101798e-07 179|1.663546e-01|-1.114904e-07 It. |Loss |Delta loss -------------------------------- 180|1.663546e-01|-1.064022e-07 181|1.663546e-01|-9.258231e-08 182|1.663546e-01|-1.213120e-07 183|1.663545e-01|-1.164296e-07 184|1.663545e-01|-1.188762e-07 185|1.663545e-01|-9.394153e-08 186|1.663545e-01|-1.028656e-07 187|1.663545e-01|-1.115348e-07 188|1.663544e-01|-9.768310e-08 189|1.663544e-01|-1.021806e-07 190|1.663544e-01|-1.086303e-07 191|1.663544e-01|-9.879008e-08 192|1.663544e-01|-1.050210e-07 193|1.663544e-01|-1.002463e-07 194|1.663543e-01|-1.062747e-07 195|1.663543e-01|-9.348538e-08 196|1.663543e-01|-7.992512e-08 197|1.663543e-01|-9.558020e-08 198|1.663543e-01|-9.993772e-08 199|1.663543e-01|-8.588499e-08 It. |Loss |Delta loss -------------------------------- 200|1.663543e-01|-8.737134e-08 Solve EMD with entropic regularization -------------------------------------- .. code-block:: python #%% Example with entropic regularization def f(G): return np.sum(G * np.log(G)) def df(G): return np.log(G) + 1. reg = 1e-3 Ge = ot.optim.cg(a, b, M, reg, f, df, verbose=True) pl.figure(4, figsize=(5, 5)) ot.plot.plot1D_mat(a, b, Ge, 'OT matrix Entrop. reg') .. image:: /auto_examples/images/sphx_glr_plot_optim_OTreg_006.png :align: center .. rst-class:: sphx-glr-script-out Out:: It. |Loss |Delta loss -------------------------------- 0|1.692289e-01|0.000000e+00 1|1.617643e-01|-4.614437e-02 2|1.612546e-01|-3.161037e-03 3|1.611040e-01|-9.349544e-04 4|1.610346e-01|-4.310179e-04 5|1.610072e-01|-1.701719e-04 6|1.609947e-01|-7.759814e-05 7|1.609934e-01|-7.941439e-06 8|1.609841e-01|-5.797180e-05 9|1.609838e-01|-1.559407e-06 10|1.609685e-01|-9.530282e-05 11|1.609666e-01|-1.142129e-05 12|1.609541e-01|-7.799970e-05 13|1.609496e-01|-2.780416e-05 14|1.609385e-01|-6.887105e-05 15|1.609334e-01|-3.174241e-05 16|1.609231e-01|-6.420777e-05 17|1.609115e-01|-7.189949e-05 18|1.608815e-01|-1.865331e-04 19|1.608799e-01|-1.013039e-05 It. |Loss |Delta loss -------------------------------- 20|1.608695e-01|-6.468606e-05 21|1.608686e-01|-5.738419e-06 22|1.608661e-01|-1.495923e-05 23|1.608657e-01|-2.784611e-06 24|1.608633e-01|-1.512408e-05 25|1.608624e-01|-5.397916e-06 26|1.608617e-01|-4.115218e-06 27|1.608561e-01|-3.503396e-05 28|1.608479e-01|-5.098773e-05 29|1.608452e-01|-1.659203e-05 30|1.608399e-01|-3.298319e-05 31|1.608330e-01|-4.302183e-05 32|1.608310e-01|-1.273465e-05 33|1.608280e-01|-1.827713e-05 34|1.608231e-01|-3.039842e-05 35|1.608212e-01|-1.229256e-05 36|1.608200e-01|-6.900556e-06 37|1.608159e-01|-2.554039e-05 38|1.608103e-01|-3.521137e-05 39|1.608058e-01|-2.795180e-05 It. |Loss |Delta loss -------------------------------- 40|1.608040e-01|-1.119118e-05 41|1.608027e-01|-8.193369e-06 42|1.607994e-01|-2.026719e-05 43|1.607985e-01|-5.819902e-06 44|1.607978e-01|-4.048170e-06 45|1.607978e-01|-3.007470e-07 46|1.607950e-01|-1.705375e-05 47|1.607927e-01|-1.430186e-05 48|1.607925e-01|-1.166526e-06 49|1.607911e-01|-9.069406e-06 50|1.607910e-01|-3.804209e-07 51|1.607910e-01|-5.942399e-08 52|1.607910e-01|-2.321380e-07 53|1.607907e-01|-1.877655e-06 54|1.607906e-01|-2.940224e-07 55|1.607877e-01|-1.814208e-05 56|1.607841e-01|-2.236496e-05 57|1.607810e-01|-1.951355e-05 58|1.607804e-01|-3.578228e-06 59|1.607789e-01|-9.442277e-06 It. |Loss |Delta loss -------------------------------- 60|1.607779e-01|-5.997371e-06 61|1.607754e-01|-1.564408e-05 62|1.607742e-01|-7.693285e-06 63|1.607727e-01|-9.030547e-06 64|1.607719e-01|-5.103894e-06 65|1.607693e-01|-1.605420e-05 66|1.607676e-01|-1.047837e-05 67|1.607675e-01|-6.026848e-07 68|1.607655e-01|-1.240216e-05 69|1.607632e-01|-1.434674e-05 70|1.607618e-01|-8.829808e-06 71|1.607606e-01|-7.581824e-06 72|1.607590e-01|-1.009457e-05 73|1.607586e-01|-2.222963e-06 74|1.607577e-01|-5.564775e-06 75|1.607574e-01|-1.932763e-06 76|1.607573e-01|-8.148685e-07 77|1.607554e-01|-1.187660e-05 78|1.607546e-01|-4.557651e-06 79|1.607537e-01|-5.911902e-06 It. |Loss |Delta loss -------------------------------- 80|1.607529e-01|-4.710187e-06 81|1.607528e-01|-8.866080e-07 82|1.607522e-01|-3.620627e-06 83|1.607514e-01|-5.091281e-06 84|1.607498e-01|-9.932095e-06 85|1.607487e-01|-6.852804e-06 86|1.607478e-01|-5.373596e-06 87|1.607473e-01|-3.287295e-06 88|1.607470e-01|-1.666655e-06 89|1.607469e-01|-5.293790e-07 90|1.607466e-01|-2.051914e-06 91|1.607456e-01|-6.422797e-06 92|1.607456e-01|-1.110433e-07 93|1.607451e-01|-2.803849e-06 94|1.607451e-01|-2.608066e-07 95|1.607441e-01|-6.290352e-06 96|1.607429e-01|-7.298455e-06 97|1.607429e-01|-8.969905e-09 98|1.607427e-01|-7.923968e-07 99|1.607427e-01|-3.519286e-07 It. |Loss |Delta loss -------------------------------- 100|1.607426e-01|-3.563804e-07 101|1.607410e-01|-1.004042e-05 102|1.607410e-01|-2.124801e-07 103|1.607398e-01|-7.556935e-06 104|1.607398e-01|-7.606853e-08 105|1.607385e-01|-8.058684e-06 106|1.607383e-01|-7.393061e-07 107|1.607381e-01|-1.504958e-06 108|1.607377e-01|-2.508807e-06 109|1.607371e-01|-4.004631e-06 110|1.607365e-01|-3.580156e-06 111|1.607364e-01|-2.563573e-07 112|1.607354e-01|-6.390137e-06 113|1.607348e-01|-4.119553e-06 114|1.607339e-01|-5.299475e-06 115|1.607335e-01|-2.316767e-06 116|1.607330e-01|-3.444737e-06 117|1.607324e-01|-3.467980e-06 118|1.607320e-01|-2.374632e-06 119|1.607319e-01|-7.978255e-07 It. |Loss |Delta loss -------------------------------- 120|1.607312e-01|-4.221434e-06 121|1.607310e-01|-1.324597e-06 122|1.607304e-01|-3.650359e-06 123|1.607298e-01|-3.732712e-06 124|1.607295e-01|-1.994082e-06 125|1.607289e-01|-3.954139e-06 126|1.607286e-01|-1.532372e-06 127|1.607286e-01|-1.167223e-07 128|1.607283e-01|-2.157376e-06 129|1.607279e-01|-2.253077e-06 130|1.607274e-01|-3.301532e-06 131|1.607269e-01|-2.650754e-06 132|1.607264e-01|-3.595551e-06 133|1.607262e-01|-1.159425e-06 134|1.607258e-01|-2.512411e-06 135|1.607255e-01|-1.998792e-06 136|1.607251e-01|-2.486536e-06 137|1.607246e-01|-2.782996e-06 138|1.607246e-01|-2.922470e-07 139|1.607242e-01|-2.071131e-06 It. |Loss |Delta loss -------------------------------- 140|1.607237e-01|-3.154193e-06 141|1.607235e-01|-1.194962e-06 142|1.607232e-01|-2.035251e-06 143|1.607232e-01|-6.027855e-08 144|1.607229e-01|-1.555696e-06 145|1.607228e-01|-1.081740e-06 146|1.607225e-01|-1.881070e-06 147|1.607224e-01|-4.100096e-07 148|1.607223e-01|-7.785200e-07 149|1.607222e-01|-2.094072e-07 150|1.607220e-01|-1.440814e-06 151|1.607217e-01|-1.997794e-06 152|1.607214e-01|-2.011022e-06 153|1.607212e-01|-8.808854e-07 154|1.607211e-01|-7.245877e-07 155|1.607207e-01|-2.217159e-06 156|1.607201e-01|-3.817891e-06 157|1.607200e-01|-7.409600e-07 158|1.607198e-01|-1.497698e-06 159|1.607195e-01|-1.729666e-06 It. |Loss |Delta loss -------------------------------- 160|1.607195e-01|-2.115187e-07 161|1.607192e-01|-1.643727e-06 162|1.607192e-01|-1.712969e-07 163|1.607189e-01|-1.805877e-06 164|1.607189e-01|-1.209827e-07 165|1.607185e-01|-2.060002e-06 166|1.607182e-01|-1.961341e-06 167|1.607181e-01|-1.020366e-06 168|1.607179e-01|-9.760982e-07 169|1.607178e-01|-7.219236e-07 170|1.607175e-01|-1.837718e-06 171|1.607174e-01|-3.337578e-07 172|1.607173e-01|-5.298564e-07 173|1.607173e-01|-6.864278e-08 174|1.607173e-01|-2.008419e-07 175|1.607171e-01|-1.375630e-06 176|1.607168e-01|-1.911257e-06 177|1.607167e-01|-2.709815e-07 178|1.607167e-01|-1.390953e-07 179|1.607165e-01|-1.199675e-06 It. |Loss |Delta loss -------------------------------- 180|1.607165e-01|-1.457259e-07 181|1.607163e-01|-1.049154e-06 182|1.607163e-01|-2.753577e-09 183|1.607163e-01|-6.972814e-09 184|1.607161e-01|-1.552100e-06 185|1.607159e-01|-1.068596e-06 186|1.607157e-01|-1.247724e-06 187|1.607155e-01|-1.158164e-06 188|1.607155e-01|-2.616199e-07 189|1.607154e-01|-3.595874e-07 190|1.607154e-01|-5.334527e-08 191|1.607153e-01|-3.452744e-07 192|1.607153e-01|-1.239593e-07 193|1.607152e-01|-8.184984e-07 194|1.607150e-01|-1.316308e-06 195|1.607150e-01|-7.100882e-09 196|1.607148e-01|-1.393958e-06 197|1.607146e-01|-1.242735e-06 198|1.607144e-01|-1.123993e-06 199|1.607143e-01|-3.512071e-07 It. |Loss |Delta loss -------------------------------- 200|1.607143e-01|-2.151971e-10 Solve EMD with Frobenius norm + entropic regularization ------------------------------------------------------- .. code-block:: python #%% Example with Frobenius norm + entropic regularization with gcg def f(G): return 0.5 * np.sum(G**2) def df(G): return G reg1 = 1e-3 reg2 = 1e-1 Gel2 = ot.optim.gcg(a, b, M, reg1, reg2, f, df, verbose=True) pl.figure(5, figsize=(5, 5)) ot.plot.plot1D_mat(a, b, Gel2, 'OT entropic + matrix Frob. reg') pl.show() .. image:: /auto_examples/images/sphx_glr_plot_optim_OTreg_008.png :align: center .. rst-class:: sphx-glr-script-out Out:: It. |Loss |Delta loss -------------------------------- 0|1.693084e-01|0.000000e+00 1|1.610121e-01|-5.152589e-02 2|1.609378e-01|-4.622297e-04 3|1.609284e-01|-5.830043e-05 4|1.609284e-01|-1.111407e-12 **Total running time of the script:** ( 0 minutes 1.809 seconds) .. container:: sphx-glr-footer .. container:: sphx-glr-download :download:`Download Python source code: plot_optim_OTreg.py ` .. container:: sphx-glr-download :download:`Download Jupyter notebook: plot_optim_OTreg.ipynb ` .. rst-class:: sphx-glr-signature `Generated by Sphinx-Gallery `_