diff options
author | Cédric Vincent-Cuaz <cedvincentcuaz@gmail.com> | 2023-03-16 08:05:54 +0100 |
---|---|---|
committer | GitHub <noreply@github.com> | 2023-03-16 08:05:54 +0100 |
commit | 583501652517c4f1dbd8572e9f942551a9e54a1f (patch) | |
tree | fadb96f888924b2d1bef01b78486e97a88ebcd42 /test/test_gromov.py | |
parent | 8f56effe7320991ebdc6457a2cf1d3b6648a09d1 (diff) |
[MRG] fix bugs of gw_entropic and armijo to run on gpu (#446)
* maj gw/ srgw/ generic cg solver
* correct pep8 on current state
* fix bug previous tests
* fix pep8
* fix bug srGW constC in loss and gradient
* fix doc html
* fix doc html
* start updating test_optim.py
* update tests gromov and optim - plus fix gromov dependencies
* add symmetry feature to entropic gw
* add symmetry feature to entropic gw
* add exemple for sr(F)GW matchings
* small stuff
* remove (reg,M) from line-search/ complete srgw tests with backend
* remove backend repetitions / rename fG to costG/ fix innerlog to True
* fix pep8
* take comments into account / new nx parameters still to test
* factor (f)gw2 + test new backend parameters in ot.gromov + harmonize stopping criterions
* split gromov.py in ot/gromov/ + update test_gromov with helper_backend functions
* manual documentaion gromov
* remove circular autosummary
* trying stuff
* debug documentation
* alphabetic ordering of module
* merge into branch
* add note in entropic gw solvers
* fix exemples/gromov doc
* add fixed issue to releases.md
* fix bugs of gw_entropic and armijo to run on gpu
* add pr to releases.md
* fix pep8
* fix call to backend in line_search_armijo
* correct docstring generic_conditional_gradient
---------
Co-authored-by: Rémi Flamary <remi.flamary@gmail.com>
Diffstat (limited to 'test/test_gromov.py')
-rw-r--r-- | test/test_gromov.py | 16 |
1 files changed, 16 insertions, 0 deletions
diff --git a/test/test_gromov.py b/test/test_gromov.py index cfccce7..80b6df4 100644 --- a/test/test_gromov.py +++ b/test/test_gromov.py @@ -214,6 +214,7 @@ def test_gromov2_gradients(): C11 = torch.tensor(C1, requires_grad=True, device=device)
C12 = torch.tensor(C2, requires_grad=True, device=device)
+ # Test with exact line-search
val = ot.gromov_wasserstein2(C11, C12, p1, q1)
val.backward()
@@ -224,6 +225,21 @@ def test_gromov2_gradients(): assert C11.shape == C11.grad.shape
assert C12.shape == C12.grad.shape
+ # Test with armijo line-search
+ q1.grad = None
+ p1.grad = None
+ C11.grad = None
+ C12.grad = None
+ val = ot.gromov_wasserstein2(C11, C12, p1, q1, armijo=True)
+
+ val.backward()
+
+ assert val.device == p1.device
+ assert q1.shape == q1.grad.shape
+ assert p1.shape == p1.grad.shape
+ assert C11.shape == C11.grad.shape
+ assert C12.shape == C12.grad.shape
+
def test_gw_helper_backend(nx):
n_samples = 20 # nb samples
|