diff options
author | MathieuCarriere <mathieu.carriere3@gmail.com> | 2021-11-01 14:36:11 +0100 |
---|---|---|
committer | MathieuCarriere <mathieu.carriere3@gmail.com> | 2021-11-01 14:36:11 +0100 |
commit | c4269eef025d4e6c7a763cd99b5dada647693c1d (patch) | |
tree | 802bb37fac936d817ea074aaadcb277b0d41c24a /src/python/doc/ls_simplex_tree_tflow_itf_ref.rst | |
parent | 10be82856aee6eb7f4e704757b70c9dab6fe28b8 (diff) |
fix doc
Diffstat (limited to 'src/python/doc/ls_simplex_tree_tflow_itf_ref.rst')
-rw-r--r-- | src/python/doc/ls_simplex_tree_tflow_itf_ref.rst | 11 |
1 files changed, 9 insertions, 2 deletions
diff --git a/src/python/doc/ls_simplex_tree_tflow_itf_ref.rst b/src/python/doc/ls_simplex_tree_tflow_itf_ref.rst index 7baf611c..26cf1ff2 100644 --- a/src/python/doc/ls_simplex_tree_tflow_itf_ref.rst +++ b/src/python/doc/ls_simplex_tree_tflow_itf_ref.rst @@ -10,7 +10,7 @@ TensorFlow layer for lower-star persistence on simplex trees Example of gradient computed from lower-star filtration of a simplex tree ------------------------------------------------------------------------- -.. code-block:: python +.. testcode:: from gudhi.tensorflow import * import numpy as np @@ -47,8 +47,15 @@ Example of gradient computed from lower-star filtration of a simplex tree with tf.GradientTape() as tape: dgm = sl.call(F) loss = tf.math.reduce_sum(tf.square(.5*(dgm[:,1]-dgm[:,0]))) + grads = tape.gradient(loss, [F]) - print(grads[0].numpy()) + print(grads[0].indices.numpy()) + print(grads[0].values.numpy()) + +.. testoutput:: + + [2 4] + [-1. 1.] Documentation for LowerStarSimplexTreeLayer ------------------------------------------- |