summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
-rw-r--r--src/Alpha_complex/utilities/README148
-rw-r--r--src/Bitmap_cubical_complex/utilities/README42
-rw-r--r--src/Bottleneck_distance/utilities/README32
-rw-r--r--src/Nerve_GIC/doc/Intro_graph_induced_complex.h6
-rw-r--r--src/Nerve_GIC/include/gudhi/GIC.h11
-rwxr-xr-xsrc/Nerve_GIC/utilities/KeplerMapperVisuFromTxtFile.py97
-rw-r--r--src/Nerve_GIC/utilities/covercomplex.md73
-rw-r--r--src/Rips_complex/utilities/README67
-rw-r--r--src/Witness_complex/utilities/README70
-rw-r--r--src/common/utilities/README46
10 files changed, 365 insertions, 227 deletions
diff --git a/src/Alpha_complex/utilities/README b/src/Alpha_complex/utilities/README
index 1cd2ca95..56bce602 100644
--- a/src/Alpha_complex/utilities/README
+++ b/src/Alpha_complex/utilities/README
@@ -1,16 +1,35 @@
-# Alpha_complex #
-
-## `alpha_complex_3d_persistence` ##
+---
+layout: page
+title: "Alpha complex"
+meta_title: "alphacomplex"
+subheadline: ""
+teaser: ""
+permalink: "/alphacomplex/"
+---
+{::comment}
+These flags above are here for web site generation, please let them.
+cf. https://gitlab.inria.fr/GUDHI/website
+Must be in conformity with _data/navigation.yml
+{:/comment}
+
+
+
+## alpha_complex_3d_persistence ##
This program computes the persistent homology with coefficient field Z/pZ of the 3D alpha complex built from a 3D point cloud. The output diagram contains one bar per line, written with the convention:
-`p dim birth death`
+```
+p dim birth death
+```
where `dim` is the dimension of the homological feature, `birth` and `death` are respectively the birth and death of the feature, and `p` is the characteristic of the field *Z/pZ* used for homology coefficients (`p` must be a prime number).
**Usage**
-`alpha_complex_3d_persistence [options] <input OFF file>`
-where
-`<input OFF file>` is the path to the input point cloud in [nOFF ASCII format](http://www.geomview.org/docs/html/OFF.html).
+
+```
+ alpha_complex_3d_persistence [options] <input OFF file>
+```
+
+where `<input OFF file>` is the path to the input point cloud in [nOFF ASCII format](http://www.geomview.org/docs/html/OFF.html).
**Allowed options**
@@ -20,43 +39,38 @@ where
* `-m [ --min-persistence ]` (default = 0) Minimal lifetime of homology feature to be recorded. Enter a negative value to see zero length intervals.
**Example**
-`alpha_complex_3d_persistence ../../data/points/tore3D_300.off -p 2 -m 0.45`
-outputs:
```
-Simplex_tree dim: 3
-2 0 0 inf
-2 1 0.0682162 1.0001
-2 1 0.0934117 1.00003
-2 2 0.56444 1.03938
+alpha_complex_3d_persistence ../../data/points/tore3D_300.off -p 2 -m 0.45
```
-Here we retrieve expected Betti numbers on a tore 3D:
-```
-Betti numbers[0] = 1
-Betti numbers[1] = 2
-Betti numbers[2] = 1
-```
+N.B.:
-N.B.:
* `alpha_complex_3d_persistence` only accepts OFF files in dimension 3.
* Filtration values are alpha square values.
+## exact_alpha_complex_3d_persistence ##
+
+Same as `alpha_complex_3d_persistence`, but using exact computation.
+It is slower, but it is necessary when points are on a grid for instance.
-## `exact_alpha_complex_3d_persistence` ##
-Same as `alpha_complex_3d_persistence`, but using exact computation. It is slower, but it is necessary when points are on a grid for instance.
+## weighted_alpha_complex_3d_persistence ##
-## `weighted_alpha_complex_3d_persistence` ##
Same as `alpha_complex_3d_persistence`, but using weighted points.
**Usage**
-`weighted_alpha_complex_3d_persistence [options] <input OFF file> <weights input file>`
+
+```
+ weighted_alpha_complex_3d_persistence [options] <input OFF file> <weights input file>
+```
+
where
-`<input OFF file>` is the path to the input point cloud in [nOFF ASCII format](http://www.geomview.org/docs/html/OFF.html).
-`<input weights file>` is the path to the file containing the weights of the points (one value per line).
+
+* `<input OFF file>` is the path to the input point cloud in [nOFF ASCII format](http://www.geomview.org/docs/html/OFF.html).
+* `<input weights file>` is the path to the file containing the weights of the points (one value per line).
**Allowed options**
@@ -66,32 +80,33 @@ where
* `-m [ --min-persistence ]` (default = 0) Minimal lifetime of homology feature to be recorded. Enter a negative value to see zero length intervals.
**Example**
-`weighted_alpha_complex_3d_persistence ../../data/points/tore3D_300.off ../../data/points/tore3D_300.weights -p 2 -m 0.45`
-outputs:
```
-Simplex_tree dim: 3
-2 0 -1 inf
-2 1 -0.931784 0.000103311
-2 1 -0.906588 2.60165e-05
-2 2 -0.43556 0.0393798
+ weighted_alpha_complex_3d_persistence ../../data/points/tore3D_300.off ../../data/points/tore3D_300.weights -p 2 -m 0.45
```
+
N.B.:
+
* Weights values are explained on CGAL [Alpha shape](https://doc.cgal.org/latest/Alpha_shapes_3/index.html#title0)
and [Regular triangulation](https://doc.cgal.org/latest/Triangulation_3/index.html#Triangulation3secclassRegulartriangulation) documentation.
* Filtration values are alpha square values.
-## `periodic_alpha_complex_3d_persistence` ##
+## periodic_alpha_complex_3d_persistence ##
Same as `alpha_complex_3d_persistence`, but using periodic alpha shape 3d.
Refer to the [CGAL's 3D Periodic Triangulations User Manual](https://doc.cgal.org/latest/Periodic_3_triangulation_3/index.html) for more details.
**Usage**
-`periodic_alpha_complex_3d_persistence [options] <input OFF file> <cuboid file>`
+
+```
+ periodic_alpha_complex_3d_persistence [options] <input OFF file> <cuboid file>
+```
+
where
-`<input OFF file>` is the path to the input point cloud in [nOFF ASCII format](http://www.geomview.org/docs/html/OFF.html).
-`<cuboid file>` is the path to the file describing the periodic domain. It must be in the format described [here](http://gudhi.gforge.inria.fr/doc/latest/fileformats.html#FileFormatsIsoCuboid).
+
+* `<input OFF file>` is the path to the input point cloud in [nOFF ASCII format](http://www.geomview.org/docs/html/OFF.html).
+* `<cuboid file>` is the path to the file describing the periodic domain. It must be in the format described [here](http://gudhi.gforge.inria.fr/doc/latest/fileformats.html#FileFormatsIsoCuboid).
**Allowed options**
@@ -102,46 +117,36 @@ where
**Example**
-`periodic_alpha_complex_3d_persistence ../../data/points/grid_10_10_10_in_0_1.off ../../data/points/iso_cuboid_3_in_0_1.txt -p 3 -m 1.0`
-outputs:
```
-Periodic Delaunay computed.
-Simplex_tree dim: 3
-3 0 0 inf
-3 1 0.0025 inf
-3 1 0.0025 inf
-3 1 0.0025 inf
-3 2 0.005 inf
-3 2 0.005 inf
-3 2 0.005 inf
-3 3 0.0075 inf
+periodic_alpha_complex_3d_persistence ../../data/points/grid_10_10_10_in_0_1.off ../../data/points/iso_cuboid_3_in_0_1.txt -p 3 -m 1.0
```
-Here we retrieve expected Betti numbers on an 3D iso-oriented cuboids:
-```
-Betti numbers[0] = 1
-Betti numbers[1] = 3
-Betti numbers[2] = 3
-Betti numbers[3] = 1
-```
+N.B.:
-N.B.:
* Cuboid file must be in the format described [here](http://gudhi.gforge.inria.fr/doc/latest/fileformats.html#FileFormatsIsoCuboid).
* Filtration values are alpha square values.
+## alpha_complex_persistence ##
-## `alpha_complex_persistence` ##
-This program computes the persistent homology with coefficient field Z/pZ of the dD alpha complex built from a dD point cloud. The output diagram contains one bar per line, written with the convention:
+This program computes the persistent homology with coefficient field Z/pZ of the dD alpha complex built from a dD point cloud.
+The output diagram contains one bar per line, written with the convention:
-`p dim birth death`
+```
+ p dim birth death
+```
-where `dim` is the dimension of the homological feature, `birth` and `death` are respectively the birth and death of the feature, and `p` is the characteristic of the field *Z/pZ* used for homology coefficients (`p` must be a prime number).
+where `dim` is the dimension of the homological feature, `birth` and `death` are respectively the birth and death of the feature,
+and `p` is the characteristic of the field *Z/pZ* used for homology coefficients (`p` must be a prime number).
**Usage**
-`alpha_complex_persistence [options] <input OFF file>`
+
+```
+ alpha_complex_persistence [options] <input OFF file>
+```
+
where
`<input OFF file>` is the path to the input point cloud in [nOFF ASCII format](http://www.geomview.org/docs/html/OFF.html).
@@ -154,24 +159,11 @@ where
* `-m [ --min-persistence ]` (default = 0) Minimal lifetime of homology feature to be recorded. Enter a negative value to see zero length intervals.
**Example**
-`alpha_complex_persistence -r 32 -p 2 -m 0.45 ../../data/points/tore3D_300.off`
-outputs:
```
-Alpha complex is of dimension 3 - 9273 simplices - 300 vertices.
-Simplex_tree dim: 3
-2 0 0 inf
-2 1 0.0682162 1.0001
-2 1 0.0934117 1.00003
-2 2 0.56444 1.03938
-```
-
-Here we retrieve expected Betti numbers on a tore 3D:
-```
-Betti numbers[0] = 1
-Betti numbers[1] = 2
-Betti numbers[2] = 1
+ alpha_complex_persistence -r 32 -p 2 -m 0.45 ../../data/points/tore3D_300.off
```
N.B.:
+
* Filtration values are alpha square values.
diff --git a/src/Bitmap_cubical_complex/utilities/README b/src/Bitmap_cubical_complex/utilities/README
index ddff7034..393b2654 100644
--- a/src/Bitmap_cubical_complex/utilities/README
+++ b/src/Bitmap_cubical_complex/utilities/README
@@ -1,18 +1,40 @@
-# Bitmap_cubical_complex #
+---
+layout: page
+title: "Bitmap cubical complex"
+meta_title: "cubicalcomplex"
+subheadline: ""
+teaser: ""
+permalink: "/cubicalcomplex/"
+---
+{::comment}
+These flags above are here for web site generation, please let them.
+cf. https://gitlab.inria.fr/GUDHI/website
+Must be in conformity with _data/navigation.yml
+{:/comment}
-## `cubical_complex_persistence` ##
-This program computes persistent homology, by using the Bitmap_cubical_complex class, of cubical complexes provided in text files in Perseus style. See [here](http://gudhi.gforge.inria.fr/doc/latest/fileformats.html#FileFormatsPerseus) for a description of the file format.
-Example:
+## cubical_complex_persistence ##
+This program computes persistent homology, by using the Bitmap_cubical_complex class, of cubical complexes provided in text files in Perseus style.
+See [here](http://gudhi.gforge.inria.fr/doc/latest/fileformats.html#FileFormatsPerseus) for a description of the file format.
-* Create a Cubical Complex from the Perseus style file `CubicalTwoSphere.txt`, computes Persistence cohomology from it and writes the results in a persistence file `CubicalTwoSphere.txt_persistence`:
-`cubical_complex_persistence data/bitmap/CubicalTwoSphere.txt`
+**Example**
-## `periodic_cubical_complex_persistence` ##
+```
+ cubical_complex_persistence data/bitmap/CubicalTwoSphere.txt
+```
+
+* Creates a Cubical Complex from the Perseus style file `CubicalTwoSphere.txt`,
+computes Persistence cohomology from it and writes the results in a persistence file `CubicalTwoSphere.txt_persistence`.
+
+## periodic_cubical_complex_persistence ##
Same as above, but with periodic boundary conditions.
-Example:
+**Example**
+
+```
+ periodic_cubical_complex_persistence data/bitmap/3d_torus.txt
+```
-* Create a Periodical Cubical Complex from the Perseus style file `3d_torus.txt`, computes Persistence cohomology from it and writes the results in a persistence file `3d_torus.txt_persistence`:
-`periodic_cubical_complex_persistence data/bitmap/3d_torus.txt`
+* Creates a Periodical Cubical Complex from the Perseus style file `3d_torus.txt`,
+computes Persistence cohomology from it and writes the results in a persistence file `3d_torus.txt_persistence`.
diff --git a/src/Bottleneck_distance/utilities/README b/src/Bottleneck_distance/utilities/README
index d9fdd252..04e1c4bd 100644
--- a/src/Bottleneck_distance/utilities/README
+++ b/src/Bottleneck_distance/utilities/README
@@ -1,10 +1,30 @@
-# Bottleneck_distance #
+---
+layout: page
+title: "Bottleneck distance"
+meta_title: "bottleneckdistance"
+subheadline: ""
+teaser: ""
+permalink: "/bottleneckdistance/"
+---
+{::comment}
+These flags above are here for web site generation, please let them.
+cf. https://gitlab.inria.fr/GUDHI/website
+Must be in conformity with _data/navigation.yml
+{:/comment}
+
+
+
+## bottleneck_read_file_example ##
-## `bottleneck_read_file_example` ##
This program computes the Bottleneck distance between two persistence diagram files.
-Usage:
-`bottleneck_read_file_example <file_1.pers> <file_2.pers> [<tolerance>]`
+**Usage**
+
+```
+ bottleneck_read_file_example <file_1.pers> <file_2.pers> [<tolerance>]
+```
+
where
-`<file_1.pers>` and `<file_2.pers>` must be in the format described [here](http://gudhi.gforge.inria.fr/doc/latest/fileformats.html#FileFormatsPers).
-`<tolerance>` is an error bound on the bottleneck distance (set by default to the smallest positive double value).
+
+* `<file_1.pers>` and `<file_2.pers>` must be in the format described [here](http://gudhi.gforge.inria.fr/doc/latest/fileformats.html#FileFormatsPers).
+* `<tolerance>` is an error bound on the bottleneck distance (set by default to the smallest positive double value).
diff --git a/src/Nerve_GIC/doc/Intro_graph_induced_complex.h b/src/Nerve_GIC/doc/Intro_graph_induced_complex.h
index f2409087..474f0f0e 100644
--- a/src/Nerve_GIC/doc/Intro_graph_induced_complex.h
+++ b/src/Nerve_GIC/doc/Intro_graph_induced_complex.h
@@ -77,8 +77,8 @@ namespace cover_complex {
*
* \include Nerve_GIC/Nerve.txt
*
- * The program also writes a file SC.txt. The first three lines in this file are the location of the input point cloud
- * and the function used to compute the cover.
+ * The program also writes a file ../../data/points/human_sc.txt. The first three lines in this file are the location
+ * of the input point cloud and the function used to compute the cover.
* The fourth line contains the number of vertices nv and edges ne of the Nerve.
* The next nv lines represent the vertices. Each line contains the vertex ID,
* the number of data points it contains, and their average color function value.
@@ -118,7 +118,7 @@ namespace cover_complex {
*
* the program outputs SC.off. Using e.g.
*
- * \code $> geomview SC.off
+ * \code $> geomview ../../data/points/human_sc.off
* \endcode
*
* one can obtain the following visualization:
diff --git a/src/Nerve_GIC/include/gudhi/GIC.h b/src/Nerve_GIC/include/gudhi/GIC.h
index 58831bbf..ff95b913 100644
--- a/src/Nerve_GIC/include/gudhi/GIC.h
+++ b/src/Nerve_GIC/include/gudhi/GIC.h
@@ -934,7 +934,7 @@ class Cover_complex {
}
graphic << "}";
graphic.close();
- std::cout << ".dot file generated. It can be visualized with e.g. neato." << std::endl;
+ std::cout << mapp << " file generated. It can be visualized with e.g. neato." << std::endl;
}
public: // Create a .txt file that can be compiled with KeplerMapper.
@@ -944,7 +944,7 @@ class Cover_complex {
void write_info() {
int num_simplices = simplices.size();
int num_edges = 0;
- std::string mapp = point_cloud_name + "_sc.dot";
+ std::string mapp = point_cloud_name + "_sc.txt";
std::ofstream graphic(mapp.c_str());
for (int i = 0; i < num_simplices; i++)
@@ -970,7 +970,8 @@ class Cover_complex {
if (cover_color[simplices[i][0]].first > mask && cover_color[simplices[i][1]].first > mask)
graphic << name2id[simplices[i][0]] << " " << name2id[simplices[i][1]] << std::endl;
graphic.close();
- std::cout << ".txt generated. It can be visualized with e.g. python KeplerMapperVisuFromTxtFile.py and firefox."
+ std::cout << mapp
+ << " generated. It can be visualized with e.g. python KeplerMapperVisuFromTxtFile.py and firefox."
<< std::endl;
}
@@ -988,7 +989,7 @@ class Cover_complex {
std::vector<std::vector<int> > edges, faces;
int numsimplices = simplices.size();
- std::string mapp = point_cloud_name + "_sc.dot";
+ std::string mapp = point_cloud_name + "_sc.off";
std::ofstream graphic(mapp.c_str());
graphic << "OFF" << std::endl;
@@ -1016,7 +1017,7 @@ class Cover_complex {
for (int i = 0; i < numfaces; i++)
graphic << 3 << " " << faces[i][0] << " " << faces[i][1] << " " << faces[i][2] << std::endl;
graphic.close();
- std::cout << ".off generated. It can be visualized with e.g. geomview." << std::endl;
+ std::cout << mapp << " generated. It can be visualized with e.g. geomview." << std::endl;
}
// *******************************************************************************************************************
diff --git a/src/Nerve_GIC/utilities/KeplerMapperVisuFromTxtFile.py b/src/Nerve_GIC/utilities/KeplerMapperVisuFromTxtFile.py
index d2897774..c811f610 100755
--- a/src/Nerve_GIC/utilities/KeplerMapperVisuFromTxtFile.py
+++ b/src/Nerve_GIC/utilities/KeplerMapperVisuFromTxtFile.py
@@ -3,6 +3,7 @@
import km
import numpy as np
from collections import defaultdict
+import argparse
"""This file is part of the Gudhi Library. The Gudhi library
(Geometric Understanding in Higher Dimensions) is a generic C++
@@ -30,43 +31,59 @@ __author__ = "Mathieu Carriere"
__copyright__ = "Copyright (C) 2017 INRIA"
__license__ = "GPL v3"
-network = {}
-mapper = km.KeplerMapper(verbose=0)
-data = np.zeros((3,3))
-projected_data = mapper.fit_transform( data, projection="sum", scaler=None )
-
-f = open('SC.txt','r')
-nodes = defaultdict(list)
-links = defaultdict(list)
-custom = defaultdict(list)
-
-dat = f.readline()
-lens = f.readline()
-color = f.readline();
-param = [float(i) for i in f.readline().split(" ")]
-
-nums = [int(i) for i in f.readline().split(" ")]
-num_nodes = nums[0]
-num_edges = nums[1]
-
-for i in range(0,num_nodes):
- point = [float(j) for j in f.readline().split(" ")]
- nodes[ str(int(point[0])) ] = [ int(point[0]), point[1], int(point[2]) ]
- links[ str(int(point[0])) ] = []
- custom[ int(point[0]) ] = point[1]
-
-m = min([custom[i] for i in range(0,num_nodes)])
-M = max([custom[i] for i in range(0,num_nodes)])
-
-for i in range(0,num_edges):
- edge = [int(j) for j in f.readline().split(" ")]
- links[ str(edge[0]) ].append( str(edge[1]) )
- links[ str(edge[1]) ].append( str(edge[0]) )
-
-network["nodes"] = nodes
-network["links"] = links
-network["meta"] = lens
-
-mapper.visualize(network, color_function = color, path_html="SC.html", title=dat,
-graph_link_distance=30, graph_gravity=0.1, graph_charge=-120, custom_tooltips=custom, width_html=0,
-height_html=0, show_tooltips=True, show_title=True, show_meta=True, res=param[0],gain=param[1], minimum=m,maximum=M)
+parser = argparse.ArgumentParser(description='Creates an html Keppler Mapper '
+ 'file to visualize a SC.txt file.',
+ epilog='Example: '
+ './KeplerMapperVisuFromTxtFile.py '
+ '-f ../../data/points/human.off_sc.txt'
+ '- Constructs an human.off_sc.html file.')
+parser.add_argument("-f", "--file", type=str, required=True)
+
+args = parser.parse_args()
+
+with open(args.file, 'r') as f:
+ network = {}
+ mapper = km.KeplerMapper(verbose=0)
+ data = np.zeros((3,3))
+ projected_data = mapper.fit_transform( data, projection="sum", scaler=None )
+
+ nodes = defaultdict(list)
+ links = defaultdict(list)
+ custom = defaultdict(list)
+
+ dat = f.readline()
+ lens = f.readline()
+ color = f.readline();
+ param = [float(i) for i in f.readline().split(" ")]
+
+ nums = [int(i) for i in f.readline().split(" ")]
+ num_nodes = nums[0]
+ num_edges = nums[1]
+
+ for i in range(0,num_nodes):
+ point = [float(j) for j in f.readline().split(" ")]
+ nodes[ str(int(point[0])) ] = [ int(point[0]), point[1], int(point[2]) ]
+ links[ str(int(point[0])) ] = []
+ custom[ int(point[0]) ] = point[1]
+
+ m = min([custom[i] for i in range(0,num_nodes)])
+ M = max([custom[i] for i in range(0,num_nodes)])
+
+ for i in range(0,num_edges):
+ edge = [int(j) for j in f.readline().split(" ")]
+ links[ str(edge[0]) ].append( str(edge[1]) )
+ links[ str(edge[1]) ].append( str(edge[0]) )
+
+ network["nodes"] = nodes
+ network["links"] = links
+ network["meta"] = lens
+
+ html_output_filename = args.file.rsplit('.', 1)[0] + '.html'
+ mapper.visualize(network, color_function = color, path_html=html_output_filename, title=dat,
+ graph_link_distance=30, graph_gravity=0.1, graph_charge=-120, custom_tooltips=custom, width_html=0,
+ height_html=0, show_tooltips=True, show_title=True, show_meta=True, res=param[0],gain=param[1], minimum=m,maximum=M)
+ message = repr(html_output_filename) + " is generated. You can now use your favorite web browser to visualize it."
+ print(message)
+
+
+ f.close()
diff --git a/src/Nerve_GIC/utilities/covercomplex.md b/src/Nerve_GIC/utilities/covercomplex.md
new file mode 100644
index 00000000..692e44e7
--- /dev/null
+++ b/src/Nerve_GIC/utilities/covercomplex.md
@@ -0,0 +1,73 @@
+---
+layout: page
+title: "Cover complex"
+meta_title: "covercomplex"
+subheadline: ""
+teaser: ""
+permalink: "/covercomplex/"
+---
+{::comment}
+These flags above are here for web site generation, please let them.
+cf. https://gitlab.inria.fr/GUDHI/website
+Must be in conformity with _data/navigation.yml
+{:/comment}
+
+
+
+## Nerve ##
+This program builds the Nerve of a point cloud sampled on an OFF file.
+The cover C comes from the preimages of intervals covering a coordinate function,
+which are then refined into their connected components using the triangulation of the .OFF file.
+
+The program also writes a file SC.txt.
+The first three lines in this file are the location of the input point cloud and the function used to compute the cover.
+The fourth line contains the number of vertices nv and edges ne of the Nerve. The next nv lines represent the vertices.
+Each line contains the vertex ID, the number of data points it contains, and their average color function value.
+Finally, the next ne lines represent the edges, characterized by the ID of their vertices.
+
+**Usage**
+
+`Nerve <OFF input file> coordinate resolution gain [--v]`
+
+where
+
+* `coordinate` is the coordinate function to cover
+* `resolution` is the number of the intervals
+* `gain` is the gain for each interval
+* `--v` is optional, it activates verbose mode.
+
+**Example**
+
+`Nerve ../../data/points/human.off 2 10 0.3`
+
+* Builds the Nerve of a point cloud sampled on a 3D human shape (human.off).
+The cover C comes from the preimages of intervals (10 intervals with gain 0.3) covering the height function (coordinate 2).
+
+`python KeplerMapperVisuFromTxtFile.py -f ../../data/points/human.off_sc.txt`
+
+* Constructs `human.off_sc.html` file. You can now use your favorite web browser to visualize it.
+
+## VoronoiGIC ##
+
+This util builds the Graph Induced Complex (GIC) of a point cloud.
+It subsamples *N* points in the point cloud, which act as seeds of a geodesic Voronoï diagram.
+Each cell of the diagram is then an element of C.
+
+The program also writes a file `*_sc.off`, that is an OFF file that can be visualized with GeomView.
+
+**Usage**
+
+`VoroniGIC <OFF input file> samples_number [--v]`
+
+where
+
+* `samples_number` is the number of samples to take from the point cloud
+* `--v` is optional, it activates verbose mode.
+
+**Example**
+
+`VoroniGIC ../../data/points/human.off 700`
+
+* Builds the Voronoi Graph Induced Complex with 700 subsamples from `human.off` file.
+`../../data/points/human_sc.off` can be visualized with GeomView.
+
diff --git a/src/Rips_complex/utilities/README b/src/Rips_complex/utilities/README
index 4d20c806..44a37543 100644
--- a/src/Rips_complex/utilities/README
+++ b/src/Rips_complex/utilities/README
@@ -1,6 +1,20 @@
-# Rips_complex #
-
-## `rips_persistence` ##
+---
+layout: page
+title: "Rips complex"
+meta_title: "ripscomplex"
+subheadline: ""
+teaser: ""
+permalink: "/ripscomplex/"
+---
+{::comment}
+These flags above are here for web site generation, please let them.
+cf. https://gitlab.inria.fr/GUDHI/website
+Must be in conformity with _data/navigation.yml
+{:/comment}
+
+
+
+## rips_persistence ##
This program computes the persistent homology with coefficient field *Z/pZ* of a Rips complex defined on a set of input points. The output diagram contains one bar per line, written with the convention:
`p dim birth death`
@@ -8,6 +22,7 @@ This program computes the persistent homology with coefficient field *Z/pZ* of a
where `dim` is the dimension of the homological feature, `birth` and `death` are respectively the birth and death of the feature, and `p` is the characteristic of the field *Z/pZ* used for homology coefficients (`p` must be a prime number).
**Usage**
+
`rips_persistence [options] <OFF input file>`
**Allowed options**
@@ -22,53 +37,25 @@ where `dim` is the dimension of the homological feature, `birth` and `death` are
Beware: this program may use a lot of RAM and take a lot of time if `max-edge-length` is set to a large value.
**Example 1 with Z/2Z coefficients**
-`rips_persistence ../../data/points/tore3D_1307.off -r 0.25 -m 0.5 -d 3 -p 2`
-outputs:
-```
-2 0 0 inf
-2 1 0.0983494 inf
-2 1 0.104347 inf
-2 2 0.138335 inf
-```
+`rips_persistence ../../data/points/tore3D_1307.off -r 0.25 -m 0.5 -d 3 -p 2`
**Example 2 with Z/3Z coefficients**
-rips_persistence ../../data/points/tore3D_1307.off -r 0.25 -m 0.5 -d 3 -p 3
-
-outputs:
-```
-3 0 0 inf
-3 1 0.0983494 inf
-3 1 0.104347 inf
-3 2 0.138335 inf
-```
+`rips_persistence ../../data/points/tore3D_1307.off -r 0.25 -m 0.5 -d 3 -p 3`
+## rips_distance_matrix_persistence ##
+Same as `rips_persistence` but taking a distance matrix as input.
-## `rips_distance_matrix_persistence` ##
-Same as `rips_persistence` but taking a distance matrix as input.
-
**Usage**
-`rips_persistence [options] <CSV input file>`
-where
+
+`rips_persistence [options] <CSV input file>`
+
+where
`<CSV input file>` is the path to the file containing a distance matrix. Can be square or lower triangular matrix. Separator is ';'.
**Example**
-`rips_distance_matrix_persistence data/distance_matrix/full_square_distance_matrix.csv -r 15 -d 3 -p 3 -m 0`
-outputs:
-```
-The complex contains 46 simplices
- and has dimension 3
-3 0 0 inf
-3 0 0 8.94427
-3 0 0 7.28011
-3 0 0 6.08276
-3 0 0 5.83095
-3 0 0 5.38516
-3 0 0 5
-3 1 11 12.0416
-3 1 6.32456 6.7082
-```
+`rips_distance_matrix_persistence data/distance_matrix/full_square_distance_matrix.csv -r 15 -d 3 -p 3 -m 0`
diff --git a/src/Witness_complex/utilities/README b/src/Witness_complex/utilities/README
index 1141033e..5cdb1f88 100644
--- a/src/Witness_complex/utilities/README
+++ b/src/Witness_complex/utilities/README
@@ -1,18 +1,34 @@
-# Witness_complex #
+---
+layout: page
+title: "Witness complex"
+meta_title: "witnesscomplex"
+subheadline: ""
+teaser: ""
+permalink: "/witnesscomplex/"
+---
+{::comment}
+These flags above are here for web site generation, please let them.
+cf. https://gitlab.inria.fr/GUDHI/website
+Must be in conformity with _data/navigation.yml
+{:/comment}
+
For more details about the witness complex, please read the [user manual of the package](http://gudhi.gforge.inria.fr/doc/latest/group__witness__complex.html).
-## `weak_witness_persistence` ##
-This program computes the persistent homology with coefficient field *Z/pZ* of a Weak witness complex defined on a set of input points. The output diagram contains one bar per line, written with the convention:
+## weak_witness_persistence ##
+This program computes the persistent homology with coefficient field *Z/pZ* of a Weak witness complex defined on a set of input points.
+The output diagram contains one bar per line, written with the convention:
`p dim birth death`
-where `dim` is the dimension of the homological feature, `birth` and `death` are respectively the birth and death of the feature, and `p` is the characteristic of the field *Z/pZ* used for homology coefficients.
+where `dim` is the dimension of the homological feature, `birth` and `death` are respectively the birth and death of the feature,
+and `p` is the characteristic of the field *Z/pZ* used for homology coefficients.
+
+**Usage**
-*Usage*
`weak_witness_persistence [options] <OFF input file>`
-*Allowed options*
+**Allowed options**
* `-h [ --help ]` Produce help message
* `-l [ --landmarks ]` Number of landmarks to choose from the point cloud.
@@ -22,33 +38,28 @@ where `dim` is the dimension of the homological feature, `birth` and `death` are
* `-m [ --min-persistence ]` (default = 0) Minimal lifetime of homology feature to be recorded. Enter a negative value to see zero length intervals.
* `-d [ --cpx-dimension ]` (default = 2147483647) Maximal dimension of the weak witness complex we want to compute.
-*Example*
-`weak_witness_persistence data/points/tore3D_1307.off -l 20 -a 0.5 -m 0.006`
+**Example**
-outputs:
-```
-Successfully read 1307 points.
-Ambient dimension is 3.
-The complex contains 732 simplices and has dimension 8
-11 0 0 inf
-11 1 0 inf
-11 2 0.0275251 0.0534586
-11 1 0 0.0239952
-```
+`weak_witness_persistence data/points/tore3D_1307.off -l 20 -a 0.5 -m 0.006`
N.B.: output is random as the 20 landmarks are chosen randomly.
-## `strong_witness_persistence` ##
-This program computes the persistent homology with coefficient field *Z/pZ* of a Strong witness complex defined on a set of input points. The output diagram contains one bar per line, written with the convention:
+
+## strong_witness_persistence ##
+
+This program computes the persistent homology with coefficient field *Z/pZ* of a Strong witness complex defined on a set of input points.
+The output diagram contains one bar per line, written with the convention:
`p dim birth death`
-where `dim` is the dimension of the homological feature, `birth` and `death` are respectively the birth and death of the feature, and `p` is the characteristic of the field *Z/pZ* used for homology coefficients.
+where `dim` is the dimension of the homological feature, `birth` and `death` are respectively the birth and death of the feature,
+and `p` is the characteristic of the field *Z/pZ* used for homology coefficients.
+
+**Usage**
-*Usage*
`strong_witness_persistence [options] <OFF input file>`
-*Allowed options*
+**Allowed options**
* `-h [ --help ]` Produce help message
* `-l [ --landmarks ]` Number of landmarks to choose from the point cloud.
@@ -58,17 +69,8 @@ where `dim` is the dimension of the homological feature, `birth` and `death` are
* `-m [ --min-persistence ]` (default = 0) Minimal lifetime of homology feature to be recorded. Enter a negative value to see zero length intervals.
* `-d [ --cpx-dimension ]` (default = 2147483647) Maximal dimension of the weak witness complex we want to compute.
-*Example*
-`strong_witness_persistence data/points/tore3D_1307.off -l 20 -a 0.5 -m 0.06`
+**Example**
-outputs:
-```
-Successfully read 1307 points.
-Ambient dimension is 3.
-The complex contains 1836 simplices and has dimension 8
-11 0 0 inf
-11 1 0.00674748 inf
-11 2 0.0937751 0.235354
-```
+`strong_witness_persistence data/points/tore3D_1307.off -l 20 -a 0.5 -m 0.06`
N.B.: output is random as the 20 landmarks are chosen randomly.
diff --git a/src/common/utilities/README b/src/common/utilities/README
index 18fa8cc4..f39c63b8 100644
--- a/src/common/utilities/README
+++ b/src/common/utilities/README
@@ -1,19 +1,43 @@
-# Pointset generator #
+---
+layout: page
+title: "Pointset generator"
+meta_title: "pointsetgenerator"
+subheadline: ""
+teaser: ""
+permalink: "/pointsetgenerator/"
+---
+{::comment}
+These flags above are here for web site generation, please let them.
+cf. https://gitlab.inria.fr/GUDHI/website
+Must be in conformity with _data/navigation.yml
+{:/comment}
-## `off_file_from_shape_generator` ##
+## off_file_from_shape_generator ##
Generates a pointset and save it in an OFF file. Command-line is:
-`off_file_from_shape_generator on|in sphere|cube|curve|torus|klein <filename> <num_points> <dimension> <parameter1> <parameter2>...`
+
+```
+off_file_from_shape_generator on|in sphere|cube|curve|torus|klein <filename> <num_points> <dimension> <parameter1> <parameter2>...
+```
Warning: "on cube" generator is not available!
-Examples:
+**Examples**
+
+```
+off_file_from_shape_generator on sphere onSphere.off 1000 3 15.2
+```
+
+* Generates an onSphere.off file with 1000 points randomized on a sphere of dimension 3 and radius 15.2.
+
+```
+off_file_from_shape_generator in sphere inSphere.off 100 2
+```
+
+* Generates an inSphere.off file with 100 points randomized in a sphere of dimension 2 (circle) and radius 1.0 (default).
-* Generate an onSphere.off file with 1000 points randomized on a sphere of dimension 3 and radius 15.2:
-`off_file_from_shape_generator on sphere onSphere.off 1000 3 15.2`
-
-* Generate an inSphere.off file with 100 points randomized in a sphere of dimension 2 (circle) and radius 1.0 (default):
-`off_file_from_shape_generator in sphere inSphere.off 100 2`
+```
+off_file_from_shape_generator in cube inCube.off 10000 3 5.8
+```
-* Generates a inCube.off file with 10000 points randomized in a cube of dimension 3 and side 5.8:
-`off_file_from_shape_generator in cube inCube.off 10000 3 5.8`
+* Generates a inCube.off file with 10000 points randomized in a cube of dimension 3 and side 5.8.