Skip to content

Commit abe3a47

Browse files
authored
Merge branch 'master' into better-dist
2 parents e818733 + cbdf979 commit abe3a47

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

45 files changed

+3172
-1465
lines changed

.github/workflows/labeler.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
name: "Pull Request Labeler"
22

33
on:
4-
pull_request:
4+
pull_request_target:
55
branches:
66
- 'master'
77
jobs:
@@ -11,4 +11,4 @@ jobs:
1111
pull-requests: write
1212
runs-on: ubuntu-latest
1313
steps:
14-
- uses: actions/labeler@v5
14+
- uses: actions/labeler@v5

CONTRIBUTORS.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -48,7 +48,7 @@ The contributors to this library are:
4848
* [Camille Le Coz](https://www.linkedin.com/in/camille-le-coz-8593b91a1/) (EMD2 debug)
4949
* [Eduardo Fernandes Montesuma](https://eddardd.github.io/my-personal-blog/) (Free support sinkhorn barycenter)
5050
* [Theo Gnassounou](https://github.com/tgnassou) (OT between Gaussian distributions)
51-
* [Clément Bonet](https://clbonet.github.io) (Wassertstein on circle, Spherical Sliced-Wasserstein)
51+
* [Clément Bonet](https://clbonet.github.io) (Wasserstein on circle, Spherical Sliced-Wasserstein)
5252
* [Ronak Mehta](https://ronakrm.github.io) (Efficient Discrete Multi Marginal Optimal Transport Regularization)
5353
* [Xizheng Yu](https://github.com/x12hengyu) (Efficient Discrete Multi Marginal Optimal Transport Regularization)
5454
* [Sonia Mazelet](https://github.com/SoniaMaz8) (Template based GNN layers)

README.md

Lines changed: 6 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -40,8 +40,7 @@ POT provides the following generic OT solvers (links to examples):
4040
* [Sampled solver of Gromov Wasserstein](https://pythonot.github.io/auto_examples/gromov/plot_gromov.html) for large-scale problem with any loss functions [33]
4141
* Non regularized [free support Wasserstein barycenters](https://pythonot.github.io/auto_examples/barycenters/plot_free_support_barycenter.html) [20].
4242
* [One dimensional Unbalanced OT](https://pythonot.github.io/auto_examples/unbalanced-partial/plot_UOT_1D.html) with KL relaxation and [barycenter](https://pythonot.github.io/auto_examples/unbalanced-partial/plot_UOT_barycenter_1D.html) [10, 25]. Also [exact unbalanced OT](https://pythonot.github.io/auto_examples/unbalanced-partial/plot_unbalanced_ot.html) with KL and quadratic regularization and the [regularization path of UOT](https://pythonot.github.io/auto_examples/unbalanced-partial/plot_regpath.html) [41]
43-
* [Partial Wasserstein and Gromov-Wasserstein](https://pythonot.github.io/auto_examples/unbalanced-partial/plot_partial_wass_and_gromov.html) (exact [29] and entropic [3]
44-
formulations).
43+
* [Partial Wasserstein and Gromov-Wasserstein](https://pythonot.github.io/auto_examples/unbalanced-partial/plot_partial_wass_and_gromov.html) and [Partial Fused Gromov-Wasserstein](https://pythonot.github.io/auto_examples/gromov/plot_partial_fgw.html) (exact [29] and entropic [3] formulations).
4544
* [Sliced Wasserstein](https://pythonot.github.io/auto_examples/sliced-wasserstein/plot_variance.html) [31, 32] and Max-sliced Wasserstein [35] that can be used for gradient flows [36].
4645
* [Wasserstein distance on the circle](https://pythonot.github.io/auto_examples/plot_compute_wasserstein_circle.html) [44, 45]
4746
* [Spherical Sliced Wasserstein](https://pythonot.github.io/auto_examples/sliced-wasserstein/plot_variance_ssw.html) [46]
@@ -51,7 +50,7 @@ POT provides the following generic OT solvers (links to examples):
5150
* [Efficient Discrete Multi Marginal Optimal Transport Regularization](https://pythonot.github.io/auto_examples/others/plot_demd_gradient_minimize.html) [50].
5251
* [Several backends](https://pythonot.github.io/quickstart.html#solving-ot-with-multiple-backends) for easy use of POT with [Pytorch](https://pytorch.org/)/[jax](https://github.com/google/jax)/[Numpy](https://numpy.org/)/[Cupy](https://cupy.dev/)/[Tensorflow](https://www.tensorflow.org/) arrays.
5352
* [Smooth Strongly Convex Nearest Brenier Potentials](https://pythonot.github.io/auto_examples/others/plot_SSNB.html#sphx-glr-auto-examples-others-plot-ssnb-py) [58], with an extension to bounding potentials using [59].
54-
* Gaussian Mixture Model OT [69]
53+
* [Gaussian Mixture Model OT](https://pythonot.github.io/auto_examples/others/plot_GMMOT_plan.html#sphx-glr-auto-examples-others-plot-gmmot-plan-py) [69].
5554
* [Co-Optimal Transport](https://pythonot.github.io/auto_examples/others/plot_COOT.html) [49] and
5655
[unbalanced Co-Optimal Transport](https://pythonot.github.io/auto_examples/others/plot_learning_weights_with_COOT.html) [71].
5756
* Fused unbalanced Gromov-Wasserstein [70].
@@ -391,3 +390,7 @@ Artificial Intelligence.
391390
[72] Thibault Séjourné, François-Xavier Vialard, and Gabriel Peyré (2021). [The Unbalanced Gromov Wasserstein Distance: Conic Formulation and Relaxation](https://proceedings.neurips.cc/paper/2021/file/4990974d150d0de5e6e15a1454fe6b0f-Paper.pdf). Neural Information Processing Systems (NeurIPS).
392391

393392
[73] Séjourné, T., Vialard, F. X., & Peyré, G. (2022). [Faster Unbalanced Optimal Transport: Translation Invariant Sinkhorn and 1-D Frank-Wolfe](https://proceedings.mlr.press/v151/sejourne22a.html). In International Conference on Artificial Intelligence and Statistics (pp. 4995-5021). PMLR.
393+
394+
[74] Chewi, S., Maunu, T., Rigollet, P., & Stromme, A. J. (2020). [Gradient descent algorithms for Bures-Wasserstein barycenters](https://proceedings.mlr.press/v125/chewi20a.html). In Conference on Learning Theory (pp. 1276-1304). PMLR.
395+
396+
[75] Altschuler, J., Chewi, S., Gerber, P. R., & Stromme, A. (2021). [Averaging on the Bures-Wasserstein manifold: dimension-free convergence of gradient descent](https://papers.neurips.cc/paper_files/paper/2021/hash/b9acb4ae6121c941324b2b1d3fac5c30-Abstract.html). Advances in Neural Information Processing Systems, 34, 22132-22145.

RELEASES.md

Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -6,12 +6,22 @@
66
- Implement CG solvers for partial FGW (PR #687)
77
- Added feature `grad=last_step` for `ot.solvers.solve` (PR #693)
88
- Automatic PR labeling and release file update check (PR #704)
9+
- Reorganize sub-module `ot/lp/__init__.py` into separate files (PR #714)
10+
- Fix warning raise when import the library (PR #716)
11+
- Implement projected gradient descent solvers for entropic partial FGW (PR #702)
12+
- Fix documentation in the module `ot.gaussian` (PR #718)
13+
- Refactored `ot.bregman._convolutional` to improve readability (PR #709)
14+
- Added `ot.gaussian.bures_barycenter_gradient_descent` (PR #680)
15+
- Added `ot.gaussian.bures_wasserstein_distance` (PR #680)
16+
- `ot.gaussian.bures_wasserstein_distance` can be batched (PR #680)
917

1018
#### Closed issues
1119
- Fixed `ot.mapping` solvers which depended on deprecated `cvxpy` `ECOS` solver (PR #692, Issue #668)
1220
- Fixed numerical errors in `ot.gmm` (PR #690, Issue #689)
1321
- Add version number to the documentation (PR #696)
1422
- Update doc for default regularization in `ot.unbalanced` sinkhorn solvers (Issue #691, PR #700)
23+
- Clean documentation for `gromov`, `lp` and `unbalanced` folders (PR #710)
24+
- Clean references in documentation (PR #722)
1525

1626
## 0.9.5
1727

examples/gromov/plot_barycenter_fgw.py

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -91,7 +91,7 @@ def build_noisy_circular_graph(
9191
g = nx.Graph()
9292
g.add_nodes_from(list(range(N)))
9393
for i in range(N):
94-
noise = float(np.random.normal(mu, sigma, 1))
94+
noise = np.random.normal(mu, sigma, 1)[0]
9595
if with_noise:
9696
g.add_node(i, attr_name=math.sin((2 * i * math.pi / N)) + noise)
9797
else:
@@ -107,7 +107,7 @@ def build_noisy_circular_graph(
107107
if i == N - 1:
108108
g.add_edge(i, 1)
109109
g.add_edge(N, 0)
110-
noise = float(np.random.normal(mu, sigma, 1))
110+
noise = np.random.normal(mu, sigma, 1)[0]
111111
if with_noise:
112112
g.add_node(N, attr_name=math.sin((2 * N * math.pi / N)) + noise)
113113
else:
@@ -157,7 +157,7 @@ def graph_colors(nx_graph, vmin=0, vmax=7):
157157
plt.subplot(3, 3, i + 1)
158158
g = X0[i]
159159
pos = nx.kamada_kawai_layout(g)
160-
nx.draw(
160+
nx.draw_networkx(
161161
g,
162162
pos=pos,
163163
node_color=graph_colors(g, vmin=-1, vmax=1),
@@ -173,7 +173,7 @@ def graph_colors(nx_graph, vmin=0, vmax=7):
173173

174174
# %% We compute the barycenter using FGW. Structure matrices are computed using the shortest_path distance in the graph
175175
# Features distances are the euclidean distances
176-
Cs = [shortest_path(nx.adjacency_matrix(x).todense()) for x in X0]
176+
Cs = [shortest_path(nx.adjacency_matrix(x).toarray()) for x in X0]
177177
ps = [np.ones(len(x.nodes())) / len(x.nodes()) for x in X0]
178178
Ys = [
179179
np.array([v for (k, v) in nx.get_node_attributes(x, "attr_name").items()]).reshape(
@@ -199,7 +199,7 @@ def graph_colors(nx_graph, vmin=0, vmax=7):
199199

200200
# %%
201201
pos = nx.kamada_kawai_layout(bary)
202-
nx.draw(
202+
nx.draw_networkx(
203203
bary, pos=pos, node_color=graph_colors(bary, vmin=-1, vmax=1), with_labels=False
204204
)
205205
plt.suptitle("Barycenter", fontsize=20)

0 commit comments

Comments
 (0)