Sliced OT Distances
FunctionsGenerates n_projections samples from the uniform on the unit sphere of dimension \(d-1\): \(\mathcal{U}(\mathcal{S}^{d-1})\)
out – The uniform unit vectors on the sphere
ndarray, shape (d, n_projections)
Examples
>>> n_projections = 100 >>> d = 5 >>> projs = get_random_projections(d, n_projections) >>> np.allclose(np.sum(np.square(projs), 0), 1.) True
Computes a Monte-Carlo approximation of the max p-Sliced Wasserstein distance
\[\mathcal{Max-SWD}_p(\mu, \nu) = \underset{\theta _in \mathcal{U}(\mathbb{S}^{d-1})}{\max} [\mathcal{W}_p^p(\theta_\# \mu, \theta_\# \nu)]^{\frac{1}{p}}\]
where :
\(\theta_\# \mu\) stands for the pushforwards of the projection \(\mathbb{R}^d \ni X \mapsto \langle \theta, X \rangle\)
X_s (ndarray, shape (n_samples_a, dim)) – samples in the source domain
X_t (ndarray, shape (n_samples_b, dim)) – samples in the target domain
a (ndarray, shape (n_samples_a,), optional) – samples weights in the source domain
b (ndarray, shape (n_samples_b,), optional) – samples weights in the target domain
n_projections (int, optional) – Number of projections used for the Monte-Carlo approximation
p (float, optional =) – Power p used for computing the sliced Wasserstein
projections (shape (dim, n_projections), optional) – Projection matrix (n_projections and seed are not used in this case)
seed (int or RandomState or None, optional) – Seed used for random number generator
log (bool, optional) – if True, sliced_wasserstein_distance returns the projections used and their associated EMD.
cost (float) – Sliced Wasserstein Cost
log (dict, optional) – log dictionary return only if log==True in parameters
Examples
>>> n_samples_a = 20 >>> X = np.random.normal(0., 1., (n_samples_a, 5)) >>> sliced_wasserstein_distance(X, X, seed=0) 0.0
References
Computes a Monte-Carlo approximation of the p-Sliced Wasserstein distance
\[\mathcal{SWD}_p(\mu, \nu) = \underset{\theta \sim \mathcal{U}(\mathbb{S}^{d-1})}{\mathbb{E}}\left(\mathcal{W}_p^p(\theta_\# \mu, \theta_\# \nu)\right)^{\frac{1}{p}}\]
where :
\(\theta_\# \mu\) stands for the pushforwards of the projection \(X \in \mathbb{R}^d \mapsto \langle \theta, X \rangle\)
X_s (ndarray, shape (n_samples_a, dim)) – samples in the source domain
X_t (ndarray, shape (n_samples_b, dim)) – samples in the target domain
a (ndarray, shape (n_samples_a,), optional) – samples weights in the source domain
b (ndarray, shape (n_samples_b,), optional) – samples weights in the target domain
n_projections (int, optional) – Number of projections used for the Monte-Carlo approximation
p (float, optional =) – Power p used for computing the sliced Wasserstein
projections (shape (dim, n_projections), optional) – Projection matrix (n_projections and seed are not used in this case)
seed (int or RandomState or None, optional) – Seed used for random number generator
log (bool, optional) – if True, sliced_wasserstein_distance returns the projections used and their associated EMD.
cost (float) – Sliced Wasserstein Cost
log (dict, optional) – log dictionary return only if log==True in parameters
Examples
>>> n_samples_a = 20 >>> X = np.random.normal(0., 1., (n_samples_a, 5)) >>> sliced_wasserstein_distance(X, X, seed=0) 0.0
References
Compute the spherical sliced-Wasserstein discrepancy.
\[SSW_p(\mu,\nu) = \left(\int_{\mathbb{V}_{d,2}} W_p^p(P^U_\#\mu, P^U_\#\nu)\ \mathrm{d}\sigma(U)\right)^{\frac{1}{p}}\]
where:
\(P^U_\# \mu\) stands for the pushforwards of the projection \(\forall x\in S^{d-1},\ P^U(x) = \frac{U^Tx}{\|U^Tx\|_2}\)
The function runs on backend but tensorflow and jax are not supported.
X_s (ndarray, shape (n_samples_a, dim)) – Samples in the source domain
X_t (ndarray, shape (n_samples_b, dim)) – Samples in the target domain
a (ndarray, shape (n_samples_a,), optional) – samples weights in the source domain
b (ndarray, shape (n_samples_b,), optional) – samples weights in the target domain
n_projections (int, optional) – Number of projections used for the Monte-Carlo approximation
p (float, optional (default=2)) – Power p used for computing the spherical sliced Wasserstein
projections (shape (n_projections, dim, 2), optional) – Projection matrix (n_projections and seed are not used in this case)
seed (int or RandomState or None, optional) – Seed used for random number generator
log (bool, optional) – if True, sliced_wasserstein_sphere returns the projections used and their associated EMD.
cost (float) – Spherical Sliced Wasserstein Cost
log (dict, optional) – log dictionary return only if log==True in parameters
Examples
>>> n_samples_a = 20 >>> X = np.random.normal(0., 1., (n_samples_a, 5)) >>> X = X / np.sqrt(np.sum(X**2, -1, keepdims=True)) >>> sliced_wasserstein_sphere(X, X, seed=0) 0.0
References
Compute the 2-spherical sliced wasserstein w.r.t. a uniform distribution.
\[SSW_2(\mu_n, \nu)\]
where
\(\mu_n=\sum_{i=1}^n \alpha_i \delta_{x_i}\)
\(\nu=\mathrm{Unif}(S^1)\)
X_s (ndarray, shape (n_samples_a, dim)) – Samples in the source domain
a (ndarray, shape (n_samples_a,), optional) – samples weights in the source domain
n_projections (int, optional) – Number of projections used for the Monte-Carlo approximation
seed (int or RandomState or None, optional) – Seed used for random number generator
log (bool, optional) – if True, sliced_wasserstein_distance returns the projections used and their associated EMD.
cost (float) – Spherical Sliced Wasserstein Cost
log (dict, optional) – log dictionary return only if log==True in parameters
Examples
>>> np.random.seed(42) >>> x0 = np.random.randn(500,3) >>> x0 = x0 / np.sqrt(np.sum(x0**2, -1, keepdims=True)) >>> ssw = sliced_wasserstein_sphere_unif(x0, seed=42) >>> np.allclose(sliced_wasserstein_sphere_unif(x0, seed=42), 0.01734, atol=1e-3) TrueReferences:
Generates n_projections samples from the uniform on the unit sphere of dimension \(d-1\): \(\mathcal{U}(\mathcal{S}^{d-1})\)
out – The uniform unit vectors on the sphere
ndarray, shape (d, n_projections)
Examples
>>> n_projections = 100 >>> d = 5 >>> projs = get_random_projections(d, n_projections) >>> np.allclose(np.sum(np.square(projs), 0), 1.) True
Computes a Monte-Carlo approximation of the max p-Sliced Wasserstein distance
\[\mathcal{Max-SWD}_p(\mu, \nu) = \underset{\theta _in \mathcal{U}(\mathbb{S}^{d-1})}{\max} [\mathcal{W}_p^p(\theta_\# \mu, \theta_\# \nu)]^{\frac{1}{p}}\]
where :
\(\theta_\# \mu\) stands for the pushforwards of the projection \(\mathbb{R}^d \ni X \mapsto \langle \theta, X \rangle\)
X_s (ndarray, shape (n_samples_a, dim)) – samples in the source domain
X_t (ndarray, shape (n_samples_b, dim)) – samples in the target domain
a (ndarray, shape (n_samples_a,), optional) – samples weights in the source domain
b (ndarray, shape (n_samples_b,), optional) – samples weights in the target domain
n_projections (int, optional) – Number of projections used for the Monte-Carlo approximation
p (float, optional =) – Power p used for computing the sliced Wasserstein
projections (shape (dim, n_projections), optional) – Projection matrix (n_projections and seed are not used in this case)
seed (int or RandomState or None, optional) – Seed used for random number generator
log (bool, optional) – if True, sliced_wasserstein_distance returns the projections used and their associated EMD.
cost (float) – Sliced Wasserstein Cost
log (dict, optional) – log dictionary return only if log==True in parameters
Examples
>>> n_samples_a = 20 >>> X = np.random.normal(0., 1., (n_samples_a, 5)) >>> sliced_wasserstein_distance(X, X, seed=0) 0.0
References
Computes a Monte-Carlo approximation of the p-Sliced Wasserstein distance
\[\mathcal{SWD}_p(\mu, \nu) = \underset{\theta \sim \mathcal{U}(\mathbb{S}^{d-1})}{\mathbb{E}}\left(\mathcal{W}_p^p(\theta_\# \mu, \theta_\# \nu)\right)^{\frac{1}{p}}\]
where :
\(\theta_\# \mu\) stands for the pushforwards of the projection \(X \in \mathbb{R}^d \mapsto \langle \theta, X \rangle\)
X_s (ndarray, shape (n_samples_a, dim)) – samples in the source domain
X_t (ndarray, shape (n_samples_b, dim)) – samples in the target domain
a (ndarray, shape (n_samples_a,), optional) – samples weights in the source domain
b (ndarray, shape (n_samples_b,), optional) – samples weights in the target domain
n_projections (int, optional) – Number of projections used for the Monte-Carlo approximation
p (float, optional =) – Power p used for computing the sliced Wasserstein
projections (shape (dim, n_projections), optional) – Projection matrix (n_projections and seed are not used in this case)
seed (int or RandomState or None, optional) – Seed used for random number generator
log (bool, optional) – if True, sliced_wasserstein_distance returns the projections used and their associated EMD.
cost (float) – Sliced Wasserstein Cost
log (dict, optional) – log dictionary return only if log==True in parameters
Examples
>>> n_samples_a = 20 >>> X = np.random.normal(0., 1., (n_samples_a, 5)) >>> sliced_wasserstein_distance(X, X, seed=0) 0.0
References
Compute the spherical sliced-Wasserstein discrepancy.
\[SSW_p(\mu,\nu) = \left(\int_{\mathbb{V}_{d,2}} W_p^p(P^U_\#\mu, P^U_\#\nu)\ \mathrm{d}\sigma(U)\right)^{\frac{1}{p}}\]
where:
\(P^U_\# \mu\) stands for the pushforwards of the projection \(\forall x\in S^{d-1},\ P^U(x) = \frac{U^Tx}{\|U^Tx\|_2}\)
The function runs on backend but tensorflow and jax are not supported.
X_s (ndarray, shape (n_samples_a, dim)) – Samples in the source domain
X_t (ndarray, shape (n_samples_b, dim)) – Samples in the target domain
a (ndarray, shape (n_samples_a,), optional) – samples weights in the source domain
b (ndarray, shape (n_samples_b,), optional) – samples weights in the target domain
n_projections (int, optional) – Number of projections used for the Monte-Carlo approximation
p (float, optional (default=2)) – Power p used for computing the spherical sliced Wasserstein
projections (shape (n_projections, dim, 2), optional) – Projection matrix (n_projections and seed are not used in this case)
seed (int or RandomState or None, optional) – Seed used for random number generator
log (bool, optional) – if True, sliced_wasserstein_sphere returns the projections used and their associated EMD.
cost (float) – Spherical Sliced Wasserstein Cost
log (dict, optional) – log dictionary return only if log==True in parameters
Examples
>>> n_samples_a = 20 >>> X = np.random.normal(0., 1., (n_samples_a, 5)) >>> X = X / np.sqrt(np.sum(X**2, -1, keepdims=True)) >>> sliced_wasserstein_sphere(X, X, seed=0) 0.0
References
Compute the 2-spherical sliced wasserstein w.r.t. a uniform distribution.
\[SSW_2(\mu_n, \nu)\]
where
\(\mu_n=\sum_{i=1}^n \alpha_i \delta_{x_i}\)
\(\nu=\mathrm{Unif}(S^1)\)
X_s (ndarray, shape (n_samples_a, dim)) – Samples in the source domain
a (ndarray, shape (n_samples_a,), optional) – samples weights in the source domain
n_projections (int, optional) – Number of projections used for the Monte-Carlo approximation
seed (int or RandomState or None, optional) – Seed used for random number generator
log (bool, optional) – if True, sliced_wasserstein_distance returns the projections used and their associated EMD.
cost (float) – Spherical Sliced Wasserstein Cost
log (dict, optional) – log dictionary return only if log==True in parameters
Examples
>>> np.random.seed(42) >>> x0 = np.random.randn(500,3) >>> x0 = x0 / np.sqrt(np.sum(x0**2, -1, keepdims=True)) >>> ssw = sliced_wasserstein_sphere_unif(x0, seed=42) >>> np.allclose(sliced_wasserstein_sphere_unif(x0, seed=42), 0.01734, atol=1e-3) TrueReferences:
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4