Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
14 changes: 7 additions & 7 deletions doc/gallery/applications/normalizing_flows_in_pytensor.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@
"- $f(\\cdot)$ if the (known!) PDF of the variable $X$\n",
"- $G(\\cdot)$ is a function with nice properties.\n",
"\n",
"The \"nice properties\" require (in the most general case) that $G(x)$ is a $C^1$ diffeomorphism, which means that it is 1) continuous and differentiable almost everywhere; 2) it is bijective, and 3) its derivaties are also bijective. \n",
"The \"nice properties\" require (in the most general case) that $G(x)$ is a $C^1$ diffeomorphism, which means that it is 1) continuous and differentiable almost everywhere; 2) it is bijective, and 3) its derivatives are also bijective. \n",
"\n",
"A simpler requirement is that $G(x)$ is continuous, bijective, and monotonic. That will get us 99% of the way there. Hey, $\\exp$ is continuous, bijective, and monotonic -- what a coincidence!\n"
]
Expand Down Expand Up @@ -412,7 +412,7 @@
],
"source": [
"z_values = pt.dvector(\"z_values\")\n",
"# The funtion `pm.logp` does the magic!\n",
"# The function `pm.logp` does the magic!\n",
"z_logp = pm.logp(z, z_values, jacobian=True)\n",
"# We do this rewrite to make the computation more stable.\n",
"rewrite_graph(z_logp).dprint()"
Expand Down Expand Up @@ -668,7 +668,7 @@
"id": "5f9a7a50",
"metadata": {},
"source": [
"Theese distribution are essentially the same."
"These distribution are essentially the same."
]
},
{
Expand Down Expand Up @@ -715,7 +715,7 @@
"\n",
"So, the inverse of their composition is $G^{-1} \\equiv (J^{-1} \\circ H^{-1}) = J^{-1}(H^{-1}(x)) = J^{-1}(\\ln(x)) = \\frac{\\ln(x) - a}{b}$\n",
"\n",
"For the correction term, we need the determinant of the jacobian. Since $G$ is a scalar function, this is just the absolutel value of the gradient:\n",
"For the correction term, we need the determinant of the jacobian. Since $G$ is a scalar function, this is just the absolute value of the gradient:\n",
"\n",
"$$\\left | \\frac{\\partial}{\\partial x}G^{-1} \\right | = \\left | \\frac{\\partial}{\\partial x} \\frac{\\ln(x) - a}{b} \\right | = \\left | \\frac{1}{b} \\cdot \\frac{1}{x} \\right | $$\n",
"\n",
Expand All @@ -733,7 +733,7 @@
"source": [
"### Solution by hand\n",
"\n",
"We now implement theis analytic procesure in PyTensor:"
"We now implement this analytic procedure in PyTensor:"
]
},
{
Expand Down Expand Up @@ -803,7 +803,7 @@
"id": "bcd081d3",
"metadata": {},
"source": [
"We can verify these values are exaclty what we are expecting:"
"We can verify these values are exactly what we are expecting:"
]
},
{
Expand Down Expand Up @@ -859,7 +859,7 @@
"id": "46834a6f",
"metadata": {},
"source": [
"As above let's verify taht the results are consistent and correct:"
"As above let's verify that the results are consistent and correct:"
]
},
{
Expand Down