diff --git a/docs/source/contributing/developer_guide_implementing_distribution.md b/docs/source/contributing/developer_guide_implementing_distribution.md index 32fa236400..335d676db0 100644 --- a/docs/source/contributing/developer_guide_implementing_distribution.md +++ b/docs/source/contributing/developer_guide_implementing_distribution.md @@ -1,3 +1,4 @@ +(implementing-a-distribution)= # Implementing a Distribution This guide provides an overview on how to implement a distribution for PyMC version `>=4.0.0`. diff --git a/docs/source/learn/core_notebooks/index.md b/docs/source/learn/core_notebooks/index.md index 480d8d957b..d849b078da 100644 --- a/docs/source/learn/core_notebooks/index.md +++ b/docs/source/learn/core_notebooks/index.md @@ -9,4 +9,5 @@ GLM_linear model_comparison posterior_predictive dimensionality +pymc_aesara ::: diff --git a/docs/source/learn/core_notebooks/pymc_aesara.ipynb b/docs/source/learn/core_notebooks/pymc_aesara.ipynb new file mode 100644 index 0000000000..329a3aa6d4 --- /dev/null +++ b/docs/source/learn/core_notebooks/pymc_aesara.ipynb @@ -0,0 +1,2378 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "(pymc_aesara)=\n", + "\n", + "# PyMC and Aesara\n", + "\n", + "**Authors:** [Ricardo Vieira](https://github.com/ricardoV94) and [Juan Orduz](https://juanitorduz.github.io/)\n", + "\n", + "In this notebook we want to give an introduction of how PyMC models translate to Aesara graphs. The purpose is not to give a detailed description of all [`aesara`](https://github.com/aesara-devs/aesara)'s capabilities but rather focus on the main concepts to understand its connection with PyMC. For a more detailed description of the project please refer to the official documentation." + ] + }, + { + "cell_type": "markdown", + "metadata": { + "heading_collapsed": true, + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "## Prepare Notebook\n", + "\n", + "First import the required libraries." + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": { + "hidden": true, + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "# Aesara version: 2.6.6\n", + "# PyMC version: 4.0.0\n", + "\n" + ] + } + ], + "source": [ + "import aesara\n", + "import aesara.tensor as at\n", + "import pymc as pm\n", + "import matplotlib.pyplot as plt\n", + "import numpy as np\n", + "import scipy.stats\n", + "\n", + "\n", + "print(f\"\"\"\n", + "# Aesara version: {aesara.__version__}\n", + "# PyMC version: {pm.__version__}\n", + "\"\"\")" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "## Introduction to Aesara\n", + "\n", + "We start by looking into `aesara`. According to their documentation\n", + "\n", + "> Aesara is a Python library that allows one to define, optimize, and efficiently evaluate mathematical expressions involving multi-dimensional arrays." + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "![aesara logo](https://raw.githubusercontent.com/aesara-devs/aesara/main/doc/images/aesara_logo_2400.png)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "### A simple example\n", + "\n", + "To begin, we define some aesara tensors and show how to perform some basic operations." + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "x type: TensorType(float64, ())\n", + "x name = x\n", + "---\n", + "y type: TensorType(float64, (None,))\n", + "y name = y\n", + "\n" + ] + } + ], + "source": [ + "x = at.scalar(name=\"x\")\n", + "y = at.vector(name=\"y\")\n", + "\n", + "print(f\"\"\"\n", + "x type: {x.type}\n", + "x name = {x.name}\n", + "---\n", + "y type: {y.type}\n", + "y name = {y.name}\n", + "\"\"\")" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "Now that we have defined the `x` and `y` tensors, we can create a new one by adding them together." + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [], + "source": [ + "z = x + y\n", + "z.name = \"x + y\"" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "To make the computation a bit more complex let us take the logarithm of the resulting tensor." + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [], + "source": [ + "w = at.log(z)\n", + "w.name = \"log(x + y)\"" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "We can use the {func}`~aesara.dprint` function to print the computational graph of any given tensor." + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Elemwise{log,no_inplace} [id A] 'log(x + y)'\n", + " |Elemwise{add,no_inplace} [id B] 'x + y'\n", + " |InplaceDimShuffle{x} [id C]\n", + " | |x [id D]\n", + " |y [id E]\n" + ] + }, + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 5, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "aesara.dprint(obj=w)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "Note that this graph does not do any computation (yet!). It is simply defining the sequence of steps to be done. We can use {func}`~aesara.function` to define a callable object so that we can push values trough the graph." + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [], + "source": [ + "f = aesara.function(inputs=[x, y], outputs=w)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "Now that the graph is compiled, we can push some concrete values:" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([0., 1.])" + ] + }, + "execution_count": 7, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "f(x=0, y=[1, np.e])" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + ":::{tip}\n", + "Sometimes we just want to debug, we can use {meth}`~aesara.graph.basic.Variable.eval` for that:\n", + ":::" + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([0., 1.])" + ] + }, + "execution_count": 8, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "w.eval({x: 0, y:[1, np.e]})" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "You can set intermediate values as well" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([0., 1.])" + ] + }, + "execution_count": 9, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "w.eval({z: [1, np.e]})" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "### Aesara is clever!\n", + "\n", + "One of the most important features of `aesara` is that it can automatically optimize the mathematical operations inside a graph. Let's consider a simple example:" + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Elemwise{true_div,no_inplace} [id A] 'a / b'\n", + " |a [id B]\n", + " |b [id C]\n" + ] + }, + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 10, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "a = at.scalar(name=\"a\")\n", + "b = at.scalar(name=\"b\")\n", + "\n", + "c = a / b\n", + "c.name = \"a / b\"\n", + "\n", + "aesara.dprint(c)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "Now let us multiply `b` times `c`. This should result in simply `a`." + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Elemwise{mul,no_inplace} [id A] 'b * c'\n", + " |b [id B]\n", + " |Elemwise{true_div,no_inplace} [id C] 'a / b'\n", + " |a [id D]\n", + " |b [id B]\n" + ] + }, + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 11, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "d = b * c\n", + "d.name = \"b * c\"\n", + "\n", + "aesara.dprint(d)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "The graph shows the full computation, but once we compile it the operation becomes the identity on `a` as expected." + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "DeepCopyOp [id A] 'a' 0\n", + " |a [id B]\n" + ] + }, + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 12, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "g = aesara.function(inputs=[a, b], outputs=d)\n", + "\n", + "aesara.dprint(g)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "### What is in an Aesara graph?\n", + "\n", + "The following diagram shows the basic structure of an `aesara` graph.\n", + "\n", + "![aesara graph](https://raw.githubusercontent.com/aesara-devs/aesara/main/doc/tutorial/apply.png)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "We can can make these concepts more tangible by explicitly indicating them in the first example from the section above. Let us compute the graph components for the tensor `z`. " + ] + }, + { + "cell_type": "code", + "execution_count": 13, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "z type: TensorType(float64, (None,))\n", + "z name = x + y\n", + "z owner = Elemwise{add,no_inplace}(InplaceDimShuffle{x}.0, y)\n", + "z owner inputs = [InplaceDimShuffle{x}.0, y]\n", + "z owner op = Elemwise{add,no_inplace}\n", + "z owner output = [x + y]\n", + "\n" + ] + } + ], + "source": [ + "print(f\"\"\"\n", + "z type: {z.type}\n", + "z name = {z.name}\n", + "z owner = {z.owner}\n", + "z owner inputs = {z.owner.inputs}\n", + "z owner op = {z.owner.op}\n", + "z owner output = {z.owner.outputs}\n", + "\"\"\")" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "The following code snippet helps us understand these concepts by going through the computational graph of `w`. The actual code is not as important here, the focus is on the outputs." + ] + }, + { + "cell_type": "code", + "execution_count": 14, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "---\n", + "Checking variable log(x + y) of type TensorType(float64, (None,))\n", + " > Op is Elemwise{log,no_inplace}\n", + " > Input 0 is x + y\n", + "---\n", + "Checking variable x + y of type TensorType(float64, (None,))\n", + " > Op is Elemwise{add,no_inplace}\n", + " > Input 0 is InplaceDimShuffle{x}.0\n", + " > Input 1 is y\n", + "---\n", + "Checking variable InplaceDimShuffle{x}.0 of type TensorType(float64, (1,))\n", + " > Op is InplaceDimShuffle{x}\n", + " > Input 0 is x\n", + "---\n", + "Checking variable y of type TensorType(float64, (None,))\n", + " > y is a root variable\n", + "---\n", + "Checking variable x of type TensorType(float64, ())\n", + " > x is a root variable\n" + ] + } + ], + "source": [ + "# start from the top\n", + "stack = [w]\n", + "\n", + "while stack:\n", + " print(\"---\")\n", + " var = stack.pop(0)\n", + " print(f\"Checking variable {var} of type {var.type}\")\n", + " # check variable is not a root variable\n", + " if var.owner is not None:\n", + " print(f\" > Op is {var.owner.op}\")\n", + " # loop over the inputs\n", + " for i, input in enumerate(var.owner.inputs):\n", + " print(f\" > Input {i} is {input}\")\n", + " stack.append(input)\n", + " else:\n", + " print(f\" > {var} is a root variable\")" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "Note that this is very similar to the output of {func}`~aesara.dprint` function introduced above." + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Elemwise{log,no_inplace} [id A] 'log(x + y)'\n", + " |Elemwise{add,no_inplace} [id B] 'x + y'\n", + " |InplaceDimShuffle{x} [id C]\n", + " | |x [id D]\n", + " |y [id E]\n" + ] + }, + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 15, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "aesara.dprint(w)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "### Graph manipulation 101\n", + "\n", + "Another interesting feature of Aesara is the ability to manipulate the computational graph, something that is not possible with TensorFlow or PyTorch. Here we continue with the example above in order to illustrate the main idea around this technique." + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[x, y]" + ] + }, + "execution_count": 16, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# get input tensors\n", + "list(aesara.graph.graph_inputs(graphs=[w]))" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "As a simple example, let's add an {func}`~aesara.tensor.exp` before the {func}`~aesara.tensor.log` (to get the identity function)." + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [], + "source": [ + "parent_of_w = w.owner.inputs[0] # get z tensor\n", + "new_parent_of_w = at.exp(parent_of_w) # modify the parent of w\n", + "new_parent_of_w.name = \"exp(x + y)\"" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "Note that the graph of `w` has actually not changed:" + ] + }, + { + "cell_type": "code", + "execution_count": 18, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Elemwise{log,no_inplace} [id A] 'log(x + y)'\n", + " |Elemwise{add,no_inplace} [id B] 'x + y'\n", + " |InplaceDimShuffle{x} [id C]\n", + " | |x [id D]\n", + " |y [id E]\n" + ] + }, + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 18, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "aesara.dprint(w)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "To modify the graph we need to use the {func}`~aesara.clone_replace` function, which *returns a copy of the initial subgraph with the corresponding substitutions.*" + ] + }, + { + "cell_type": "code", + "execution_count": 19, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Elemwise{log,no_inplace} [id A] 'log(exp(x + y))'\n", + " |Elemwise{exp,no_inplace} [id B] 'exp(x + y)'\n", + " |Elemwise{add,no_inplace} [id C] 'x + y'\n", + " |InplaceDimShuffle{x} [id D]\n", + " | |x [id E]\n", + " |y [id F]\n" + ] + }, + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 19, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "new_w = aesara.clone_replace(output=[w], replace={parent_of_w: new_parent_of_w})[0]\n", + "new_w.name = \"log(exp(x + y))\"\n", + "aesara.dprint(new_w)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "Finally, we can test the modified graph by passing some input to the new graph." + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([1. , 2.71828183])" + ] + }, + "execution_count": 20, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "new_w.eval({x: 0, y:[1, np.e]})" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "As expected, the new graph is just the identity function." + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + ":::{note}\n", + "Again, note that `aesara` is clever enough to omit the `exp` and `log` once we compile the function.\n", + ":::" + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Elemwise{add,no_inplace} [id A] 'x + y' 1\n", + " |InplaceDimShuffle{x} [id B] 0\n", + " | |x [id C]\n", + " |y [id D]\n" + ] + }, + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 21, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "f = aesara.function(inputs=[x, y], outputs=new_w)\n", + "\n", + "aesara.dprint(f)" + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([1. , 2.71828183])" + ] + }, + "execution_count": 22, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "f(x=0, y=[1, np.e])" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "### Aesara RandomVariables\n", + "\n", + "Now that we have seen aesara's basics we want to move in the direction of random variables.\n", + "\n", + "How do we generate random numbers in [`numpy`](https://github.com/numpy/numpy)? To illustrate it we can sample from a normal distribution:" + ] + }, + { + "cell_type": "code", + "execution_count": 23, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAfQAAAF1CAYAAAAeOhj3AAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjMuMywgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/Il7ecAAAACXBIWXMAAAsTAAALEwEAmpwYAAAfwUlEQVR4nO3deZxkZX3v8c+X1V2WaRGY0SGKJogmeieEJCQXJcYF4xCv4UJEUUm4GterURFURMVgTFyiiQmRVQhIQCMJeCPiFhNBB0VkU0ccZBCYEUQEFQV/949zJhZN90zPTFdV99Of9+vVr66z1Dm/U1Vd3zrP89TpVBWSJGl+22LcBUiSpM1noEuS1AADXZKkBhjokiQ1wECXJKkBBrokSQ0w0DUWSd6c5LRZ3maSnJTk+0m+OJvbXkiSPD/J5zdi/VVJfq+/fWSSD85iLbcn+aX+9slJ3jaL2/77JG+cre1trtl+7LTwbDXuAjRaSfYB/hJ4DHA3cBXwyqr60lgLmx37AE8GFlfVHeMuZiGqqrfPZL0knwFOq6r1BlhVPWA26kryfOBPqmqfgW2/aDa2PVtm+thJ0zHQF5AkDwL+DXgxcBawDfA7wJ3jrGsWPRxYNV2YJ9mqqu4acU1D1+JxtXhM0rDZ5L6wPAqgqs6oqrur6sdV9YmqugwgySOSfCrJzUm+l+T0JNutu3PftPqaJJcluSPJCUl2SvLxJD9M8skk2/frLk1SSQ5P8t0kNyT58+kKS7J3kv9KcmuSrybZd2DZ85Nc0+/j20meM8X9DwM+CPxm30x7TJJ9k6xO8rokNwInJdk2yXv6mr7b396238a69V+bZE1f8wFJnp7kG0luSXLkeo5h/yRfSXJbkuuSvHk9667b16sH9vWCgeUPTnJqkrVJrk3yhiRbDDwe/5nk3UluBt7cN0f/Xf9c3N4vf2h/fN9PcnWSxw9s/4gk3+of0yuT/OF0tU5R+3P7mm5OctSkZf/dlZLkPklO69e7NcmX+tfLsXQfJN/f1/r+fv1K8pIk3wS+OTDvkQO7WJTkgr7uzyZ5eL/eutfbVgO1fCbJnyT5FeDv+cVr49Z++T2a8JP8aZKV/fN8bpJdBpZVkhcl+WZ/LH+bJNM8PpO3u2+S1QPTr0tyfX8MX0+y3xSP3brjOTTJd9L9PR41sI37Jjmlf26v6l+zq5nG+urPpO6vyY9l/zi+Ld3f5+1J/jXJjuneH27rn9elk/b18nR/s99L8s4kWyTZpn9sHzuw7kOS/CjJxHS1a+YM9IXlG8Dd/RvB09KH74AAfwHsAvwKsAR486R1/hdds/ajgD8APg4cCUzQvZ5ePmn9JwK7A78PvC59X+s9dprsCpwHvA3YAfhz4JwkE0nuD/wN8LSqeiDwW8Clk7dRVScALwK+UFUPqKqj+0UP7bf5cOBw4Chgb+DXgF8F9gLeMLCphwL3AXYF3gT8I3AI8D/oQuiNSXabvP/eHcDzgO2A/YEXJzlgmnXX7evB/b4OA/524Dl5X7/sl4D/2W/3BQP3/Q3gGmAn4Nh+3oH9sSyia3X5AvDlfvps4F0D9/9WfzwPBo4BTkuy83pqBSDJHsAHgOfSvU52BBZPs/qh/faX9Ou9CPhxVR0F/Afw0v65eunAfQ7oj22Pabb5HOCt/TFdCpy+oZqr6iru+drYborjehLda/9AYGfgWuDMSas9A/h14HH9ek/Z0L6n2M+jgZcCv96/np8CrFrPXfYBHg3sB7yp/3ACcDSwlO718WS61+iGbE79B9E957sCj6B7bZ1E97d1VV/PoD8ElgFPAJYDL6yqn9I9poO1HgxcWFVrN6IWTcNAX0Cq6ja6N4iiC6q1/ZnITv3ylVV1QVXd2f+BvYsuTAa9r6puqqrr6d6UL66qr1TVT4CPAo+ftP4xVXVHVX2N7g3g4ClKOwQ4v6rOr6qfV9UFwArg6f3ynwN7JrlvVd1QVVdsxGH/HDi6P6Yf0wXCW6pqTX+Mx9C9Ua3zM+DYqvoZ3ZvPIuC9VfXDfr9X0n0QuJeq+kxVfa0/hsuAM7j34zfoZ30tP6uq84HbgUcn2ZLuDfT1/X5XAX89qc7vVtX7ququ/rgAPlpVlww8Fz+pqlOr6m7gwww8N1X1z1X13b7WD9OdEe+1gccS4NnAv1XV56rqTuCNdI/xdMe3I/DIvkXokv41uD5/UVW3DBzTZOcN7PsourPuJTOoe0OeA5xYVV/ut/36fttLB9Y5rqpurarvAJ+m+1C4se4GtgX2SLJ1Va2qqm+tZ/1j+pa0rwJf5RevvQOBt1fV96tqNd2H3g3ZnPpPqqpvVdUP6D7Ef6uqPtl3i/wz9/67f0f/PH4HeA+/+Ls/BTh4oHXjucCHNqIOrYeBvsBU1VVV9fyqWgzsSXeW9R6Avjn0zL458DbgNLpAG3TTwO0fTzE9eRDTdQO3r+33N9nDgT/qmwJv7ZtE9wF27vvD/zfdGdYNSc5L8sszP2LW9gG3zi59HdPVdHMfgOuOBzZ8jAAk+Y0kn07XTP6DvubJj9+gmyf1E/+o3/YiYOsp6tx1YHrwcV1nxs9NkucluXTg8d5zA7Wus8vgvvvn5+Zp1v0Q8O/Amem6N/4yydYb2P5UxzXl8qq6HbiFqV9TG+ser4t+2zdzz8f8xoHb656rjVJVK4FX0rV8ren/3tZX/3T7vMfzwIYft/VtayZm5e++qi7u971v/3f8SODcjahD62GgL2BVdTVwMt2bOcDb6c7eH1tVD6I7c56yn3AjDJ49PQz47hTrXAd8qKq2G/i5f1Ud19f571X1ZLqm0KvpWhdmavK/E/wu3QeIDdW0Kf6J7s1pSVU9mK7fdlMev+/Rnd1OrvP6gelN/jeJ6fqd/5Gu6XfHvgn6cmZW6w0MPKdJ7kd3Fn4vfcvDMVW1B11XyTPoug7WV/+Gjmtw3w+ga/L9Ll13B8D9BtZ96EZs9x6vi76rZ0fu+ZjP1B3rqYOq+qd+tP3D+7resQn7uIF7dnVsTivFeuvdROv7uz+F7r3lucDZkz5wazMY6AtIkl9ONwhrcT+9hK4p7KJ+lQfSNfv+oO/Xfs0s7PaNSe6X5DF0fcAfnmKd04A/SPKUJFumG0y1b5LFfavB8v4N9s6+vumaeGfiDOANff/8Irp+8tn6PvwDgVuq6idJ9gL+eFM20rcQnAUcm+SBfQC/ahbrvD9dkKwFSDcYb8/13uMXzgaekWSfJNsAb2Ga95EkT0zy2L4L4Ta6Dynrnrub6Pp/N9bTB/b9VuCiqrqu7z65Hjikfw29kK6vd52bgMX9/aZyBvCCJL+WbpDk2+m6k1ZtQo2X9nXukOShdGfkQNeHnuRJ/T5+Qnd2uymv57OA1yfZvv9bfemG7rCBen83ycOSPJiuu2FzvaavbQnwCu75d38aXR/7IcCps7Av9Qz0heWHdAOOLk5yB12QXw68ul9+DN0glh/QDVL7yCzs87PASuBC4K+q6hOTV6iq6+gGzhxJFzLX0X2Y2KL/eRXdJ/xb6PqkX7wZ9byNrn/+MuBrdIPGZutiJX8GvCXJD+k+KJy1Gdt6Gd2Z0zXA5+nO/k/c7AqBqrqSrk/+C3RB91jgP2d43yuAl/T13AB8H5hudPVD6T4A3EY3cOqz/KK/9L3As9ON0p5J/+86/0Q3AOsWuoGKgwOs/pTudXMz3XUW/mtg2aeAK4Abk3xviuP6JN14gHP643oE3TiGTfEhuv7uVcAnuGeYbQscR9cKcyPwEDYtQN9C97h/G/gk3eO8SV8/7cesfJjub+ISuq+2bq6P9du6lO695ISB/V1H93dXdONwNEtStcktd9K0+sFE3wa2Lr9PLA1VkhcDB1XV+gZhjqqWAnbvxwtMt86JdAM73zDdOtp4XlhGkuaZ/iuGv0TXyrI7XSvb+8da1Az1H/afxb1Hxmsz2eQuSfPPNsA/0HWjfYquifvvxlrRDCR5K1033zur6tvjrqc1NrlLktQAz9AlSWqAgS5JUgPm9aC4RYsW1dKlS8ddhiRJI3PJJZd8r6ru9Q9t5nWgL126lBUrVoy7DEmSRibJtVPNt8ldkqQGGOiSJDXAQJckqQEGuiRJDTDQJUlqgIEuSVIDDHRJkhpgoEuS1AADXZKkBhjokiQ1wECXJKkBBrokSQ0w0CVJasC8/m9rkjbf0iPOG9m+Vh23/8j2JS00nqFLktQAA12SpAYY6JIkNcBAlySpAQa6JEkNMNAlSWqAgS5JUgMMdEmSGmCgS5LUAANdkqQGeOlXaQ4a5eVYJbXBM3RJkhpgoEuS1AADXZKkBgwt0JOcmGRNkssnzX9ZkquTXJHkLwfmvz7JyiRfT/KUYdUlSVKLhjko7mTg/cCp62YkeSKwHPjVqrozyUP6+XsABwGPAXYBPpnkUVV19xDrkySpGUM7Q6+qzwG3TJr9YuC4qrqzX2dNP385cGZV3VlV3wZWAnsNqzZJkloz6j70RwG/k+TiJJ9N8uv9/F2B6wbWW93Pu5ckhydZkWTF2rVrh1yuJEnzw6gDfStgB2Bv4DXAWUmyMRuoquOrallVLZuYmBhGjZIkzTujDvTVwEeq80Xg58Ai4HpgycB6i/t5kiRpBkYd6P8CPBEgyaOAbYDvAecCByXZNsluwO7AF0dcmyRJ89bQRrknOQPYF1iUZDVwNHAicGL/VbafAodWVQFXJDkLuBK4C3iJI9wlSZq5oQV6VR08zaJDpln/WODYYdUjSVLLvFKcJEkNMNAlSWqAgS5JUgMMdEmSGmCgS5LUAANdkqQGGOiSJDXAQJckqQEGuiRJDTDQJUlqgIEuSVIDDHRJkhpgoEuS1AADXZKkBhjokiQ1wECXJKkBBrokSQ0w0CVJaoCBLklSAwx0SZIaYKBLktQAA12SpAYY6JIkNcBAlySpAQa6JEkNMNAlSWqAgS5JUgMMdEmSGjC0QE9yYpI1SS6fYtmrk1SSRf10kvxNkpVJLkvyhGHVJUlSi7Ya4rZPBt4PnDo4M8kS4PeB7wzMfhqwe//zG8AH+t+SGrL0iPNGtq9Vx+0/sn1Jc8HQztCr6nPALVMsejfwWqAG5i0HTq3ORcB2SXYeVm2SJLVmpH3oSZYD11fVVyct2hW4bmB6dT9vqm0cnmRFkhVr164dUqWSJM0vIwv0JPcDjgTetDnbqarjq2pZVS2bmJiYneIkSZrnhtmHPtkjgN2AryYBWAx8OclewPXAkoF1F/fzJEnSDIzsDL2qvlZVD6mqpVW1lK5Z/QlVdSNwLvC8frT73sAPquqGUdUmSdJ8N8yvrZ0BfAF4dJLVSQ5bz+rnA9cAK4F/BP5sWHVJktSioTW5V9XBG1i+dOB2AS8ZVi2SJLXOK8VJktQAA12SpAYY6JIkNcBAlySpAQa6JEkNMNAlSWqAgS5JUgMMdEmSGmCgS5LUAANdkqQGGOiSJDXAQJckqQEGuiRJDTDQJUlqgIEuSVIDDHRJkhpgoEuS1AADXZKkBhjokiQ1wECXJKkBW427AEkahqVHnDeyfa06bv+R7UuajmfokiQ1wECXJKkBBrokSQ0w0CVJaoCBLklSAwx0SZIaMLRAT3JikjVJLh+Y984kVye5LMlHk2w3sOz1SVYm+XqSpwyrLkmSWjTMM/STgadOmncBsGdVPQ74BvB6gCR7AAcBj+nv83dJthxibZIkNWVogV5VnwNumTTvE1V1Vz95EbC4v70cOLOq7qyqbwMrgb2GVZskSa0ZZx/6C4GP97d3Ba4bWLa6nydJkmZgLJd+TXIUcBdw+ibc93DgcICHPexhs1yZNL1RXkpUkjbWyM/QkzwfeAbwnKqqfvb1wJKB1Rb38+6lqo6vqmVVtWxiYmKotUqSNF+MNNCTPBV4LfDMqvrRwKJzgYOSbJtkN2B34IujrE2SpPlsaE3uSc4A9gUWJVkNHE03qn1b4IIkABdV1Yuq6ookZwFX0jXFv6Sq7h5WbZIktWZogV5VB08x+4T1rH8scOyw6pEkqWVeKU6SpAYY6JIkNcBAlySpAQa6JEkNMNAlSWqAgS5JUgMMdEmSGmCgS5LUAANdkqQGGOiSJDXAQJckqQEGuiRJDTDQJUlqgIEuSVIDDHRJkhpgoEuS1AADXZKkBhjokiQ1wECXJKkBBrokSQ0w0CVJaoCBLklSAwx0SZIaYKBLktQAA12SpAYY6JIkNcBAlySpAQa6JEkNGFqgJzkxyZoklw/M2yHJBUm+2f/evp+fJH+TZGWSy5I8YVh1SZLUomGeoZ8MPHXSvCOAC6tqd+DCfhrgacDu/c/hwAeGWJckSc0ZWqBX1eeAWybNXg6c0t8+BThgYP6p1bkI2C7JzsOqTZKk1oy6D32nqrqhv30jsFN/e1fguoH1Vvfz7iXJ4UlWJFmxdu3a4VUqSdI8MrZBcVVVQG3C/Y6vqmVVtWxiYmIIlUmSNP+MOtBvWteU3v9e08+/HlgysN7ifp4kSZqBUQf6ucCh/e1DgY8NzH9eP9p9b+AHA03zkiRpA7Ya1oaTnAHsCyxKsho4GjgOOCvJYcC1wIH96ucDTwdWAj8CXjCsuiRJatHQAr2qDp5m0X5TrFvAS4ZViyRJrfNKcZIkNcBAlySpATMK9CQXzmSeJEkaj/X2oSe5D3A/uoFt2wPpFz2IaS78IkmSRm9Dg+L+D/BKYBfgEn4R6LcB7x9eWZIkaWOsN9Cr6r3Ae5O8rKreN6KaJEnSRprR19aq6n1JfgtYOnifqjp1SHVJkqSNMKNAT/Ih4BHApcDd/ewCDHRJkuaAmV5YZhmwR38BGEmSNMfM9HvolwMPHWYhkiRp0830DH0RcGWSLwJ3rptZVc8cSlWSJGmjzDTQ3zzMIiRJ0uaZ6Sj3zw67EEmStOlmOsr9h3Sj2gG2AbYG7qiqBw2rMEmSNHMzPUN/4LrbSQIsB/YeVlGSJGnjbPR/W6vOvwBPmf1yJEnSpphpk/uzBia3oPte+k+GUpEkSdpoMx3l/gcDt+8CVtE1u0uSpDlgpn3oLxh2IZIkadPNqA89yeIkH02ypv85J8niYRcnSZJmZqaD4k4CzqX7v+i7AP/az5MkSXPATAN9oqpOqqq7+p+TgYkh1iVJkjbCTAP95iSHJNmy/zkEuHmYhUmSpJmbaaC/EDgQuBG4AXg28Pwh1SRJkjbSTL+29hbg0Kr6PkCSHYC/ogt6SZI0ZjMN9MetC3OAqrolyeOHVJM0Y0uPOG/cJUjSnDDTJvctkmy/bqI/Q5/phwFJkjRkMw3lvwa+kOSf++k/Ao7d1J0m+b/An9D9B7evAS8AdgbOBHYELgGeW1U/3dR9SJK0kMzoDL2qTgWeBdzU/zyrqj60KTtMsivwcmBZVe0JbAkcBLwDeHdVPRL4PnDYpmxfkqSFaMbN5lV1JXDlLO73vkl+BtyPbuT8k4A/7pefArwZ+MAs7U+SpKZt9L9P3VxVdT3dCPnv0AX5D+ia2G+tqrv61VYDu466NkmS5quRB3o/uG45sBvdZWTvDzx1I+5/eJIVSVasXbt2SFVKkjS/jDzQgd8Dvl1Va6vqZ8BHgN8GtkuyrgtgMXD9VHeuquOrallVLZuY8OqzkiTBeAL9O8DeSe6XJMB+dH3zn6a7Ah3AocDHxlCbJEnz0jj60C8Gzga+TPeVtS2A44HXAa9KspLuq2snjLo2SZLmq7FcHKaqjgaOnjT7GmCvMZQjSdK8N44md0mSNMsMdEmSGmCgS5LUAANdkqQGGOiSJDXAQJckqQH+T3NJ2kxLjzhvZPtaddz+I9uX5hfP0CVJaoCBLklSAwx0SZIaYKBLktQAA12SpAYY6JIkNcBAlySpAQa6JEkNMNAlSWqAgS5JUgMMdEmSGmCgS5LUAANdkqQGGOiSJDXAQJckqQEGuiRJDTDQJUlqgIEuSVIDDHRJkhpgoEuS1AADXZKkBowl0JNsl+TsJFcnuSrJbybZIckFSb7Z/95+HLVJkjQfjesM/b3A/6uqXwZ+FbgKOAK4sKp2By7spyVJ0gyMPNCTPBj4XeAEgKr6aVXdCiwHTulXOwU4YNS1SZI0X43jDH03YC1wUpKvJPlgkvsDO1XVDf06NwI7TXXnJIcnWZFkxdq1a0dUsiRJc9s4An0r4AnAB6rq8cAdTGper6oCaqo7V9XxVbWsqpZNTEwMvVhJkuaDcQT6amB1VV3cT59NF/A3JdkZoP+9Zgy1SZI0L4080KvqRuC6JI/uZ+0HXAmcCxzazzsU+Nioa5Mkab7aakz7fRlwepJtgGuAF9B9uDgryWHAtcCBY6pNkqR5ZyyBXlWXAsumWLTfiEuRJKkJXilOkqQGGOiSJDXAQJckqQEGuiRJDTDQJUlqgIEuSVIDDHRJkhpgoEuS1AADXZKkBhjokiQ1wECXJKkBBrokSQ0w0CVJaoCBLklSAwx0SZIaYKBLktQAA12SpAYY6JIkNcBAlySpAQa6JEkNMNAlSWqAgS5JUgMMdEmSGmCgS5LUAANdkqQGGOiSJDXAQJckqQFjC/QkWyb5SpJ/66d3S3JxkpVJPpxkm3HVJknSfDPOM/RXAFcNTL8DeHdVPRL4PnDYWKqSJGkeGkugJ1kM7A98sJ8O8CTg7H6VU4ADxlGbJEnz0bjO0N8DvBb4eT+9I3BrVd3VT68Gdh1DXZIkzUsjD/QkzwDWVNUlm3j/w5OsSLJi7dq1s1ydJEnz0zjO0H8beGaSVcCZdE3t7wW2S7JVv85i4Pqp7lxVx1fVsqpaNjExMYp6JUma80Ye6FX1+qpaXFVLgYOAT1XVc4BPA8/uVzsU+Nioa5Mkab6aS99Dfx3wqiQr6frUTxhzPZIkzRtbbXiV4amqzwCf6W9fA+w1znokSZqv5tIZuiRJ2kQGuiRJDTDQJUlqgIEuSVIDxjooTpK0cZYecd7I9rXquP1Hti9tPs/QJUlqgIEuSVIDDHRJkhpgoEuS1AADXZKkBhjokiQ1wECXJKkBBrokSQ3wwjKadaO88IUkqeMZuiRJDTDQJUlqgIEuSVIDDHRJkhrgoLgFwoFqktQ2z9AlSWqAgS5JUgMMdEmSGmCgS5LUAANdkqQGGOiSJDXAQJckqQEGuiRJDTDQJUlqwMgDPcmSJJ9OcmWSK5K8op+/Q5ILknyz/739qGuTJGm+GscZ+l3Aq6tqD2Bv4CVJ9gCOAC6sqt2BC/tpSZI0AyMP9Kq6oaq+3N/+IXAVsCuwHDilX+0U4IBR1yZJ0nw11j70JEuBxwMXAztV1Q39ohuBnaa5z+FJViRZsXbt2tEUKknSHDe2QE/yAOAc4JVVddvgsqoqoKa6X1UdX1XLqmrZxMTECCqVJGnuG0ugJ9maLsxPr6qP9LNvSrJzv3xnYM04apMkaT4axyj3ACcAV1XVuwYWnQsc2t8+FPjYqGuTJGm+2moM+/xt4LnA15Jc2s87EjgOOCvJYcC1wIFjqE2SpHlp5IFeVZ8HMs3i/UZZiyRJrfBKcZIkNcBAlySpAQa6JEkNMNAlSWqAgS5JUgMMdEmSGmCgS5LUAANdkqQGGOiSJDVgHJd+lSTNA0uPOG9k+1p13P4j21erPEOXJKkBBrokSQ0w0CVJaoCBLklSAxwUJ0kaOwfgbT7P0CVJaoCBLklSAwx0SZIaYKBLktQAA12SpAYY6JIkNcBAlySpAX4PXZK0oLT6nXfP0CVJaoCBLklSA2xyH6NRNvtIktrmGbokSQ2Yc4Ge5KlJvp5kZZIjxl2PJEnzwZwK9CRbAn8LPA3YAzg4yR7jrUqSpLlvrvWh7wWsrKprAJKcCSwHrhzFzu3TliTNV3PqDB3YFbhuYHp1P0+SJK3HXDtD36AkhwOH95O3J/n6GMtZBHxvjPsfJ499YfLYF6aFfOywGcefd8xyJZ2HTzVzrgX69cCSgenF/bz/VlXHA8ePsqjpJFlRVcvGXcc4eOwe+0LjsS/MY4f5c/xzrcn9S8DuSXZLsg1wEHDumGuSJGnOm1Nn6FV1V5KXAv8ObAmcWFVXjLksSZLmvDkV6ABVdT5w/rjrmKE50fQ/Jh77wuSxL0wL+dhhnhx/qmrcNUiSpM001/rQJUnSJjDQN0OStya5LMmlST6RZJdx1zQqSd6Z5Or++D+aZLtx1zRKSf4oyRVJfp5kzo9+nQ0L9bLMSU5MsibJ5eOuZdSSLEny6SRX9q/3V4y7plFJcp8kX0zy1f7Yjxl3TRtik/tmSPKgqrqtv/1yYI+qetGYyxqJJL8PfKofyPgOgKp63ZjLGpkkvwL8HPgH4M+rasWYSxqq/rLM3wCeTHfBpy8BB1fVSK7iOE5Jfhe4HTi1qvYcdz2jlGRnYOeq+nKSBwKXAAcskOc9wP2r6vYkWwOfB15RVReNubRpeYa+GdaFee/+wIL5dFRVn6iqu/rJi+iuGbBgVNVVVTXOixqN2n9flrmqfgqsuyxz86rqc8At465jHKrqhqr6cn/7h8BVLJCrd1bn9n5y6/5nTr/HG+ibKcmxSa4DngO8adz1jMkLgY+PuwgNlZdlXuCSLAUeD1w85lJGJsmWSS4F1gAXVNWcPnYDfQOSfDLJ5VP8LAeoqqOqaglwOvDS8VY7uzZ07P06RwF30R1/U2Zy/NJCkOQBwDnAKye1TDatqu6uql+ja4HcK8mc7nKZc99Dn2uq6vdmuOrpdN+fP3qI5YzUho49yfOBZwD7VYODMTbiuV8INnhZZrWp7z8+Bzi9qj4y7nrGoapuTfJp4KnAnB0c6Rn6Zkiy+8DkcuDqcdUyakmeCrwWeGZV/Wjc9WjovCzzAtQPDDsBuKqq3jXuekYpycS6b+8kuS/dgNA5/R7vKPfNkOQc4NF0o52vBV5UVQvirCXJSmBb4OZ+1kULZYQ/QJI/BN4HTAC3ApdW1VPGWtSQJXk68B5+cVnmY8db0WgkOQPYl+4/bt0EHF1VJ4y1qBFJsg/wH8DX6N7nAI7sr+jZtCSPA06he71vAZxVVW8Zb1XrZ6BLktQAm9wlSWqAgS5JUgMMdEmSGmCgS5LUAANdkqQGGOiSJDXAQJckqQEGuiRJDfj/5JXosFuF6EMAAAAASUVORK5CYII=", + "text/plain": [ + "
" + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + } + ], + "source": [ + "a = np.random.normal(loc=0, scale=1, size=1_000)\n", + "\n", + "fig, ax = plt.subplots(figsize=(8, 6))\n", + "ax.hist(a, color=\"C0\", bins=15)\n", + "ax.set(title=\"Samples from a normal distribution using numpy\", ylabel=\"count\");" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "Now let's try to do it in Aesara." + ] + }, + { + "cell_type": "code", + "execution_count": 24, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "TensorType(float64, ())" + ] + }, + "execution_count": 24, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "y = at.random.normal(loc=0, scale=1, name=\"y\")\n", + "y.type" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "Next, we show the graph using {func}`~aesara.dprint`." + ] + }, + { + "cell_type": "code", + "execution_count": 25, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "normal_rv{0, (0, 0), floatX, False}.1 [id A] 'y'\n", + " |RandomGeneratorSharedVariable() [id B]\n", + " |TensorConstant{[]} [id C]\n", + " |TensorConstant{11} [id D]\n", + " |TensorConstant{0} [id E]\n", + " |TensorConstant{1} [id F]\n" + ] + }, + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 25, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "aesara.dprint(y)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "The inputs are always in the following order:\n", + "1. `rng` shared variable\n", + "2. `size`\n", + "3. `dtype` (number code)\n", + "4. `arg1`\n", + "5. `arg2`\n", + "6. `argn`" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "We *could* sample by calling {meth}`~aesara.graph.basic.Variable.eval`. on the random variable." + ] + }, + { + "cell_type": "code", + "execution_count": 26, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array(0.92905265)" + ] + }, + "execution_count": 26, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "y.eval()" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "Note however that these samples are always the same!" + ] + }, + { + "cell_type": "code", + "execution_count": 27, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Sample 0: 0.929052652756385\n", + "Sample 1: 0.929052652756385\n", + "Sample 2: 0.929052652756385\n", + "Sample 3: 0.929052652756385\n", + "Sample 4: 0.929052652756385\n", + "Sample 5: 0.929052652756385\n", + "Sample 6: 0.929052652756385\n", + "Sample 7: 0.929052652756385\n", + "Sample 8: 0.929052652756385\n", + "Sample 9: 0.929052652756385\n" + ] + } + ], + "source": [ + "for i in range(10):\n", + " print(f\"Sample {i}: {y.eval()}\")" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "We always get the same samples! This has to do with the random seed step in the graph, i.e. `RandomGeneratorSharedVariable` (we will not go deeper into this subject here). We will show how to generate different samples with `pymc` below." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "---" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "## PyMC\n", + "\n", + "![pymc logo](https://raw.githubusercontent.com/pymc-devs/pymc/main/docs/logos/PyMC.png)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To do so, we start by defining a `pymc` normal distribution." + ] + }, + { + "cell_type": "code", + "execution_count": 28, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "normal_rv{0, (0, 0), floatX, False}.1 [id A]\n", + " |RandomGeneratorSharedVariable() [id B]\n", + " |TensorConstant{[]} [id C]\n", + " |TensorConstant{11} [id D]\n", + " |TensorConstant{0} [id E]\n", + " |TensorConstant{1.0} [id F]\n" + ] + }, + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 28, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "x = pm.Normal.dist(mu=0, sigma=1)\n", + "aesara.dprint(x)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "Observe that `x` is just a normal `RandomVariable` and which is the same as `y` except for the `rng`." + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "We can try to generate samples by calling {meth}`~aesara.graph.basic.Variable.eval` as above." + ] + }, + { + "cell_type": "code", + "execution_count": 29, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Sample 0: -0.8310657629670953\n", + "Sample 1: -0.8310657629670953\n", + "Sample 2: -0.8310657629670953\n", + "Sample 3: -0.8310657629670953\n", + "Sample 4: -0.8310657629670953\n", + "Sample 5: -0.8310657629670953\n", + "Sample 6: -0.8310657629670953\n", + "Sample 7: -0.8310657629670953\n", + "Sample 8: -0.8310657629670953\n", + "Sample 9: -0.8310657629670953\n" + ] + } + ], + "source": [ + "for i in range(10):\n", + " print(f\"Sample {i}: {x.eval()}\")" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "As before we get the same value for all iterations. The correct way to generate random samples is using {func}`~pymc.draw`." + ] + }, + { + "cell_type": "code", + "execution_count": 30, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAfQAAAF1CAYAAAAeOhj3AAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjMuMywgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/Il7ecAAAACXBIWXMAAAsTAAALEwEAmpwYAAAfh0lEQVR4nO3de5gkdX3v8fcHFhBQLoYRgUWXKJogmmj2EJN4EuIlojHB5FEPHC+gJESjRk+MdxNAxYMxUYnGJEQQUAQJaiBKTkBEiYmgCyJyUzeC7iKww/2iEhe/54+qkWaY3Z2d3e6e+c379Tz9TNelq75V3T2f/v2qujpVhSRJWti2GHcBkiRp0xnokiQ1wECXJKkBBrokSQ0w0CVJaoCBLklSAwx0zQtJjkzysc28zCT5SJJbk3xlcy57MUlyaJIvbcT81yZ5en//LUk+vBlruSvJz/b3T0zyzs247L9P8ueba3mbanPvO7VvybgL0HgleQrwl8DjgHuBq4DXVtVXx1rY5vEU4BnA0qq6e9zFLEZV9a7ZzJfkC8DHqmq9AVZVD94cdSU5FPiDqnrKwLJfvjmWvbnMdt9JUwz0RSzJDsBngFcApwNbA/8TuGecdW1GjwSuXVeYJ1lSVWtHXNPQtbhdLW6TtLnZ5b64PQagqk6tqnur6odVdU5VXQaQ5FFJPp/k5iQ3JTklyU5TD+67Vl+f5LIkdyc5PsmuSf41yZ1JPpdk537eZUkqyeFJvp/k+iR/tq7Ckjw5yX8muS3J15PsPzDt0CTf6ddxTZIXzvD4w4APA7/Sd9MelWT/JKuTvDHJDcBHkmyT5P19Td/v72/TL2Nq/jckWdPX/Nwkz07yrSS3JHnLerbht5N8LckdSVYlOXI9806t63UD63rpwPQdk5ycZDLJd5O8LckWA/vjP5K8L8nNwJF9d/SH+ufirn76w/vtuzXJ1UmeOLD8NyX5r36fXpnk99ZV6wy1v7iv6eYkb5027aeHUpI8KMnH+vluS/LV/vVyNN0HyQ/2tX6wn7+SvDLJt4FvD4x79MAqdklybl/3F5M8sp9v6vW2ZKCWLyT5gyQ/D/w99702buun368LP8kfJlnZP89nJdl9YFoleXmSb/fb8rdJso79M325+ydZPTD8xiTX9dvwzSRPm2HfTW3PIUm+l+79+NaBZWyb5KT+ub2qf82uZh36Zf1JuvfRTUnek2SLJFv32/v4gXkfluQHSSayke+JJFumO3Qw9dq6OMme66pLm6iqvC3SG7ADcDNwEvAsYOdp0x9N12W9DTABXAC8f2D6tcCFwK7AHsAa4BLgicCDgM8DR/TzLgMKOBXYHng8MAk8vZ9+JF2XK/2ybgaeTfeh8xn98ET/2DuAx/bz7gY8bh3bdyjwpYHh/YG1wLv7bdoWeHu/DQ/rl/+fwDumzf8XwFbAH/Y1fxx4CN1hih8Ce61j/fv327kF8ATgRuC565l3bV/PVv22/2DqOQFOBs7s17sM+BZw2MB2rgVeTdfrti1wInAT8EsDz8U1wEuALYF3AucPrP/5wO59rf8LuBvYbab9OK3ufYC7gF/v9+l7+1pmel7/CPgXYLu+hl8CduinfYGuC3xw2QWcCzwU2HZg3KP7+ycCdw6s+9ipOrnv9bZkYHk/XcdM29Qv7539/af2++9J/bI/AFwwrbbPADsBj6B7XRywjn300+UOPNer+/uPBVYBuw/U/agZ9t3U9vxj//z+Al1P2s/3048BvgjsDCwFLptaxzpqKuD8ft8+gu71NLVvPgS8e2De1wD/Mpf3BPB64Bv9dqav+2fG/b+v1Zst9EWsqu6gO8489Y9ism+J7NpPX1lV51bVPVU1SffP+jemLeYDVXVjVV0H/DtwUVV9rap+BHyaLtwHHVVVd1fVN4CPAAfPUNqLgLOr6uyq+klVnQusoAs5gJ8A+ybZtqqur6orNmKzf0L3IeOeqvoh8ELg7VW1pt/Go4AXD8z/Y+DoqvoxcBqwC3BsVd3Zr/dKun9SD1BVX6iqb/TbcBndh5np+2/Qj/taflxVZ9MF5WOTbAkcBLy5X++1wF9Pq/P7VfWBqlrbbxfAp6vq4oHn4kdVdXJV3Qt8goHnpqr+qaq+39f6CboW8X4b2JcAzwM+U1UXVNU9wJ/T7eN1bd/P0AXyvX1td2xg+f+3qm4Z2KbpPjuw7rfStbo3RwvwhcAJVXVJv+w398teNjDPMVV1W1V9jy4cf3EO67mX7gPDPkm2qqprq+q/1jP/UdX1pH0d+Dr3vfZeALyrqm6tqtXA38xi3e/u9+33gPdz33vxJODggR6HFwMfHXjcxrwn/gB4W1V9szpfr6qbZ1Gb5sBAX+Sq6qqqOrSqlgL70rXS3g/Qd4ee1ncH3gF8jO7NO+jGgfs/nGF4+klMqwbuf7df33SPBJ7fd2Xe1neJPoWuxXg3XQvy5cD1ST6b5Odmv8VM9gE3Zfe+jnXVdHMfgFPbAxveRgCS/HKS89N1k9/e1zx9/w26ue5/nPgH/bJ3oWsNTa9zj4Hhwf06ZdbPTZKXJLl0YH/vu4Fap+w+uO7++VnXP+yPAv8GnJbu8MZfJtlqA8ufabtmnF5VdwG3MPNramPd73XRL/tm7r/Pbxi4P/VcbZSqWgm8lq41vqZ/v62v/nWt837PAxveb9Pn+enrvqou6pe9f//eejRw1sC8G/Oe2BNY3wcUbUYGun6qqq6m6x7ctx/1LrrW++Orage6lvOMxwk3wmDr6RHA92eYZxXw0araaeC2fVUd09f5b1X1DLru9qvpehdma/rPC36f7gPEhmqai4/T/SPcs6p2pDtuO5f9dxNdq2h6ndcNDM/5ZxP7487/CLyKrjt0J+ByZlfr9Qw8p0m2o2uFP0Df83BUVe0D/CrwHLpDAOurf0PbNbjuB9N1IX+f7pABdN37Ux6+Ecu93+siyfZ023XdOh+xbnevpw6q6uPVnW3/yL6ud89hHdfTdbVPmU0vxfreiyfRvd9fDJwx7UPwxlgFPGqOj9VGMtAXsSQ/l+4krKX98J503W4X9rM8hK7b9/Yke9AdD9tUf55kuySPA15K1/U73ceA30nyzP6kmgf1J+Ms7XsNDuz/wd7T17euLt7ZOBV4W3/Czy50xwY31/fhHwLcUlU/SrIf8L/nspC+NXQ6cHSSh/QB/Kebsc7t6YJkEiDdyXj7rvcR9zkDeE6SpyTZmu4cgBn/ryT5zSSP7w8h3EH3IWXqubsR+Nk51P7sgXW/A7iwqlb1h0+uA17Uv4Zexv2D5UZgaf+4mZwKvDTJL6Y7SfJddIeTrp1DjZf2dT40ycPpWuQAJHlskqf26/gRXet2Lq/n04E3J9m5f6++ahaPeX0//550x8kH34sfA36PLtRPnkM9Uz4MvCPJ3uk8IcmMH/i06Qz0xe1O4JeBi5LcTRfklwOv66cfRXdS0O3AZ4FPbYZ1fhFYCZwH/FVVnTN9hqpaBRwIvIUuZFbRfZjYor/9KV1r4ha6Y9Kv2IR63kl3fP4yupN3LunHbQ5/DLw9yZ10HxRO34RlvZqupfcd4Et0rf8TNrlCoKqupDsm/2W6oHs88B+zfOwVwCv7eq4HbgXWdXb1w+k+ANxBd72DL3Lfsdljgef1Z2nP5vjvlI8DR9C9Fn6JLoCm/CHd6+ZmupO1/nNg2ueBK4Abktw0w3Z9ju58gE/22/UouvMY5uKjdMe7rwXO4f7BuQ3dCW030XWnP4zueP3Gejvdfr8G+Bzdft7Q10/PBC6m+8DxWeD4qQn9e/ASug96/z6Heqa8l+51fw7d83483Ul9GoJUzbmnTpq1/mSia4Ctyu8TS0OV5BXAQVU140mYSQrYuz+Gv65lnEB3suXbhlSmNjMvLCNJC1yS3egOWXwZ2Juul+2Dm7C8ZcDv88BvqWges8tdkha+rYF/oDuM9nm67vQPzWVBSd5Bd+jtPVV1zWarUENnl7skSQ2whS5JUgMMdEmSGrCgT4rbZZddatmyZeMuQ5Kkkbn44otvqqqJ6eMXdKAvW7aMFStWjLsMSZJGJsl3Zxpvl7skSQ0w0CVJaoCBLklSAwx0SZIaYKBLktQAA12SpAYY6JIkNcBAlySpAQa6JEkNMNAlSWqAgS5JUgMMdEmSGmCgS5LUgAX9a2uSNoMjdxzhum4f3bqkRWZoLfQkJyRZk+TyaeNfneTqJFck+cuB8W9OsjLJN5M8c1h1SZLUomG20E8EPgicPDUiyW8CBwK/UFX3JHlYP34f4CDgccDuwOeSPKaq7h1ifZIkNWNoLfSqugC4ZdroVwDHVNU9/Txr+vEHAqdV1T1VdQ2wEthvWLVJktSaUZ8U9xjgfya5KMkXk/yPfvwewKqB+Vb34x4gyeFJViRZMTk5OeRyJUlaGEYd6EuAhwJPBl4PnJ4kG7OAqjquqpZX1fKJiYlh1ChJ0oIz6kBfDXyqOl8BfgLsAlwH7Dkw39J+nCRJmoVRB/o/A78JkOQxwNbATcBZwEFJtkmyF7A38JUR1yZJ0oI1tLPck5wK7A/skmQ1cARwAnBC/1W2/wYOqaoCrkhyOnAlsBZ4pWe4S5I0e0ML9Ko6eB2TXrSO+Y8Gjh5WPZIktcwrxUkaHa9KJw2N13KXJKkBBrokSQ0w0CVJaoCBLklSAwx0SZIaYKBLktQAA12SpAYY6JIkNcBAlySpAQa6JEkN8NKv0nw0ykukSmqCLXRJkhpgoEuS1AADXZKkBhjokiQ1wECXJKkBBrokSQ0w0CVJaoCBLklSAwx0SZIaYKBLktQAA12SpAYY6JIkNcBAlySpAQa6JEkNMNAlSWqAgS5JUgMMdEmSGjC0QE9yQpI1SS6fYdrrklSSXfrhJPmbJCuTXJbkScOqS5KkFg2zhX4icMD0kUn2BH4L+N7A6GcBe/e3w4G/G2JdkiQ1Z2iBXlUXALfMMOl9wBuAGhh3IHBydS4Edkqy27BqkySpNSM9hp7kQOC6qvr6tEl7AKsGhlf342ZaxuFJViRZMTk5OaRKJUlaWEYW6Em2A94C/MWmLKeqjquq5VW1fGJiYvMUJ0nSArdkhOt6FLAX8PUkAEuBS5LsB1wH7Dkw79J+nCRJmoWRtdCr6htV9bCqWlZVy+i61Z9UVTcAZwEv6c92fzJwe1VdP6raJEla6Ib5tbVTgS8Dj02yOslh65n9bOA7wErgH4E/HlZdkiS1aGhd7lV18AamLxu4X8Arh1WLJEmt80pxkiQ1wECXJKkBBrokSQ0w0CVJaoCBLklSAwx0SZIaYKBLktQAA12SpAYY6JIkNcBAlySpAQa6JEkNMNAlSWqAgS5JUgMMdEmSGmCgS5LUAANdkqQGGOiSJDXAQJckqQEGuiRJDTDQJUlqgIEuSVIDDHRJkhpgoEuS1AADXZKkBhjokiQ1wECXJKkBBrokSQ0w0CVJaoCBLklSA4YW6ElOSLImyeUD496T5OoklyX5dJKdBqa9OcnKJN9M8sxh1SVJUouG2UI/EThg2rhzgX2r6gnAt4A3AyTZBzgIeFz/mA8l2XKItUmS1JShBXpVXQDcMm3cOVW1th+8EFja3z8QOK2q7qmqa4CVwH7Dqk2SpNaM8xj6y4B/7e/vAawamLa6HydJkmZhLIGe5K3AWuCUOTz28CQrkqyYnJzc/MVJkrQAjTzQkxwKPAd4YVVVP/o6YM+B2Zb24x6gqo6rquVVtXxiYmKotUqStFAsGeXKkhwAvAH4jar6wcCks4CPJ3kvsDuwN/CVUdYmqTFH7jjCdd0+unVJ6zC0QE9yKrA/sEuS1cARdGe1bwOcmwTgwqp6eVVdkeR04Eq6rvhXVtW9w6pNkqTWDC3Qq+rgGUYfv575jwaOHlY9kiS1zCvFSZLUAANdkqQGGOiSJDXAQJckqQEGuiRJDTDQJUlqgIEuSVIDDHRJkhpgoEuS1AADXZKkBhjokiQ1wECXJKkBBrokSQ0w0CVJaoCBLklSAwx0SZIaYKBLktQAA12SpAYY6JIkNcBAlySpAQa6JEkNMNAlSWqAgS5JUgMMdEmSGmCgS5LUAANdkqQGGOiSJDXAQJckqQEGuiRJDTDQJUlqwNACPckJSdYkuXxg3EOTnJvk2/3fnfvxSfI3SVYmuSzJk4ZVlyRJLVoyxGWfCHwQOHlg3JuA86rqmCRv6offCDwL2Lu//TLwd/1faf44csdxVyBJ6zS0FnpVXQDcMm30gcBJ/f2TgOcOjD+5OhcCOyXZbVi1SZLUmlEfQ9+1qq7v798A7Nrf3wNYNTDf6n7cAyQ5PMmKJCsmJyeHV6kkSQvI2E6Kq6oCag6PO66qllfV8omJiSFUJknSwjPqQL9xqiu9/7umH38dsOfAfEv7cZIkaRZGHehnAYf09w8BzhwY/5L+bPcnA7cPdM1LkqQNGNpZ7klOBfYHdkmyGjgCOAY4PclhwHeBF/Sznw08G1gJ/AB46bDqkiSpRUML9Ko6eB2TnjbDvAW8cli1SJLUOq8UJ0lSAwx0SZIaYKBLktQAA12SpAYY6JIkNcBAlySpAQa6JEkNMNAlSWqAgS5JUgMMdEmSGmCgS5LUgFkFepLzZjNOkiSNx3p/nCXJg4Dt6H4xbWcg/aQdgD2GXJskSZqlDf3a2h8BrwV2By7mvkC/A/jg8MqSJEkbY72BXlXHAscmeXVVfWBENUmSpI00q99Dr6oPJPlVYNngY6rq5CHVJUmSNsKsAj3JR4FHAZcC9/ajCzDQJUmaB2YV6MByYJ+qqmEWI0mS5ma230O/HHj4MAuRJElzN9sW+i7AlUm+AtwzNbKqfncoVUmSpI0y20A/cphFSJKkTTPbs9y/OOxCJEnS3M32LPc76c5qB9ga2Aq4u6p2GFZhkiRp9mbbQn/I1P0kAQ4EnjysoiRJ0sbZ6F9bq84/A8/c/OVIkqS5mG2X++8PDG5B9730Hw2lIkmStNFme5b77wzcXwtcS9ftLkmS5oHZHkN/6bALkaQF68gdR7iu20e3Li0oszqGnmRpkk8nWdPfPplk6bCLkyRJszPbk+I+ApxF97vouwP/0o+bkyT/J8kVSS5PcmqSByXZK8lFSVYm+USSree6fEmSFpvZBvpEVX2kqtb2txOBibmsMMkewJ8Ay6tqX2BL4CDg3cD7qurRwK3AYXNZviRJi9FsA/3mJC9KsmV/exFw8yasdwmwbZIlwHbA9cBTgTP66ScBz92E5UuStKjMNtBfBrwAuIEufJ8HHDqXFVbVdcBfAd/rl3U7cDFwW1Wt7WdbDewx0+OTHJ5kRZIVk5OTcylBkqTmzDbQ3w4cUlUTVfUwuoA/ai4rTLIz3Vfe9qI7Hr89cMBsH19Vx1XV8qpaPjExp15/SZKaM9tAf0JV3To1UFW3AE+c4zqfDlxTVZNV9WPgU8CvATv1XfAAS4Hr5rh8SZIWndkG+hZ9yxqAJA9l9helme57wJOTbNdfF/5pwJXA+XRd+QCHAGfOcfmSJC06sw3lvwa+nOSf+uHnA0fPZYVVdVGSM4BL6K469zXgOOCzwGlJ3tmPO34uy5ckaTGa7ZXiTk6ygu5MdIDfr6or57rSqjoCOGLa6O8A+811mZIkLWaz7jbvA3zOIS5JkoZno38+VZIkzT8GuiRJDTDQJUlqwFy/eibND6P82UpJmsdsoUuS1AADXZKkBhjokiQ1wECXJKkBBrokSQ0w0CVJaoCBLklSAwx0SZIaYKBLktQAA12SpAYY6JIkNcBAlySpAQa6JEkNMNAlSWqAgS5JUgMMdEmSGmCgS5LUAANdkqQGGOiSJDXAQJckqQEGuiRJDTDQJUlqgIEuSVIDDHRJkhowlkBPslOSM5JcneSqJL+S5KFJzk3y7f7vzuOoTZKkhWhcLfRjgf9XVT8H/AJwFfAm4Lyq2hs4rx+WJEmzMPJAT7Ij8OvA8QBV9d9VdRtwIHBSP9tJwHNHXZskSQvVOFroewGTwEeSfC3Jh5NsD+xaVdf389wA7DqG2iRJWpDGEehLgCcBf1dVTwTuZlr3elUVUDM9OMnhSVYkWTE5OTn0YiVJWgjGEeirgdVVdVE/fAZdwN+YZDeA/u+amR5cVcdV1fKqWj4xMTGSgiVJmu9GHuhVdQOwKslj+1FPA64EzgIO6ccdApw56tokSVqoloxpva8GTkmyNfAd4KV0Hy5OT3IY8F3gBWOqTZKkBWcsgV5VlwLLZ5j0tBGXIklSE7xSnCRJDTDQJUlqgIEuSVIDDHRJkhpgoEuS1AADXZKkBhjokiQ1wECXJKkBBrokSQ0w0CVJaoCBLklSAwx0SZIaYKBLktQAA12SpAYY6JIkNcBAlySpAQa6JEkNMNAlSWrAknEXoAYdueO4K5CkRccWuiRJDTDQJUlqgIEuSVIDDHRJkhpgoEuS1AADXZKkBhjokiQ1wECXJKkBBrokSQ0w0CVJaoCBLklSA8YW6Em2TPK1JJ/ph/dKclGSlUk+kWTrcdUmSdJCM84W+muAqwaG3w28r6oeDdwKHDaWqiRJWoDGEuhJlgK/DXy4Hw7wVOCMfpaTgOeOozZJkhaicf186vuBNwAP6Yd/Britqtb2w6uBPWZ6YJLDgcMBHvGIRwy3Skmab0b588RH3j66dWmTjbyFnuQ5wJqqunguj6+q46pqeVUtn5iY2MzVSZK0MI2jhf5rwO8meTbwIGAH4FhgpyRL+lb6UuC6MdQmSdKCNPIWelW9uaqWVtUy4CDg81X1QuB84Hn9bIcAZ466NkmSFqr59D30NwJ/mmQl3TH148dcjyRJC8a4TooDoKq+AHyhv/8dYL9x1iNJ0kI1n1rokiRpjgx0SZIaYKBLktQAA12SpAYY6JIkNcBAlySpAQa6JEkNMNAlSWqAgS5JUgMMdEmSGmCgS5LUAANdkqQGGOiSJDXAQJckqQEGuiRJDTDQJUlqgIEuSVIDDHRJkhpgoEuS1AADXZKkBhjokiQ1wECXJKkBBrokSQ0w0CVJaoCBLklSAwx0SZIaYKBLktQAA12SpAYY6JIkNWDJqFeYZE/gZGBXoIDjqurYJA8FPgEsA64FXlBVt466PklS78gdR7iu20e3rkaNo4W+FnhdVe0DPBl4ZZJ9gDcB51XV3sB5/bAkSZqFkbfQq+p64Pr+/p1JrgL2AA4E9u9nOwn4AvDGUdfXrFF+0pYkjdxYj6EnWQY8EbgI2LUPe4Ab6LrkZ3rM4UlWJFkxOTk5mkIlSZrnxhboSR4MfBJ4bVXdMTitqoru+PoDVNVxVbW8qpZPTEyMoFJJkua/sQR6kq3owvyUqvpUP/rGJLv103cD1oyjNkmSFqKRB3qSAMcDV1XVewcmnQUc0t8/BDhz1LVJkrRQjfykOODXgBcD30hyaT/uLcAxwOlJDgO+C7xgDLVJkrQgjeMs9y8BWcfkp42yFkmSWuGV4iRJaoCBLklSAwx0SZIaYKBLktQAA12SpAYY6JIkNcBAlySpAQa6JEkNMNAlSWqAgS5JUgMMdEmSGmCgS5LUAANdkqQGGOiSJDXAQJckqQEGuiRJDTDQJUlqgIEuSVIDDHRJkhqwZNwFSJI0UkfuOMJ13T6yVdlClySpAbbQJUnjN8pWc6NsoUuS1ABb6OPkJ1JJ0mZiC12SpAYY6JIkNcBAlySpAR5DH+QxbUnSAmULXZKkBsy7QE9yQJJvJlmZ5E3jrkeSpIVgXgV6ki2BvwWeBewDHJxkn/FWJUnS/DevAh3YD1hZVd+pqv8GTgMOHHNNkiTNe/Mt0PcAVg0Mr+7HSZKk9VhwZ7knORw4vB+8K8k3x1jOLsBNY1z/fOA+cB+A+wDcB+A+gOn74KgMYx2PnGnkfAv064A9B4aX9uN+qqqOA44bZVHrkmRFVS0fdx3j5D5wH4D7ANwH4D6A8e6D+dbl/lVg7yR7JdkaOAg4a8w1SZI0782rFnpVrU3yKuDfgC2BE6rqijGXJUnSvDevAh2gqs4Gzh53HbM0L7r+x8x94D4A9wG4D8B9AGPcB6mqca1bkiRtJvPtGLokSZoDA30TJHlHksuSXJrknCS7j7umUUvyniRX9/vh00l2GndNo5bk+UmuSPKTJIvqDF8v1QxJTkiyJsnl465lHJLsmeT8JFf274PXjLumUUvyoCRfSfL1fh8cNZY67HKfuyQ7VNUd/f0/AfapqpePuayRSvJbwOf7ExrfDVBVbxxzWSOV5OeBnwD/APxZVa0Yc0kj0V+q+VvAM+guAvVV4OCqunKshY1Ykl8H7gJOrqp9x13PqCXZDditqi5J8hDgYuC5i+l1kCTA9lV1V5KtgC8Br6mqC0dZhy30TTAV5r3tgUX36aiqzqmqtf3ghXTXDlhUquqqqhrnBY7GxUs1A1V1AXDLuOsYl6q6vqou6e/fCVzFIrvCZ3Xu6ge36m8jzwMDfRMlOTrJKuCFwF+Mu54xexnwr+MuQiPjpZp1P0mWAU8ELhpzKSOXZMsklwJrgHOrauT7wEDfgCSfS3L5DLcDAarqrVW1J3AK8KrxVjscG9oH/TxvBdbS7YfmzGYfSItZkgcDnwReO633clGoqnur6hfpein3SzLywy/z7nvo801VPX2Ws55C9/35I4ZYzlhsaB8kORR4DvC0avSkjI14HSwmG7xUsxaH/rjxJ4FTqupT465nnKrqtiTnAwcAIz1R0hb6Jkiy98DggcDV46plXJIcALwB+N2q+sG469FIealmTZ0QdjxwVVW9d9z1jEOSialv+CTZlu5E0ZHngWe5b4IknwQeS3eG83eBl1fVomqhJFkJbAPc3I+6cBGe6f97wAeACeA24NKqeuZYixqRJM8G3s99l2o+erwVjV6SU4H96X5l60bgiKo6fqxFjVCSpwD/DnyD7n8hwFv6q34uCkmeAJxE9z7YAji9qt4+8joMdEmSFj673CVJaoCBLklSAwx0SZIaYKBLktQAA12SpAYY6JIkNcBAlySpAQa6JEkN+P+cyD0k4yru1wAAAABJRU5ErkJggg==", + "text/plain": [ + "
" + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + } + ], + "source": [ + "fig, ax = plt.subplots(figsize=(8, 6))\n", + "ax.hist(pm.draw(x, draws=1_000), color=\"C1\", bins=15)\n", + "ax.set(title=\"Samples from a normal distribution using pymc\", ylabel=\"count\");" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "Yay! We learned how to sample from a `pymc` distribution!" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "### What is going on behind the scenes?" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "We can now look into how this is done inside a {class}`~pymc.Model`." + ] + }, + { + "cell_type": "code", + "execution_count": 31, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "normal_rv{0, (0, 0), floatX, False}.1 [id A]\n", + " |RandomGeneratorSharedVariable() [id B]\n", + " |TensorConstant{[]} [id C]\n", + " |TensorConstant{11} [id D]\n", + " |TensorConstant{0} [id E]\n", + " |TensorConstant{1.0} [id F]\n" + ] + }, + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 31, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "with pm.Model() as model:\n", + " z = pm.Normal(name=\"z\", mu=np.array([0, 0]), sigma=np.array([1, 2]))\n", + "\n", + "aesara.dprint(x)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "We are just creating random variables like we saw before, but now registering them in a `pymc` model. To extract the list of random variables we can simply do:" + ] + }, + { + "cell_type": "code", + "execution_count": 32, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[z]" + ] + }, + "execution_count": 32, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "model.basic_RVs" + ] + }, + { + "cell_type": "code", + "execution_count": 33, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "normal_rv{0, (0, 0), floatX, False}.1 [id A] 'z'\n", + " |RandomGeneratorSharedVariable() [id B]\n", + " |TensorConstant{[]} [id C]\n", + " |TensorConstant{11} [id D]\n", + " |TensorConstant{(2,) of 0} [id E]\n", + " |TensorConstant{[1. 2.]} [id F]\n" + ] + }, + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 33, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "aesara.dprint(model.basic_RVs[0])" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "We can try to sample via {meth}`~aesara.graph.basic.Variable.eval` as above and it is no surprise that we are getting the same samples at each iteration." + ] + }, + { + "cell_type": "code", + "execution_count": 34, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Sample 0: [-1.97758523 3.67000753]\n", + "Sample 1: [-1.97758523 3.67000753]\n", + "Sample 2: [-1.97758523 3.67000753]\n", + "Sample 3: [-1.97758523 3.67000753]\n", + "Sample 4: [-1.97758523 3.67000753]\n", + "Sample 5: [-1.97758523 3.67000753]\n", + "Sample 6: [-1.97758523 3.67000753]\n", + "Sample 7: [-1.97758523 3.67000753]\n", + "Sample 8: [-1.97758523 3.67000753]\n", + "Sample 9: [-1.97758523 3.67000753]\n" + ] + } + ], + "source": [ + "for i in range(10):\n", + " print(f\"Sample {i}: {z.eval()}\")" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "Again, the correct way of sampling is via {func}`~pymc.draw`. " + ] + }, + { + "cell_type": "code", + "execution_count": 35, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Sample 0: [-1.13812635 0.73080569]\n", + "Sample 1: [-2.53843834 0.63905086]\n", + "Sample 2: [ 1.23414716 -0.6985418 ]\n", + "Sample 3: [-0.07509846 0.7046108 ]\n", + "Sample 4: [-0.4407329 -3.47301204]\n", + "Sample 5: [0.34642982 1.03938203]\n", + "Sample 6: [ 0.4471714 -1.04602358]\n", + "Sample 7: [0.5208584 0.85399692]\n", + "Sample 8: [-0.79061811 -1.33057063]\n", + "Sample 9: [-1.62328737 3.10770738]\n" + ] + } + ], + "source": [ + "for i in range(10):\n", + " print(f\"Sample {i}: {pm.draw(z)}\")" + ] + }, + { + "cell_type": "code", + "execution_count": 36, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAeIAAAHiCAYAAAA06c+jAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjMuMywgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/Il7ecAAAACXBIWXMAAAsTAAALEwEAmpwYAAAeRUlEQVR4nO3deZCkd33f8c9nes5d7a6u1YF2kYSR5CgSh2uFuRwMCCxARpXEBxiIsZ3aMhU5UBEQQIkNFduhTArLBBzXGmQCUhCYw6GEFB0RmDgYgS5kCS1YHLoQ0q6kXc1eM9Pd3/zRz5JhNdqV9vmuvjPT71eVSjvbvZ/+Ps9096d/T8/044gQAACoMVI9AAAAw4wiBgCgEEUMAEAhihgAgEIUMQAAhShiAAAKUcTAImL7vbYvKbjd99j+6FN9uwAoYkCSZPvFtr9me7vth23/X9tnVc+V4fHK3XbYfqYkRcQfR8S/fgJZX7F9wOsBeOJGqwcAqtleLelySW+R9BlJ45J+QdJM5VzDxvZoRHSr5wCeaqyIAelUSYqIT0VELyJ2R8TVEXGrJNn+GdvX2X7I9lbbl9o+fO8/tv1D2++wfavtnbY/ZvtY21fanrZ9re0jmuue1KxEN9r+ke37bb/98Qaz/fxmpb7N9rds/+K8y95s+/vNbfzA9hsOdgfMXzXbnrR9SbO922x/s9meP9LgBcqHbe+w/eHm+i9srrO9+f8L5+WebPur8/bDR+bdzt598Tu275Z0XfP3f237x03eV23/03l5H7f9582+3dEcuTjO9kW2H7G92fZzD3Y/ABUoYkD6rqSe7f9u+1V7S3MeS/rPkp4m6Z9IWi/pvftc519KeoUGpf7Lkq6U9B5JazV4nP3bfa7/UkmnSHqlpH9v++x9h7J9gqQvSfpDSUdKerukz9lea3ulpA9JelVErJL0Qkm3POktX9hvSlqjwXYeJel3Je2OiAsl/R9J50fEYRFxvu0jmxk/1Fz3g5K+ZPuoJut/SPpGc9l7Jb1pgdt7iQb79Zear6/UYN8cI+kmSZfuc/1fk/QfJB2twVGLv2+ud7SkzzYzAEsGRYyhFxGPSnqxpJD0l5K22P6i7WOby++MiGsiYiYitmjwRP+SfWL+a0Q8EBH3aVBW10fEzRGxR9IXJO27SntfROyMiH+Q9FeSXr/AaG+UdEVEXBER/Yi4RtINkl7dXN6XdIbtqYi4PyJu389m/lqzuv3Jf/u57pwGxfnM5gjBjc0+WshrJP1jRHwyIroR8SlJmyX9su2nSzpL0u9HxGxE/J2kLy6Q8d5mX+yWpIi4OCKmI2JGg/J+tu01867/hWamvft2T0R8IiJ6kj6tx+5rYFGjiAFJEXFHRLw5ItZJOkOD1e9FktQclr3M9n22H5V0iQarr/kemPfn3Qt8fdg+179n3p/vam5vXydK+tV9yvPFko6PiJ2Sfl2D1er9tr9k+2f3s4mfiYjD5/+3n+t+UtJVki5rDp//ie2xx7nu05r557tL0gnNZQ9HxK55l92jx/rJ39nu2H6/7e81+/qHzUXz9/eT3dfAokYRA/uIiM2SPq5BIUvSH2uwWj4zIlZrsFJ1y5tZP+/PT5f0owWuc4+kT+5ToCsj4v3NnFdFxCskHa/BKvQvW86kJncuIt4XEadrcMj7XEn/au/F+1z9Rxq8YJjv6ZLuk3S/pCNtr5h32Xo91vzM35B0nqSzNTg8flLz9233N7BoUcQYerZ/1vYFttc1X6/X4FDx15urrJK0Q9L25n3bdyTc7H+0vaL5QaTf0uCQ6r4u0eAQ7y81K8VJ279oe12zSj+vea94ppmvnzCXbL/U9pm2O5Ie1eBQ9d7sByQ9Y97Vr5B0qu3fsD1q+9clnS7p8oi4S4ND6e+1PW77BRq8f74/q5rteUjSCg1eBAHLGkUMSNOSfl7S9bZ3alDAt0m6oLn8fZJ+TtJ2DX4w6fMJt/m3ku6U9L8l/ZeIuHrfK0TEPRqsDt8jaYsGK+R3aPC4HZH07zRYkT6swXvWb0mYS5KO0+CHnh6VdEcz6yeby/5M0q80P6H8oYh4SIMV8wUalOc7JZ0bEVub679B0guay/5Qgxcc+/u1sE9ocGj7Pknf1v9/MQQsW47Y90gTgEPF9kmSfiBpbBh/Z9b2pyVtjog/qJ4FWCxYEQM4ZGyf5cHvYY/YPkeDFf7fFI8FLCp8shaAQ+k4DQ7lHyXpXklviYiba0cCFhcOTQMAUIhD0wAAFKKIAQAoVPIe8bgnYlIrK24aAICn3LQe2RoRaxe6rKSIJ7VSP++XV9w0KniZHniJlM/PADAEro3P7vtRsD+xTJ8hAQBYGihiAAAKUcQAABSiiAEAKEQRAwBQiCIGAKAQRQwAQCGKGACAQhQxAACFKGIAAApRxAAAFKKIAQAoRBEDAFCIIgYAoBBFDABAIYoYAIBCo9UDYAhEPyfHSa8bF9s8SdzppOREr5cQkrSPgSGwuJ5JAAAYMhQxAACFKGIAAApRxAAAFKKIAQAoRBEDAFCIIgYAoBBFDABAIYoYAIBCFDEAAIUoYgAAClHEAAAUoogBAChEEQMAUIgiBgCgUEoR2z7c9mdtb7Z9h+0XZOQCALDcjSbl/Jmk/xURv2J7XNKKpNzhk3Wy+cV0YvZFtk0j4+MpOf25bkqOO52UnCwZ84xMrUyYROrv3pOSE925lByPjqXkZM2z2B5bODiti9j2Gkn/TNKbJSkiZiXNts0FAGAYZLycOlnSFkl/Zftm2x+1/ZiXw7Y32r7B9g1zmkm4WQAAlr6MIh6V9HOS/ltEPFfSTknv2vdKEbEpIjZExIYxTSTcLAAAS19GEd8r6d6IuL75+rMaFDMAADiA1kUcET+WdI/t05q/ermkb7fNBQBgGGT91PTvSbq0+Ynp70v6raRcAACWtZQijohbJG3IyAIAYJjwyVoAABSiiAEAKEQRAwBQiCIGAKAQRQwAQCGKGACAQhQxAACFKGIAAApRxAAAFKKIAQAolPVZ08gS/ZQYj46l5ER3LiFkcW1TlpHJnNN5ejTnYejx8ZScDDGzyM457pw1R8rjIVPSYwu1WBEDAFCIIgYAoBBFDABAIYoYAIBCFDEAAIUoYgAAClHEAAAUoogBAChEEQMAUIgiBgCgEEUMAEAhihgAgEIUMQAAhShiAAAKUcQAABSiiAEAKJRzRnLkSTqBeZqEeUYmJxIGkTy6uO6uPu6YnKCHt+XkTIzn5Ix22mf0eu0zJI1MpsQoZmdTcvpz3ZQcRT8nB8vCInvWBwBguFDEAAAUoogBAChEEQMAUIgiBgCgEEUMAEAhihgAgEIUMQAAhShiAAAKUcQAABSiiAEAKEQRAwBQiCIGAKAQRQwAQCGKGACAQhQxAACFFteZ1pcyJ72mSTpheHRzcjw6lpCRdDeLSInx6lUpOTGRs11x4nEpOVn7Z2THTOsMr8y5//UffiQlxxMTKTmLbeUS/ZzvefR6CSE53/NhtNjuVwAADBWKGACAQhQxAACFKGIAAApRxAAAFEorYtsd2zfbvjwrEwCA5S5zRfxWSXck5gEAsOylFLHtdZJeI+mjGXkAAAyLrBXxRZLeKYnf6AYA4EloXcS2z5X0YETceIDrbbR9g+0b5tT+k3sAAFgOMlbEL5L0Wts/lHSZpJfZvmTfK0XEpojYEBEbxpTzcXMAACx1rYs4It4dEesi4iRJr5N0XUS8sfVkAAAMAX6PGACAQqlnX4qIr0j6SmYmAADLGStiAAAKUcQAABSiiAEAKJT6HvGT4pavASLps0PazrE3ptNJyYluznaNTE3l5KxelZKTYuWKlJje2tUpObOH5/wa3tj0XErO3GE5D+fOqvbb1ZlZmTCJNHLUmpQcb5vOydm9JyWnv2NnSk7MLaLPZEh6Lk17bl9CWBEDAFCIIgYAoBBFDABAIYoYAIBCFDEAAIUoYgAAClHEAAAUoogBAChEEQMAUIgiBgCgEEUMAEAhihgAgEIUMQAAhShiAAAKUcQAABSiiAEAKJRzJvGDsVhO/pw2RyclZWRqKiXHnZx5+o+2P6H6yPHHJkwiza47IiVn9zHtT3wvSXJOTG8y5/XwzmOSvudj7TOmHs55XK36wa6UnP7qo1NyOjd+JyUny8hYzlN4f3a2fYhZ1x0s9hwAAIUoYgAAClHEAAAUoogBAChEEQMAUIgiBgCgEEUMAEAhihgAgEIUMQAAhShiAAAKUcQAABSiiAEAKEQRAwBQiCIGAKAQRQwAQCGKGACAQhQxAACFRqsHqObRsZSc6PVScjprVqfk9HftSskZOXFd64zemqmESaRtp+bkzKxJiVE/566jXetz7jsaycmZeKDTOmN82gmTSL2J9rNI0uj0TErOyJFHpORoYjwlJrY8lJLjfqTkZIikh4OinxR06LEiBgCgEEUMAEAhihgAgEIUMQAAhShiAAAKUcQAABSiiAEAKEQRAwBQiCIGAKAQRQwAQCGKGACAQhQxAACFWhex7fW2v2z727Zvt/3WjMEAABgGGWdf6kq6ICJusr1K0o22r4mIbydkAwCwrLVeEUfE/RFxU/PnaUl3SDqhbS4AAMMg9T1i2ydJeq6k6zNzAQBYrjIOTUuSbB8m6XOS3hYRjy5w+UZJGyVpUisSbjDnNUR051JysubJMrL2qJygbvuzdD905mEJg0iRc4547Twx54ThsTrnvvOi076XkvONu09MyfHa9vtny5ErEyaRojOZkjP10FhKzsoHt6XkaPt0Skx0uyk5I1Pt93NvOmebFp2s5/Z4/ItSbsH2mAYlfGlEfH7BGSI2RcSGiNgwpomMmwUAYMnL+KlpS/qYpDsi4oPtRwIAYHhkrIhfJOlNkl5m+5bmv1cn5AIAsOy1fo84Iv5OkhNmAQBg6CyunzACAGDIUMQAABSiiAEAKEQRAwBQiCIGAKAQRQwAQCGKGACAQhQxAACFKGIAAApRxAAAFKKIAQAoRBEDAFCo9UkfDlrbky1HzsndPZpzwvAsMTOTkuMjD0/J2XPiEa0zVj6Qc/LyH70w5+560uk/Ssl507rrU3J+MLM2JWf9aY+k5Dwws6p1xoNHt8+QpB/fenJKzp7DOyk5k8cflZLT2b4rJWdkJOd8O92tD6fkZMh6To7uXEpOVtfsDytiAAAKUcQAABSiiAEAKEQRAwBQiCIGAKAQRQwAQCGKGACAQhQxAACFKGIAAApRxAAAFKKIAQAoRBEDAFCIIgYAoBBFDABAIYoYAIBCFDEAAIUoYgAACo2W3XL0y276UPBY0q4cTcqZmUmJGd0x1zpj5oiphEmk0VOmU3IenF6VkvPp+zek5Fx52hUpOff1cvbP2+8+r3XG/TtWJ0wibTsj53li5V05a47Vk52UnM6OpDVQP1Ji3MnYrpx9E71eSs5SwooYAIBCFDEAAIUoYgAAClHEAAAUoogBAChEEQMAUIgiBgCgEEUMAEAhihgAgEIUMQAAhShiAAAKUcQAABSiiAEAKEQRAwBQiCIGAKAQRQwAQKGks9AfBLd8DRA5JwxPk3Uy66Sc/lE5J2bvHjbWOmPbKTmv9+Zmc+6unanZlJw/fcZfp+T8+bbTUnLOnLw3JefklVtbZ2zdsyJhEmmbjkjJGduVEqMYTVq77N6TEtPflbRhCSLrOTDrub1tx+z1FHQNK2IAAApRxAAAFEopYtvn2P6O7TttvysjEwCAYdC6iG13JH1E0qsknS7p9bZPb5sLAMAwyFgRP0/SnRHx/YiYlXSZpPMScgEAWPYyivgESffM+/re5u8AAMABPGW/vmR7o6SNkjSpnF9tAABgqctYEd8naf28r9c1f/dTImJTRGyIiA1jmki4WQAAlr6MIv6mpFNsn2x7XNLrJH0xIRcAgGWv9aHpiOjaPl/SVZI6ki6OiNtbTwYAwBBIeY84Iq6QdEVGFgAAw4RP1gIAoBBFDABAIYoYAIBCFDEAAIUoYgAAClHEAAAUeso+4vIxot/u3zvnNYRHnJMzPp6So4iUmJHpPTk5aybbh7T8Vu81OTWbkrNyIifnE4+8ICXnbUd9LSVn0p2UnIumj22dMT2TcL+RNLY953E++XDOnXB0e87jKlYlfczvgzkx0eslhCQ90LMstnn2gxUxAACFKGIAAApRxAAAFKKIAQAoRBEDAFCIIgYAoBBFDABAIYoYAIBCFDEAAIUoYgAAClHEAAAUoogBAChEEQMAUIgiBgCgEEUMAEAhihgAgEKj1QNU68/mnCS+Mz6ekiM7J2c05yTx3an2OSNzCYNI2rN5TUrOjrU5A12nU1Nyts1NpeQ8NHNYSs5d249onfHIIysTJpHWPJASo5FupOR4rpeTs+WRnJwVK1JyYq7bPiNn10jRTwpaOlgRAwBQiCIGAKAQRQwAQCGKGACAQhQxAACFKGIAAApRxAAAFKKIAQAoRBEDAFCIIgYAoBBFDABAIYoYAIBCFDEAAIUoYgAAClHEAAAUoogBACg0Wj3AQUs6ebRHx1Jyejt3peR0Vuac6Fu9nP0zdd+O1hlH93JOEj/99PGUnJ27c77n2368NiXnG8/JeT18wqrtKTkZxn4wmZLTmU2JUW/CKTmeznmcq9tLienvypknunOtM7KeS6Ob89yVxknr1Xj8i1gRAwBQiCIGAKAQRQwAQCGKGACAQhQxAACFKGIAAApRxAAAFKKIAQAoRBEDAFCIIgYAoFCrIrb9Adubbd9q+wu2D0+aCwCAodB2RXyNpDMi4lmSvivp3e1HAgBgeLQq4oi4OiK6zZdfl7Su/UgAAAyPzPeIf1vSlYl5AAAsewc8DaLtayUdt8BFF0bE/2yuc6GkrqRL95OzUdJGSZpU0qn+AABY4g5YxBFx9v4ut/1mSedKenlEPO4ZFyNik6RNkrTaR+7nzIwAAAyPAxbx/tg+R9I7Jb0kIpLOmA0AwPBoVcSSPixpQtI1tiXp6xHxu0/oX7rl29PRb/fv98Z051JyPDqWktPbmfN6ZvTR6ZQcTRzROmLqzq0Jg0i7jj0+JWfV3Skxmn66U3L2fO2olJzvTuTkdGbaZ4zlPKy0YksvJWfqvpzHVazKeVvNc90DX+mJ5MwkfLOU8/yV9Vy66CR1zf60KuKIeGbWIAAADCM+WQsAgEIUMQAAhShiAAAKUcQAABSiiAEAKEQRAwBQiCIGAKAQRQwAQCGKGACAQhQxAACFKGIAAApRxAAAFKKIAQAoRBEDAFCIIgYAoFCr8xG30vZky+Y1xP7EzGxKjqdzTqieYc33dqfkdFfk3O17E+MpOVNbeik5s6sWz2PiiM07U3Jm1+Ts4849D6TkpOnmfM+zRHeueoR8WR3RtquegMXzyAUAYAhRxAAAFKKIAQAoRBEDAFCIIgYAoBBFDABAIYoYAIBCFDEAAIUoYgAAClHEAAAUoogBAChEEQMAUIgiBgCgEEUMAEAhihgAgEIUMQAAhShiAAAKjVYPcNCiXz3BT/FIzq702ERKTuzalZLjsYTt6kf7DEkju1el5IwmzXP0jTMpOd1V4yk5Uw/2UnJmjmp/H+xNdBImkabu3p6So9Gcx2d/y9aUHE/kPM57O3ak5MgJa7JF9py86ObZD1bEAAAUoogBAChEEQMAUIgiBgCgEEUMAEAhihgAgEIUMQAAhShiAAAKUcQAABSiiAEAKEQRAwBQiCIGAKAQRQwAQCGKGACAQhQxAACFKGIAAArlnC0bafp7ck42707Oidl7j7Q/MfvIZM5J0Ecenk7J8faknInxlJzOyhUpOUq674z9Y699SNK+0c7dKTHRS9gmSf25bkqOsnKcs5bKeL6Ibj9hkuGU8l20fYHtsH10Rh4AAMOidRHbXi/plZLubj8OAADDJWNF/KeS3ikpErIAABgqrYrY9nmS7ouIbz2B6260fYPtG+aU814WAABL3QF/WMv2tZKOW+CiCyW9R4PD0gcUEZskbZKk1T6S1TMAAHoCRRwRZy/097bPlHSypG/ZlqR1km6y/byI+HHqlAAALFMH/etLEfEPko7Z+7XtH0raEBFbE+YCAGAo8IEeAAAUSvtAj4g4KSsLAIBhwYoYAIBCFDEAAIUoYgAAClHEAAAUoogBAChEEQMAUIgiBgCgUNrvEQ+7tBOGR87JtdNO0p1x4vHBR6C2FtunU3KUdJL4NDOzOTljOQ/nmN7RPmTnrvYZkmI2ad8kcaeTkzOS85jIet6J7lxKDg4OK2IAAApRxAAAFKKIAQAoRBEDAFCIIgYAoBBFDABAIYoYAIBCFDEAAIUoYgAAClHEAAAUoogBAChEEQMAUIgiBgCgEEUMAEAhihgAgEIUMQAAhShiAAAKjVYPsGxEv3qCn+ac11judFpn9HbsTJhEGhnLubt6xYqUnNi5KyVHoznb1U/az07YzzHXTZhE8ohTcvpJ82Q9ziMlBcsFK2IAAApRxAAAFKKIAQAoRBEDAFCIIgYAoBBFDABAIYoYAIBCFDEAAIUoYgAAClHEAAAUoogBAChEEQMAUIgiBgCgEEUMAEAhihgAgEIUMQAAhXLOSI7FJ+sE5t2cnAz92dmUnKxXn56YSMnpbduWkuPRsZScmOu2zvCIEyaR1Onk5CTdd9I46V6Y9DhHLVbEAAAUoogBAChEEQMAUIgiBgCgEEUMAEAhihgAgEIUMQAAhVoXse3fs73Z9u22/yRjKAAAhkWrD/Sw/VJJ50l6dkTM2D4mZywAAIZD2xXxWyS9PyJmJCkiHmw/EgAAw6NtEZ8q6RdsX2/7b22flTEUAADD4oCHpm1fK+m4BS66sPn3R0p6vqSzJH3G9jMiIhbI2ShpoyRNakWbmQEAWDYOWMQRcfbjXWb7LZI+3xTvN2z3JR0tacsCOZskbZKk1T7yMUUNAMAwanto+m8kvVSSbJ8qaVzS1paZAAAMjbanQbxY0sW2b5M0K+k3FzosDQAAFtaqiCNiVtIbk2YBAGDo8MlaAAAUantoGnjqOOd1Y392NiVHc92cnCTR66XkuNNpnZG1jz06lpKz6ES/egIsIqyIAQAoRBEDAFCIIgYAoBBFDABAIYoYAIBCFDEAAIUoYgAAClHEAAAUoogBAChEEQMAUIgiBgCgEEUMAEAhihgAgEIUMQAAhShiAAAKUcQAABQarR4AWLIW28ndk+aJXkpMiujOVY8AHHKsiAEAKEQRAwBQiCIGAKAQRQwAQCGKGACAQhQxAACFKGIAAApRxAAAFKKIAQAoRBEDAFCIIgYAoBBFDABAIYoYAIBCFDEAAIUoYgAAClHEAAAUckQ89Tdqb5F011N+w3mOlrS1eohDhG1bepbrdkls21LFtj3WiRGxdqELSop4qbN9Q0RsqJ7jUGDblp7lul0S27ZUsW1PDoemAQAoRBEDAFCIIj44m6oHOITYtqVnuW6XxLYtVWzbk8B7xAAAFGJFDABAIYr4INn+T7ZvtX2L7attP616piy2P2B7c7N9X7B9ePVMGWz/qu3bbfdtL4uf6LR9ju3v2L7T9ruq58li+2LbD9q+rXqWbLbX2/6y7W8398e3Vs+Uwfak7W/Y/lazXe+rnimb7Y7tm21fnplLER+8D0TEsyLiOZIul/T7xfNkukbSGRHxLEnflfTu4nmy3CbpX0j6avUgGWx3JH1E0qsknS7p9bZPr50qzcclnVM9xCHSlXRBRJwu6fmS/s0y+b7NSHpZRDxb0nMknWP7+bUjpXurpDuyQynigxQRj877cqWkZfNme0RcHRHd5suvS1pXOU+WiLgjIr5TPUei50m6MyK+HxGzki6TdF7xTCki4quSHq6e41CIiPsj4qbmz9MaPLGfUDtVezGwo/lyrPlv2Twv2l4n6TWSPpqdTRG3YPuPbN8j6Q1aXivi+X5b0pXVQ2BBJ0i6Z97X92oZPKEPE9snSXqupOuLR0nRHLq9RdKDkq6JiGWxXY2LJL1TUj87mCLeD9vX2r5tgf/Ok6SIuDAi1ku6VNL5tdM+OQfatuY6F2pwGO3SukmfnCeyXcBiYPswSZ+T9LZ9jrAtWRHRa96uWyfpebbPKB4phe1zJT0YETceivzRQxG6XETE2U/wqpdKukLSHxzCcVIdaNtsv1nSuZJeHkvod9yexPdsObhP0vp5X69r/g6LnO0xDUr40oj4fPU82SJim+0va/A+/3L4gbsXSXqt7VdLmpS02vYlEfHGjHBWxAfJ9inzvjxP0uaqWbLZPkeDQzCvjYhd1fPgcX1T0im2T7Y9Lul1kr5YPBMOwLYlfUzSHRHxwep5stheu/c3LGxPSXqFlsnzYkS8OyLWRcRJGjzOrssqYYkibuP9zSHPWyW9UoOfplsuPixplaRrml/P+ovqgTLY/ue275X0Aklfsn1V9UxtND9Qd76kqzT4gZ/PRMTttVPlsP0pSX8v6TTb99r+neqZEr1I0pskvax5fN3SrLSWuuMlfbl5TvymBu8Rp/6az3LFJ2sBAFCIFTEAAIUoYgAAClHEAAAUoogBAChEEQMAUIgiBgCgEEUMAEAhihgAgEL/D0EF77PI/OdGAAAAAElFTkSuQmCC", + "text/plain": [ + "
" + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + } + ], + "source": [ + "fig, ax = plt.subplots(figsize=(8, 8))\n", + "z_draws = pm.draw(vars=z, draws=10_000)\n", + "ax.hist2d(x=z_draws[:, 0], y=z_draws[:, 1], bins=25)\n", + "ax.set(title=\"Samples Histogram\");" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "### Enough with Random Variables, I want to see some (log)probabilities!" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "Recall we have defined the following model above:" + ] + }, + { + "cell_type": "code", + "execution_count": 37, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "data": { + "text/latex": [ + "$$\n", + " \\begin{array}{rcl}\n", + " \\text{z} &\\sim & \\operatorname{N}(\\text{},~\\text{})\n", + " \\end{array}\n", + " $$" + ], + "text/plain": [ + "" + ] + }, + "execution_count": 37, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "model" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "`pymc` is able to convert `RandomVariable`s to their respective probability functions. One simple way is to use {func}`~pymc.logp`, which takes as first input a RandomVariable, and as second input the value at which the logp is evaluated (we will discuss this in more detail later)." + ] + }, + { + "cell_type": "code", + "execution_count": 38, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [], + "source": [ + "z_value = at.vector(name=\"z\")\n", + "z_logp = pm.logp(rv=z, value=z_value)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "`z_logp` contains the Aesara graph that represents the log-probability of the normal random variable `z`, evaluated at `z_value`." + ] + }, + { + "cell_type": "code", + "execution_count": 39, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Check{sigma > 0} [id A] 'z_logprob'\n", + " |Elemwise{sub,no_inplace} [id B]\n", + " | |Elemwise{sub,no_inplace} [id C]\n", + " | | |Elemwise{mul,no_inplace} [id D]\n", + " | | | |InplaceDimShuffle{x} [id E]\n", + " | | | | |TensorConstant{-0.5} [id F]\n", + " | | | |Elemwise{pow,no_inplace} [id G]\n", + " | | | |Elemwise{true_div,no_inplace} [id H]\n", + " | | | | |Elemwise{sub,no_inplace} [id I]\n", + " | | | | | |z [id J]\n", + " | | | | | |TensorConstant{(2,) of 0} [id K]\n", + " | | | | |TensorConstant{[1. 2.]} [id L]\n", + " | | | |InplaceDimShuffle{x} [id M]\n", + " | | | |TensorConstant{2} [id N]\n", + " | | |InplaceDimShuffle{x} [id O]\n", + " | | |Elemwise{log,no_inplace} [id P]\n", + " | | |Elemwise{sqrt,no_inplace} [id Q]\n", + " | | |TensorConstant{6.283185307179586} [id R]\n", + " | |Elemwise{log,no_inplace} [id S]\n", + " | |TensorConstant{[1. 2.]} [id L]\n", + " |All [id T]\n", + " |Elemwise{gt,no_inplace} [id U]\n", + " |TensorConstant{[1. 2.]} [id L]\n", + " |InplaceDimShuffle{x} [id V]\n", + " |TensorConstant{0.0} [id W]\n" + ] + }, + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 39, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "aesara.dprint(z_logp)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + ":::{tip}\n", + "There is also a handy `pymc` function to compute the log cumulative probability of a random variable {func}`~pymc.logcdf`." + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "Observe that, as explained at the beginning, there has been no computation yet. The actual computation is performed after compiling and passing the input. For illustration purposes alone, we will again use the handy {meth}`~aesara.graph.basic.Variable.eval` method." + ] + }, + { + "cell_type": "code", + "execution_count": 40, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([-0.91893853, -1.61208571])" + ] + }, + "execution_count": 40, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "z_logp.eval({z_value: [0, 0]})" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "This is nothing else than evaluating the log probability of a normal distribution." + ] + }, + { + "cell_type": "code", + "execution_count": 41, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([-0.91893853, -1.61208571])" + ] + }, + "execution_count": 41, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "scipy.stats.norm.logpdf(x=np.array([0, 0]), loc=np.array([0, 0]), scale=np.array([1, 2]))" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "`pymc` models provide some helpful routines to facilitating the conversion of `RandomVariable`s to probability functions. {meth}`~pymc.Model.logpt`, for instance can be used to extract the joint probability of all variables in the model:" + ] + }, + { + "cell_type": "code", + "execution_count": 42, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Elemwise{mul,no_inplace} [id A]\n", + " |Check{sigma > 0} [id B] 'z_logprob'\n", + " | |Elemwise{sub,no_inplace} [id C]\n", + " | | |Elemwise{sub,no_inplace} [id D]\n", + " | | | |Elemwise{mul,no_inplace} [id E]\n", + " | | | | |InplaceDimShuffle{x} [id F]\n", + " | | | | | |TensorConstant{-0.5} [id G]\n", + " | | | | |Elemwise{pow,no_inplace} [id H]\n", + " | | | | |Elemwise{true_div,no_inplace} [id I]\n", + " | | | | | |Elemwise{sub,no_inplace} [id J]\n", + " | | | | | | |z [id K]\n", + " | | | | | | |TensorConstant{(2,) of 0} [id L]\n", + " | | | | | |TensorConstant{[1. 2.]} [id M]\n", + " | | | | |InplaceDimShuffle{x} [id N]\n", + " | | | | |TensorConstant{2} [id O]\n", + " | | | |InplaceDimShuffle{x} [id P]\n", + " | | | |Elemwise{log,no_inplace} [id Q]\n", + " | | | |Elemwise{sqrt,no_inplace} [id R]\n", + " | | | |TensorConstant{6.283185307179586} [id S]\n", + " | | |Elemwise{log,no_inplace} [id T]\n", + " | | |TensorConstant{[1. 2.]} [id M]\n", + " | |All [id U]\n", + " | |Elemwise{gt,no_inplace} [id V]\n", + " | |TensorConstant{[1. 2.]} [id M]\n", + " | |InplaceDimShuffle{x} [id W]\n", + " | |TensorConstant{0.0} [id X]\n", + " |InplaceDimShuffle{x} [id Y]\n", + " |TensorConstant{1.0} [id Z]\n" + ] + }, + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 42, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "aesara.dprint(model.logpt(sum=False))" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "Because we only have one variable, this function is equivalent to what we obtained by manually calling `pm.logp` before. We can also use a helper {meth}`~pymc.Model.compile_logp` to return an already compiled Aesara function of the model logp." + ] + }, + { + "cell_type": "code", + "execution_count": 43, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [], + "source": [ + "logp_function = model.compile_logp(sum=False)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "This function expects a \"point\" dictionary as input. We could create it ourselves, but just to illustrate another useful {class}`~pymc.Model` method, let's call {meth}`~pymc.Model.initial_point`, which returns the point that most samplers use when deciding where to start sampling." + ] + }, + { + "cell_type": "code", + "execution_count": 44, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "{'z': array([0., 0.])}" + ] + }, + "execution_count": 44, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "point = model.initial_point()\n", + "point" + ] + }, + { + "cell_type": "code", + "execution_count": 45, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[array([-0.91893853, -1.61208571])]" + ] + }, + "execution_count": 45, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "logp_function(point)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "### What are value variables and why are they important?\n", + "\n", + "As he have seen above, a logp graph does not have random variables. Instead it's defined in terms of input (value) variables. When we want to sample, each random variable (RV) is replaced by a logp function evaluated at the respective input (value) variable. Let's see how this works through some examples. RV and value variables can be observed in these [`scipy`](https://github.com/scipy/scipy) operations:" + ] + }, + { + "cell_type": "code", + "execution_count": 46, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 46, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "rv = scipy.stats.norm(0, 1)\n", + "\n", + "# Equivalent to rv = pm.Normal(\"rv\", 0, 1)\n", + "scipy.stats.norm(0, 1)" + ] + }, + { + "cell_type": "code", + "execution_count": 47, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([0.68972784, 2.21155968, 0.67678162])" + ] + }, + "execution_count": 47, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + " # Equivalent to rv_draw = pm.draw(rv, 3)\n", + "rv.rvs(3)" + ] + }, + { + "cell_type": "code", + "execution_count": 48, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "-1.7001885332046727" + ] + }, + "execution_count": 48, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# Equivalent to rv_logp = pm.logp(rv, 1.25)\n", + "rv.logpdf(1.25)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "Next, let's look at how these value variables behave in a slightly more complex model." + ] + }, + { + "cell_type": "code", + "execution_count": 49, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [], + "source": [ + "with pm.Model() as model_2:\n", + " mu = pm.Normal(name=\"mu\", mu=0, sigma=2)\n", + " sigma = pm.HalfNormal(name=\"sigma\", sigma=3)\n", + " x = pm.Normal(name=\"x\", mu=mu, sigma=sigma)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "Each model RV is related to a \"value variable\":" + ] + }, + { + "cell_type": "code", + "execution_count": 50, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "{mu: mu, sigma: sigma_log__, x: x}" + ] + }, + "execution_count": 50, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "model_2.rvs_to_values" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "Observe that for sigma the associated value is in the *log* scale as in practice we require unbounded values for NUTS sampling." + ] + }, + { + "cell_type": "code", + "execution_count": 51, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[mu, sigma_log__, x]" + ] + }, + "execution_count": 51, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "model_2.value_vars" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "Now that we know how to extract the model variables, we can compute the element-wise log-probability of the model for specific values." + ] + }, + { + "cell_type": "code", + "execution_count": 52, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([ -1.61208571, -11.32440364, 9.08106147])" + ] + }, + "execution_count": 52, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# extract values as aesara.tensor.var.TensorVariable\n", + "mu_value = model_2.rvs_to_values[mu]\n", + "sigma_log_value = model_2.rvs_to_values[sigma]\n", + "x_value = model_2.rvs_to_values[x]\n", + "# element-wise log-probability of the model (we do not take te sum)\n", + "logp_graph = at.stack(model_2.logpt(sum=False))\n", + "# evaluate by passing concrete values\n", + "logp_graph.eval({mu_value: 0, sigma_log_value: -10, x_value:0})" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "This equivalent to:" + ] + }, + { + "cell_type": "code", + "execution_count": 53, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "mu_value -> -1.612085713764618\n", + "sigma_log_value -> -11.324403641427345 \n", + "x_value -> 9.081061466795328\n", + "\n" + ] + } + ], + "source": [ + "print(f\"\"\"\n", + "mu_value -> {scipy.stats.norm.logpdf(x=0, loc=0, scale=2)}\n", + "sigma_log_value -> {- 10 + scipy.stats.halfnorm.logpdf(x=np.exp(-10), loc=0, scale=3)} \n", + "x_value -> {scipy.stats.norm.logpdf(x=0, loc=0, scale=np.exp(-10))}\n", + "\"\"\")\n" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + ":::{Note}\n", + "For `sigma_log_value` we add the $-10$ term for the `scipy` and `aesara` to match because of the jacobian.\n", + ":::" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "As we already saw, we can also use the method {meth}`~pymc.Model.compile_logp` to obtain a compiled aesara function of the model logp, which takes a dictionary of `{value variable name : value}` as inputs:" + ] + }, + { + "cell_type": "code", + "execution_count": 54, + "metadata": { + "pycharm": { + "name": "#%%\n" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[array(-1.61208571), array(-11.32440364), array(9.08106147)]" + ] + }, + "execution_count": 54, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "model_2.compile_logp(sum=False)({\"mu\": 0, \"sigma_log__\": -10, \"x\": 0})" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "The {class}`~pymc.Model` class also has methods to extract the gradient ({meth}`~pymc.Model.dlogpt`) and the hessian ({meth}`~pymc.Model.d2logpt`) of the logp." + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "If you want to go deeper into the internals of `aesara` RandomVariables and `pymc` distributions please take a look into the [distribution developer guide](implementing-a-distribution)." + ] + } + ], + "metadata": { + "colab": { + "collapsed_sections": [ + "dIYxBNT_5HgW", + "JhmIBByY6T9h", + "k7spOYvvTdZL", + "qIMN2gHGzKof", + "aMM0sJy6z7Ur", + "WY6bUgqZztC3", + "58gng2f17_P7", + "C8Us4nEyhRdu", + "wkZR0gDWRAgK", + "wPw9kCvASOeJ", + "WdZcUfvLUkwK", + "EHH1hP0uzaWG", + "I_Ph4o7ZW_XD", + "jOI-E76fZ2bG", + "Rq2W2fmobmFg" + ], + "name": "PyMC intro.ipynb", + "provenance": [] + }, + "hide_input": false, + "interpreter": { + "hash": "322221ae0b6adf1db1274c5f417c2cb5b37d259e740acb22a87dc0305ae08c77" + }, + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.9.12" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +}