aboutsummaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorSIPB2024-12-03 14:46:38 -0500
committerSIPB2024-12-03 14:46:38 -0500
commit7462968826ca42383491e7441b495ef8d6eaf8b7 (patch)
tree634660aef605e3829c5fa4bf7b61bb1b756a6eee
parenta24288e28c4b53fdd6467ed4eed626fa0586bf72 (diff)
Latest blog post and graphs
-rw-r--r--blog.md171
-rw-r--r--grokking.pngbin0 -> 20549 bytes
-rw-r--r--index.html466
-rw-r--r--insane-shortest-paths.ipynb452
-rw-r--r--training-2d-histogram.pngbin8764 -> 12615 bytes
-rw-r--r--training-loss.pngbin16108 -> 20425 bytes
-rw-r--r--transformer_shortest_paths.ipynb1445
7 files changed, 1583 insertions, 951 deletions
diff --git a/blog.md b/blog.md
index 02a425e..ddeca55 100644
--- a/blog.md
+++ b/blog.md
@@ -1,5 +1,5 @@
---
-build: pandoc blog.md --citeproc -s -o index.html
+build: pandoc blog.md --citeproc --katex -s -o index.html
mkzip: zip project.zip index.html *.png
title: "6.7960 Project: Investigating Off-Distribution Generalization of Transformers"
bibliography: blog.bib
@@ -14,13 +14,7 @@ Anthony Wang, Alek Westover, Kevin Zhao
{xy,alekw,kevinmz}\@mit.edu
</div>
-## Abstract
-
-TODO
-
-## Introduction
-
-### Overview
+## Goals
Recently, LLMs have been developing very fast, and with that comes the concern of aligning the models to output true and productive statements. One common approach for ensuring this is to have a human in the loop rewarding the model for true outputs (e.g. RLHF), but one drawback to this problem is that humans can be poor judges of truthfulness. As LLMs become more capable, there might not even exist experts that are good judges of whether the model's outputs, such as difficult mathematical proofs, are truthful. So, we'd like to propose a potential solution to this issue via **off-distribution generalization** - applying human-like intuition to solve problems not in the dataset. Paul Christiano [proposed an experiment](https://www.alignmentforum.org/posts/BxersHYN2qcFoonwg/experimentally-evaluating-whether-honesty-generalizes?commentId=dsDA2BWpHPdgLvaXX) about shortest paths in a graph; our project is essentially to implement Christiano's proposed experiment. To the best of our knowledge, although there has been research in applying machine learning for different variations of graph searches [@10.5555/3666122.3666260], no one has done our exact experiment yet.
@@ -30,7 +24,7 @@ One approach to solving this problem is to reward an LLM for truthful behavior o
COMMENT FROM KEVIN -- synthesize from intorduction
-## Task
+### Task
We will use a synthetic task to test our hypothesis that models will generalize truthfully off-distribution. The synthetic task is computing the distance between various vertices in an input graph. Our experiment will have three parts:
@@ -38,7 +32,7 @@ We will use a synthetic task to test our hypothesis that models will generalize
2. Fine-tune a transformer to predict the distances between $s,t'$ for any $t'$ which is on the shortest path from $s$ to $t$, but only do fine-tuning on graphs with $n\in [8,16)$ vertices.
3. Test whether the transformer can accurately predict the distances between $s,t'$ for any $t'$ on the shortest path from $s$ to $t$ for graphs with $n\in [16,32)$ vertices.
-## Related Work
+### Related Work
COMMENT FROM ALEK
-- please remove all mentions of graph neural networks -- that is BS: there is no actual reason why you'd ever use a Neural network to solve shortest paths, the point of choosing a synthetic task is because there is a **simple ground truth** which makes it easy to evaluate whether or not our model is performing correctly. We'd also hoped that the simplicity of the task would make it more feasible to do with a limited compute budget, but apparently this task was too hard for our architecture.
@@ -50,7 +44,7 @@ There has been some research into the algorithmic optimization of GNNs and how t
- Tutsoy uses a graph-theory-based approach to model the epidemiological characteristics of infectious diseases, such as COVID-19 [@10.1109/TPAMI.2023.3256421]. We understand from his paper how GNN optimization may also be useful in researching novel diseases.
-### Theory
+## Methods
### Algorithm for Shortest Paths
@@ -58,14 +52,6 @@ The standard algorithm to find the shortest path in a graph between a source num
We will use this algorithm to verify the accuracy of our machine learning approach. Given $V$ vertices and $E$ edges, the runtime of this algorithm is thus $O(V + E)$; however, a machine learning approach may do better in time through parallelism, although at the expense of using much more memory.
-### Potential Mathematical Approaches to Shortest Paths
-
-Another way one can think of the shortest path of a graph is using a *matrix* to record which vertices are connected. Given vertices numbered $1$ to $V$, we denote the **adjacency matrix** $\textbf{M}$ of dimensions $V \times V$ as the matrix with element $\textbf{M}_{i, j} = 1$ if vertices $i$ and $j$ are connected by an edge and $\textbf{M}_{i, j} = 0$ if they are not. Now, we note that (1) For all $k$, $(\textbf{M}+I)^k_{i, j} = 0$ if and only if there exists no path from the vertex numbered $i$ to the vertex numbered $j$ that is distance $k$ or less due to Markov matrix processes. As a result, if the distance between vertices numbered $i$ and $j$ is $d$, then $\text{min}\left((\textbf{M}+I)^k_{i, j}, 1\right) = 1$ if $k \ge d$ and $\text{min}\left((\textbf{M}+I)^k_{i, j}, 1\right) = 0$ if $k < d$.
-
-With this information, because the distance between any two vertices is at most $V-1$ in a graph with $V$ vertices, we note that the *distance* matrix turns out to be simply $$\textbf{D} = \textbf{1}_{V \times V} \cdot V - \Sigma_{i=0}^{V-1}\text{min}\left((\textbf{M}+I)^k_{i, j}, 1\right).$$ The runtime to compute this is $O(V)$, although it will take more space to compute all powers of $\textbf{M}$.
-
-## Our Machine Learning Approach
-
### Data
We will represent an $n$ vertex, $m$ edge unweighted, undirected graph as sequence of the endpoints of the $m$ edges, so $[a_1,b_1,a_2,b_2,\ldots,a_m,b_m]$ represents a graph with the edges $\{(a_i,b_i)\}$ for $1 \leq i \leq m$. We will pad all sequences to be the same length using the padding token 0.
@@ -78,36 +64,51 @@ We have three separate datasets.
- **Fine-tune data**: For each $n \in [8,16)$, we will generate several graphs on $n$ vertices. We generate these graphs by inserting $2n$ random edges into the graph. We select the target vertex to be a random vertex on the shortest path from $1$ to $2$.
- **Generalization testing data**: The same as the fine-tune data, except we sample $n \in [16,32)$ instead.
-As a side note, we are also curious whether the transformer learns to generalize to different distributions of graphs, such as denser graphs or graphs with different properties. Time permitting, we will also investigate this.
+We wrote some Python code to generate the data during the training loop, but Python is slow and the data generation wasted a lot of time during training. To get around this, we pre-generated the data before training and made our Python code multithreaded to speed it up.
### Architecture
+TODO: honestly not much to say here since it's a pretty typical arch
+
We plan to use a standard transformer architecture. We will ensure that the number of layers in our transformer is at least the diameter of the graph. By doing this, we ensure that there is an extremely simple circuit --- namely BFS --- that the transformer could in theory learn to perform the task. Note that if the transformer actually learns a simple circuit to perform this task, then it seems more likely to generalize well. This is also our intuition for why it should be possible to fine tune on a small amount of data for finding shortest paths to other vertices besides $2$ -- it seems like the model should be computing these other distances as intermediate values in its computation to find the distance to vertex $2$.
### Embeddings
-TODO: fix this
+Since the order of the edges in the input does not matter, we did not use positional encodings. Each edge $(u,v)$ where $u < v$ is embedded to a dimension of $d$ where the first $\frac{d}{2}$ elements are the learned embedding of $u$ and the last $\frac{d}{2}$ elements are the learned embedding of $v$. For the target vertex $t$, we also embedded to dimension $d$, where the first $\frac{d}{2}$ elements are the learned embedding of $t$ and the last $\frac{d}{2}$ are a learned embedding of a special token.
-In order to facilitate performing this task with limited computational resources, we plan to use custom-made positional encodings that tell the model extra information about the
-structure of the problem, rather than the traditional sine/cosine positional encodings. (TODO: THIS IS OUTDATED) Specifically, our positional encodings are $v_1,v_1,v_2,v_2,\ldots,v_m,v_m,v_{m+1}$ where each $v_i$ is a random vector so each $v_i,v_j$ pair is nearly orthogonal with high probability. We will concatenate these with the token encodings rather than adding them. This should let the model easily have large attention scores between vertices corresponding to a single edge.
+## Training
-### Explicit transformer formula for shortest paths
+For our model, we used a model dimension of 64, four layers, and two heads per layer, for a total of 200545 parameters in bfloat16 which corresponds to around 3.2e6 bits. The number of possible graphs on 15 vertices generated using our procedure is approximately
+$$\frac{\binom{15}{2}^{15}}{15!} = 1.59\cdot10^{18}.$$
+This is because there are $\binom{15}{2}$ choices for each of the 15 edges and we don't care about the order of the edges. This is only an approximation because some edges might be duplicated. Each graph has an answer between 1 and 15 which requires around 4 bits, so memorizing all the answers requires $4\cdot1.59\cdot10^{18} = 6.36\cdot10^{18}$, which is $2\cdot10^{12}$ times larger than our model size.
-## Results
-### Initial Results
-We used a model dimension of 64, four layers, and two heads per layer. We used MSE loss, the Adam optimizer, a learning rate of 8e-4, and a batch size of 131,072 for 8000 unique randomly generated batches. Our final MSE loss was 0.35546875.
+We used MSE loss, the Adam optimizer, a learning rate of 8e-4, and a batch size of 131072 for 8000 unique randomly generated batches. Our final MSE loss was approximately 0.3555.
![](training-loss.png)
![](training-2d-histogram.png)
-### Fine Tuning
+One pattern we consistently noticed during training is that the model often gets stuck and plateaus for many epochs before rapidly decreasing. For instance, this happened between epochs 100 and 300 in the graph above:
-After receiving our initial results, we fine-tuned with a learning rate of 1e-5, also with MSE and the same batch size. Our final results are shown below.
+![](grokking.png)
+
+"grokking" hypothesis: it's memorizing all length 2 paths?
+
+TODO: training curves for 1, 2, 3 length paths
+
+### Potential Mathematical Approaches to Shortest Paths? Delete this?
+
+Another way one can think of the shortest path of a graph is using a *matrix* to record which vertices are connected. Given vertices numbered $1$ to $V$, we denote the **adjacency matrix** $\textbf{M}$ of dimensions $V \times V$ as the matrix with element $\textbf{M}_{i, j} = 1$ if vertices $i$ and $j$ are connected by an edge and $\textbf{M}_{i, j} = 0$ if they are not. Now, we note that (1) For all $k$, $(\textbf{M}+I)^k_{i, j} = 0$ if and only if there exists no path from the vertex numbered $i$ to the vertex numbered $j$ that is distance $k$ or less due to Markov matrix processes. As a result, if the distance between vertices numbered $i$ and $j$ is $d$, then $\text{min}\left((\textbf{M}+I)^k_{i, j}, 1\right) = 1$ if $k \ge d$ and $\text{min}\left((\textbf{M}+I)^k_{i, j}, 1\right) = 0$ if $k < d$.
+
+With this information, because the distance between any two vertices is at most $V-1$ in a graph with $V$ vertices, we note that the *distance* matrix turns out to be simply $$\textbf{D} = \textbf{1}_{V \times V} \cdot V - \Sigma_{i=0}^{V-1}\text{min}\left((\textbf{M}+I)^k_{i, j}, 1\right).$$ The runtime to compute this is $O(V)$, although it will take more space to compute all powers of $\textbf{M}$.
+
+## Fine tuning results
+
+After receiving our initial results, we fine-tuned with a learning rate of 1e-5, also with MSE and the same batch size. Our final results are shown in the images below.
![](fine-tuning-loss.png)
@@ -115,4 +116,116 @@ After receiving our initial results, we fine-tuned with a learning rate of 1e-5,
![](test-2d-histogram.png)
+Memorization? Do some math here to compute how many bits required to memorize 1, 2, 3
+
+## Complicated explicit transformer formula for shortest paths
+
+```py
+# Configuration
+NVTXS = 16
+MAXDIST = NVTXS + 1
+AVGDEG = 2
+SEQLEN = NVTXS + 1
+HIDDENDIM = 4 * NVTXS + 2
+
+# Start indices for different sections of the input data
+START_REACH = NVTXS + 1
+START_OUT = 2 * NVTXS + 1
+START_SELF = 3 * NVTXS + 1
+SRC_FLAG_IDX = START_SELF
+ANS_FLAG_IDX = 0
+NOTANS_FLAG_IDX = -1
+
+BIG = 20
+SUPABIG = 100
+MED = 10
+CURSE = 5
+
+class SillyTransformer(nn.Module):
+ def __init__(self, device):
+ super().__init__()
+ self.device = device
+
+ with torch.no_grad():
+ # Initialize weight parameters with specific configurations
+ self.mostKs = nn.ParameterList()
+ self.mostQs = nn.ParameterList()
+ self.mostVs = nn.ParameterList()
+ for head in range(1, NVTXS + 1):
+ Q = nn.Parameter(torch.zeros((2, HIDDENDIM), device=device))
+ Q[0, START_REACH - 1 + head] = SUPABIG
+ Q[1, NOTANS_FLAG_IDX] = 1
+
+ K = nn.Parameter(torch.zeros((2, HIDDENDIM), device=device))
+ K[0, head] = 1
+ K[1, ANS_FLAG_IDX] = BIG
+
+ V = nn.Parameter(torch.zeros((NVTXS, HIDDENDIM), device=device))
+ for i in range(NVTXS):
+ V[i, START_SELF + i] = 1
+
+ self.mostKs.append(K)
+ self.mostQs.append(Q)
+ self.mostVs.append(V)
+
+ self.weirdKs = nn.ParameterList()
+ self.weirdQs = nn.ParameterList()
+ self.weirdVs = nn.ParameterList()
+ for layer in range(NVTXS):
+ K = nn.Parameter(torch.zeros((3, HIDDENDIM), device=device))
+ K[0, NOTANS_FLAG_IDX] = -BIG
+ K[0, SRC_FLAG_IDX] = BIG+SUPABIG
+ K[1, NOTANS_FLAG_IDX] = -SUPABIG
+ K[1, NVTXS + 2] = BIG+SUPABIG
+ K[1, ANS_FLAG_IDX] = -BIG-SUPABIG
+ K[2, ANS_FLAG_IDX] = MED
+
+ Q = nn.Parameter(torch.zeros((3, HIDDENDIM), device=device))
+ Q[:, ANS_FLAG_IDX] = 1
+
+ V = nn.Parameter(torch.zeros((NVTXS, HIDDENDIM), device=device))
+ V[layer, SRC_FLAG_IDX] = 1
+
+ self.weirdKs.append(K)
+ self.weirdQs.append(Q)
+ self.weirdVs.append(V)
+
+ def forward(self, src):
+ for layer in range(NVTXS):
+ allKs = [self.weirdKs[layer]] + [x for x in self.mostKs]
+ allQs = [self.weirdQs[layer]] + [x for x in self.mostQs]
+ allVs = [self.weirdVs[layer]] + [x for x in self.mostVs]
+ head_outputs = []
+
+ for (K, Q, V) in zip(allKs, allQs, allVs):
+ ksrc = torch.matmul(src, K.unsqueeze(0).transpose(-2, -1))
+ qsrc = torch.matmul(src, Q.unsqueeze(0).transpose(-2, -1))
+ vsrc = torch.matmul(src, V.unsqueeze(0).transpose(-2, -1))
+
+ scores = torch.matmul(qsrc, ksrc.transpose(-2, -1))
+ attention_weights = torch.softmax(scores, dim=-1)
+ head_output = torch.matmul(attention_weights, vsrc)
+ head_outputs.append(head_output)
+
+ new_reaches = sum(head_outputs[1:])
+ BSZ = new_reaches.shape[0]
+
+ nodelta_nbrs = torch.zeros((BSZ, SEQLEN, NVTXS + 1), device=self.device)
+ morepadlol = torch.zeros((BSZ, SEQLEN, 1 + NVTXS), device=self.device)
+
+ src = src + torch.cat((nodelta_nbrs, new_reaches, head_outputs[0], morepadlol), dim=2)
+ src[:, :, START_REACH:START_REACH + NVTXS] = 2 * torch.sigmoid(src[:, :, START_REACH:START_REACH + NVTXS] * CURSE) - 1
+
+ canreach = src[:, 0, START_OUT:START_OUT + NVTXS]
+ final_output = 1 + torch.sum(1 - canreach, dim=1)
+ return final_output
+```
+
+
+## Alek perturbed experiment
+
+## Conclusion
+
+just do bfs lol
+
## References
diff --git a/grokking.png b/grokking.png
new file mode 100644
index 0000000..848b4ab
--- /dev/null
+++ b/grokking.png
Binary files differ
diff --git a/index.html b/index.html
index 95d23ff..331fabc 100644
--- a/index.html
+++ b/index.html
@@ -165,7 +165,70 @@
margin: 0 0.8em 0.2em -1.6em;
vertical-align: middle;
}
- .display.math{display: block; text-align: center; margin: 0.5rem auto;}
+ /* CSS for syntax highlighting */
+ pre > code.sourceCode { white-space: pre; position: relative; }
+ pre > code.sourceCode > span { line-height: 1.25; }
+ pre > code.sourceCode > span:empty { height: 1.2em; }
+ .sourceCode { overflow: visible; }
+ code.sourceCode > span { color: inherit; text-decoration: inherit; }
+ div.sourceCode { margin: 1em 0; }
+ pre.sourceCode { margin: 0; }
+ @media screen {
+ div.sourceCode { overflow: auto; }
+ }
+ @media print {
+ pre > code.sourceCode { white-space: pre-wrap; }
+ pre > code.sourceCode > span { display: inline-block; text-indent: -5em; padding-left: 5em; }
+ }
+ pre.numberSource code
+ { counter-reset: source-line 0; }
+ pre.numberSource code > span
+ { position: relative; left: -4em; counter-increment: source-line; }
+ pre.numberSource code > span > a:first-child::before
+ { content: counter(source-line);
+ position: relative; left: -1em; text-align: right; vertical-align: baseline;
+ border: none; display: inline-block;
+ -webkit-touch-callout: none; -webkit-user-select: none;
+ -khtml-user-select: none; -moz-user-select: none;
+ -ms-user-select: none; user-select: none;
+ padding: 0 4px; width: 4em;
+ color: #aaaaaa;
+ }
+ pre.numberSource { margin-left: 3em; border-left: 1px solid #aaaaaa; padding-left: 4px; }
+ div.sourceCode
+ { }
+ @media screen {
+ pre > code.sourceCode > span > a:first-child::before { text-decoration: underline; }
+ }
+ code span.al { color: #ff0000; font-weight: bold; } /* Alert */
+ code span.an { color: #60a0b0; font-weight: bold; font-style: italic; } /* Annotation */
+ code span.at { color: #7d9029; } /* Attribute */
+ code span.bn { color: #40a070; } /* BaseN */
+ code span.bu { color: #008000; } /* BuiltIn */
+ code span.cf { color: #007020; font-weight: bold; } /* ControlFlow */
+ code span.ch { color: #4070a0; } /* Char */
+ code span.cn { color: #880000; } /* Constant */
+ code span.co { color: #60a0b0; font-style: italic; } /* Comment */
+ code span.cv { color: #60a0b0; font-weight: bold; font-style: italic; } /* CommentVar */
+ code span.do { color: #ba2121; font-style: italic; } /* Documentation */
+ code span.dt { color: #902000; } /* DataType */
+ code span.dv { color: #40a070; } /* DecVal */
+ code span.er { color: #ff0000; font-weight: bold; } /* Error */
+ code span.ex { } /* Extension */
+ code span.fl { color: #40a070; } /* Float */
+ code span.fu { color: #06287e; } /* Function */
+ code span.im { color: #008000; font-weight: bold; } /* Import */
+ code span.in { color: #60a0b0; font-weight: bold; font-style: italic; } /* Information */
+ code span.kw { color: #007020; font-weight: bold; } /* Keyword */
+ code span.op { color: #666666; } /* Operator */
+ code span.ot { color: #007020; } /* Other */
+ code span.pp { color: #bc7a00; } /* Preprocessor */
+ code span.sc { color: #4070a0; } /* SpecialChar */
+ code span.ss { color: #bb6688; } /* SpecialString */
+ code span.st { color: #4070a0; } /* String */
+ code span.va { color: #19177c; } /* Variable */
+ code span.vs { color: #4070a0; } /* VerbatimString */
+ code span.wa { color: #60a0b0; font-weight: bold; font-style: italic; } /* Warning */
/* CSS for citations */
div.csl-bib-body { }
div.csl-entry {
@@ -187,6 +250,24 @@
div.csl-indent {
margin-left: 2em;
} </style>
+ <script defer=""
+ src="https://cdn.jsdelivr.net/npm/katex@0.15.1/dist/katex.min.js"></script>
+ <script>document.addEventListener("DOMContentLoaded", function () {
+ var mathElements = document.getElementsByClassName("math");
+ var macros = [];
+ for (var i = 0; i < mathElements.length; i++) {
+ var texText = mathElements[i].firstChild;
+ if (mathElements[i].tagName == "SPAN") {
+ katex.render(texText.data, mathElements[i], {
+ displayMode: mathElements[i].classList.contains('display'),
+ throwOnError: false,
+ macros: macros,
+ fleqn: false
+ });
+}}});
+ </script>
+ <link rel="stylesheet"
+ href="https://cdn.jsdelivr.net/npm/katex@0.15.1/dist/katex.min.css" />
</head>
<body>
<header id="title-block-header">
@@ -194,17 +275,11 @@
Generalization of Transformers</h1>
</header>
<!-- Guidelines: https://www.dropbox.com/scl/fi/bet8enscln8ue36kd8t17/final_project_guidelines.pdf?rlkey=knd19cnumk51ho1y9crno56ib&e=2&dl=0 -->
-<!-- <div style="display: flex; justify-content: space-between;">
-
-<div style="flex: 1; margin: 5px; padding: 10px; border: 1px solid #ddd; text-align: center;"> -->
<div style="text-align:center">
<p>Anthony Wang, Alek Westover, Kevin Zhao</p>
<p>{xy,alekw,kevinmz}@mit.edu</p>
</div>
-<h2 id="abstract">Abstract</h2>
-<p>TODO</p>
-<h2 id="introduction">Introduction</h2>
-<h3 id="overview">Overview</h3>
+<h2 id="goals">Goals</h2>
<p>Recently, LLMs have been developing very fast, and with that comes
the concern of aligning the models to output true and productive
statements. One common approach for ensuring this is to have a human in
@@ -242,31 +317,28 @@ transform hand drawn cats into images of cats might be able to handle a
generalizing truthfully is simple, thus promoted by “Occam’s Razor”, and
aim to investigate that with this project.</p>
<p>COMMENT FROM KEVIN – synthesize from intorduction</p>
-<h2 id="task">Task</h2>
+<h3 id="task">Task</h3>
<p>We will use a synthetic task to test our hypothesis that models will
generalize truthfully off-distribution. The synthetic task is computing
the distance between various vertices in an input graph. Our experiment
will have three parts:</p>
<ol type="1">
<li>Pre-train a transformer to predict the distance between two fixed
-vertices <span class="math inline"><em>s</em>, <em>t</em></span> on
-graphs with <span class="math inline"><em>n</em> ∈ [8, 32)</span>
-vertices.</li>
+vertices <span class="math inline">s,t</span> on graphs with <span
+class="math inline">n\in [8, 32)</span> vertices.</li>
<li>Fine-tune a transformer to predict the distances between <span
-class="math inline"><em>s</em>, <em>t</em>′</span> for any <span
-class="math inline"><em>t</em>′</span> which is on the shortest path
-from <span class="math inline"><em>s</em></span> to <span
-class="math inline"><em>t</em></span>, but only do fine-tuning on graphs
-with <span class="math inline"><em>n</em> ∈ [8, 16)</span>
-vertices.</li>
+class="math inline">s,t&#39;</span> for any <span
+class="math inline">t&#39;</span> which is on the shortest path from
+<span class="math inline">s</span> to <span
+class="math inline">t</span>, but only do fine-tuning on graphs with
+<span class="math inline">n\in [8,16)</span> vertices.</li>
<li>Test whether the transformer can accurately predict the distances
-between <span class="math inline"><em>s</em>, <em>t</em>′</span> for any
-<span class="math inline"><em>t</em>′</span> on the shortest path from
-<span class="math inline"><em>s</em></span> to <span
-class="math inline"><em>t</em></span> for graphs with <span
-class="math inline"><em>n</em> ∈ [16, 32)</span> vertices.</li>
+between <span class="math inline">s,t&#39;</span> for any <span
+class="math inline">t&#39;</span> on the shortest path from <span
+class="math inline">s</span> to <span class="math inline">t</span> for
+graphs with <span class="math inline">n\in [16,32)</span> vertices.</li>
</ol>
-<h2 id="related-work">Related Work</h2>
+<h3 id="related-work">Related Work</h3>
<p>COMMENT FROM ALEK – please remove all mentions of graph neural
networks – that is BS: there is no actual reason why you’d ever use a
Neural network to solve shortest paths, the point of choosing a
@@ -295,111 +367,67 @@ href="#ref-10.1109/TPAMI.2023.3256421" role="doc-biblioref">Tutsoy
2023</a>)</span>. We understand from his paper how GNN optimization may
also be useful in researching novel diseases.</p></li>
</ul>
-<h3 id="theory">Theory</h3>
+<h2 id="methods">Methods</h2>
<h3 id="algorithm-for-shortest-paths">Algorithm for Shortest Paths</h3>
<p>The standard algorithm to find the shortest path in a graph between a
-source numbered as <span class="math inline"><em>u</em></span> and sink
-numbered as <span class="math inline"><em>v</em></span> is
-<strong>breadth-first search (BFS)</strong>. The BFS algorithm maintains
-a mapping of visited vertices to their distances with respect to <span
-class="math inline"><em>u</em></span>, and each run of the algorithm
-goes through all the vertices newly visited in the previous run, and for
-each vertex, visits any of its unvisited neighbors. The algorithm
-terminates once either <span class="math inline"><em>v</em></span> is
-visited or the set of newly visited vertices in a single run is
-empty.</p>
+source numbered as <span class="math inline">u</span> and sink numbered
+as <span class="math inline">v</span> is <strong>breadth-first search
+(BFS)</strong>. The BFS algorithm maintains a mapping of visited
+vertices to their distances with respect to <span
+class="math inline">u</span>, and each run of the algorithm goes through
+all the vertices newly visited in the previous run, and for each vertex,
+visits any of its unvisited neighbors. The algorithm terminates once
+either <span class="math inline">v</span> is visited or the set of newly
+visited vertices in a single run is empty.</p>
<p>We will use this algorithm to verify the accuracy of our machine
-learning approach. Given <span class="math inline"><em>V</em></span>
-vertices and <span class="math inline"><em>E</em></span> edges, the
-runtime of this algorithm is thus <span
-class="math inline"><em>O</em>(<em>V</em> + <em>E</em>)</span>; however,
-a machine learning approach may do better in time through parallelism,
-although at the expense of using much more memory.</p>
-<h3 id="potential-mathematical-approaches-to-shortest-paths">Potential
-Mathematical Approaches to Shortest Paths</h3>
-<p>Another way one can think of the shortest path of a graph is using a
-<em>matrix</em> to record which vertices are connected. Given vertices
-numbered <span class="math inline">1</span> to <span
-class="math inline"><em>V</em></span>, we denote the <strong>adjacency
-matrix</strong> <span class="math inline"><strong>M</strong></span> of
-dimensions <span class="math inline"><em>V</em> × <em>V</em></span> as
-the matrix with element <span
-class="math inline"><strong>M</strong><sub><em>i</em>, <em>j</em></sub> = 1</span>
-if vertices <span class="math inline"><em>i</em></span> and <span
-class="math inline"><em>j</em></span> are connected by an edge and <span
-class="math inline"><strong>M</strong><sub><em>i</em>, <em>j</em></sub> = 0</span>
-if they are not. Now, we note that (1) For all <span
-class="math inline"><em>k</em></span>, <span
-class="math inline">(<strong>M</strong> + <em>I</em>)<sub><em>i</em>, <em>j</em></sub><sup><em>k</em></sup> = 0</span>
-if and only if there exists no path from the vertex numbered <span
-class="math inline"><em>i</em></span> to the vertex numbered <span
-class="math inline"><em>j</em></span> that is distance <span
-class="math inline"><em>k</em></span> or less due to Markov matrix
-processes. As a result, if the distance between vertices numbered <span
-class="math inline"><em>i</em></span> and <span
-class="math inline"><em>j</em></span> is <span
-class="math inline"><em>d</em></span>, then <span
-class="math inline">min((<strong>M</strong> + <em>I</em>)<sub><em>i</em>, <em>j</em></sub><sup><em>k</em></sup>, 1) = 1</span>
-if <span class="math inline"><em>k</em> ≥ <em>d</em></span> and <span
-class="math inline">min((<strong>M</strong> + <em>I</em>)<sub><em>i</em>, <em>j</em></sub><sup><em>k</em></sup>, 1) = 0</span>
-if <span class="math inline"><em>k</em> &lt; <em>d</em></span>.</p>
-<p>With this information, because the distance between any two vertices
-is at most <span class="math inline"><em>V</em> − 1</span> in a graph
-with <span class="math inline"><em>V</em></span> vertices, we note that
-the <em>distance</em> matrix turns out to be simply <span
-class="math display"><strong>D</strong> = <strong>1</strong><sub><em>V</em> × <em>V</em></sub> ⋅ <em>V</em> − <em>Σ</em><sub><em>i</em> = 0</sub><sup><em>V</em> − 1</sup>min((<strong>M</strong> + <em>I</em>)<sub><em>i</em>, <em>j</em></sub><sup><em>k</em></sup>, 1).</span>
-The runtime to compute this is <span
-class="math inline"><em>O</em>(<em>V</em>)</span>, although it will take
-more space to compute all powers of <span
-class="math inline"><strong>M</strong></span>.</p>
-<h2 id="our-machine-learning-approach">Our Machine Learning
-Approach</h2>
+learning approach. Given <span class="math inline">V</span> vertices and
+<span class="math inline">E</span> edges, the runtime of this algorithm
+is thus <span class="math inline">O(V + E)</span>; however, a machine
+learning approach may do better in time through parallelism, although at
+the expense of using much more memory.</p>
<h3 id="data">Data</h3>
-<p>We will represent an <span class="math inline"><em>n</em></span>
-vertex, <span class="math inline"><em>m</em></span> edge unweighted,
-undirected graph as sequence of the endpoints of the <span
-class="math inline"><em>m</em></span> edges, so <span
-class="math inline">[<em>a</em><sub>1</sub>, <em>b</em><sub>1</sub>, <em>a</em><sub>2</sub>, <em>b</em><sub>2</sub>, …, <em>a</em><sub><em>m</em></sub>, <em>b</em><sub><em>m</em></sub>]</span>
-represents a graph with the edges <span
-class="math inline">{(<em>a</em><sub><em>i</em></sub>, <em>b</em><sub><em>i</em></sub>)}</span>
-for <span class="math inline">1 ≤ <em>i</em> ≤ <em>m</em></span>. We
-will pad all sequences to be the same length using the padding token
-0.</p>
+<p>We will represent an <span class="math inline">n</span> vertex, <span
+class="math inline">m</span> edge unweighted, undirected graph as
+sequence of the endpoints of the <span class="math inline">m</span>
+edges, so <span
+class="math inline">[a_1,b_1,a_2,b_2,\ldots,a_m,b_m]</span> represents a
+graph with the edges <span class="math inline">\{(a_i,b_i)\}</span> for
+<span class="math inline">1 \leq i \leq m</span>. We will pad all
+sequences to be the same length using the padding token 0.</p>
<p>The full input to our model will additionally add the target vertex
after the padding tokens. The model is tasked with predicting the length
of the shortest path between vertex 1 and the target vertex <span
-class="math inline"><em>t</em></span>. If no such path exists, we define
-the length to be <span class="math inline"><em>n</em> + 1</span> which
-represents infinity. For example, an input-output pair for our model
-could look like <span
-class="math inline">[1, 3, 3, 2, 0, 0, 0, 0, 2]</span> and <span
-class="math inline">2</span> respectively.</p>
+class="math inline">t</span>. If no such path exists, we define the
+length to be <span class="math inline">n+1</span> which represents
+infinity. For example, an input-output pair for our model could look
+like <span class="math inline">[1, 3, 3, 2, 0, 0, 0, 0, 2]</span> and
+<span class="math inline">2</span> respectively.</p>
<p>We have three separate datasets.</p>
<ul>
<li><strong>Pre-train data</strong>: For each <span
-class="math inline"><em>n</em> ∈ [8, 32)</span>, we will generate
-several graphs on <span class="math inline"><em>n</em></span> vertices.
-We generate these graphs by inserting <span
-class="math inline">2<em>n</em></span> random edges into the graph. We
-always set the target vertex to be <span class="math inline">2</span>
-here.</li>
+class="math inline">n \in [8,32)</span>, we will generate several graphs
+on <span class="math inline">n</span> vertices. We generate these graphs
+by inserting <span class="math inline">2n</span> random edges into the
+graph. We always set the target vertex to be <span
+class="math inline">2</span> here.</li>
<li><strong>Fine-tune data</strong>: For each <span
-class="math inline"><em>n</em> ∈ [8, 16)</span>, we will generate
-several graphs on <span class="math inline"><em>n</em></span> vertices.
-We generate these graphs by inserting <span
-class="math inline">2<em>n</em></span> random edges into the graph. We
-select the target vertex to be a random vertex on the shortest path from
-<span class="math inline">1</span> to <span
+class="math inline">n \in [8,16)</span>, we will generate several graphs
+on <span class="math inline">n</span> vertices. We generate these graphs
+by inserting <span class="math inline">2n</span> random edges into the
+graph. We select the target vertex to be a random vertex on the shortest
+path from <span class="math inline">1</span> to <span
class="math inline">2</span>.</li>
<li><strong>Generalization testing data</strong>: The same as the
-fine-tune data, except we sample <span
-class="math inline"><em>n</em> ∈ [16, 32)</span> instead.</li>
+fine-tune data, except we sample <span class="math inline">n \in
+[16,32)</span> instead.</li>
</ul>
-<p>As a side note, we are also curious whether the transformer learns to
-generalize to different distributions of graphs, such as denser graphs
-or graphs with different properties. Time permitting, we will also
-investigate this.</p>
+<p>We wrote some Python code to generate the data during the training
+loop, but Python is slow and the data generation wasted a lot of time
+during training. To get around this, we pre-generated the data before
+training and made our Python code multithreaded to speed it up.</p>
<h3 id="architecture">Architecture</h3>
+<p>TODO: honestly not much to say here since it’s a pretty typical
+arch</p>
<p>We plan to use a standard transformer architecture. We will ensure
that the number of layers in our transformer is at least the diameter of
the graph. By doing this, we ensure that there is an extremely simple
@@ -413,39 +441,197 @@ model should be computing these other distances as intermediate values
in its computation to find the distance to vertex <span
class="math inline">2</span>.</p>
<h3 id="embeddings">Embeddings</h3>
-<p>TODO: fix this</p>
-<p>In order to facilitate performing this task with limited
-computational resources, we plan to use custom-made positional encodings
-that tell the model extra information about the structure of the
-problem, rather than the traditional sine/cosine positional encodings.
-(TODO: THIS IS OUTDATED) Specifically, our positional encodings are
-<span
-class="math inline"><em>v</em><sub>1</sub>, <em>v</em><sub>1</sub>, <em>v</em><sub>2</sub>, <em>v</em><sub>2</sub>, …, <em>v</em><sub><em>m</em></sub>, <em>v</em><sub><em>m</em></sub>, <em>v</em><sub><em>m</em> + 1</sub></span>
-where each <span
-class="math inline"><em>v</em><sub><em>i</em></sub></span> is a random
-vector so each <span
-class="math inline"><em>v</em><sub><em>i</em></sub>, <em>v</em><sub><em>j</em></sub></span>
-pair is nearly orthogonal with high probability. We will concatenate
-these with the token encodings rather than adding them. This should let
-the model easily have large attention scores between vertices
-corresponding to a single edge.</p>
-<h3 id="explicit-transformer-formula-for-shortest-paths">Explicit
-transformer formula for shortest paths</h3>
-<h2 id="results">Results</h2>
-<h3 id="initial-results">Initial Results</h3>
-<p>We used a model dimension of 64 64, four layers, and two heads per
-layer. We used MSE loss, the Adam optimizer, a learning rate of 8e-4,
-and a batch size of 131,072 for 8000 unique randomly generated batches.
-Our final MSE loss was 0.35546875.</p>
+<p>Since the order of the edges in the input does not matter, we did not
+use positional encodings. Each edge <span
+class="math inline">(u,v)</span> where <span class="math inline">u &lt;
+v</span> is embedded to a dimension of <span
+class="math inline">d</span> where the first <span
+class="math inline">\frac{d}{2}</span> elements are the learned
+embedding of <span class="math inline">u</span> and the last <span
+class="math inline">\frac{d}{2}</span> elements are the learned
+embedding of <span class="math inline">v</span>. For the target vertex
+<span class="math inline">t</span>, we also embedded to dimension <span
+class="math inline">d</span>, where the first <span
+class="math inline">\frac{d}{2}</span> elements are the learned
+embedding of <span class="math inline">t</span> and the last <span
+class="math inline">\frac{d}{2}</span> are a learned embedding of a
+special token.</p>
+<h2 id="training">Training</h2>
+<p>For our model, we used a model dimension of 64, four layers, and two
+heads per layer, for a total of 200545 parameters in bfloat16 which
+corresponds to around 3.2e6 bits. The number of possible graphs on 15
+vertices generated using our procedure is approximately</p>
+<p><span class="math display">\frac{\binom{15}{2}^{15}}{15!} =
+1.59\cdot10^{18}.</span></p>
+<p>This is because there are <span
+class="math inline">\binom{15}{2}</span> choices for each of the 15
+edges and we don’t care about the order of the edges. This is only an
+approximation because some edges might be duplicated. Each graph has an
+answer between 1 and 15 which requires around 4 bits, so memorizing all
+the answers requires <span class="math inline">4\cdot1.59\cdot10^{18} =
+6.36\cdot10^{18}</span>, which is <span
+class="math inline">2\cdot10^{12}</span> times larger than our model
+size.</p>
+<p>We used MSE loss, the Adam optimizer, a learning rate of 8e-4, and a
+batch size of 131072 for 8000 unique randomly generated batches. Our
+final MSE loss was approximately 0.3555.</p>
<p><img src="training-loss.png" /></p>
<p><img src="training-2d-histogram.png" /></p>
-<h3 id="fine-tuning">Fine Tuning</h3>
+<p>One pattern we consistently noticed during training is that the model
+often gets stuck and plateaus for many epochs before rapidly decreasing.
+For instance, this happened between epochs 100 and 300 in the graph
+above:</p>
+<p><img src="grokking.png" /></p>
+<p>“grokking” hypothesis: it’s memorizing all length 2 paths?</p>
+<p>TODO: training curves for 1, 2, 3 length paths</p>
+<h3
+id="potential-mathematical-approaches-to-shortest-paths-delete-this">Potential
+Mathematical Approaches to Shortest Paths? Delete this?</h3>
+<p>Another way one can think of the shortest path of a graph is using a
+<em>matrix</em> to record which vertices are connected. Given vertices
+numbered <span class="math inline">1</span> to <span
+class="math inline">V</span>, we denote the <strong>adjacency
+matrix</strong> <span class="math inline">\textbf{M}</span> of
+dimensions <span class="math inline">V \times V</span> as the matrix
+with element <span class="math inline">\textbf{M}_{i, j} = 1</span> if
+vertices <span class="math inline">i</span> and <span
+class="math inline">j</span> are connected by an edge and <span
+class="math inline">\textbf{M}_{i, j} = 0</span> if they are not. Now,
+we note that (1) For all <span class="math inline">k</span>, <span
+class="math inline">(\textbf{M}+I)^k_{i, j} = 0</span> if and only if
+there exists no path from the vertex numbered <span
+class="math inline">i</span> to the vertex numbered <span
+class="math inline">j</span> that is distance <span
+class="math inline">k</span> or less due to Markov matrix processes. As
+a result, if the distance between vertices numbered <span
+class="math inline">i</span> and <span class="math inline">j</span> is
+<span class="math inline">d</span>, then <span
+class="math inline">\text{min}\left((\textbf{M}+I)^k_{i, j}, 1\right) =
+1</span> if <span class="math inline">k \ge d</span> and <span
+class="math inline">\text{min}\left((\textbf{M}+I)^k_{i, j}, 1\right) =
+0</span> if <span class="math inline">k &lt; d</span>.</p>
+<p>With this information, because the distance between any two vertices
+is at most <span class="math inline">V-1</span> in a graph with <span
+class="math inline">V</span> vertices, we note that the
+<em>distance</em> matrix turns out to be simply <span
+class="math display">\textbf{D} = \textbf{1}_{V \times V} \cdot V -
+\Sigma_{i=0}^{V-1}\text{min}\left((\textbf{M}+I)^k_{i, j},
+1\right).</span> The runtime to compute this is <span
+class="math inline">O(V)</span>, although it will take more space to
+compute all powers of <span class="math inline">\textbf{M}</span>.</p>
+<h2 id="fine-tuning-results">Fine tuning results</h2>
<p>After receiving our initial results, we fine-tuned with a learning
rate of 1e-5, also with MSE and the same batch size. Our final results
-are shown below.</p>
+are shown in the images below.</p>
<p><img src="fine-tuning-loss.png" /></p>
<p><img src="fine-tuning-2d-histogram.png" /></p>
<p><img src="test-2d-histogram.png" /></p>
+<p>Memorization? Do some math here to compute how many bits required to
+memorize 1, 2, 3</p>
+<h2
+id="complicated-explicit-transformer-formula-for-shortest-paths">Complicated
+explicit transformer formula for shortest paths</h2>
+<div class="sourceCode" id="cb1"><pre class="sourceCode py"><code class="sourceCode python"><span id="cb1-1"><a href="#cb1-1" aria-hidden="true" tabindex="-1"></a><span class="co"># Configuration</span></span>
+<span id="cb1-2"><a href="#cb1-2" aria-hidden="true" tabindex="-1"></a>NVTXS <span class="op">=</span> <span class="dv">16</span></span>
+<span id="cb1-3"><a href="#cb1-3" aria-hidden="true" tabindex="-1"></a>MAXDIST <span class="op">=</span> NVTXS <span class="op">+</span> <span class="dv">1</span></span>
+<span id="cb1-4"><a href="#cb1-4" aria-hidden="true" tabindex="-1"></a>AVGDEG <span class="op">=</span> <span class="dv">2</span></span>
+<span id="cb1-5"><a href="#cb1-5" aria-hidden="true" tabindex="-1"></a>SEQLEN <span class="op">=</span> NVTXS <span class="op">+</span> <span class="dv">1</span></span>
+<span id="cb1-6"><a href="#cb1-6" aria-hidden="true" tabindex="-1"></a>HIDDENDIM <span class="op">=</span> <span class="dv">4</span> <span class="op">*</span> NVTXS <span class="op">+</span> <span class="dv">2</span></span>
+<span id="cb1-7"><a href="#cb1-7" aria-hidden="true" tabindex="-1"></a></span>
+<span id="cb1-8"><a href="#cb1-8" aria-hidden="true" tabindex="-1"></a><span class="co"># Start indices for different sections of the input data</span></span>
+<span id="cb1-9"><a href="#cb1-9" aria-hidden="true" tabindex="-1"></a>START_REACH <span class="op">=</span> NVTXS <span class="op">+</span> <span class="dv">1</span></span>
+<span id="cb1-10"><a href="#cb1-10" aria-hidden="true" tabindex="-1"></a>START_OUT <span class="op">=</span> <span class="dv">2</span> <span class="op">*</span> NVTXS <span class="op">+</span> <span class="dv">1</span></span>
+<span id="cb1-11"><a href="#cb1-11" aria-hidden="true" tabindex="-1"></a>START_SELF <span class="op">=</span> <span class="dv">3</span> <span class="op">*</span> NVTXS <span class="op">+</span> <span class="dv">1</span></span>
+<span id="cb1-12"><a href="#cb1-12" aria-hidden="true" tabindex="-1"></a>SRC_FLAG_IDX <span class="op">=</span> START_SELF</span>
+<span id="cb1-13"><a href="#cb1-13" aria-hidden="true" tabindex="-1"></a>ANS_FLAG_IDX <span class="op">=</span> <span class="dv">0</span></span>
+<span id="cb1-14"><a href="#cb1-14" aria-hidden="true" tabindex="-1"></a>NOTANS_FLAG_IDX <span class="op">=</span> <span class="op">-</span><span class="dv">1</span></span>
+<span id="cb1-15"><a href="#cb1-15" aria-hidden="true" tabindex="-1"></a></span>
+<span id="cb1-16"><a href="#cb1-16" aria-hidden="true" tabindex="-1"></a>BIG <span class="op">=</span> <span class="dv">20</span></span>
+<span id="cb1-17"><a href="#cb1-17" aria-hidden="true" tabindex="-1"></a>SUPABIG <span class="op">=</span> <span class="dv">100</span></span>
+<span id="cb1-18"><a href="#cb1-18" aria-hidden="true" tabindex="-1"></a>MED <span class="op">=</span> <span class="dv">10</span></span>
+<span id="cb1-19"><a href="#cb1-19" aria-hidden="true" tabindex="-1"></a>CURSE <span class="op">=</span> <span class="dv">5</span></span>
+<span id="cb1-20"><a href="#cb1-20" aria-hidden="true" tabindex="-1"></a></span>
+<span id="cb1-21"><a href="#cb1-21" aria-hidden="true" tabindex="-1"></a><span class="kw">class</span> SillyTransformer(nn.Module):</span>
+<span id="cb1-22"><a href="#cb1-22" aria-hidden="true" tabindex="-1"></a> <span class="kw">def</span> <span class="fu">__init__</span>(<span class="va">self</span>, device):</span>
+<span id="cb1-23"><a href="#cb1-23" aria-hidden="true" tabindex="-1"></a> <span class="bu">super</span>().<span class="fu">__init__</span>()</span>
+<span id="cb1-24"><a href="#cb1-24" aria-hidden="true" tabindex="-1"></a> <span class="va">self</span>.device <span class="op">=</span> device</span>
+<span id="cb1-25"><a href="#cb1-25" aria-hidden="true" tabindex="-1"></a></span>
+<span id="cb1-26"><a href="#cb1-26" aria-hidden="true" tabindex="-1"></a> <span class="cf">with</span> torch.no_grad():</span>
+<span id="cb1-27"><a href="#cb1-27" aria-hidden="true" tabindex="-1"></a> <span class="co"># Initialize weight parameters with specific configurations</span></span>
+<span id="cb1-28"><a href="#cb1-28" aria-hidden="true" tabindex="-1"></a> <span class="va">self</span>.mostKs <span class="op">=</span> nn.ParameterList()</span>
+<span id="cb1-29"><a href="#cb1-29" aria-hidden="true" tabindex="-1"></a> <span class="va">self</span>.mostQs <span class="op">=</span> nn.ParameterList()</span>
+<span id="cb1-30"><a href="#cb1-30" aria-hidden="true" tabindex="-1"></a> <span class="va">self</span>.mostVs <span class="op">=</span> nn.ParameterList()</span>
+<span id="cb1-31"><a href="#cb1-31" aria-hidden="true" tabindex="-1"></a> <span class="cf">for</span> head <span class="kw">in</span> <span class="bu">range</span>(<span class="dv">1</span>, NVTXS <span class="op">+</span> <span class="dv">1</span>):</span>
+<span id="cb1-32"><a href="#cb1-32" aria-hidden="true" tabindex="-1"></a> Q <span class="op">=</span> nn.Parameter(torch.zeros((<span class="dv">2</span>, HIDDENDIM), device<span class="op">=</span>device))</span>
+<span id="cb1-33"><a href="#cb1-33" aria-hidden="true" tabindex="-1"></a> Q[<span class="dv">0</span>, START_REACH <span class="op">-</span> <span class="dv">1</span> <span class="op">+</span> head] <span class="op">=</span> SUPABIG</span>
+<span id="cb1-34"><a href="#cb1-34" aria-hidden="true" tabindex="-1"></a> Q[<span class="dv">1</span>, NOTANS_FLAG_IDX] <span class="op">=</span> <span class="dv">1</span></span>
+<span id="cb1-35"><a href="#cb1-35" aria-hidden="true" tabindex="-1"></a></span>
+<span id="cb1-36"><a href="#cb1-36" aria-hidden="true" tabindex="-1"></a> K <span class="op">=</span> nn.Parameter(torch.zeros((<span class="dv">2</span>, HIDDENDIM), device<span class="op">=</span>device))</span>
+<span id="cb1-37"><a href="#cb1-37" aria-hidden="true" tabindex="-1"></a> K[<span class="dv">0</span>, head] <span class="op">=</span> <span class="dv">1</span></span>
+<span id="cb1-38"><a href="#cb1-38" aria-hidden="true" tabindex="-1"></a> K[<span class="dv">1</span>, ANS_FLAG_IDX] <span class="op">=</span> BIG</span>
+<span id="cb1-39"><a href="#cb1-39" aria-hidden="true" tabindex="-1"></a></span>
+<span id="cb1-40"><a href="#cb1-40" aria-hidden="true" tabindex="-1"></a> V <span class="op">=</span> nn.Parameter(torch.zeros((NVTXS, HIDDENDIM), device<span class="op">=</span>device))</span>
+<span id="cb1-41"><a href="#cb1-41" aria-hidden="true" tabindex="-1"></a> <span class="cf">for</span> i <span class="kw">in</span> <span class="bu">range</span>(NVTXS):</span>
+<span id="cb1-42"><a href="#cb1-42" aria-hidden="true" tabindex="-1"></a> V[i, START_SELF <span class="op">+</span> i] <span class="op">=</span> <span class="dv">1</span></span>
+<span id="cb1-43"><a href="#cb1-43" aria-hidden="true" tabindex="-1"></a></span>
+<span id="cb1-44"><a href="#cb1-44" aria-hidden="true" tabindex="-1"></a> <span class="va">self</span>.mostKs.append(K)</span>
+<span id="cb1-45"><a href="#cb1-45" aria-hidden="true" tabindex="-1"></a> <span class="va">self</span>.mostQs.append(Q)</span>
+<span id="cb1-46"><a href="#cb1-46" aria-hidden="true" tabindex="-1"></a> <span class="va">self</span>.mostVs.append(V)</span>
+<span id="cb1-47"><a href="#cb1-47" aria-hidden="true" tabindex="-1"></a></span>
+<span id="cb1-48"><a href="#cb1-48" aria-hidden="true" tabindex="-1"></a> <span class="va">self</span>.weirdKs <span class="op">=</span> nn.ParameterList()</span>
+<span id="cb1-49"><a href="#cb1-49" aria-hidden="true" tabindex="-1"></a> <span class="va">self</span>.weirdQs <span class="op">=</span> nn.ParameterList()</span>
+<span id="cb1-50"><a href="#cb1-50" aria-hidden="true" tabindex="-1"></a> <span class="va">self</span>.weirdVs <span class="op">=</span> nn.ParameterList()</span>
+<span id="cb1-51"><a href="#cb1-51" aria-hidden="true" tabindex="-1"></a> <span class="cf">for</span> layer <span class="kw">in</span> <span class="bu">range</span>(NVTXS):</span>
+<span id="cb1-52"><a href="#cb1-52" aria-hidden="true" tabindex="-1"></a> K <span class="op">=</span> nn.Parameter(torch.zeros((<span class="dv">3</span>, HIDDENDIM), device<span class="op">=</span>device))</span>
+<span id="cb1-53"><a href="#cb1-53" aria-hidden="true" tabindex="-1"></a> K[<span class="dv">0</span>, NOTANS_FLAG_IDX] <span class="op">=</span> <span class="op">-</span>BIG</span>
+<span id="cb1-54"><a href="#cb1-54" aria-hidden="true" tabindex="-1"></a> K[<span class="dv">0</span>, SRC_FLAG_IDX] <span class="op">=</span> BIG<span class="op">+</span>SUPABIG</span>
+<span id="cb1-55"><a href="#cb1-55" aria-hidden="true" tabindex="-1"></a> K[<span class="dv">1</span>, NOTANS_FLAG_IDX] <span class="op">=</span> <span class="op">-</span>SUPABIG</span>
+<span id="cb1-56"><a href="#cb1-56" aria-hidden="true" tabindex="-1"></a> K[<span class="dv">1</span>, NVTXS <span class="op">+</span> <span class="dv">2</span>] <span class="op">=</span> BIG<span class="op">+</span>SUPABIG</span>
+<span id="cb1-57"><a href="#cb1-57" aria-hidden="true" tabindex="-1"></a> K[<span class="dv">1</span>, ANS_FLAG_IDX] <span class="op">=</span> <span class="op">-</span>BIG<span class="op">-</span>SUPABIG</span>
+<span id="cb1-58"><a href="#cb1-58" aria-hidden="true" tabindex="-1"></a> K[<span class="dv">2</span>, ANS_FLAG_IDX] <span class="op">=</span> MED</span>
+<span id="cb1-59"><a href="#cb1-59" aria-hidden="true" tabindex="-1"></a></span>
+<span id="cb1-60"><a href="#cb1-60" aria-hidden="true" tabindex="-1"></a> Q <span class="op">=</span> nn.Parameter(torch.zeros((<span class="dv">3</span>, HIDDENDIM), device<span class="op">=</span>device))</span>
+<span id="cb1-61"><a href="#cb1-61" aria-hidden="true" tabindex="-1"></a> Q[:, ANS_FLAG_IDX] <span class="op">=</span> <span class="dv">1</span></span>
+<span id="cb1-62"><a href="#cb1-62" aria-hidden="true" tabindex="-1"></a></span>
+<span id="cb1-63"><a href="#cb1-63" aria-hidden="true" tabindex="-1"></a> V <span class="op">=</span> nn.Parameter(torch.zeros((NVTXS, HIDDENDIM), device<span class="op">=</span>device))</span>
+<span id="cb1-64"><a href="#cb1-64" aria-hidden="true" tabindex="-1"></a> V[layer, SRC_FLAG_IDX] <span class="op">=</span> <span class="dv">1</span></span>
+<span id="cb1-65"><a href="#cb1-65" aria-hidden="true" tabindex="-1"></a></span>
+<span id="cb1-66"><a href="#cb1-66" aria-hidden="true" tabindex="-1"></a> <span class="va">self</span>.weirdKs.append(K)</span>
+<span id="cb1-67"><a href="#cb1-67" aria-hidden="true" tabindex="-1"></a> <span class="va">self</span>.weirdQs.append(Q)</span>
+<span id="cb1-68"><a href="#cb1-68" aria-hidden="true" tabindex="-1"></a> <span class="va">self</span>.weirdVs.append(V)</span>
+<span id="cb1-69"><a href="#cb1-69" aria-hidden="true" tabindex="-1"></a></span>
+<span id="cb1-70"><a href="#cb1-70" aria-hidden="true" tabindex="-1"></a> <span class="kw">def</span> forward(<span class="va">self</span>, src):</span>
+<span id="cb1-71"><a href="#cb1-71" aria-hidden="true" tabindex="-1"></a> <span class="cf">for</span> layer <span class="kw">in</span> <span class="bu">range</span>(NVTXS):</span>
+<span id="cb1-72"><a href="#cb1-72" aria-hidden="true" tabindex="-1"></a> allKs <span class="op">=</span> [<span class="va">self</span>.weirdKs[layer]] <span class="op">+</span> [x <span class="cf">for</span> x <span class="kw">in</span> <span class="va">self</span>.mostKs]</span>
+<span id="cb1-73"><a href="#cb1-73" aria-hidden="true" tabindex="-1"></a> allQs <span class="op">=</span> [<span class="va">self</span>.weirdQs[layer]] <span class="op">+</span> [x <span class="cf">for</span> x <span class="kw">in</span> <span class="va">self</span>.mostQs]</span>
+<span id="cb1-74"><a href="#cb1-74" aria-hidden="true" tabindex="-1"></a> allVs <span class="op">=</span> [<span class="va">self</span>.weirdVs[layer]] <span class="op">+</span> [x <span class="cf">for</span> x <span class="kw">in</span> <span class="va">self</span>.mostVs]</span>
+<span id="cb1-75"><a href="#cb1-75" aria-hidden="true" tabindex="-1"></a> head_outputs <span class="op">=</span> []</span>
+<span id="cb1-76"><a href="#cb1-76" aria-hidden="true" tabindex="-1"></a> </span>
+<span id="cb1-77"><a href="#cb1-77" aria-hidden="true" tabindex="-1"></a> <span class="cf">for</span> (K, Q, V) <span class="kw">in</span> <span class="bu">zip</span>(allKs, allQs, allVs):</span>
+<span id="cb1-78"><a href="#cb1-78" aria-hidden="true" tabindex="-1"></a> ksrc <span class="op">=</span> torch.matmul(src, K.unsqueeze(<span class="dv">0</span>).transpose(<span class="op">-</span><span class="dv">2</span>, <span class="op">-</span><span class="dv">1</span>))</span>
+<span id="cb1-79"><a href="#cb1-79" aria-hidden="true" tabindex="-1"></a> qsrc <span class="op">=</span> torch.matmul(src, Q.unsqueeze(<span class="dv">0</span>).transpose(<span class="op">-</span><span class="dv">2</span>, <span class="op">-</span><span class="dv">1</span>))</span>
+<span id="cb1-80"><a href="#cb1-80" aria-hidden="true" tabindex="-1"></a> vsrc <span class="op">=</span> torch.matmul(src, V.unsqueeze(<span class="dv">0</span>).transpose(<span class="op">-</span><span class="dv">2</span>, <span class="op">-</span><span class="dv">1</span>))</span>
+<span id="cb1-81"><a href="#cb1-81" aria-hidden="true" tabindex="-1"></a></span>
+<span id="cb1-82"><a href="#cb1-82" aria-hidden="true" tabindex="-1"></a> scores <span class="op">=</span> torch.matmul(qsrc, ksrc.transpose(<span class="op">-</span><span class="dv">2</span>, <span class="op">-</span><span class="dv">1</span>))</span>
+<span id="cb1-83"><a href="#cb1-83" aria-hidden="true" tabindex="-1"></a> attention_weights <span class="op">=</span> torch.softmax(scores, dim<span class="op">=-</span><span class="dv">1</span>)</span>
+<span id="cb1-84"><a href="#cb1-84" aria-hidden="true" tabindex="-1"></a> head_output <span class="op">=</span> torch.matmul(attention_weights, vsrc)</span>
+<span id="cb1-85"><a href="#cb1-85" aria-hidden="true" tabindex="-1"></a> head_outputs.append(head_output)</span>
+<span id="cb1-86"><a href="#cb1-86" aria-hidden="true" tabindex="-1"></a></span>
+<span id="cb1-87"><a href="#cb1-87" aria-hidden="true" tabindex="-1"></a> new_reaches <span class="op">=</span> <span class="bu">sum</span>(head_outputs[<span class="dv">1</span>:])</span>
+<span id="cb1-88"><a href="#cb1-88" aria-hidden="true" tabindex="-1"></a> BSZ <span class="op">=</span> new_reaches.shape[<span class="dv">0</span>]</span>
+<span id="cb1-89"><a href="#cb1-89" aria-hidden="true" tabindex="-1"></a></span>
+<span id="cb1-90"><a href="#cb1-90" aria-hidden="true" tabindex="-1"></a> nodelta_nbrs <span class="op">=</span> torch.zeros((BSZ, SEQLEN, NVTXS <span class="op">+</span> <span class="dv">1</span>), device<span class="op">=</span><span class="va">self</span>.device)</span>
+<span id="cb1-91"><a href="#cb1-91" aria-hidden="true" tabindex="-1"></a> morepadlol <span class="op">=</span> torch.zeros((BSZ, SEQLEN, <span class="dv">1</span> <span class="op">+</span> NVTXS), device<span class="op">=</span><span class="va">self</span>.device)</span>
+<span id="cb1-92"><a href="#cb1-92" aria-hidden="true" tabindex="-1"></a></span>
+<span id="cb1-93"><a href="#cb1-93" aria-hidden="true" tabindex="-1"></a> src <span class="op">=</span> src <span class="op">+</span> torch.cat((nodelta_nbrs, new_reaches, head_outputs[<span class="dv">0</span>], morepadlol), dim<span class="op">=</span><span class="dv">2</span>)</span>
+<span id="cb1-94"><a href="#cb1-94" aria-hidden="true" tabindex="-1"></a> src[:, :, START_REACH:START_REACH <span class="op">+</span> NVTXS] <span class="op">=</span> <span class="dv">2</span> <span class="op">*</span> torch.sigmoid(src[:, :, START_REACH:START_REACH <span class="op">+</span> NVTXS] <span class="op">*</span> CURSE) <span class="op">-</span> <span class="dv">1</span></span>
+<span id="cb1-95"><a href="#cb1-95" aria-hidden="true" tabindex="-1"></a></span>
+<span id="cb1-96"><a href="#cb1-96" aria-hidden="true" tabindex="-1"></a> canreach <span class="op">=</span> src[:, <span class="dv">0</span>, START_OUT:START_OUT <span class="op">+</span> NVTXS]</span>
+<span id="cb1-97"><a href="#cb1-97" aria-hidden="true" tabindex="-1"></a> final_output <span class="op">=</span> <span class="dv">1</span> <span class="op">+</span> torch.<span class="bu">sum</span>(<span class="dv">1</span> <span class="op">-</span> canreach, dim<span class="op">=</span><span class="dv">1</span>)</span>
+<span id="cb1-98"><a href="#cb1-98" aria-hidden="true" tabindex="-1"></a> <span class="cf">return</span> final_output</span></code></pre></div>
+<h2 id="alek-perturbed-experiment">Alek perturbed experiment</h2>
+<h2 id="conclusion">Conclusion</h2>
+<p>just do bfs lol</p>
<h2 class="unnumbered" id="references">References</h2>
<div id="refs" class="references csl-bib-body hanging-indent"
data-entry-spacing="0" role="list">
diff --git a/insane-shortest-paths.ipynb b/insane-shortest-paths.ipynb
index 72846c2..e74974b 100644
--- a/insane-shortest-paths.ipynb
+++ b/insane-shortest-paths.ipynb
@@ -2,7 +2,7 @@
"cells": [
{
"cell_type": "code",
- "execution_count": 1,
+ "execution_count": 11,
"execution_state": "idle",
"id": "86ce5f44-94f6-43b0-a0d1-091b8134ffb6",
"metadata": {},
@@ -11,231 +11,337 @@
"name": "stdout",
"output_type": "stream",
"text": [
- "[set(), set(), {5, 6}, {4}, {3}, {2, 6}, {2, 5}]\n",
- "[set(), {6}, set(), {4, 5, 6}, {3}, {3, 6}, {1, 3, 5}]\n",
- "[set(), {4}, set(), {4, 5}, {1, 3}, {3, 6}, {5}]\n",
- "[set(), {2, 6}, {1, 6}, {6}, set(), set(), {1, 2, 3}]\n",
- "[set(), {3}, {3}, {1, 2, 5, 6}, {5}, {3, 4}, {3}]\n",
- "[set(), {3, 6}, {4}, {1}, {2}, {6}, {1, 5}]\n",
- "[set(), {2, 3}, {1, 3, 6}, {1, 2, 4}, {3}, set(), {2}]\n",
- "[set(), {4}, set(), {4}, {1, 3, 5, 6}, {4}, {4}]\n",
- "[set(), {3, 4, 5}, {6}, {1}, {1, 6}, {1}, {2, 4}]\n",
- "[set(), {5, 6}, {6}, {6}, {5, 6}, {1, 4}, {1, 2, 3, 4}]\n"
+ "Total number of parameters: 44352\n"
]
}
],
"source": [
- "# -*- coding: utf-8 -*-\n",
- "\"\"\"how-tsp-should-be.ipynb\n",
- "\n",
- "Automatically generated by Colab.\n",
- "\n",
- "Original file is located at\n",
- " https://colab.research.google.com/drive/1InE1iW8ARzndPpvqH_9y22s81sOiHxPs\n",
- "\"\"\"\n",
- "\n",
- "from tqdm import tqdm\n",
"import torch\n",
"import torch.nn as nn\n",
- "import matplotlib as mpl\n",
- "import matplotlib.pyplot as plt\n",
- "from torch.utils.data import DataLoader, TensorDataset\n",
- "\n",
- "from math import sqrt\n",
- "from collections import deque\n",
- "import os\n",
"import random\n",
- "import pickle\n",
- "import ipdb\n",
+ "from collections import deque\n",
"\n",
- "# torch.manual_seed(30)\n",
- "# random.seed(30)\n",
+ "# Set manual seeds for reproducibility\n",
"torch.manual_seed(33)\n",
"random.seed(33)\n",
"\n",
- "device = torch.device(\"cuda\" if torch.cuda.is_available() else \"cpu\")\n",
- "# assert device.type == \"cuda\", \"CUDA is not available. Please check your GPU setup.\"\n",
- "\n",
- "NVTXS = 6\n",
- "MAXDIST = NVTXS+1\n",
+ "# Configuration\n",
+ "NVTXS = 16\n",
+ "MAXDIST = NVTXS + 1\n",
"AVGDEG = 2\n",
"SEQLEN = NVTXS + 1\n",
- "HIDDENDIM = 4*NVTXS+2\n",
+ "HIDDENDIM = 4 * NVTXS + 2\n",
"\n",
- "# 0: ANSFLAG\n",
- "# 1:NVTXS+1 NBRS\n",
- "# NVTXS+1: 2*NVTXS+1 REACH\n",
- "# 2*NVTXS+1: 3*NVTXS+1 SELF\n",
- "# -1 NOTANSFLAG\n",
- "\n",
- "START_REACH = NVTXS+1\n",
- "START_OUT = 2*NVTXS+1\n",
- "START_SELF = 3*NVTXS+1\n",
+ "# Start indices for different sections of the input data\n",
+ "START_REACH = NVTXS + 1\n",
+ "START_OUT = 2 * NVTXS + 1\n",
+ "START_SELF = 3 * NVTXS + 1\n",
"SRC_FLAG_IDX = START_SELF\n",
- "SOURCE = 1\n",
- "TARGET = 2\n",
"ANS_FLAG_IDX = 0\n",
"NOTANS_FLAG_IDX = -1\n",
"\n",
- "def print_everything(data):\n",
- " print(\"NBRS\")\n",
- " print(data[0, 1:, 1:1+NVTXS])\n",
- " print(\"REACH\")\n",
- " print(data[0, 1:, START_REACH:START_REACH+NVTXS])\n",
- " print(\"ANSFLAG\")\n",
- " print(data[0, :, 0])\n",
- " print(\"MORE FLAGS\")\n",
- " print(data[0, :, -1])\n",
- " print(\"SELF\")\n",
- " print(data[0, 1:, START_SELF:START_SELF+NVTXS])\n",
- " print(\"OUT\")\n",
- " print(data[0, 0, START_OUT:START_OUT+NVTXS])\n",
- "\n",
- "\n",
- "def random_graph():\n",
- " data = torch.zeros((SEQLEN, HIDDENDIM))\n",
+ "# Determine device\n",
+ "device = torch.device(\"cuda\" if torch.cuda.is_available() else \"cpu\")\n",
"\n",
- " for i in range(1,NVTXS+1):\n",
- " data[i, START_SELF-1+i] = 1\n",
+ "def random_graph(device):\n",
+ " \"\"\"Generate a random graph tensor.\"\"\"\n",
+ " data = torch.zeros((SEQLEN, HIDDENDIM), device=device)\n",
+ " \n",
+ " # Mark self vertices\n",
+ " for i in range(1, NVTXS + 1):\n",
+ " data[i, START_SELF - 1 + i] = 1\n",
"\n",
+ " # Create adjacency list\n",
" adj_list = [set() for _ in range(SEQLEN)]\n",
" indices = [random.randint(1, NVTXS) for _ in range(AVGDEG * NVTXS)]\n",
+ " \n",
" for i in range(0, len(indices), 2):\n",
" u = indices[i]\n",
" v = indices[i + 1]\n",
" if u != v:\n",
- " data[v,u] = 1\n",
- " data[u,v] = 1\n",
- " data[v,NVTXS+u] = 1\n",
- " data[u,NVTXS+v] = 1\n",
+ " # Bidirectional connections\n",
+ " data[v, u] = 1\n",
+ " data[u, v] = 1\n",
+ " data[v, NVTXS + u] = 1\n",
+ " data[u, NVTXS + v] = 1\n",
" adj_list[u].add(v)\n",
" adj_list[v].add(u)\n",
"\n",
+ " # Set flags\n",
" data[0, ANS_FLAG_IDX] = 1\n",
" data[1:, NOTANS_FLAG_IDX] = 1\n",
- "\n",
- " # TODO: this is kind of a hack\n",
- " data[0, START_REACH:START_REACH+NVTXS] = 1\n",
+ " data[0, START_REACH:START_REACH + NVTXS] = 1\n",
" return data, adj_list\n",
"\n",
- "\"\"\"\n",
- "input: G, represented as an adjacency list\n",
- "output: distance from SOURCE to TARGET\n",
- "\"\"\"\n",
"def SSSP(G):\n",
+ " \"\"\"Single Source Shortest Path algorithm.\"\"\"\n",
" dist = [MAXDIST for _ in G]\n",
- " dist[SOURCE] = 0\n",
- " frontier = deque()\n",
- " frontier.append(SOURCE)\n",
- " while len(frontier) > 0:\n",
+ " dist[1] = 0\n",
+ " frontier = deque([1])\n",
+ " while frontier:\n",
" vtx = frontier.popleft()\n",
" for x in G[vtx]:\n",
" if dist[x] == MAXDIST:\n",
" dist[x] = 1 + dist[vtx]\n",
" frontier.append(x)\n",
- " if x == TARGET:\n",
- " return dist[TARGET]\n",
+ " if x == 2:\n",
+ " return dist[2]\n",
" return MAXDIST\n",
"\n",
"def mkbatch(size):\n",
- " graphs1 = []\n",
- " distance1 = []\n",
+ " \"\"\"Create a batch of graph data.\"\"\"\n",
+ " graphs = []\n",
+ " distances = []\n",
"\n",
- " for i in range(size):\n",
- " data, adj_list = random_graph()\n",
+ " for _ in range(size):\n",
+ " data, adj_list = random_graph(device)\n",
" dist = SSSP(adj_list)\n",
- " graphs1.append(data)\n",
- " distance1.append(dist)\n",
- "\n",
- " print(adj_list)\n",
+ " graphs.append(data)\n",
+ " distances.append(dist)\n",
"\n",
- " data = torch.stack(graphs1)\n",
- " labels = torch.tensor(distance1, dtype=torch.float16)\n",
+ " data = torch.stack(graphs)\n",
+ " labels = torch.tensor(distances, dtype=torch.float32, device=device)\n",
" return data, labels\n",
"\n",
- "\"\"\"\n",
- "TODO: WRAP EVERYTHING in nn.Parameter(torch.zeros((1, HIDDENDIM)))\n",
- "and then do my perturbing parameters experiment\n",
- "\n",
- "TODO:\n",
- " USE activation magic to bring everything back to the 0/1 realm instead of possibly being 0/2 valued\n",
- "\"\"\"\n",
+ "BIG = 20\n",
+ "SUPABIG = 100\n",
+ "MED = 10\n",
+ "CURSE = 5\n",
"\n",
"class SillyTransformer(nn.Module):\n",
- " def __init__(self):\n",
+ " def __init__(self, device):\n",
" super().__init__()\n",
- " self.most_KQVs = []\n",
- " for head in range(1,NVTXS+1):\n",
- " Q = torch.zeros((2, HIDDENDIM))\n",
- " Q[0, START_REACH-1+head] = 1000\n",
- " Q[1, NOTANS_FLAG_IDX] = 1\n",
- "\n",
- " K = torch.zeros((2, HIDDENDIM))\n",
- " K[0, head] = 1\n",
- " K[1, ANS_FLAG_IDX] = 200\n",
- "\n",
- " V = torch.zeros((NVTXS,HIDDENDIM))\n",
- " for i in range(NVTXS):\n",
- " V[i, START_SELF+i] = 1\n",
- "\n",
- " self.most_KQVs.append((K, Q, V))\n",
- "\n",
- " self.weird_KQVs = []\n",
- " for layer in range(NVTXS):\n",
- " K = torch.zeros((3, HIDDENDIM))\n",
- " K[0, NOTANS_FLAG_IDX] = -1000\n",
- " K[0, SRC_FLAG_IDX] = +1100\n",
- " K[1, NOTANS_FLAG_IDX] = -1000\n",
- " K[1, NVTXS+TARGET] = +1100\n",
- " K[1, ANS_FLAG_IDX] = -1100\n",
- " K[2, ANS_FLAG_IDX] = 10\n",
- "\n",
- " Q = torch.zeros((3, HIDDENDIM))\n",
- " Q[:, ANS_FLAG_IDX] = 1\n",
- "\n",
- " V = torch.zeros((NVTXS, HIDDENDIM))\n",
- " V[layer, SRC_FLAG_IDX] = 1\n",
- "\n",
- " self.weird_KQVs.append((K, Q, V))\n",
+ " self.device = device\n",
+ "\n",
+ " with torch.no_grad():\n",
+ " # Initialize weight parameters with specific configurations\n",
+ " self.mostKs = nn.ParameterList()\n",
+ " self.mostQs = nn.ParameterList()\n",
+ " self.mostVs = nn.ParameterList()\n",
+ " for head in range(1, NVTXS + 1):\n",
+ " Q = nn.Parameter(torch.zeros((2, HIDDENDIM), device=device))\n",
+ " Q[0, START_REACH - 1 + head] = SUPABIG\n",
+ " Q[1, NOTANS_FLAG_IDX] = 1\n",
+ "btrfs filesystem resize max\n",
+ " K = nn.Parameter(torch.zeros((2, HIDDENDIM), device=device))\n",
+ " K[0, head] = 1\n",
+ " K[1, ANS_FLAG_IDX] = BIG\n",
+ "\n",
+ " V = nn.Parameter(torch.zeros((NVTXS, HIDDENDIM), device=device))\n",
+ " for i in range(NVTXS):\n",
+ " V[i, START_SELF + i] = 1\n",
+ "\n",
+ " self.mostKs.append(K)\n",
+ " self.mostQs.append(Q)\n",
+ " self.mostVs.append(V)\n",
+ "\n",
+ " self.weirdKs = nn.ParameterList()\n",
+ " self.weirdQs = nn.ParameterList()\n",
+ " self.weirdVs = nn.ParameterList()\n",
+ " for layer in range(NVTXS):\n",
+ " K = nn.Parameter(torch.zeros((3, HIDDENDIM), device=device))\n",
+ " K[0, NOTANS_FLAG_IDX] = -BIG\n",
+ " K[0, SRC_FLAG_IDX] = BIG+SUPABIG\n",
+ " K[1, NOTANS_FLAG_IDX] = -SUPABIG\n",
+ " K[1, NVTXS + 2] = BIG+SUPABIG\n",
+ " K[1, ANS_FLAG_IDX] = -BIG-SUPABIG\n",
+ " K[2, ANS_FLAG_IDX] = MED\n",
+ "\n",
+ " Q = nn.Parameter(torch.zeros((3, HIDDENDIM), device=device))\n",
+ " Q[:, ANS_FLAG_IDX] = 1\n",
+ "\n",
+ " V = nn.Parameter(torch.zeros((NVTXS, HIDDENDIM), device=device))\n",
+ " V[layer, SRC_FLAG_IDX] = 1\n",
+ "\n",
+ " self.weirdKs.append(K)\n",
+ " self.weirdQs.append(Q)\n",
+ " self.weirdVs.append(V)\n",
"\n",
" def forward(self, src):\n",
- " for layer in range(NVTXS):\n",
- " allKQVs = [self.weird_KQVs[layer]] + self.most_KQVs\n",
- " head_outputs = []\n",
- " for (K, Q, V) in allKQVs:\n",
- " ksrc = torch.matmul(src, K.unsqueeze(0).transpose(-2, -1))\n",
- " qsrc = torch.matmul(src, Q.unsqueeze(0).transpose(-2, -1))\n",
- " vsrc = torch.matmul(src, V.unsqueeze(0).transpose(-2, -1))\n",
- "\n",
- " scores = torch.matmul(qsrc, ksrc.transpose(-2, -1))\n",
- " attention_weights = torch.softmax(scores, dim=-1)\n",
- " head_output = torch.matmul(attention_weights, vsrc)\n",
- " head_outputs.append(head_output)\n",
- "\n",
- " new_reaches = sum(head_outputs[1:])\n",
- " BSZ = new_reaches.shape[0]\n",
- "\n",
- " nodelta_nbrs = torch.zeros((BSZ, SEQLEN, NVTXS+1))\n",
- " morepadlol = torch.zeros((BSZ, SEQLEN, 1+NVTXS))\n",
- "\n",
- " DIFF = torch.cat((nodelta_nbrs, new_reaches, head_outputs[0], morepadlol), dim=2)\n",
- " src += torch.cat((nodelta_nbrs, new_reaches, head_outputs[0], morepadlol), dim=2)\n",
- " src[:, :, START_REACH:START_REACH+NVTXS] = 2*torch.sigmoid(src[:,:, START_REACH:START_REACH+NVTXS]*1000)-1\n",
- "\n",
- " # print(\"SRC\")\n",
- " # print_everything(src)\n",
- "\n",
- " canreach = src[:,0,START_OUT:START_OUT+NVTXS]\n",
- " # __import__('ipdb').set_trace()\n",
- " final_output = 1+torch.sum(1-canreach,dim=1)\n",
- " return final_output\n",
- "\n",
- "model = SillyTransformer()\n",
- "model.to(device)\n",
- "\n",
- "data, labels = mkbatch(10)\n",
- "assert torch.all(model(data) == labels)\n",
- "\n"
+ " for layer in range(NVTXS):\n",
+ " allKs = [self.weirdKs[layer]] + [x for x in self.mostKs]\n",
+ " allQs = [self.weirdQs[layer]] + [x for x in self.mostQs]\n",
+ " allVs = [self.weirdVs[layer]] + [x for x in self.mostVs]\n",
+ " head_outputs = []\n",
+ " \n",
+ " for (K, Q, V) in zip(allKs, allQs, allVs):\n",
+ " ksrc = torch.matmul(src, K.unsqueeze(0).transpose(-2, -1))\n",
+ " qsrc = torch.matmul(src, Q.unsqueeze(0).transpose(-2, -1))\n",
+ " vsrc = torch.matmul(src, V.unsqueeze(0).transpose(-2, -1))\n",
+ "\n",
+ " scores = torch.matmul(qsrc, ksrc.transpose(-2, -1))\n",
+ " attention_weights = torch.softmax(scores, dim=-1)\n",
+ " head_output = torch.matmul(attention_weights, vsrc)\n",
+ " head_outputs.append(head_output)\n",
+ "\n",
+ " new_reaches = sum(head_outputs[1:])\n",
+ " BSZ = new_reaches.shape[0]\n",
+ "\n",
+ " nodelta_nbrs = torch.zeros((BSZ, SEQLEN, NVTXS + 1), device=self.device)\n",
+ " morepadlol = torch.zeros((BSZ, SEQLEN, 1 + NVTXS), device=self.device)\n",
+ "\n",
+ " src = src + torch.cat((nodelta_nbrs, new_reaches, head_outputs[0], morepadlol), dim=2)\n",
+ " src[:, :, START_REACH:START_REACH + NVTXS] = 2 * torch.sigmoid(src[:, :, START_REACH:START_REACH + NVTXS] * CURSE) - 1\n",
+ "\n",
+ " canreach = src[:, 0, START_OUT:START_OUT + NVTXS]\n",
+ " final_output = 1 + torch.sum(1 - canreach, dim=1)\n",
+ " return final_output\n",
+ "\n",
+ "model = SillyTransformer(device).to(device)\n",
+ "params = sum(p.numel() for p in model.parameters())\n",
+ "print(f\"Total number of parameters: {params}\")\n",
+ "\n",
+ "def destroy_rand_weights(model):\n",
+ " weight_lists = [model.mostKs, model.mostQs, model.mostVs, \n",
+ " model.weirdKs, model.weirdQs, model.weirdVs]\n",
+ " random_list = random.choice(weight_lists)\n",
+ " random_matrix = random.choice(random_list)\n",
+ " random_matrix.data = torch.randn_like(random_matrix)\n",
+ "\n",
+ "optimizer = torch.optim.Adam(model.parameters(), lr=3e-5)\n",
+ "loss_fn = nn.MSELoss()"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 6,
+ "execution_state": "idle",
+ "id": "a9dd76f4-96f2-47b5-9bb9-a32a1b478dd4",
+ "metadata": {},
+ "outputs": [
+ {
+ "name": "stdout",
+ "output_type": "stream",
+ "text": [
+ "Epoch [0/10000], Loss: 8.3387\n",
+ "Epoch [10/10000], Loss: 7.6416\n",
+ "Epoch [20/10000], Loss: 11.2689\n",
+ "Epoch [30/10000], Loss: 7.0312\n",
+ "Epoch [40/10000], Loss: 8.7287\n",
+ "Epoch [50/10000], Loss: 7.7182\n"
+ ]
+ },
+ {
+ "ename": "KeyboardInterrupt",
+ "evalue": "",
+ "output_type": "error",
+ "traceback": [
+ "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
+ "\u001b[0;31mKeyboardInterrupt\u001b[0m Traceback (most recent call last)",
+ "Cell \u001b[0;32mIn[6], line 11\u001b[0m\n\u001b[1;32m 9\u001b[0m loss \u001b[38;5;241m=\u001b[39m loss_fn(outputs, labels)\n\u001b[1;32m 10\u001b[0m optimizer\u001b[38;5;241m.\u001b[39mzero_grad()\n\u001b[0;32m---> 11\u001b[0m \u001b[43mloss\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mbackward\u001b[49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 12\u001b[0m optimizer\u001b[38;5;241m.\u001b[39mstep()\n\u001b[1;32m 13\u001b[0m train_err\u001b[38;5;241m.\u001b[39mappend(loss\u001b[38;5;241m.\u001b[39mitem())\n",
+ "File \u001b[0;32m~/.venv/lib64/python3.12/site-packages/torch/_tensor.py:581\u001b[0m, in \u001b[0;36mTensor.backward\u001b[0;34m(self, gradient, retain_graph, create_graph, inputs)\u001b[0m\n\u001b[1;32m 571\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m has_torch_function_unary(\u001b[38;5;28mself\u001b[39m):\n\u001b[1;32m 572\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m handle_torch_function(\n\u001b[1;32m 573\u001b[0m Tensor\u001b[38;5;241m.\u001b[39mbackward,\n\u001b[1;32m 574\u001b[0m (\u001b[38;5;28mself\u001b[39m,),\n\u001b[0;32m (...)\u001b[0m\n\u001b[1;32m 579\u001b[0m inputs\u001b[38;5;241m=\u001b[39minputs,\n\u001b[1;32m 580\u001b[0m )\n\u001b[0;32m--> 581\u001b[0m \u001b[43mtorch\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mautograd\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mbackward\u001b[49m\u001b[43m(\u001b[49m\n\u001b[1;32m 582\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mgradient\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mretain_graph\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mcreate_graph\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43minputs\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43minputs\u001b[49m\n\u001b[1;32m 583\u001b[0m \u001b[43m\u001b[49m\u001b[43m)\u001b[49m\n",
+ "File \u001b[0;32m~/.venv/lib64/python3.12/site-packages/torch/autograd/__init__.py:347\u001b[0m, in \u001b[0;36mbackward\u001b[0;34m(tensors, grad_tensors, retain_graph, create_graph, grad_variables, inputs)\u001b[0m\n\u001b[1;32m 342\u001b[0m retain_graph \u001b[38;5;241m=\u001b[39m create_graph\n\u001b[1;32m 344\u001b[0m \u001b[38;5;66;03m# The reason we repeat the same comment below is that\u001b[39;00m\n\u001b[1;32m 345\u001b[0m \u001b[38;5;66;03m# some Python versions print out the first line of a multi-line function\u001b[39;00m\n\u001b[1;32m 346\u001b[0m \u001b[38;5;66;03m# calls in the traceback and some print out the last line\u001b[39;00m\n\u001b[0;32m--> 347\u001b[0m \u001b[43m_engine_run_backward\u001b[49m\u001b[43m(\u001b[49m\n\u001b[1;32m 348\u001b[0m \u001b[43m \u001b[49m\u001b[43mtensors\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 349\u001b[0m \u001b[43m \u001b[49m\u001b[43mgrad_tensors_\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 350\u001b[0m \u001b[43m \u001b[49m\u001b[43mretain_graph\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 351\u001b[0m \u001b[43m \u001b[49m\u001b[43mcreate_graph\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 352\u001b[0m \u001b[43m \u001b[49m\u001b[43minputs\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 353\u001b[0m \u001b[43m \u001b[49m\u001b[43mallow_unreachable\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;28;43;01mTrue\u001b[39;49;00m\u001b[43m,\u001b[49m\n\u001b[1;32m 354\u001b[0m \u001b[43m \u001b[49m\u001b[43maccumulate_grad\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;28;43;01mTrue\u001b[39;49;00m\u001b[43m,\u001b[49m\n\u001b[1;32m 355\u001b[0m \u001b[43m\u001b[49m\u001b[43m)\u001b[49m\n",
+ "File \u001b[0;32m~/.venv/lib64/python3.12/site-packages/torch/autograd/graph.py:825\u001b[0m, in \u001b[0;36m_engine_run_backward\u001b[0;34m(t_outputs, *args, **kwargs)\u001b[0m\n\u001b[1;32m 823\u001b[0m unregister_hooks \u001b[38;5;241m=\u001b[39m _register_logging_hooks_on_whole_graph(t_outputs)\n\u001b[1;32m 824\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[0;32m--> 825\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[43mVariable\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_execution_engine\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mrun_backward\u001b[49m\u001b[43m(\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;66;43;03m# Calls into the C++ engine to run the backward pass\u001b[39;49;00m\n\u001b[1;32m 826\u001b[0m \u001b[43m \u001b[49m\u001b[43mt_outputs\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43margs\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43mkwargs\u001b[49m\n\u001b[1;32m 827\u001b[0m \u001b[43m \u001b[49m\u001b[43m)\u001b[49m \u001b[38;5;66;03m# Calls into the C++ engine to run the backward pass\u001b[39;00m\n\u001b[1;32m 828\u001b[0m \u001b[38;5;28;01mfinally\u001b[39;00m:\n\u001b[1;32m 829\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m attach_logging_hooks:\n",
+ "\u001b[0;31mKeyboardInterrupt\u001b[0m: "
+ ]
+ }
+ ],
+ "source": [
+ "# destroy_rand_weights(model)\n",
+ "num_epochs = 10000\n",
+ "batch_size = 1<<9\n",
+ "train_err = []\n",
+ "for epoch in range(num_epochs):\n",
+ " model.train()\n",
+ " data, labels = mkbatch(batch_size)\n",
+ " outputs = model(data)\n",
+ " loss = loss_fn(outputs, labels)\n",
+ " optimizer.zero_grad()\n",
+ " loss.backward()\n",
+ " optimizer.step()\n",
+ " train_err.append(loss.item())\n",
+ " if epoch % 10 == 0:\n",
+ " print(f\"Epoch [{epoch}/{num_epochs}], Loss: {loss.item():.4f}\")"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 3,
+ "execution_state": "idle",
+ "id": "dcbdebf6-5c9f-4491-a442-9271d2ba5696",
+ "metadata": {},
+ "outputs": [
+ {
+ "ename": "NameError",
+ "evalue": "name 'plt' is not defined",
+ "output_type": "error",
+ "traceback": [
+ "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
+ "\u001b[0;31mNameError\u001b[0m Traceback (most recent call last)",
+ "Cell \u001b[0;32mIn[3], line 1\u001b[0m\n\u001b[0;32m----> 1\u001b[0m \u001b[43mplt\u001b[49m\u001b[38;5;241m.\u001b[39msuptitle(\u001b[38;5;124m'\u001b[39m\u001b[38;5;124mMSE vs Epochs\u001b[39m\u001b[38;5;124m'\u001b[39m)\n\u001b[1;32m 2\u001b[0m plt\u001b[38;5;241m.\u001b[39mplot(train_err, label\u001b[38;5;241m=\u001b[39m\u001b[38;5;124m'\u001b[39m\u001b[38;5;124mTrain\u001b[39m\u001b[38;5;124m'\u001b[39m, color\u001b[38;5;241m=\u001b[39m\u001b[38;5;124m'\u001b[39m\u001b[38;5;124mblue\u001b[39m\u001b[38;5;124m'\u001b[39m)\n\u001b[1;32m 3\u001b[0m plt\u001b[38;5;241m.\u001b[39mxlabel(\u001b[38;5;124m'\u001b[39m\u001b[38;5;124mEpochs\u001b[39m\u001b[38;5;124m'\u001b[39m)\n",
+ "\u001b[0;31mNameError\u001b[0m: name 'plt' is not defined"
+ ]
+ }
+ ],
+ "source": [
+ "plt.suptitle('MSE vs Epochs')\n",
+ "plt.plot(train_err, label='Train', color='blue')\n",
+ "plt.xlabel('Epochs')\n",
+ "plt.ylabel('MSE')\n",
+ "plt.show()"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 12,
+ "execution_state": "idle",
+ "id": "30893731-9991-4df9-b6c6-380010569ee1",
+ "metadata": {},
+ "outputs": [
+ {
+ "data": {
+ "image/png": "iVBORw0KGgoAAAANSUhEUgAAAvsAAAJOCAYAAAAphsiIAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjkuMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8hTgPZAAAACXBIWXMAAA9hAAAPYQGoP6dpAAB1LklEQVR4nO3de3zO9f/H8ec1s4Nl18xhh9pYyDmEtJKUZQ6J8JVaQqK+kVM5VY7JopJIpG+RoqNDqdAcCjnEkPOpnMJGxtbIzK7P74/s+nW1Ydd2fVzb5XG/3T63m+v9+Vzv6/X5tK3XXnt/Xh+LYRiGAAAAAHgcL3cHAAAAAMAcJPsAAACAhyLZBwAAADwUyT4AAADgoUj2AQAAAA9Fsg8AAAB4KJJ9AAAAwEOR7AMAAAAeimQfAAAA8FAk+0AejRw5UhaLRX/88YfL5uzatasqVKjgsvk8zcyZM2WxWHTw4EGXzVlUrvkPP/wgi8WiH374wT5W2GLPLUbkrmXLlurRo4e7wyiyTp06pYCAAH333XfuDgUockj2kS8WiyVPm7uTgCZNmqhmzZpujcFTnDx5Un379lXVqlXl7++vcuXK6fbbb9fgwYOVnp7u7vBcrkmTJg5fy8HBwWrQoIE++OAD2Ww2d4fnlLFjx2rBggXuDkM7duzQY489phtvvFG+vr4KDw9XXFycduzYUaB5r+X5rVmzRiNHjtSZM2fy/J6ffvpJ33//vQYPHmwfy/5F6csvv8z1PV27dtUNN9zgkpgvJz/n4i6lS5fWk08+qWHDhrk7FKDI8XZ3ACiaPvroI4fXs2bNUkJCQo7xatWqXePIYIaUlBTVr19faWlpeuKJJ1S1alWdOnVKW7du1dSpU/Xf//7X9MTEHW666SbFx8dLl37ZmTVrlrp37669e/fq1VdfvebxvPfee/n6RWPs2LHq0KGD2rZta0pceTFv3jw98sgjCg4OVvfu3RUVFaWDBw/q/fff15dffqlPP/1UDz30UL7mvpbnt2bNGo0aNUpdu3ZVUFBQnt7z2muvqWnTpqpUqZLp8TkjP+fiTk8//bQmTZqk5cuX67777nN3OECRQbKPfHnsscccXq9bt04JCQk5xv/t3LlzKlGihMnRwdXef/99HT58WD/99JPuvPNOh31paWny8fFxW2xmslqtDl/TTz31lKpUqaK3335bL7/8sooXL57jPTabTRcuXJCfn5/L48nt84qCX3/9VZ07d9bNN9+slStXqmzZsvZ9ffv21d13363OnTtr69atuvnmm90aq6udOHFC3377raZNm+buUIq8atWqqWbNmpo5cybJPuAElvHANNlLaBITE9W4cWOVKFFCL7zwgnRpGdDIkSNzvKdChQrq2rWrw9iZM2fUr18/RUREyNfXV5UqVdK4ceNctpRi69at6tq1q26++Wb5+fkpNDRUTzzxhE6dOpXr8X/88Yc6duyowMBAlS5dWn379tX58+dzHPfxxx+rXr168vf3V3BwsDp16qQjR45cNZ5PP/1U9erVU8mSJRUYGKhatWrprbfeuuzxmZmZCg4OVrdu3XLsS0tLk5+fn55//nn72OTJk1WjRg2VKFFCpUqVUv369TVnzpwrxvTrr7+qWLFiuuOOO3LsCwwMzJHYrl+/Xi1btlSpUqUUEBCgW2+91eEcnL3m/7Zo0SLdfffdCggIUMmSJdWqVatcl4IsWLBANWvWlJ+fn2rWrKn58+fnaf7LKVGihO644w6dPXtWJ0+elC59Lffu3VuzZ89WjRo15Ovrq8WLF0uSjh49qieeeEIhISHy9fVVjRo19MEHH+SY9/fff1fbtm0VEBCgcuXKqX///srIyMhxXG5r9m02m9566y3VqlVLfn5+Klu2rJo3b66NGzfa4zt79qw+/PBD+5Kkf36PuTrG3Lz22ms6d+6cpk+f7pDoS1KZMmX07rvv6uzZsxo/fvwVz1X/uHcm25XOL/vY3bt3X/F79uDBg7JYLJo5c2aOz/vnz6qRI0dq4MCBkqSoqCj7513pnpJvv/1WFy9eVExMTJ6u1dXk5Ws/L99fVzuX7K/rL774QtWrV5e/v7+io6O1bds2SdK7776rSpUqyc/PT02aNMlxDVatWqX//Oc/ioyMlK+vryIiItS/f3/99ddfDsdlL1f67bffFBsbq4CAAIWHh2v06NEyDCPH+d9///1auHBhrvsA5I7KPkx16tQptWjRQp06ddJjjz2mkJAQp95/7tw53XPPPTp69KieeuopRUZGas2aNRo6dKiOHz+uiRMnFjjGhIQE/fbbb+rWrZtCQ0O1Y8cOTZ8+XTt27NC6descEgtJ6tixoypUqKD4+HitW7dOkyZN0unTpzVr1iz7Ma+88oqGDRumjh076sknn9TJkyc1efJkNW7cWJs3b77sn8wTEhL0yCOPqGnTpho3bpwkadeuXfrpp5/Ut2/fXN9TvHhxPfTQQ5o3b57effddhyr7ggULlJGRoU6dOkmXloH06dNHHTp0sCc8W7du1fr16/Xoo49e9hqVL19eWVlZ+uijj9SlS5erXs8HHnhAYWFh6tu3r0JDQ7Vr1y5988039nNw9pr/U3YMsbGxGjdunM6dO6epU6eqUaNG2rx5sz1B/P7779W+fXtVr15d8fHxOnXqlLp166abbrrpivFfzW+//aZixYo5/Ddcvny5Pv/8c/Xu3VtlypRRhQoVlJycrDvuuMOeNJUtW1aLFi1S9+7dlZaWpn79+kmS/vrrLzVt2lSHDx9Wnz59FB4ero8++kjLly/PUzzdu3fXzJkz1aJFCz355JO6ePGiVq1apXXr1ql+/fr66KOP9OSTT+r2229Xz549JUkVK1aUpGsW48KFC1WhQgXdfffdue5v3LixKlSooG+//TZP8/3Tlc4vW16+Z/OiXbt22rt3rz755BO9+eabKlOmjCTl+AXmn9asWaPSpUurfPnyue7/888/c73pP7dfpPL6tZ+X76+8nMuqVav09ddfq1evXpKk+Ph4PfDAAxo0aJDeeecdPfPMMzp9+rTGjx+vJ554wuHr4YsvvtC5c+f03//+V6VLl9bPP/+syZMn6/fff9cXX3zhcF5ZWVlq3ry57rjjDo0fP16LFy/WiBEjdPHiRY0ePdrh2Hr16unNN9/Ujh07uB8LyCsDcIFevXoZ//5yuueeewxJxrRp03IcL8kYMWJEjvHy5csbXbp0sb9++eWXjYCAAGPv3r0Oxw0ZMsQoVqyYcfjw4SvGdc899xg1atS44jHnzp3LMfbJJ58YkoyVK1fax0aMGGFIMh588EGHY5955hlDkvHLL78YhmEYBw8eNIoVK2a88sorDsdt27bN8Pb2dhjv0qWLUb58efvrvn37GoGBgcbFixevGPO/LVmyxJBkLFy40GG8ZcuWxs0332x/3aZNm6tej9wkJSUZZcuWNSQZVatWNZ5++mljzpw5xpkzZxyOu3jxohEVFWWUL1/eOH36tMM+m81m/3der/mMGTMMScaBAwcMwzCMP//80wgKCjJ69OiRIz6r1eowXqdOHSMsLMwhxu+//96Q5HDNL+eee+4xqlatapw8edI4efKksWvXLqNPnz6GJKN169b24yQZXl5exo4dOxze3717dyMsLMz4448/HMY7depkWK1W+zWYOHGiIcn4/PPP7cecPXvWqFSpkiHJWLFihX38318vy5cvNyQZffr0yRH/P693QECAw/eVmTH+25kzZwxJRps2bS57jGEYxoMPPmhIMtLS0nI912zZ34f/dLnzy+v37IEDBwxJxowZM3LM8e+fVa+99prD1+TVNGrUyKhXr16O8RUrVhiSrrgFBATYj3fmaz+v319XOhdJhq+vr8O+d99915BkhIaG2v87GYZhDB06NMc8ucUQHx9vWCwW49ChQ/axLl26GJKMZ5991j5ms9mMVq1aGT4+PsbJkycd5lizZo0hyfjss89yzA8gdyzjgal8fX1zXV6SV1988YXuvvtulSpVSn/88Yd9i4mJUVZWllauXFngGP39/e3/Pn/+vP744w/7cpVNmzblOD67ypXt2WeflSR7S7h58+bJZrOpY8eODjGHhoaqcuXKWrFixWVjCQoK0tmzZ5WQkODUOdx3330qU6aMPvvsM/vY6dOnlZCQoIcffthh/t9//10bNmxwav6QkBD98ssvevrpp3X69GlNmzZNjz76qMqVK6eXX37Z/if1zZs368CBA+rXr1+Ov178s1rv7DXPlpCQoDNnzuiRRx5xuLbFihVTw4YN7df2+PHj2rJli7p06SKr1Wp///3336/q1avn+bx3796tsmXLqmzZsqpWrZomT56sVq1a5Vjmcs899zjMaxiG5s6dq9atW8swDIdYY2NjlZqaaj/P7777TmFhYerQoYP9/SVKlLBXqa9k7ty5slgsGjFiRI59V/rryLWM8c8//5QklSxZ8orHZe9PS0u76pzOutr3rJlOnTqlUqVKXXb/8OHDlZCQkGNr1qyZw3F5/dpXAb6//q1p06YOS6kaNmwoSWrfvr3Df8/s8d9++y3XGM6ePas//vhDd955pwzD0ObNm3N8Vu/eve3/zv5L04ULF7R06VKH47KvpStbIAOejmU8MNWNN95YoJs39+3bp61bt172z+QnTpwoQHR/S0lJ0ahRo/Tpp5/mmC81NTXH8ZUrV3Z4XbFiRXl5ednXrO7bt0+GYeQ4LtuVbrJ85pln9Pnnn6tFixa68cYb1axZM3Xs2FHNmze/4jl4e3urffv2mjNnjjIyMuTr66t58+YpMzPTIdkfPHiwli5dqttvv12VKlVSs2bN9Oijj+quu+664vySFBYWpqlTp+qdd97Rvn37tGTJEo0bN07Dhw9XWFiYnnzySf3666+SdNU/rzt7zbPt27dPuvTLTW4CAwMlSYcOHZJy+W8lSVWqVMlzwlOhQgW99957slgs8vPzU+XKlVWuXLkcx0VFRTm8PnnypM6cOaPp06dr+vTpuc6dfd6HDh1SpUqVciTnVapUuWp8v/76q8LDwxUcHJyn83FHjNlJYXbSfzl5/aUgP672PWu2K60vr1WrVq7r+T/++GOH13n92lcBvr/+LTIy0uF19i/OERERuY6fPn3aPnb48GENHz5cX3/9tcN4bjF4eXnluDH7lltukS7dT/FP2dfyar/MAvh/JPsw1T+rO3mRlZXl8Npms+n+++/XoEGDcj0++38IBdGxY0etWbNGAwcOVJ06dXTDDTfIZrOpefPmeboJ+N//07HZbLJYLFq0aJGKFSuW4/grtagsV66ctmzZoiVLlmjRokVatGiRZsyYoccff1wffvjhFePo1KmT3n33XS1atEht27bV559/rqpVq6p27dr2Y6pVq6Y9e/bom2++0eLFizV37ly98847Gj58uEaNGnXVc80+31tuuUW33HKLWrVqpcqVK2v27Nl68skn8/R+FeCaZ+/76KOPFBoammO/t7drf6QFBATk6cbKf3+dZ8f52GOPXfYeh1tvvdVFUebPtYrRarUqLCxMW7duveJxW7du1Y033mhPWi+XzP37Z0R+/HtuMz+rdOnSOZLd/HDma7+gP9Oy5fbz60rj2Yl4VlaW7r//fqWkpGjw4MGqWrWqAgICdPToUXXt2rVAzRWyr2X2PQYAro5kH25RqlSpHA9yuXDhgo4fP+4wVrFiRaWnp7usk8W/nT59WsuWLdOoUaM0fPhw+3h2FS03+/btc6jk7t+/Xzabzf7n7ooVK8owDEVFReXrlxEfHx+1bt1arVu3ls1m0zPPPKN3331Xw4YNu2Kf7saNGyssLEyfffaZGjVqpOXLl+vFF1/McVxAQIAefvhhPfzww7pw4YLatWunV155RUOHDnW6XeTNN9+sUqVK2f+7Zd8YuX379sv+N8vPNc+WPX+5cuWu+DWRfTNkbnPu2bPnqp9TUGXLllXJkiWVlZV11a/d8uXLa/v27TIMwyHpzEucFStW1JIlS5SSknLF6n5uyey1ilGSHnjgAb333ntavXq1GjVqlGP/qlWrdPDgQT311FP2sdx+Rugff7W52vn909W+Z7OXhvz78/LzWf9WtWpVzZ0716n35CavX/vOfH+ZVR3ftm2b9u7dqw8//FCPP/64ffxyyxNtNpt+++03h5+Xe/fulS79de2fDhw4IPEMF8AprNmHW1SsWDHHevvp06fnqKR17NhRa9eu1ZIlS3LMcebMGV28eLFAcWRXqP79Z/YrdfmZMmWKw+vJkydLklq0aCFd6thRrFgxjRo1Kse8hmFcsb3kv/d5eXnZq6tXa3Po5eWlDh06aOHChfroo4908eJFhyU8uc3v4+Oj6tWryzAMZWZmXnbu9evX6+zZsznGf/75Z506dcq+nOO2225TVFSUJk6cmCNxyr4W+bnm2WJjYxUYGKixY8fmGm92O8ywsDDVqVNHH374ocOSgYSEBO3cufOqn1NQxYoVU/v27TV37lxt3779snFKUsuWLXXs2DGHJ6lmt6m8mvbt28swjFz/KvPP6xsQEJDjv8e1ilGSBg4cKH9/fz311FM5vgZTUlL09NNPq0SJEvZWkLr0MyI1NdXhLwLHjx/PtX1qbuf3T1f7ng0MDFSZMmVy/Ex65513cv0s5fKLweVER0fr9OnTDuvZ8yOvX/vOfH85ey55lVsMhmFcsYXw22+/7XDs22+/reLFi6tp06YOxyUmJspqtapGjRoujRnwZFT24RZPPvmknn76abVv317333+/fvnlFy1ZsiTHn2YHDhyor7/+Wg888IC6du2qevXq6ezZs9q2bZu+/PJLHTx48Kp/zj158qTGjBmTYzwqKkpxcXFq3Lixxo8fr8zMTN144436/vvv7dWj3Bw4cEAPPvigmjdvrrVr1+rjjz/Wo48+al8uU7FiRY0ZM0ZDhw7VwYMH1bZtW5UsWVIHDhzQ/Pnz1bNnT4e+9/++LikpKbrvvvt000036dChQ5o8ebLq1KmTp0rWww8/rMmTJ2vEiBGqVatWjvc0a9ZMoaGhuuuuuxQSEqJdu3bp7bffVqtWra64Vvqjjz7S7Nmz9dBDD6levXry8fHRrl279MEHH8jPz8/+/AQvLy9NnTpVrVu3Vp06ddStWzeFhYVp9+7d2rFjh5YsWaLAwECnr3m2wMBATZ06VZ07d9Ztt92mTp06qWzZsjp8+LC+/fZb3XXXXfakIT4+Xq1atVKjRo30xBNPKCUlxf6MgfT09Kt+VkG9+uqrWrFihRo2bKgePXqoevXqSklJ0aZNm7R06VKlpKRIknr06KG3335bjz/+uBITExUWFqaPPvooTw+fu/fee9W5c2dNmjRJ+/btsy/TWLVqle699177TY/16tXT0qVLNWHCBIWHhysqKkoNGza8JjHq0pr5Dz/8UHFxcapVq1aOJ+j+8ccf+uSTTxxaZnbq1EmDBw/WQw89pD59+tjbTN5yyy057rm43Pllu9r3rC5977366qt68sknVb9+fa1cudJeXf73Z0nSiy++qE6dOql48eJq3bq1PXH+t1atWsnb21tLly7N0w3Nl5PXr31nvr+cPZe8qlq1qipWrKjnn39eR48eVWBgoObOnXvZ5Ux+fn5avHixunTpooYNG2rRokX69ttv9cILL+S4XyshIUGtW7dmzT7gDHe3A4JnuFzrzcu1eczKyjIGDx5slClTxihRooQRGxtr7N+/P0frTeNSy7mhQ4calSpVMnx8fIwyZcoYd955p/H6668bFy5cuGJc2e0/c9uaNm1qGIZh/P7778ZDDz1kBAUFGVar1fjPf/5jHDt2LEfLvew2fjt37jQ6dOhglCxZ0ihVqpTRu3dv46+//srx2XPnzjUaNWpkBAQEGAEBAUbVqlWNXr16GXv27LEf8+/2gl9++aXRrFkzo1y5coaPj48RGRlpPPXUU8bx48ev+t/AuNSyLiIiwpBkjBkzJsf+d99912jcuLFRunRpw9fX16hYsaIxcOBAIzU19Yrzbt261Rg4cKBx2223GcHBwYa3t7cRFhZm/Oc//zE2bdqU4/jVq1cb999/v1GyZEkjICDAuPXWW43Jkyfb9+f1mv+79Wa2FStWGLGxsYbVajX8/PyMihUrGl27djU2btzocNzcuXONatWqGb6+vkb16tWNefPmXbal47/lpW2rcalFYa9evXLdl5ycbPTq1cuIiIgwihcvboSGhhpNmzY1pk+f7nDcoUOHjAcffNAoUaKEUaZMGaNv377G4sWLr9p607jU7vS1114zqlatavj4+Bhly5Y1WrRoYSQmJtqP2b17t9G4cWPD39/fkOTwPebqGK9k69atxiOPPGKEhYXZP+uRRx4xtm3bluvx33//vVGzZk3Dx8fHqFKlivHxxx/n2nrzcufnzPfsuXPnjO7duxtWq9UoWbKk0bFjR+PEiRO5tgl++eWXjRtvvNHw8vLKUxvOBx980P7zJlt2680vvvgi1/d06dLFofXmP993ta/9vH5/Xelccvu6zm5R+tprr131XHbu3GnExMQYN9xwg1GmTBmjR48exi+//JKjxWn2ef76669Gs2bNjBIlShghISHGiBEjjKysLIfP2bVrlyHJWLp06RWvNwBHFoPH0AEAPNDIkSM1atQonTx50q03dK5atUpNmjTR7t27L9ul63rVtWtXffnll3n6a1u/fv20cuVKJSYmUtkHnMCafQAATHT33XerWbNmGj9+vLtDKbJOnTql//3vfxozZgyJPuAk1uwDAGCyRYsWuTuEIq106dLX5F4bwBNR2QcAAAA8FGv2AQAAAA9FZR8AAADwUCT7AAAAgIfiBt1Lj+o+duyYSpYsyV3+AADA4xiGoT///FPh4eHy8ioctd7z58/rwoULpn6Gj4+P/Pz8TP2Mwo5kX9KxY8cUERHh7jAAAABMdeTIEd10003uDkPnz59XVPkblHQiy9TPCQ0N1YEDB67rhJ9kX1LJkiUlSY3UUt4q7u5wAAAAXOqiMrVa39lzHne7cOGCkk5k6VBiBQWWNOcvDWl/2lS+3kFduHCBZP96l710x1vF5W0h2QcAAB7mUu/FwrZc+YaSFt1Q0pyYbCpc5+ouhWPRFgAAAACXo7IPAAAAt8gybMoy6YlPWYbNnImLGCr7AAAAgIeisg8AAAC3sMmQTeaU9s2at6ihsg8AAAB4KCr7AAAAcAubbDJrZb15MxctVPYBAAAAD0VlHwAAAG6RZRjKMsxZW2/WvEUNlX0AAADAQ1HZBwAAgFvQjcd8VPYBAAAAD0VlHwAAAG5hk6EsKvumorIPAAAAeCgq+wAAAHAL1uybj8o+AAAA4KGo7AMAAMAt6LNvPir7AAAAgIeisg8AAAC3sF3azJobVPYBAAAAj0VlHwAAAG6RZWKffbPmLWqo7AMAAAAeiso+AAAA3CLL+Hsza25Q2QcAAAA8FpV9AAAAuAXdeMxHZR8AAADwUFT2AQAA4BY2WZQli2lzg8o+AAAA4LGo7AMAAMAtbMbfm1lzg8o+AAAA4LGo7AMAAMAtskxcs2/WvEUNlX0AAADAQ1HZBwAAgFtQ2TcflX0AAADAQ1HZBwAAgFvYDItshkl99k2at6ihsg8AAAB4KCr7AAAAcAvW7JuPyj4AAADgoajsAwAAwC2y5KUsk2rPWabMWvRQ2QcAAAA8FJV9AAAAuIVhYjceg248EpV9AAAAwHOR7AMAAMAtsrvxmLU5Y+XKlWrdurXCw8NlsVi0YMGCHMfs2rVLDz74oKxWqwICAtSgQQMdPnzYvv/8+fPq1auXSpcurRtuuEHt27dXcnKyS65VfpHsAwAA4Lp39uxZ1a5dW1OmTMl1/6+//qpGjRqpatWq+uGHH7R161YNGzZMfn5+9mP69++vhQsX6osvvtCPP/6oY8eOqV27dtfwLHJizT4AAADcIsvwUpZhUjcew7njW7RooRYtWlx2/4svvqiWLVtq/Pjx9rGKFSva/52amqr3339fc+bM0X333SdJmjFjhqpVq6Z169bpjjvuyM9pFBiVfQAAAOAKbDabvv32W91yyy2KjY1VuXLl1LBhQ4elPomJicrMzFRMTIx9rGrVqoqMjNTatWvdFDnJPgAAANzEJots8jJp+3vNflpamsOWkZHhdJwnTpxQenq6Xn31VTVv3lzff/+9HnroIbVr104//vijJCkpKUk+Pj4KCgpyeG9ISIiSkpJcdMWcR7IPAAAAjxURESGr1Wrf4uPjnZ7DZrNJktq0aaP+/furTp06GjJkiB544AFNmzbNhKhdhzX7AAAAcIv8dM1xZm5JOnLkiAIDA+3jvr6+Ts9VpkwZeXt7q3r16g7j1apV0+rVqyVJoaGhunDhgs6cOeNQ3U9OTlZoaGgBzqRgqOwDAADAYwUGBjps+Un2fXx81KBBA+3Zs8dhfO/evSpfvrwkqV69eipevLiWLVtm379nzx4dPnxY0dHRLjiT/KGyDwAAALcwtxuPc+140tPTtX//fvvrAwcOaMuWLQoODlZkZKQGDhyohx9+WI0bN9a9996rxYsXa+HChfrhhx8kSVarVd27d9eAAQMUHByswMBAPfvss4qOjnZbJx6R7AMAAADSxo0bde+999pfDxgwQJLUpUsXzZw5Uw899JCmTZum+Ph49enTR1WqVNHcuXPVqFEj+3vefPNNeXl5qX379srIyFBsbKzeeecdt5xPNothOPlrjwdKS0uT1WpVE7WRt6W4u8MBAABwqYtGpn7QV0pNTXVYv+4u2bnX3F9uUUDJYqZ8xtk/s9S+9t5Cc87uwpp9AAAAwEOxjAcAAABuYZOXskyqPdt03S9ekdxd2V+5cqVat26t8PBwWSwWh6eQ/dvTTz8ti8WiiRMnOoynpKQoLi5OgYGBCgoKUvfu3ZWenn4NogcAAEBBZN+ga9YGNyf7Z8+eVe3atTVlypQrHjd//nytW7dO4eHhOfbFxcVpx44dSkhI0DfffKOVK1eqZ8+eJkYNAAAAFA1uXcbTokULtWjR4orHHD16VM8++6yWLFmiVq1aOezbtWuXFi9erA0bNqh+/fqSpMmTJ6tly5Z6/fXXc/3lAAAAAIWDTV6ysYzHVIX67xs2m02dO3fWwIEDVaNGjRz7165dq6CgIHuiL0kxMTHy8vLS+vXrr3G0AAAAQOFSqG/QHTdunLy9vdWnT59c9yclJalcuXIOY97e3goODlZSUtJl583IyFBGRob9dVpamgujBgAAQF5kGRZlGRbT5kYhruwnJibqrbfe0syZM2WxuPY/Vnx8vKxWq32LiIhw6fwAAABAYVBok/1Vq1bpxIkTioyMlLe3t7y9vXXo0CE999xzqlChgiQpNDRUJ06ccHjfxYsXlZKSotDQ0MvOPXToUKWmptq3I0eOmH4+AAAAcJR1qfWmWRsK8TKezp07KyYmxmEsNjZWnTt3Vrdu3SRJ0dHROnPmjBITE1WvXj1J0vLly2Wz2dSwYcPLzu3r6ytfX1+TzwAAAABwL7cm++np6dq/f7/99YEDB7RlyxYFBwcrMjJSpUuXdji+ePHiCg0NVZUqVSRJ1apVU/PmzdWjRw9NmzZNmZmZ6t27tzp16kQnHgAAgELOZnjJZlI/fJtBNx65exnPxo0bVbduXdWtW1eSNGDAANWtW1fDhw/P8xyzZ89W1apV1bRpU7Vs2VKNGjXS9OnTTYwaAAAAKBrcWtlv0qSJDCd+6zp48GCOseDgYM2ZM8fFkQEAAMBsZq6tz6LPvuTuyj4AAAAA8xTaG3QBAADg2Wwm9sO3mTJr0UNlHwAAAPBQVPYBAADgFjZ5yWZS7dmseYsargIAAADgoajsAwAAwC2yDC9lmdRn36x5ixquAgAAAOChqOwDAADALWyyyCazuvGYM29RQ2UfAAAA8FBU9gEAAOAWrNk3H1cBAAAA8FBU9gEAAOAWWfJSlkm1Z7PmLWq4CgAAAICHorIPAAAAt7AZFtkMk7rxmDRvUUNlHwAAAPBQVPYBAADgFjYT1+zbqGlLVPYBAAAAz0VlHwAAAG5hM7xkM6kfvlnzFjVcBQAAAMBDUdkHAACAW2TJoiyZ0zXHrHmLGir7AAAAgIeisg8AAAC3YM2++bgKAAAAgIeisg8AAAC3yDJxbX2WKbMWPVT2AQAAAA9FZR8AAABuwZp983EVAAAAAA9FZR8AAABukWV4KcukCrxZ8xY1XAUAAADAQ1HZBwAAgFsYsshmUjcegyfoSlT2AQAAAM9FZR8AAABuwZp983EVAAAAAA9Fsg8AAAC3sBkWUzdnrFy5Uq1bt1Z4eLgsFosWLFhw2WOffvppWSwWTZw40WE8JSVFcXFxCgwMVFBQkLp376709PR8Xx9XINkHAADAde/s2bOqXbu2pkyZcsXj5s+fr3Xr1ik8PDzHvri4OO3YsUMJCQn65ptvtHLlSvXs2dPEqK+ONfsAAABwiyx5Kcuk2rOz87Zo0UItWrS44jFHjx7Vs88+qyVLlqhVq1YO+3bt2qXFixdrw4YNql+/viRp8uTJatmypV5//fVcfzm4FqjsAwAAAFdhs9nUuXNnDRw4UDVq1Mixf+3atQoKCrIn+pIUExMjLy8vrV+//hpH+/+o7AMAAMAt8rO23pm5JSktLc1h3NfXV76+vk7PN27cOHl7e6tPnz657k9KSlK5cuUcxry9vRUcHKykpCSnP89VqOwDAADAY0VERMhqtdq3+Ph4p+dITEzUW2+9pZkzZ8piKVoP66KyDwAAALewyUs2k2rP2fMeOXJEgYGB9vH8VPVXrVqlEydOKDIy0j6WlZWl5557ThMnTtTBgwcVGhqqEydOOLzv4sWLSklJUWhoaIHOpSBI9gEAAOCxAgMDHZL9/OjcubNiYmIcxmJjY9W5c2d169ZNkhQdHa0zZ84oMTFR9erVkyQtX75cNptNDRs2LNDnFwTJPgAAANwiy7Aoy6Q1+87Om56erv3799tfHzhwQFu2bFFwcLAiIyNVunRph+OLFy+u0NBQValSRZJUrVo1NW/eXD169NC0adOUmZmp3r17q1OnTm7rxCPW7AMAAADSxo0bVbduXdWtW1eSNGDAANWtW1fDhw/P8xyzZ89W1apV1bRpU7Vs2VKNGjXS9OnTTYz66qjsAwAAwC2uRTeevGrSpIkMw8jz8QcPHswxFhwcrDlz5jj1uWajsg8AAAB4KCr7AAAAcAvD8JLNMKf2bJg0b1HDVQAAAAA8FJV9AAAAuEWWLMqSSd14TJq3qKGyDwAAAHgoKvsAAABwC5vhfNccZ+YGlX0AAADAY1HZBwAAgFvYTOzGY9a8RY1br8LKlSvVunVrhYeHy2KxaMGCBfZ9mZmZGjx4sGrVqqWAgACFh4fr8ccf17FjxxzmSElJUVxcnAIDAxUUFKTu3bsrPT3dDWcDAAAAFC5uTfbPnj2r2rVra8qUKTn2nTt3Tps2bdKwYcO0adMmzZs3T3v27NGDDz7ocFxcXJx27NihhIQEffPNN1q5cqV69ux5Dc8CAAAA+WGTxdQNbl7G06JFC7Vo0SLXfVarVQkJCQ5jb7/9tm6//XYdPnxYkZGR2rVrlxYvXqwNGzaofv36kqTJkyerZcuWev311xUeHn5NzgMAAADOyzIsyjLpBl2z5i1qitRiptTUVFksFgUFBUmS1q5dq6CgIHuiL0kxMTHy8vLS+vXr3RgpAAAA4H5F5gbd8+fPa/DgwXrkkUcUGBgoSUpKSlK5cuUcjvP29lZwcLCSkpIuO1dGRoYyMjLsr9PS0kyMHAAAALnhBl3zFYmrkJmZqY4dO8owDE2dOrXA88XHx8tqtdq3iIgIl8QJAAAAFCaFPtnPTvQPHTqkhIQEe1VfkkJDQ3XixAmH4y9evKiUlBSFhoZeds6hQ4cqNTXVvh05csTUcwAAAEBONllkM0zauEFXKuzLeLIT/X379mnFihUqXbq0w/7o6GidOXNGiYmJqlevniRp+fLlstlsatiw4WXn9fX1la+vr+nxAwAAAO7k1mQ/PT1d+/fvt78+cOCAtmzZouDgYIWFhalDhw7atGmTvvnmG2VlZdnX4QcHB8vHx0fVqlVT8+bN1aNHD02bNk2ZmZnq3bu3OnXqRCceAACAQs4wsUWmQWVfcneyv3HjRt1777321wMGDJAkdenSRSNHjtTXX38tSapTp47D+1asWKEmTZpIkmbPnq3evXuradOm8vLyUvv27TVp0qRreh4AAABAYeTWZL9JkyYyDOOy+6+0L1twcLDmzJnj4sgAAABgtuz19WbNjSJwgy4AAACA/CnUN+gCAADAc9Fn33xcBQAAAMBDUdkHAACAW7Bm33xU9gEAAAAPRWUfAAAAbmEzsc8+T9D9G5V9AAAAwENR2QcAAIBbsGbffFT2AQAAAA9FZR8AAABuQWXffFT2AQAAAA9FZR8AAABuQWXffFT2AQAAAA9FZR8AAABuQWXffFT2AQAAAA9FZR8AAABuYZj4pFvDlFmLHir7AAAAgIeisg8AAAC3YM2++ajsAwAAAB6Kyj4AAADcgsq++ajsAwAAAB6Kyj4AAADcgsq++ajsAwAAAB6Kyj4AAADcgsq++ajsAwAAAB6Kyj4AAADcwjAsMkyqwJs1b1FDZR8AAADwUCT7AAAAcAubLKZuzli5cqVat26t8PBwWSwWLViwwL4vMzNTgwcPVq1atRQQEKDw8HA9/vjjOnbsmMMcKSkpiouLU2BgoIKCgtS9e3elp6e77Hrlh1PJ/q5duzRixAjdd999qlixosLCwnTrrbeqS5cumjNnjjIyMsyLFAAAADDJ2bNnVbt2bU2ZMiXHvnPnzmnTpk0aNmyYNm3apHnz5mnPnj168MEHHY6Li4vTjh07lJCQoG+++UYrV65Uz549r+FZ5GQxDMO42kGbNm3SoEGDtHr1at111126/fbbFR4eLn9/f6WkpGj79u1atWqV0tLSNGjQIPXr10++vr7X5gxcIC0tTVarVU3URt6W4u4OBwAAwKUuGpn6QV8pNTVVgYGB7g7Hnns1XNBH3gHm5IwXz2ZofdtJ+Tpni8Wi+fPnq23btpc9ZsOGDbr99tt16NAhRUZGateuXapevbo2bNig+vXrS5IWL16sli1b6vfff1d4eHiBzyk/8nSDbvv27TVw4EB9+eWXCgoKuuxxa9eu1VtvvaU33nhDL7zwgivjBAAAAAqN1NRUWSwWe268du1aBQUF2RN9SYqJiZGXl5fWr1+vhx56yC1x5inZ37t3r4oXv3rFOzo6WtHR0crMzHRFbAAAAPBg16IbT1pamsO4r69vgVegnD9/XoMHD9Yjjzxi/6tBUlKSypUr53Cct7e3goODlZSUVKDPK4g8rdn/Z6L/22+/OXU8AAAA4C4RERGyWq32LT4+vkDzZWZmqmPHjjIMQ1OnTnVZnGZxus9+pUqVdM8996h79+7q0KGD/Pz8zIkMAAAAHu1aPEH3yJEjDmv2C1LVz070Dx06pOXLlzvMGxoaqhMnTjgcf/HiRaWkpCg0NDTfn1lQTrfe3LRpk2699VYNGDBAoaGheuqpp/Tzzz+bEx0AAABQAIGBgQ5bfpP97ER/3759Wrp0qUqXLu2wPzo6WmfOnFFiYqJ9bPny5bLZbGrYsGGBzyO/nE7269Spo7feekvHjh3TBx98oOPHj6tRo0aqWbOmJkyYoJMnT5oTKQAAADxK9pp9szZnpKena8uWLdqyZYsk6cCBA9qyZYsOHz6szMxMdejQQRs3btTs2bOVlZWlpKQkJSUl6cKFC5KkatWqqXnz5urRo4d+/vln/fTTT+rdu7c6derktk48KshDtby9vdWuXTt98cUXGjdunPbv36/nn39eERERevzxx3X8+HHXRgoAAACYZOPGjapbt67q1q0rSRowYIDq1q2r4cOH6+jRo/r666/1+++/q06dOgoLC7Nva9assc8xe/ZsVa1aVU2bNlXLli3VqFEjTZ8+3Y1nlY81+9k2btyoDz74QJ9++qkCAgL0/PPPq3v37vr99981atQotWnThuU9AAAAuCzDxDX7zlb2mzRpois9fioPj6ZScHCw5syZ49Tnms3pZH/ChAmaMWOG9uzZo5YtW2rWrFlq2bKlvLz+/iNBVFSUZs6cqQoVKpgRLwAAAIA8cjrZnzp1qp544gl17dpVYWFhuR5Trlw5vf/++66IDwAAAB7KkJSHgnm+50Y+kv19+/Zd9RgfHx916dIlvzEBAAAAcIE83aB7+PBhpyY9evRofuMBAADAdcImi6kb8pjsN2jQQE899ZQ2bNhw2WNSU1P13nvvqWbNmpo7d64rYwQAAACQD3laxrNz50698soruv/+++Xn56d69eopPDxcfn5+On36tHbu3KkdO3botttu0/jx49WyZUvzIwcAAECRlp9++M7MjTxW9kuXLq0JEybo+PHjevvtt1W5cmX98ccf9vX7cXFxSkxM1Nq1a0n0AQAAgELCqRt0/f391aFDB3Xo0MG8iAAAAHBdsBkWWUyqwJvVv7+oyfcTdAEAAAAUbvl+gi4AAABQEIZhYp99Gu1LVPYBAAAAz0VlHwAAAG5BNx7zOV3ZX7lypS5evJhj/OLFi1q5cqWr4gIAAABQQE4n+/fee69SUlJyjKempuree+91VVwAAADwcNmVfbM25CPZNwxDFkvOi3fq1CkFBAQ4NdfKlSvVunVrhYeHy2KxaMGCBTk+a/jw4QoLC5O/v79iYmLsvf2zpaSkKC4uToGBgQoKClL37t2Vnp7u7GkBAAAAHifPa/bbtWsnSbJYLOratat8fX3t+7KysrR161bdeeedTn342bNnVbt2bT3xxBP2+f9p/PjxmjRpkj788ENFRUVp2LBhio2N1c6dO+Xn5yddeqDX8ePHlZCQoMzMTHXr1k09e/bUnDlznIoFAAAA1xZ99s2X52TfarVKl6rtJUuWlL+/v32fj4+P7rjjDvXo0cOpD2/RooVatGiR6z7DMDRx4kS99NJLatOmjSRp1qxZCgkJ0YIFC9SpUyft2rVLixcv1oYNG1S/fn1J0uTJk9WyZUu9/vrrCg8PdyoeAAAAwJPkOdmfMWOGJKlChQp6/vnnnV6y46wDBw4oKSlJMTEx9jGr1aqGDRtq7dq16tSpk9auXaugoCB7oi9JMTEx8vLy0vr16/XQQw+ZGiMAAADyjz775nO69eaIESPMieRfkpKSJEkhISEO4yEhIfZ9SUlJKleunMN+b29vBQcH24/JTUZGhjIyMuyv09LSXBw9AAAA4H5OJ/tRUVG53qCb7bfffitoTKaLj4/XqFGj3B0GAADAde3vyr5ZffZNmbbIcTrZ79evn8PrzMxMbd68WYsXL9bAgQNdFlhoaKgkKTk5WWFhYfbx5ORk1alTx37MiRMnHN538eJFpaSk2N+fm6FDh2rAgAH212lpaYqIiHBZ7AAAALg6HqplPqeT/b59++Y6PmXKFG3cuNEVMUmX/oIQGhqqZcuW2ZP7tLQ0rV+/Xv/9738lSdHR0Tpz5owSExNVr149SdLy5ctls9nUsGHDy87t6+vr0E0IAAAA8ERO99m/nBYtWmju3LlOvSc9PV1btmzRli1bpEs35W7ZskWHDx+WxWJRv379NGbMGH399dfatm2bHn/8cYWHh6tt27aSpGrVqql58+bq0aOHfv75Z/3000/q3bu3OnXqRCceAACAQs4weUM+KvuX8+WXXyo4ONip92zcuNHhqbvZS2u6dOmimTNnatCgQTp79qx69uypM2fOqFGjRlq8eLG9x74kzZ49W71791bTpk3l5eWl9u3ba9KkSa46LQAAAKDIcjrZr1u3rsMNuoZhKCkpSSdPntQ777zj1FxNmjSRcYW7JywWi0aPHq3Ro0df9pjg4GAeoAUAAFAEsWbffE4n+9lLaLJ5eXmpbNmyatKkiapWrerK2AAAAAAUQKHtsw8AAAAPZ+biehbtS/lds5+VlaX58+dr165dkqTq1aurTZs28vZ22S0AAAAAAArI6ex8x44dat26tZKTk1WlShVJ0rhx41S2bFktXLhQNWvWNCNOAAAAeBoT1+yLNftSflpvPvnkk6pZs6Z+//13bdq0SZs2bdKRI0d06623qmfPnuZECQAAAMBpTlf2t2zZoo0bN6pUqVL2sVKlSumVV15RgwYNXB0fAAAAPJRh/L2ZNTfyUdm/5ZZblJycnGP8xIkTqlSpkqviAgAAAFBATlf24+Pj1adPH40cOVJ33HGHJGndunUaPXq0xo0bp7S0NPuxgYGBro0WAAAAHoM+++ZzOtl/4IEHJEkdO3a0P1wr+8FYrVu3tr+2WCzKyspybbQAAAAA8szpZH/FihXmRAIAAIDri2Exr2sOlX0pP8l+VFSUIiIi7FX9bIZh6MiRI4qMjHRlfAAAAADyyekbdKOionTy5Mkc4ykpKYqKinJVXAAAAPBw2d14zNqQj2Q/ez3+v6Wnp8vPz89VcQEAAAAooDwv4xkwYIAkyWKxaNiwYSpRooR9X1ZWltavX686deqYEyUAAAA8j3FpM2tu5D3Z37x5s3Spsr9t2zb5+PjY9/n4+Kh27dp6/vnnzYkSAAAAgNPynOxnd+Hp1q2b3nrrLXroAwAAoEDos28+p7vxzJgxw5xIAAAAALiU08n+fffdd8X9y5cvL0g8AAAAuJ6wtt5UTif7tWvXdnidmZmpLVu2aPv27erSpYsrYwMAAABQAE4n+2+++Wau4yNHjlR6erorYgIAAMB1gDX75nO6z/7lPPbYY/rggw9cNR0AAACAAnJZsr927VoeqgUAAIC8M0zenLBy5Uq1bt1a4eHhslgsWrBggWOohqHhw4crLCxM/v7+iomJ0b59+xyOSUlJUVxcnAIDAxUUFKTu3bu7feWL08t42rVr5/DaMAwdP35cGzdu1LBhw1wZGwAAAHBNnD17VrVr19YTTzyRI9+VpPHjx2vSpEn68MMPFRUVpWHDhik2NlY7d+60F7zj4uJ0/PhxJSQkKDMzU926dVPPnj01Z84cN5zR35xO9q1Wq8NrLy8vValSRaNHj1azZs1cGRsAAAA8muXSZtbcedeiRQu1aNEi132GYWjixIl66aWX1KZNG0nSrFmzFBISogULFqhTp07atWuXFi9erA0bNqh+/fqSpMmTJ6tly5Z6/fXXFR4e7oJzch599gEAAIArOHDggJKSkhQTE2Mfs1qtatiwodauXatOnTpp7dq1CgoKsif6khQTEyMvLy+tX79eDz30kFtidzrZ/+uvv5SQkKC9e/dKkqpUqaKYmBj5+/ubER8AAAA8VT7W1js1t6S0tDSHYV9fX/n6+jo1VVJSkiQpJCTEYTwkJMS+LykpSeXKlXPY7+3treDgYPsx7uBUsv/111/rySef1B9//OEwXqZMGb3//vtq3bq1q+MDAAAA8i0iIsLh9YgRIzRy5Ei3xXOt5bkbz5o1a9ShQwc1btxYP/30k1JSUpSSkqLVq1fr7rvvVocOHbRu3TpzowUAAIDnuAbdeI4cOaLU1FT7NnToUKfDDA0NlSQlJyc7jCcnJ9v3hYaG6sSJEw77L168qJSUFPsx7pDnZH/MmDHq1q2bvvzyS0VHRysoKEhBQUG68847NXfuXHXt2lWjR482N1oAAADACYGBgQ6bs0t4JCkqKkqhoaFatmyZfSwtLU3r169XdHS0JCk6OlpnzpxRYmKi/Zjly5fLZrOpYcOGLjob5+V5Gc+6des0bty4y+7v1auX7rnnHlfFBQAAAE9nWP7ezJrbCenp6dq/f7/99YEDB7RlyxYFBwcrMjJS/fr105gxY1S5cmV7683w8HC1bdtWklStWjU1b95cPXr00LRp05SZmanevXurU6dObuvEI2eS/b/++kuBgYGX3W+1WnX+/HlXxQUAAABcMxs3btS9995rfz1gwABJUpcuXTRz5kwNGjRIZ8+eVc+ePXXmzBk1atRIixcvdnio7OzZs9W7d281bdpUXl5eat++vSZNmuSW88mW52S/cuXKWr58ubp165br/mXLlqly5cqujA0AAAAezDD+3sya2xlNmjSRcYU3WSwWjR49+orL1oODg936AK3c5HnNfrdu3fT888/ru+++y7Hv22+/1aBBg9S1a1dXxwcAAAAgn/Jc2e/bt6/WrFmjBx54QFWqVFG1atVkGIZ27dqlffv2qW3bturXr5+50QIAAMBzXIM++9e7PFf2vby89MUXX+iTTz5RlSpVtHv3bu3Zs0dVq1bV7NmzNXfuXHl55Xk6AAAAACZz+gm6Dz/8sB5++GFzogEAAMD1oxB14/FUlOIBAAAAD+V0ZR8AAABwBYvx92bW3KCyDwAAAHgsKvsAAABwD7rxmI7KPgAAAFBI3HzzzTp16lSO8TNnzujmm292er48VfbbtWuX5wnnzZvndBAAAAC4DtGNJ4eDBw8qKysrx3hGRoaOHj3q9Hx5SvatVqv934ZhaP78+bJarapfv74kKTExUWfOnHHqlwIAAAAAf/v666/t/16yZIlD/p2VlaVly5apQoUKTs+bp2R/xowZ9n8PHjxYHTt21LRp01SsWDF7AM8884wCAwOdDgAAAADXKdbs27Vt21aSZLFY1KVLF4d9xYsXV4UKFfTGG284Pa/TN+h+8MEHWr16tT3Rl6RixYppwIABuvPOO/Xaa685HQQAAABwPbPZbJKkqKgobdiwQWXKlHHJvE7foHvx4kXt3r07x/ju3bvtQQIAAABXZZi8FUEHDhxwWaKv/FT2u3Xrpu7du+vXX3/V7bffLklav369Xn31VXXr1s1lgQEAAADXo2XLlmnZsmU6ceJEjmL6Bx984NRcTif7r7/+ukJDQ/XGG2/o+PHjkqSwsDANHDhQzz33nLPTAQAA4HrFmv0cRo0apdGjR6t+/foKCwuTxVKwrkJOJ/teXl4aNGiQBg0apLS0NEnixlwAAADABaZNm6aZM2eqc+fOLpkvXw/VunjxopYuXapPPvnE/tvGsWPHlJ6e7pKgAAAAcB3I7rNv1lYEXbhwQXfeeafL5nM62T906JBq1aqlNm3aqFevXjp58qQkady4cXr++eddFhgAAABwvXnyySc1Z84cl83n9DKevn37qn79+vrll19UunRp+/hDDz2kHj16uCwwAAAAeDaL8fdm1txF0fnz5zV9+nQtXbpUt956q4oXL+6wf8KECU7N53Rlf9WqVXrppZfk4+PjMF6hQoV8PcL3SrKysjRs2DBFRUXJ399fFStW1MsvvyzD+P//eoZhaPjw4QoLC5O/v79iYmK0b98+l8YBAAAAXAtbt25VnTp15OXlpe3bt2vz5s32bcuWLU7P53Rl32azKSsrK8f477//rpIlSzodwJWMGzdOU6dO1YcffqgaNWpo48aN6tatm6xWq/r06SNJGj9+vCZNmqQPP/xQUVFRGjZsmGJjY7Vz5075+fm5NB4AAAC4EN14clixYoVL53O6st+sWTNNnDjR/tpisSg9PV0jRoxQy5YtXRrcmjVr1KZNG7Vq1UoVKlRQhw4d1KxZM/3888/Spar+xIkT9dJLL6lNmza69dZbNWvWLB07dkwLFixwaSwAAABAUeN0Zf+NN95QbGysqlevrvPnz+vRRx/Vvn37VKZMGX3yyScuDe7OO+/U9OnTtXfvXt1yyy365ZdftHr1avtapQMHDigpKUkxMTH291itVjVs2FBr165Vp06dXBoPAAAAYKZ77733ir31ly9f7tR8Tif7N910k3755Rd99tln+uWXX5Senq7u3bsrLi5O/v7+zk53RUOGDFFaWpqqVq2qYsWKKSsrS6+88ori4uIkSUlJSZKkkJAQh/eFhITY9+UmIyNDGRkZ9tfZzwsAAAAA3KlOnToOrzMzM7VlyxZt375dXbp0cXo+p5P9lStX6s4771RcXJw96dal3vsrV65U48aNnQ7icj7//HPNnj1bc+bMUY0aNbRlyxb169dP4eHh+TrZbPHx8Ro1apTL4gQAAIDzLCZ2zSmaXfalN998M9fxkSNH5uuZVk6v2b/33nuVkpKSYzw1NVX33nuv0wFcycCBAzVkyBB16tRJtWrVUufOndW/f3/Fx8dLkkJDQyVJycnJDu9LTk6278vN0KFDlZqaat+OHDni0rgBAAAAV3rsscf0wQcfOP0+p5N9wzByXUd06tQpBQQEOB3AlZw7d05eXo4hFitWTDabTZIUFRWl0NBQLVu2zL4/LS1N69evV3R09GXn9fX1VWBgoMMGAACAa4wn6ObZ2rVr89VpMs/LeNq1aydd6r7TtWtX+fr62vdlZWVp69atLn20ryS1bt1ar7zyiiIjI1WjRg1t3rxZEyZM0BNPPGGPpV+/fhozZowqV65sb70ZHh6utm3bujQWAAAAuBitN3PIzrmzGYah48ePa+PGjRo2bJjT8+U52bdarfYPLFmypMPNuD4+Prrjjjtc/gTdyZMna9iwYXrmmWd04sQJhYeH66mnntLw4cPtxwwaNEhnz55Vz549debMGTVq1EiLFy+mxz4AAACKnOycO5uXl5eqVKmi0aNHq1mzZk7PZzH++TjaPBg1apQGDhyoEiVKOP1hhVVaWpqsVquaqI28LcXz8A4AAICi46KRqR/0lVJTUwvF8uXs3Kv82FfkZVKB1nb+vA698GKhOWd3cbobz+OPP66jR4+qcuXKDuP79u1T8eLFVaFCBVfGBwAAAFx3EhMTtWvXLklSjRo1VLdu3XzN4/QNul27dtWaNWtyjK9fv15du3bNVxAAAAC4/lgMc7ei6MSJE7rvvvvUoEED9enTR3369FG9evXUtGlTnTx50un5nE72N2/erLvuuivH+B133KEtW7Y4HQAAAACAvz377LP6888/tWPHDqWkpCglJUXbt29XWlqa+vTp4/R8Ti/jsVgs+vPPP3OMp6amKisry+kAAAAAcJ2iG08Oixcv1tKlS1WtWjX7WPXq1TVlypR83aDrdGW/cePGio+Pd0jss7KyFB8fr0aNGjkdAAAAAIC/2Ww2FS+es2FM8eLF7c+acobTlf1x48apcePGqlKliu6++25J0qpVq5SWlqbly5c7HQAAAACuU1T2c7jvvvvUt29fffLJJwoPD5ckHT16VP3791fTpk2dns/pyn716tW1detWdezYUSdOnNCff/6pxx9/XLt371bNmjWdDgAAAADA395++22lpaWpQoUKqlixoipWrKioqCilpaVp8uTJTs/ndGVfksLDwzV27Nj8vBUAAACQZG7XnKLajSciIkKbNm3S0qVLtXv3bklStWrVFBMTk6/58pTsb926VTVr1pSXl5e2bt16xWNvvfXWfAUCAAAAXK+WL1+u3r17a926dQoMDNT999+v+++/X7rUCKdGjRqaNm2afRl9XuUp2a9Tp46SkpJUrlw51alTRxaLRbk9eNdisdCRBwAAAHljWP7ezJq7CJk4caJ69OiR69N+rVarnnrqKU2YMMGcZP/AgQMqW7as/d8AAAAAXOeXX37RuHHjLru/WbNmev31152eN0/Jfvny5XP9NwAAAJBvdOOxS05OzrXlZjZvb+98PUE3T8n+119/necJH3zwQaeDAAAAAK5nN954o7Zv365KlSrlun/r1q0KCwtzet48Jftt27Z1eP3vNfsWy/+viWLNPgAAAPKCbjz/r2XLlho2bJiaN28uPz8/h31//fWXRowYoQceeMDpefPUZ99ms9m377//XnXq1NGiRYt05swZnTlzRt99951uu+02LV682OkAAAAAgOvdSy+9pJSUFN1yyy0aP368vvrqK3311VcaN26cqlSpopSUFL344otOz+v0Q7X69eunt956S7GxsQoMDFRgYKBiY2M1YcIE9enTx+kAAAAAcJ0yTN6ckJWVpWHDhikqKkr+/v6qWLGiXn75ZYfVLIZhaPjw4QoLC5O/v79iYmK0b98+l1yKkJAQrVmzRjVr1tTQoUP10EMP6aGHHtILL7ygmjVravXq1QoJCXF6XqcfqvXrr78qKCgox7jVatXBgwedDgAAAABwt3Hjxmnq1Kn68MMPVaNGDW3cuFHdunWT1Wq1F7THjx+vSZMm6cMPP1RUVJSGDRum2NhY7dy5M8fSm/woX768vvvuO50+fVr79++XYRiqXLmySpUqle85nU72GzRooAEDBuijjz6y/3aRnJysgQMH6vbbb893IAAAALjOmLhm39nK/po1a9SmTRu1atVKklShQgV98skn+vnnn/+ezjA0ceJEvfTSS2rTpo0kadasWQoJCdGCBQvUqVMnl4VeqlQpNWjQwCVzOb2M54MPPtDx48cVGRmpSpUqqVKlSoqMjNTRo0f1/vvvuyQoAAAA4Fq68847tWzZMu3du1e61Pd+9erVatGihXTpWVNJSUmKiYmxv8dqtaphw4Zau3at2+K+Gqcr+5UqVdLWrVuVkJCg3bt3S5KqVaummJgYh648AAAAwBVdgz77aWlpDsO+vr7y9fXNcfiQIUOUlpamqlWrqlixYsrKytIrr7yiuLg4SVJSUpJ0aW39P4WEhNj3FUZOJ/u61GqzWbNmaty4sXx9fUnyAQAAUChFREQ4vB4xYoRGjhyZ47jPP/9cs2fP1pw5c1SjRg1t2bJF/fr1U3h4uLp06XINI3Ytp5N9m82mV155RdOmTVNycrL27t2rm2++WcOGDVOFChXUvXt3cyIFAACAZ7kGlf0jR44oMDDQPpxbVV+SBg4cqCFDhtjX3teqVUuHDh1SfHy8unTpotDQUOnSvar/fLhVcnKy6tSpY9JJFJzTa/bHjBmjmTNnavz48fLx8bGP16xZU//73/9cHR8AAACQb9mt4rO3yyX7586dk5eXY2pcrFgx2Ww2SVJUVJRCQ0O1bNky+/60tDStX79e0dHRJp9F/jld2Z81a5amT5+upk2b6umnn7aP165d276GHwAAALiawvQE3datW+uVV15RZGSkatSooc2bN2vChAl64okn/p7PYlG/fv00ZswYVa5c2d56Mzw8XG3btjXnJFzA6WT/6NGjqlSpUo5xm82mzMxMV8UFAC5XrHSwu0NwkHUqxd0hAAAumTx5soYNG6ZnnnlGJ06cUHh4uJ566ikNHz7cfsygQYN09uxZ9ezZU2fOnFGjRo20ePFil/TYN4vTyX716tW1atUqlS9f3mH8yy+/VN26dV0ZGwAAAHBNlCxZUhMnTtTEiRMve4zFYtHo0aM1evToaxpbQTid7A8fPlxdunTR0aNHZbPZNG/ePO3Zs0ezZs3SN998Y06UAAAAAJzmdLLfpk0bLVy4UKNHj1ZAQICGDx+u2267TQsXLtT9999vTpQA4AIsmwGAQuYadOO53jmV7F+8eFFjx47VE088oYSEBPOiAgAAAFBgTrXe9Pb21vjx43Xx4kXzIgIAAMB1Ibsbj1kb8tFnv2nTpvrxxx/NiQYAAACAyzi9Zr9FixYaMmSItm3bpnr16ikgIMBh/4MPPujK+AAAAODJqMCbyulk/5lnnpEkTZgwIcc+i8WirKws10QGAAAAoECcTvazHxkMAAAAFAjdeEznVLJ/8OBBJSQkKDMzU/fcc49q1KhhXmQAAAAACiTPyf6KFSv0wAMP6K+//vr7jd7e+uCDD/TYY4+ZGR8AAAA8lJldc+jG87c8d+MZNmyY7r//fh09elSnTp1Sjx49NGjQIHOjAwAAAJBveU72t2/frrFjxyosLEylSpXSa6+9phMnTujUqVPmRggAAADPZJi8Ie/JflpamsqUKWN/XaJECfn7+ys1NdWs2AAAAAAUgFM36C5ZskRWq9X+2mazadmyZdq+fbt9jD77AAAAyAvW7JvPqWS/S5cuOcaeeuop+7/psw8AAAAUHnlO9umvDwAAAJeiz77p8rxmHwAAAEDR4vQTdAEAAACXoLJvOir7AAAAgIeisg8AAAC3oBuP+ajsAwAAAB6Kyj4AAADcgzX7pstTsl+qVClZLJY8TZiSklLQmAAAAAC4QJ6S/YkTJ9r/ferUKY0ZM0axsbGKjo6WJK1du1ZLlizRsGHDzIsUAAAAnoXKvunylOz/88m57du31+jRo9W7d2/7WJ8+ffT2229r6dKl6t+/vzmRAgAAAHCK0zfoLlmyRM2bN88x3rx5cy1dutRVcQEAAMDDZXfjMWtDPpL90qVL66uvvsox/tVXX6l06dKuigsAAABAATmd7I8aNUqDBw9W69atNWbMGI0ZM0atW7fWkCFDNGrUKJcHePToUT322GMqXbq0/P39VatWLW3cuNG+3zAMDR8+XGFhYfL391dMTIz27dvn8jgAAADgYobJG5xP9rt27aqffvpJgYGBmjdvnubNm6fAwECtXr1aXbt2dWlwp0+f1l133aXixYtr0aJF2rlzp9544w2VKlXKfsz48eM1adIkTZs2TevXr1dAQIBiY2N1/vx5l8YCAAAAFDX56rPfsGFDzZ492/XR/Mu4ceMUERGhGTNm2MeioqLs/zYMQxMnTtRLL72kNm3aSJJmzZqlkJAQLViwQJ06dTI9RgCXV6xSVB6Ounay9h9wdwgAgH/gCbrmy9cTdH/99Ve99NJLevTRR3XixAlJ0qJFi7Rjxw6XBvf111+rfv36+s9//qNy5cqpbt26eu+99+z7Dxw4oKSkJMXExNjHrFarGjZsqLVr11523oyMDKWlpTlsAAAAgKdxOtn/8ccfVatWLa1fv15z585Venq6JOmXX37RiBEjXBrcb7/9pqlTp6py5cpasmSJ/vvf/6pPnz768MMPJUlJSUmSpJCQEIf3hYSE2PflJj4+Xlar1b5FRES4NG4AAADkAWv2Ted0sj9kyBCNGTNGCQkJ8vHxsY/fd999WrdunUuDs9lsuu222zR27FjVrVtXPXv2VI8ePTRt2rQCzTt06FClpqbatyNHjrgsZgAAAKCwcHrN/rZt2zRnzpwc4+XKldMff/zhqrgkSWFhYapevbrDWLVq1TR37lxJUmhoqCQpOTlZYWFh9mOSk5NVp06dy87r6+srX19fl8YKICfWyAMArogn6JrO6cp+UFCQjh8/nmN88+bNuvHGG10VlyTprrvu0p49exzG9u7dq/Lly0uXbtYNDQ3VsmXL7PvT0tK0fv16RUdHuzQWAAAAoKhxOtnv1KmTBg8erKSkJFksFtlsNv300096/vnn9fjjj7s0uP79+2vdunUaO3as9u/frzlz5mj69Onq1auXJMlisahfv34aM2aMvv76a23btk2PP/64wsPD1bZtW5fGAgAAANeymLwhH8t4xo4dq169eikiIkJZWVmqXr26srKy9Oijj+qll15yaXANGjTQ/PnzNXToUI0ePVpRUVGaOHGi4uLi7McMGjRIZ8+eVc+ePXXmzBk1atRIixcvlp+fn0tjAQAAAIoai2EY+VrRdPjwYW3fvl3p6emqW7euKleu7ProrpG0tDRZrVY1URt5W4q7OxwAAACXumhk6gd9pdTUVAUGBro7HHvuVf2/Y1XM15wCbVbGee2c+kKhOWd3yddDtSQpMjJSkZGRro0GAAAA1w0eqmW+PCX7AwYMyPOEEyZMKEg8AAAAAFwkT8n+5s2b8zSZxcKtEAAAAMgjWm+aLk/J/ooVK8yPBAAAAIBL5XvNPgAAAFBgVOBNla9kf+PGjfr88891+PBhXbhwwWHfvHnzXBUbAAAAgAJw+qFan376qe68807t2rVL8+fPV2Zmpnbs2KHly5fLarWaEyUAAAA8TnY3HrM25CPZHzt2rN58800tXLhQPj4+euutt7R792517NiRVpwAAABAIeJ0sv/rr7+qVatWkiQfHx+dPXtWFotF/fv31/Tp082IEQAAAJ7IMHlz0tGjR/XYY4+pdOnS8vf3V61atbRx48b/D9cwNHz4cIWFhcnf318xMTHat2+fa6+Jizmd7JcqVUp//vmnJOnGG2/U9u3bJUlnzpzRuXPnXB8hAAAAYLLTp0/rrrvuUvHixbVo0SLt3LlTb7zxhkqVKmU/Zvz48Zo0aZKmTZum9evXKyAgQLGxsTp//rxbY78Sp2/Qbdy4sRISElSrVi395z//Ud++fbV8+XIlJCSoadOm5kQJAAAAj1OYnqA7btw4RUREaMaMGfaxqKgo+78Nw9DEiRP10ksvqU2bNpKkWbNmKSQkRAsWLFCnTp1cF7wLOV3Zf/vtt+0n8+KLL2rAgAFKTk5W+/bt9f7775sRIwAAAGCqr7/+WvXr19d//vMflStXTnXr1tV7771n33/gwAElJSUpJibGPma1WtWwYUOtXbvWTVFfndOV/eDgYPu/vby8NGTIEFfHBAAAgOvBNXiCblpamsOwr6+vfH19cxz+22+/aerUqRowYIBeeOEFbdiwQX369JGPj4+6dOmipKQkSVJISIjD+0JCQuz7CiOnK/vfffedlixZkmP8+++/16JFi1wVFwAAAFBgERERslqt9i0+Pj7X42w2m2677TaNHTtWdevWVc+ePdWjRw9NmzbtmsfsSk4n+0OGDFFWVlaOcZvNRpUfAAAAeXYt+uwfOXJEqamp9m3o0KG5xhIWFqbq1as7jFWrVk2HDx+WJIWGhkqSkpOTHY5JTk627yuMnE729+3bl+NCSFLVqlW1f/9+V8UFAAAAFFhgYKDDltsSHkm66667tGfPHoexvXv3qnz58tKlm3VDQ0O1bNky+/60tDStX79e0dHRJp9F/jmd7FutVv322285xvfv36+AgABXxQUAAABPV4j67Pfv31/r1q3T2LFjtX//fs2ZM0fTp09Xr169JEkWi0X9+vXTmDFj9PXXX2vbtm16/PHHFR4errZt25pzfVzA6WS/TZs26tevn3799Vf72P79+/Xcc8/pwQcfdHV8AAAAgOkaNGig+fPn65NPPlHNmjX18ssva+LEiYqLi7MfM2jQID377LPq2bOnGjRooPT0dC1evFh+fn5ujf1KLIZhOPV7T2pqqpo3b66NGzfqpptukiT9/vvvuvvuuzVv3jwFBQWZFatp0tLSZLVa1URt5G0p7u5wAAAAXOqikakf9JVSU1MVGBjo7nDsudetXceqmI85iXLWhfPaOvOFQnPO7uJ0602r1ao1a9YoISFBv/zyi/z9/XXrrbeqcePG5kQIAAAAIF+cTvZ1ac1Ss2bN1KxZM9dHBAAAgOtCYXqCrqfK85r9tWvX6ptvvnEYmzVrlqKiolSuXDn17NlTGRkZZsQIAAAAIB/ynOyPHj1aO3bssL/etm2bunfvrpiYGA0ZMkQLFy687EMKAAAAgBwKUTceT5XnZTxbtmzRyy+/bH/96aefqmHDhnrvvfekS08nGzFihEaOHGlOpADyJO2RO9wdgl3gJ+vcHQIAANe1PCf7p0+fVkhIiP31jz/+qBYtWthfN2jQQEeOHHF9hAAAAPBIFsOQxbnGkE7NDSeW8YSEhOjAgQOSpAsXLmjTpk26447/ryD++eefKl6ctpUAAABAYZHnyn7Lli01ZMgQjRs3TgsWLFCJEiV099132/dv3bpVFStWNCtOAHnE0hkAQJFh5tp6CvuSM8n+yy+/rHbt2umee+7RDTfcoA8//FA+Pj72/R988AGtOAEAAIBCJM/JfpkyZbRy5UqlpqbqhhtuULFixRz2f/HFF7rhhhvMiBEAAAAeiD775svXE3RzExwc7Ip4AAAAALhIvp6gCwAAABQYa/ZNl+duPAAAAACKFir7AAAAcAvW7JuPyj4AAADgoajsAwAAwD1Ys286KvsAAACAh6KyDwAAALdgzb75qOwDAAAAHorKPgAAANyDNfumo7IPAAAAeCgq+wAAAHAb1tabi8o+AAAA4KGo7AMAAMA9DOPvzay5QWUfAAAA8FRU9gEAAOAW9Nk3H5V9AAAAwENR2QcAAIB70GffdFT2AQAAAA9FZR8AAABuYbH9vZk1N6jsAwAAAB6Lyj4AAADcgzX7pqOyDwAAAHioIpXsv/rqq7JYLOrXr5997Pz58+rVq5dKly6tG264Qe3bt1dycrJb4wQAAMDVZffZN2tDEUr2N2zYoHfffVe33nqrw3j//v21cOFCffHFF/rxxx917NgxtWvXzm1xAgAAAIVFkUj209PTFRcXp/fee0+lSpWyj6empur999/XhAkTdN9996levXqaMWOG1qxZo3Xr1rk1ZgAAAFyFYZi7oWjcoNurVy+1atVKMTExGjNmjH08MTFRmZmZiomJsY9VrVpVkZGRWrt2re64445c58vIyFBGRob9dVpamslnAE925Mua7g7BQUSH7e4OAQAAFBKFPtn/9NNPtWnTJm3YsCHHvqSkJPn4+CgoKMhhPCQkRElJSZedMz4+XqNGjTIlXgAAAOSNmWvrWbP/t0K9jOfIkSPq27evZs+eLT8/P5fNO3ToUKWmptq3I0eOuGxuAAAAoLAo1JX9xMREnThxQrfddpt9LCsrSytXrtTbb7+tJUuW6MKFCzpz5oxDdT85OVmhoaGXndfX11e+vr6mx4/rA8tmAADIJ/rsm65QJ/tNmzbVtm3bHMa6deumqlWravDgwYqIiFDx4sW1bNkytW/fXpK0Z88eHT58WNHR0W6KGgAAACgcCnWyX7JkSdWs6XjzY0BAgEqXLm0f7969uwYMGKDg4GAFBgbq2WefVXR09GVvzgUAAEDhwJp98xXqZD8v3nzzTXl5eal9+/bKyMhQbGys3nnnHXeHBQAAALhdkUv2f/jhB4fXfn5+mjJliqZMmeK2mAAAAJAPZvbDp8++VNi78QAAAADIvyJX2QcAAIBnYM2++ajsAwAAAP/w6quvymKxqF+/fvax8+fPq1evXipdurRuuOEGtW/fXsnJyW6NMy9I9gEAAOAehslbPmzYsEHvvvuubr31Vofx/v37a+HChfriiy/0448/6tixY2rXrp1rroOJSPYBAADgFtnLeMzanJWenq64uDi99957KlWqlH08NTVV77//viZMmKD77rtP9erV04wZM7RmzRqtW7fOtRfFxUj2AQAAAEm9evVSq1atFBMT4zCemJiozMxMh/GqVasqMjJSa9eudUOkeccNugAAAHAPm/H3ZtbcktLS0hyGfX195evrm+PwTz/9VJs2bdKGDRty7EtKSpKPj4+CgoIcxkNCQpSUlOTy0F2Jyj4AAAA8VkREhKxWq32Lj4/PccyRI0fUt29fzZ49W35+fm6J0yxU9gEAAOAeBbiRNk9zX0rkAwMD7cO5VfUTExN14sQJ3XbbbfaxrKwsrVy5Um+//baWLFmiCxcu6MyZMw7V/eTkZIWGhpp0Aq5Bsg8AAACPFRgY6JDs56Zp06batm2bw1i3bt1UtWpVDR48WBERESpevLiWLVum9u3bS5L27Nmjw4cPKzo62tT4C4pkHwAAAG5hMfHhVxYnji1ZsqRq1qzpMBYQEKDSpUvbx7t3764BAwYoODhYgYGBevbZZxUdHa077rjDxZG7Fsk+AAAAcBVvvvmmvLy81L59e2VkZCg2NlbvvPOOu8O6KpJ9AAAAuIdh/L2ZNXcB/PDDDw6v/fz8NGXKFE2ZMqWAgV1bdOMBAAAAPBSVfQAAALhFfp90m9e5QWUfAAAA8FhU9gEAAOAe16DP/vWOyj4AAADgoajsAwAAwC0shiGLSd14zJq3qKGyDwAAAHgoKvsAAABwD9ulzay5QWUfAAAA8FRU9gEAAOAWrNk3H8k+ipx6mwvX3+US6/IHMgAAUDiR7AMAAMA96LNvOkqSAAAAgIeiso8ih2UzAAB4CMP4ezNrblDZBwAAADwVlX0AAAC4hcX4ezNrblDZBwAAADwWlX0AAAC4B2v2TUdlHwAAAPBQVPYBAADgFhbb35tZc4PKPgAAAOCxqOwDAADAPVizbzoq+wAAAICHorIPAAAA9zAubWbNDSr7AAAAgKeisg8AAAC3sBiGLCatrTdr3qKGyj4AAADgoajsAwAAwD3oxmM6KvsAAACAh6KyDwAAAPcwJJn1pFsK+xKVfQAAAMBzUdkHAACAW9CNx3xU9gEAAAAPRWUfAAAA7mGY2DWHwr5EZR8AAADwXFT2AQAA4B702TcdlX0AAADAQ1HZBwAAgHvYJFlMnBtU9gEAAABPVaiT/fj4eDVo0EAlS5ZUuXLl1LZtW+3Zs8fhmPPnz6tXr14qXbq0brjhBrVv317JycluixkAAAB5k91n36wNhTzZ//HHH9WrVy+tW7dOCQkJyszMVLNmzXT27Fn7Mf3799fChQv1xRdf6Mcff9SxY8fUrl07t8YNAAAAFAaFes3+4sWLHV7PnDlT5cqVU2Jioho3bqzU1FS9//77mjNnju677z5J0owZM1StWjWtW7dOd9xxh5siBwAAwFXRjcd0hTrZ/7fU1FRJUnBwsCQpMTFRmZmZiomJsR9TtWpVRUZGau3atZdN9jMyMpSRkWF/nZaWZnrsRdmSY1vcHYKD2PA67g4BAACgSCjUy3j+yWazqV+/frrrrrtUs2ZNSVJSUpJ8fHwUFBTkcGxISIiSkpIuO1d8fLysVqt9i4iIMD1+AAAA/Et2Zd+sDUUn2e/Vq5e2b9+uTz/9tMBzDR06VKmpqfbtyJEjLokRAAAAKEyKxDKe3r1765tvvtHKlSt100032cdDQ0N14cIFnTlzxqG6n5ycrNDQ0MvO5+vrK19fX9Pj9hQsmwEAAKZgzb7pCnVl3zAM9e7dW/Pnz9fy5csVFRXlsL9evXoqXry4li1bZh/bs2ePDh8+rOjoaDdEDAAAgKLIU1u+F+pkv1evXvr44481Z84clSxZUklJSUpKStJff/0lSbJarerevbsGDBigFStWKDExUd26dVN0dDSdeAAAAAo7m8mbEzy15bvFMArv3zgsltyfnzxjxgx17dpVuvQb1nPPPadPPvlEGRkZio2N1TvvvHPFZTz/lpaWJqvVqiZqI29LcZfFDwAAUBhcNDL1g75SamqqAgMD3R2OPfdqWuU5eRczZ2n1xawMLdvzRr7P+eTJkypXrpx+/PFHe8v3smXLas6cOerQoYMkaffu3apWrdoVu0C6W6Fes5+X30P8/Pw0ZcoUTZky5ZrEBAAAANcw80m3BZ3XVS3f3a1QJ/sAAABAQfz7eUp5adTiypbv7lao1+wDAADAg12DPvsREREOz1eKj4+/aliubPnublT2AQAA4LGOHDnisGb/alV9V7d8dzcq+wAAAHAPm2HuJikwMNBhu1yy76kt36nsAwAA4LrXq1cvzZkzR1999ZW95bsutXr39/d3aPkeHByswMBAPfvss4W+5TvJPgAAANyjED1Bd+rUqZKkJk2aOIz/s+X7m2++KS8vL7Vv396h5XthRrIPAAAANzEx2Zdz83pqy3fW7AMAAAAeiso+AAAA3KMQLePxVFT2AQAAAA9FZR8AAADuYTOcXlvv3Nygsg8AAAB4KCr7AAAAcA/D9vdm1tygsg8AAAB4Kir7AAAAcA+68ZiOyj4AAADgoajsAwAAwD3oxmM6KvsAAACAh6KyDwAAAPdgzb7pqOwDAAAAHorKPgAAANzDMLECT2FforIPAAAAeC4q+wAAAHAP1uybjmS/kDq+oJq7Q7ALa7vL3SEAAAAgH0j2AQAA4B42mySbiXODNfsAAACAh6KyX0ixdAYAAHg81uybjso+AAAA4KGo7AMAAMA9qOybjso+AAAA4KGo7AMAAMA9bIZ5j7q1UdkXlX0AAADAc1HZBwAAgFsYhk2GYU4/fLPmLWqo7AMAAAAeiso+AAAA3MMwzFtbTzceico+AAAA4Lmo7AMAAMA9DBO78VDZl6jsAwAAAJ6Lyj4AAADcw2aTLCZ1zaEbj0RlHwAAAPBcVPYBAADgHqzZNx2VfQAAAMBDUdkHAACAWxg2mwyT1uzzBN2/UdkHAAAAPBSVfQAAALgHa/ZNR2UfAAAA8FBU9gEAAOAeNkOyUNk3E5V9AAAAwENR2QcAAIB7GIYks56gS2VfVPYBAAAAz0VlHwAAAG5h2AwZJq3ZN6jsS1T2AQAAAM/lMcn+lClTVKFCBfn5+alhw4b6+eef3R0SAAAArsSwmbvBM5L9zz77TAMGDNCIESO0adMm1a5dW7GxsTpx4oS7QwMAAEAR4mkFZI9I9idMmKAePXqoW7duql69uqZNm6YSJUrogw8+cHdoAAAAuAzDZpi6OcsTC8hFPtm/cOGCEhMTFRMTYx/z8vJSTEyM1q5dm+t7MjIylJaW5rABAADg+uaJBeQin+z/8ccfysrKUkhIiMN4SEiIkpKScn1PfHy8rFarfYuIiLhG0QIAAMCuEK3Zz08BuSi4LltvDh06VAMGDLC/Tk1NVWRkpC4qU6JLEwAA8DAXlSkVwnaUZuZe2ef87xUcvr6+8vX1zXH8lQrIu3fvNifIa6DIJ/tlypRRsWLFlJyc7DCenJys0NDQXN/z7//I2V8Eq/WdydECAAC4z59//imr1eruMOTj46PQ0FCtTjI397rhhhtyrOAYMWKERo4caernFiZFPtn38fFRvXr1tGzZMrVt21aSZLPZtGzZMvXu3TtPc4SHh+vIkSMqWbKkLBZLvmNJS0tTRESEjhw5osDAwHzP44m4NpfHtbkyrs/lcW2ujOtzeVybK/PE62MYhv7880+Fh4e7OxRJkp+fnw4cOKALFy6Y+jmGYeTI7XKr6iufBeSioMgn+5I0YMAAdenSRfXr19ftt9+uiRMn6uzZs+rWrVue3u/l5aWbbrrJZfEEBgZ6zA8HV+PaXB7X5sq4PpfHtbkyrs/lcW2uzNOuT2Go6P+Tn5+f/Pz83B2GnSsKyIWRRyT7Dz/8sE6ePKnhw4crKSlJderU0eLFi3OsuQIAAAAup6AF5MLII5J9Serdu3eR/q0LAAAA7uWJBWSPSfYLA19fX40YMeKya8GuZ1yby+PaXBnX5/K4NlfG9bk8rs2VcX2ub55WQLYYha0HEwAAAACXKPIP1QIAAACQO5J9AAAAwEOR7AMAAAAeimTfRaZMmaIKFSrIz89PDRs21M8//+zukAqF+Ph4NWjQQCVLllS5cuXUtm1b7dmzx91hFUqvvvqqLBaL+vXr5+5QCoWjR4/qscceU+nSpeXv769atWpp48aN7g6rUMjKytKwYcMUFRUlf39/VaxYUS+//LKux1uwVq5cqdatWys8PFwWi0ULFixw2G8YhoYPH66wsDD5+/srJiZG+/btc1u819qVrk9mZqYGDx6sWrVqKSAgQOHh4Xr88cd17Ngxt8Z8rVzta+efnn76aVksFk2cOPGaxgi4Asm+C3z22WcaMGCARowYoU2bNql27dqKjY3ViRMn3B2a2/3444/q1auX1q1bp4SEBGVmZqpZs2Y6e/asu0MrVDZs2KB3331Xt956q7tDKRROnz6tu+66S8WLF9eiRYu0c+dOvfHGGypVqpS7QysUxo0bp6lTp+rtt9/Wrl27NG7cOI0fP16TJ092d2jX3NmzZ1W7dm1NmTIl1/3jx4/XpEmTNG3aNK1fv14BAQGKjY3V+fPnr3ms7nCl63Pu3Dlt2rRJw4YN06ZNmzRv3jzt2bNHDz74oFtivdau9rWTbf78+Vq3bl2hefIs4DQDBXb77bcbvXr1sr/OysoywsPDjfj4eLfGVRidOHHCkGT8+OOP7g6l0Pjzzz+NypUrGwkJCcY999xj9O3b190hud3gwYONRo0auTuMQqtVq1bGE0884TDWrl07Iy4uzm0xFQaSjPnz59tf22w2IzQ01HjttdfsY2fOnDF8fX2NTz75xE1Rus+/r09ufv75Z0OScejQoWsWV2FwuWvz+++/GzfeeKOxfft2o3z58sabb77plviAgqCyX0AXLlxQYmKiYmJi7GNeXl6KiYnR2rVr3RpbYZSamipJCg4OdncohUavXr3UqlUrh6+h693XX3+t+vXr6z//+Y/KlSununXr6r333nN3WIXGnXfeqWXLlmnv3r2SpF9++UWrV69WixYt3B1aoXLgwAElJSU5fG9ZrVY1bNiQn8+XkZqaKovFoqCgIHeH4nY2m02dO3fWwIEDVaNGDXeHA+QbD9UqoD/++ENZWVk5nqwWEhKi3bt3uy2uwshms6lfv3666667VLNmTXeHUyh8+umn2rRpkzZs2ODuUAqV3377TVOnTtWAAQP0wgsvaMOGDerTp498fHzUpUsXd4fndkOGDFFaWpqqVq2qYsWKKSsrS6+88ori4uLcHVqhkpSUJF36efxPISEh9n34f+fPn9fgwYP1yCOPKDAw0N3huN24cePk7e2tPn36uDsUoEBI9nHN9OrVS9u3b9fq1avdHUqhcOTIEfXt21cJCQny8/NzdziFis1mU/369TV27FhJUt26dbV9+3ZNmzaNZF/S559/rtmzZ2vOnDmqUaOGtmzZon79+ik8PJzrg3zJzMxUx44dZRiGpk6d6u5w3C4xMVFvvfWWNm3aJIvF4u5wgAJhGU8BlSlTRsWKFVNycrLDeHJyskJDQ90WV2HTu3dvffPNN1qxYoVuuukmd4dTKCQmJurEiRO67bbb5O3tLW9vb/3444+aNGmSvL29lZWV5e4Q3SYsLEzVq1d3GKtWrZoOHz7stpgKk4EDB2rIkCHq1KmTatWqpc6dO6t///6Kj493d2iFSvbPYH4+X1l2on/o0CElJCRQ1Ze0atUqnThxQpGRkfafz4cOHdJzzz2nChUquDs8wCkk+wXk4+OjevXqadmyZfYxm82mZcuWKTo62q2xFQaGYah3796aP3++li9frqioKHeHVGg0bdpU27Zt05YtW+xb/fr1FRcXpy1btqhYsWLuDtFt7rrrrhwtWvfu3avy5cu7LabC5Ny5c/LycvzxXaxYMdlsNrfFVBhFRUUpNDTU4edzWlqa1q9fz8/nS7IT/X379mnp0qUqXbq0u0MqFDp37qytW7c6/HwODw/XwIEDtWTJEneHBziFZTwuMGDAAHXp0kX169fX7bffrokTJ+rs2bPq1q2bu0Nzu169emnOnDn66quvVLJkSfs6WavVKn9/f3eH51YlS5bMce9CQECASpcufd3f09C/f3/deeedGjt2rDp27Kiff/5Z06dP1/Tp090dWqHQunVrvfLKK4qMjFSNGjW0efNmTZgwQU888YS7Q7vm0tPTtX//fvvrAwcOaMuWLQoODlZkZKT69eunMWPGqHLlyoqKitKwYcMUHh6utm3bujXua+VK1ycsLEwdOnTQpk2b9M033ygrK8v+Mzo4OFg+Pj5ujNx8V/va+fcvPsWLF1doaKiqVKnihmiBAnB3OyBPMXnyZCMyMtLw8fExbr/9dmPdunXuDqlQkJTrNmPGDHeHVijRevP/LVy40KhZs6bh6+trVK1a1Zg+fbq7Qyo00tLSjL59+xqRkZGGn5+fcfPNNxsvvviikZGR4e7QrrkVK1bk+jOmS5cuhnGp/eawYcOMkJAQw9fX12jatKmxZ88ed4d9zVzp+hw4cOCyP6NXrFjh7tBNd7WvnX+j9SaKKotxPT5yEQAAALgOsGYfAAAA8FAk+wAAAICHItkHAAAAPBTJPgAAAOChSPYBAAAAD0WyDwAAAHgokn0AAADAQ5HsAwAAAB6KZB8A3GDkyJGqU6dOgeY4ePCgLBaLtmzZku85Tp06pXLlyungwYN5Ov7ChQuqUKGCNm7cmO/PBABcOyT7AIoMi8VyxW3kyJHXLJYmTZqoX79+1+zzzPLKK6+oTZs2qlChQp6O9/Hx0fPPP6/BgwebHhsAoOC83R0AAOTV8ePH7f/+7LPPNHz4cO3Zs8c+dsMNN9j/bRiGsrKy5O3Nj7nLOXfunN5//30tWbLEqffFxcXpueee044dO1SjRg3T4gMAFByVfQBFRmhoqH2zWq2yWCz217t371bJkiW1aNEi1atXT76+vlq9erW6du2qtm3bOszTr18/NWnSxP7aZrMpPj5eUVFR8vf3V+3atfXll18WKNbBgwfrlltuUYkSJXTzzTdr2LBhyszMzHHcu+++q4iICJUoUUIdO3ZUamqqw/7//e9/qlatmvz8/FS1alW98847l/3M06dPKy4uTmXLlpW/v78qV66sGTNmXPb47777Tr6+vrrjjjvsY6NHj1Z4eLhOnTplH2vVqpXuvfde2Ww2SVKpUqV011136dNPP3X6ugAAri1KXgA8ypAhQ/T666/r5ptvVqlSpfL0nvj4eH388ceaNm2aKleurJUrV+qxxx5T2bJldc899+QrjpIlS2rmzJkKDw/Xtm3b1KNHD5UsWVKDBg2yH7N//359/vnnWrhwodLS0tS9e3c988wzmj17tiRp9uzZGj58uN5++23VrVtXmzdvVo8ePRQQEKAuXbrk+Mxhw4Zp586dWrRokcqUKaP9+/frr7/+umyMq1atUr169RzGXnzxRS1evFhPPvmk5s+frylTpmjNmjX65Zdf5OX1//Wh22+/XatWrcrXtQEAXDsk+wA8yujRo3X//ffn+fiMjAyNHTtWS5cuVXR0tCTp5ptv1urVq/Xuu+/mO9l/6aWX7P+uUKGCnn/+eX366acOyf758+c1a9Ys3XjjjZKkyZMnq1WrVnrjjTcUGhqqESNG6I033lC7du0kSVFRUdq5c6fefffdXJP9w4cPq27duqpfv779c6/k0KFDCg8PdxgrVqyYPv74Y9WpU0dDhgzRpEmT9L///U+RkZEOx4WHh+vQoUP5ujYAgGuHZB+AR8lOdPNq//79OnfuXI5fEC5cuKC6devmO47PPvtMkyZN0q+//qr09HRdvHhRgYGBDsdERkbaE31Jio6Ols1m0549e1SyZEn9+uuv6t69u3r06GE/5uLFi7Jarbl+5n//+1+1b99emzZtUrNmzdS2bVvdeeedl43xr7/+kp+fX47xm2++Wa+//rqeeuopPfzww3r00UdzHOPv769z587l+XoAANyDZB+ARwkICHB47eXlJcMwHMb+uXY+PT1dkvTtt986JN6S5Ovrm68Y1q5dq7i4OI0aNUqxsbGyWq369NNP9cYbb+R5juy43nvvPTVs2NBhX7FixXJ9T4sWLXTo0CF99913SkhIUNOmTdWrVy+9/vrruR5fpkwZnT59Otd9K1euVLFixXTw4EFdvHgxx43OKSkpKlu2bJ7PBwDgHtygC8CjlS1b1qGLjySHvvTVq1eXr6+vDh8+rEqVKjlsERER+frMNWvWqHz58nrxxRdVv359Va5cOdclL4cPH9axY8fsr9etWycvLy9VqVJFISEhCg8P12+//ZYjrqioqCueb5cuXfTxxx9r4sSJmj59+mWPrVu3rnbu3Jlj/LPPPtO8efP0ww8/6PDhw3r55ZdzHLN9+/YC/eUDAHBtUNkH4NHuu+8+vfbaa5o1a5aio6P18ccfOySqJUuW1PPPP6/+/fvLZrOpUaNGSk1N1U8//aTAwMBc18ZnO3nyZI4HWoWFhaly5co6fPiwPv30UzVo0EDffvut5s+fn+P9fn5+6tKli15//XWlpaWpT58+6tixo0JDQyVJo0aNUp8+fWS1WtW8eXNlZGRo48aNOn36tAYMGJBjvuHDh6tevXqqUaOGMjIy9M0336hatWqXjT82NlZDhw7V6dOn7Tcz//777/rvf/+rcePGqVGjRpoxY4YeeOABtWjRwqFrz6pVq3L9JQAAUMgYAFAEzZgxw7BarfbXK1asMCQZp0+fznHs8OHDjZCQEMNqtRr9+/c3evfubdxzzz32/TabzZg4caJRpUoVo3jx4kbZsmWN2NhY48cff7zs599zzz2GpBzbyy+/bBiGYQwcONAoXbq0ccMNNxgPP/yw8eabbzrEO2LECKN27drGO++8Y4SHhxt+fn5Ghw4djJSUFIfPmT17tlGnTh3Dx8fHKFWqlNG4cWNj3rx5hmEYxoEDBwxJxubNmw3DMIyXX37ZqFatmuHv728EBwcbbdq0MX777bcrXsfbb7/dmDZtmv06NG3a1IiNjTVsNpv9mGeffdaoWLGi8eeffxqGYRhr1qwxgoKCjHPnzl1xbgCA+1mMfy9mBQBcN7799lsNHDhQ27dvd2iteSUPP/ywateurRdeeMH0+AAABcMyHgC4jrVq1Ur79u3T0aNH83SPwoULF1SrVi3179//msQHACgYKvsAAACAh6IbDwAAAOChSPYBAAAAD0WyDwAAAHgokn0AAADAQ5HsAwAAAB6KZB8AAADwUCT7AAAAgIci2QcAAAA8FMk+AAAA4KFI9gEAAAAP9X/HgTwVmN8g9gAAAABJRU5ErkJggg==",
+ "text/plain": [
+ "<Figure size 800x600 with 2 Axes>"
+ ]
+ },
+ "metadata": {},
+ "output_type": "display_data"
+ }
+ ],
+ "source": [
+ "import numpy as np\n",
+ "import matplotlib.pyplot as plt\n",
+ "batch_src, batch_labels = map(lambda x: x.to(device), mkbatch(1<<10))\n",
+ "model.eval()\n",
+ "with torch.no_grad():\n",
+ " output = model(batch_src)\n",
+ "\n",
+ "# Flatten the arrays to 1D\n",
+ "x = batch_labels.detach().to(torch.float16).cpu().numpy().flatten()\n",
+ "y = output.detach().to(torch.float16).cpu().numpy().flatten()\n",
+ "\n",
+ "# Define the number of vertices and number of bins per dimension\n",
+ "bins_y = 10 * NVTXS # 10 * nvtxs for y-bin size\n",
+ "\n",
+ "# Initialize the 2D array (matrix) to store the counts\n",
+ "count_matrix = np.zeros((NVTXS, bins_y), dtype=int)\n",
+ "\n",
+ "# Process the data: Map x to rows and floor(y*10) to columns\n",
+ "for xi, yi in zip(x, y):\n",
+ " row = int(xi) # Use integer value of x for row index\n",
+ " col = int(np.floor(yi * 10)) # Map y values to column by flooring and scaling by 10\n",
+ " if 0 <= row < NVTXS and 0 <= col < bins_y: # Ensure valid indices\n",
+ " count_matrix[row, col] += 1\n",
+ "\n",
+ "# Transpose the matrix\n",
+ "count_matrix = count_matrix.T\n",
+ "\n",
+ "# Plot the heatmap\n",
+ "plt.figure(figsize=(8, 6))\n",
+ "plt.imshow(count_matrix, cmap='viridis', origin='lower', interpolation='nearest', aspect='auto')\n",
+ "\n",
+ "# Set the labels and title\n",
+ "plt.ylabel('Scaled Predicted Output (y)')\n",
+ "plt.xlabel('True Labels (x)')\n",
+ "plt.title('True Labels vs Scaled Predicted Output (Heatmap)')\n",
+ "\n",
+ "# Add a colorbar for reference\n",
+ "plt.colorbar(label='Count')\n",
+ "\n",
+ "# Show the plot\n",
+ "plt.tight_layout()\n",
+ "plt.show()"
]
}
],
diff --git a/training-2d-histogram.png b/training-2d-histogram.png
index 45d53a2..31c3340 100644
--- a/training-2d-histogram.png
+++ b/training-2d-histogram.png
Binary files differ
diff --git a/training-loss.png b/training-loss.png
index 98a2a20..9cdc225 100644
--- a/training-loss.png
+++ b/training-loss.png
Binary files differ
diff --git a/transformer_shortest_paths.ipynb b/transformer_shortest_paths.ipynb
index 3949fd5..c9ff777 100644
--- a/transformer_shortest_paths.ipynb
+++ b/transformer_shortest_paths.ipynb
@@ -86,7 +86,7 @@
},
{
"cell_type": "code",
- "execution_count": 18,
+ "execution_count": 3,
"execution_state": "idle",
"metadata": {
"colab": {
@@ -391,7 +391,7 @@
},
{
"cell_type": "code",
- "execution_count": 5,
+ "execution_count": 8,
"execution_state": "idle",
"metadata": {
"id": "tLOWhg_CeWzH"
@@ -432,7 +432,7 @@
},
{
"cell_type": "code",
- "execution_count": 6,
+ "execution_count": 9,
"execution_state": "idle",
"metadata": {
"colab": {
@@ -446,8 +446,8 @@
"name": "stdout",
"output_type": "stream",
"text": [
- "Training data: 1048M\n",
- "Trainable parameters in the model: 200K\n"
+ "Training data: 104M\n",
+ "Trainable parameters in the model: 200545\n"
]
}
],
@@ -455,7 +455,7 @@
"# PARAMS\n",
"VOCAB_SIZE = 1 + MAX_VTXS + 1 # pad plus max number of vertices plus target token\n",
"MODEL_DIM = 64 # Dimension of model (embedding and transformer)\n",
- "NEPOCHS = 1000\n",
+ "NEPOCHS = 100\n",
"BSZ = 2**17 # Batch size\n",
"BPE = 8 # Batches per epoch\n",
"NHEADS = 2\n",
@@ -469,9 +469,13 @@
"\n",
"trainable_params = sum(p.numel() for p in model.parameters() if p.requires_grad)\n",
"print(f\"Training data: {NEPOCHS*BPE*BSZ//10**6}M\")\n",
- "print(f\"Trainable parameters in the model: {trainable_params//1000}K\")\n",
+ "print(f\"Trainable parameters in the model: {trainable_params}\")\n",
"\n",
"train_err = []\n",
+ "len1 = []\n",
+ "len2 = []\n",
+ "len3 = []\n",
+ "len15 = []\n",
"epoch = 0\n",
"\n",
"# clear loss file\n",
@@ -495,17 +499,17 @@
},
{
"cell_type": "code",
- "execution_count": 8,
+ "execution_count": 10,
"execution_state": "idle",
"metadata": {},
"outputs": [],
"source": [
- "model = TransformerModel(input_dim=VOCAB_SIZE, model_dim=MODEL_DIM,\n",
- " output_dim=1, num_heads=NHEADS,\n",
- " num_layers=NLAYERS, seq_len=SEQ_LEN,\n",
- " dropout=DROPOUT).to(device)\n",
- "model = torch.compile(model)\n",
- "model.load_state_dict(torch.load('model.pth', weights_only=True))\n",
+ "# model = TransformerModel(input_dim=VOCAB_SIZE, model_dim=MODEL_DIM,\n",
+ "# output_dim=1, num_heads=NHEADS,\n",
+ "# num_layers=NLAYERS, seq_len=SEQ_LEN,\n",
+ "# dropout=DROPOUT).to(device)\n",
+ "# model = torch.compile(model)\n",
+ "# model.load_state_dict(torch.load('model.pth', weights_only=True))\n",
"\n",
"LR = 8e-4\n",
"WD = 0 # 1e-5\n",
@@ -526,7 +530,7 @@
},
{
"cell_type": "code",
- "execution_count": 12,
+ "execution_count": 11,
"execution_state": "idle",
"metadata": {},
"outputs": [],
@@ -553,8 +557,8 @@
},
{
"cell_type": "code",
- "execution_count": null,
- "execution_state": "running",
+ "execution_count": 12,
+ "execution_state": "idle",
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/",
@@ -570,298 +574,768 @@
"text": [
"/home/sipb/.venv/lib64/python3.12/site-packages/torch/nn/functional.py:6278: UserWarning: Memory Efficient attention on Navi31 GPU is still experimental. Enable it with TORCH_ROCM_AOTRITON_ENABLE_EXPERIMENTAL=1. (Triggered internally at ../aten/src/ATen/native/transformers/hip/sdp_utils.cpp:269.)\n",
" attn_output = scaled_dot_product_attention(\n",
- "/home/sipb/.venv/lib64/python3.12/site-packages/torch/_inductor/compile_fx.py:167: UserWarning: TensorFloat32 tensor cores for float32 matrix multiplication available but not enabled. Consider setting `torch.set_float32_matmul_precision('high')` for better performance.\n",
- " warnings.warn(\n",
- "/tmp/torchinductor_sipb/nj/cnjfg6sudczhbwjig6u6ixumyik7x7ugjn4x43lbushjy4vv4pwz.py:883: UserWarning: Attempting to use hipBLASLt on an unsupported architecture! Overriding blas backend to hipblas (Triggered internally at ../aten/src/ATen/Context.cpp:296.)\n",
- " extern_kernels.mm(reinterpret_tensor(buf1, (1048576, 64), (64, 1), 0), reinterpret_tensor(primals_5, (64, 192), (1, 64), 0), out=buf2)\n"
+ "/tmp/torchinductor_sipb/bn/cbngaobakjqlwlijvkqph5lgddb2z2kzjaln3b2g2j75b6snskdn.py:859: UserWarning: Attempting to use hipBLASLt on an unsupported architecture! Overriding blas backend to hipblas (Triggered internally at ../aten/src/ATen/Context.cpp:296.)\n",
+ " extern_kernels.mm(reinterpret_tensor(buf1, (2097152, 64), (64, 1), 0), reinterpret_tensor(primals_5, (64, 192), (1, 64), 0), out=buf2)\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
- "Epoch 0/1000 \t Train Err: 85.0000\n",
- "Epoch 0/1000 \t Train Err: 72.0000\n",
- "Epoch 0/1000 \t Train Err: 63.5000\n",
- "Epoch 0/1000 \t Train Err: 58.0000\n",
- "Epoch 0/1000 \t Train Err: 53.7500\n",
- "Epoch 0/1000 \t Train Err: 51.0000\n",
- "Epoch 0/1000 \t Train Err: 49.2500\n",
- "Epoch 0/1000 \t Train Err: 48.0000\n",
- "Epoch 0/1000 \t Train Err: 47.2500\n",
- "Epoch 0/1000 \t Train Err: 46.2500\n",
- "Epoch 0/1000 \t Train Err: 45.5000\n",
- "Epoch 0/1000 \t Train Err: 45.2500\n",
- "Epoch 0/1000 \t Train Err: 44.5000\n",
- "Epoch 0/1000 \t Train Err: 44.2500\n",
- "Epoch 0/1000 \t Train Err: 44.2500\n",
- "Epoch 0/1000 \t Train Err: 44.2500\n",
- "Epoch 1/1000 \t Train Err: 43.5000\n",
- "Epoch 1/1000 \t Train Err: 43.5000\n",
- "Epoch 1/1000 \t Train Err: 43.5000\n",
- "Epoch 1/1000 \t Train Err: 43.5000\n",
- "Epoch 1/1000 \t Train Err: 43.2500\n",
- "Epoch 1/1000 \t Train Err: 43.2500\n",
- "Epoch 1/1000 \t Train Err: 43.0000\n",
- "Epoch 1/1000 \t Train Err: 43.0000\n",
- "Epoch 1/1000 \t Train Err: 42.7500\n",
- "Epoch 1/1000 \t Train Err: 42.5000\n",
- "Epoch 1/1000 \t Train Err: 42.5000\n",
- "Epoch 1/1000 \t Train Err: 42.7500\n",
- "Epoch 1/1000 \t Train Err: 42.7500\n",
- "Epoch 1/1000 \t Train Err: 42.5000\n",
- "Epoch 1/1000 \t Train Err: 42.2500\n",
- "Epoch 1/1000 \t Train Err: 42.2500\n",
- "Epoch 2/1000 \t Train Err: 42.2500\n",
- "Epoch 2/1000 \t Train Err: 42.5000\n",
- "Epoch 2/1000 \t Train Err: 42.0000\n",
- "Epoch 2/1000 \t Train Err: 42.0000\n",
- "Epoch 2/1000 \t Train Err: 42.0000\n",
- "Epoch 2/1000 \t Train Err: 42.0000\n",
- "Epoch 2/1000 \t Train Err: 42.0000\n",
- "Epoch 2/1000 \t Train Err: 42.2500\n",
- "Epoch 2/1000 \t Train Err: 41.7500\n",
- "Epoch 2/1000 \t Train Err: 41.7500\n",
- "Epoch 2/1000 \t Train Err: 41.2500\n",
- "Epoch 2/1000 \t Train Err: 41.5000\n",
- "Epoch 2/1000 \t Train Err: 41.5000\n",
- "Epoch 2/1000 \t Train Err: 41.7500\n",
- "Epoch 2/1000 \t Train Err: 41.2500\n",
- "Epoch 2/1000 \t Train Err: 41.5000\n",
- "Epoch 3/1000 \t Train Err: 41.5000\n",
- "Epoch 3/1000 \t Train Err: 41.2500\n",
- "Epoch 3/1000 \t Train Err: 41.5000\n",
- "Epoch 3/1000 \t Train Err: 41.2500\n",
- "Epoch 3/1000 \t Train Err: 41.2500\n",
- "Epoch 3/1000 \t Train Err: 41.0000\n",
- "Epoch 3/1000 \t Train Err: 41.0000\n",
- "Epoch 3/1000 \t Train Err: 40.7500\n",
- "Epoch 3/1000 \t Train Err: 40.7500\n",
- "Epoch 3/1000 \t Train Err: 40.5000\n",
- "Epoch 3/1000 \t Train Err: 40.5000\n",
- "Epoch 3/1000 \t Train Err: 40.2500\n",
- "Epoch 3/1000 \t Train Err: 40.0000\n",
- "Epoch 3/1000 \t Train Err: 39.7500\n",
- "Epoch 3/1000 \t Train Err: 39.2500\n",
- "Epoch 3/1000 \t Train Err: 38.7500\n",
- "Epoch 4/1000 \t Train Err: 38.0000\n",
- "Epoch 4/1000 \t Train Err: 37.2500\n",
- "Epoch 4/1000 \t Train Err: 36.5000\n",
- "Epoch 4/1000 \t Train Err: 35.5000\n",
- "Epoch 4/1000 \t Train Err: 35.0000\n",
- "Epoch 4/1000 \t Train Err: 34.7500\n",
- "Epoch 4/1000 \t Train Err: 34.7500\n",
- "Epoch 4/1000 \t Train Err: 34.7500\n",
- "Epoch 4/1000 \t Train Err: 34.5000\n",
- "Epoch 4/1000 \t Train Err: 34.2500\n",
- "Epoch 4/1000 \t Train Err: 33.7500\n",
- "Epoch 4/1000 \t Train Err: 33.7500\n",
- "Epoch 4/1000 \t Train Err: 33.5000\n",
- "Epoch 4/1000 \t Train Err: 33.5000\n",
- "Epoch 4/1000 \t Train Err: 33.0000\n",
- "Epoch 4/1000 \t Train Err: 33.0000\n",
- "Epoch 5/1000 \t Train Err: 33.0000\n",
- "Epoch 5/1000 \t Train Err: 32.7500\n",
- "Epoch 5/1000 \t Train Err: 32.7500\n",
- "Epoch 5/1000 \t Train Err: 32.7500\n",
- "Epoch 5/1000 \t Train Err: 32.5000\n",
- "Epoch 5/1000 \t Train Err: 32.0000\n",
- "Epoch 5/1000 \t Train Err: 32.5000\n",
- "Epoch 5/1000 \t Train Err: 32.2500\n",
- "Epoch 5/1000 \t Train Err: 32.5000\n",
- "Epoch 5/1000 \t Train Err: 31.8750\n",
- "Epoch 5/1000 \t Train Err: 31.6250\n",
- "Epoch 5/1000 \t Train Err: 31.6250\n",
- "Epoch 5/1000 \t Train Err: 31.6250\n",
- "Epoch 5/1000 \t Train Err: 31.8750\n",
- "Epoch 5/1000 \t Train Err: 31.5000\n",
- "Epoch 5/1000 \t Train Err: 31.2500\n",
- "Epoch 6/1000 \t Train Err: 31.1250\n",
- "Epoch 6/1000 \t Train Err: 31.1250\n",
- "Epoch 6/1000 \t Train Err: 31.2500\n",
- "Epoch 6/1000 \t Train Err: 31.2500\n",
- "Epoch 6/1000 \t Train Err: 31.0000\n",
- "Epoch 6/1000 \t Train Err: 30.8750\n",
- "Epoch 6/1000 \t Train Err: 31.0000\n",
- "Epoch 6/1000 \t Train Err: 30.8750\n",
- "Epoch 6/1000 \t Train Err: 30.8750\n",
- "Epoch 6/1000 \t Train Err: 30.8750\n",
- "Epoch 6/1000 \t Train Err: 30.7500\n",
- "Epoch 6/1000 \t Train Err: 30.6250\n",
- "Epoch 6/1000 \t Train Err: 30.5000\n",
- "Epoch 6/1000 \t Train Err: 30.7500\n",
- "Epoch 6/1000 \t Train Err: 30.3750\n",
- "Epoch 6/1000 \t Train Err: 30.5000\n",
- "Epoch 7/1000 \t Train Err: 30.6250\n",
- "Epoch 7/1000 \t Train Err: 30.5000\n",
- "Epoch 7/1000 \t Train Err: 30.3750\n",
- "Epoch 7/1000 \t Train Err: 30.5000\n",
- "Epoch 7/1000 \t Train Err: 30.5000\n",
- "Epoch 7/1000 \t Train Err: 30.5000\n",
- "Epoch 7/1000 \t Train Err: 30.3750\n",
- "Epoch 7/1000 \t Train Err: 30.2500\n",
- "Epoch 7/1000 \t Train Err: 30.2500\n",
- "Epoch 7/1000 \t Train Err: 30.2500\n",
- "Epoch 7/1000 \t Train Err: 30.1250\n",
- "Epoch 7/1000 \t Train Err: 30.0000\n",
- "Epoch 7/1000 \t Train Err: 30.2500\n",
- "Epoch 7/1000 \t Train Err: 30.1250\n",
- "Epoch 7/1000 \t Train Err: 30.1250\n",
- "Epoch 7/1000 \t Train Err: 30.0000\n",
- "Epoch 8/1000 \t Train Err: 30.0000\n",
- "Epoch 8/1000 \t Train Err: 29.8750\n",
- "Epoch 8/1000 \t Train Err: 30.0000\n",
- "Epoch 8/1000 \t Train Err: 30.0000\n",
- "Epoch 8/1000 \t Train Err: 29.7500\n",
- "Epoch 8/1000 \t Train Err: 30.0000\n",
- "Epoch 8/1000 \t Train Err: 29.8750\n",
- "Epoch 8/1000 \t Train Err: 29.8750\n",
- "Epoch 8/1000 \t Train Err: 29.8750\n",
- "Epoch 8/1000 \t Train Err: 29.6250\n",
- "Epoch 8/1000 \t Train Err: 29.6250\n",
- "Epoch 8/1000 \t Train Err: 29.8750\n",
- "Epoch 8/1000 \t Train Err: 29.8750\n",
- "Epoch 8/1000 \t Train Err: 29.5000\n",
- "Epoch 8/1000 \t Train Err: 29.8750\n",
- "Epoch 8/1000 \t Train Err: 29.6250\n",
- "Epoch 9/1000 \t Train Err: 29.7500\n",
- "Epoch 9/1000 \t Train Err: 29.7500\n",
- "Epoch 9/1000 \t Train Err: 29.5000\n",
- "Epoch 9/1000 \t Train Err: 29.6250\n",
- "Epoch 9/1000 \t Train Err: 29.6250\n",
- "Epoch 9/1000 \t Train Err: 29.6250\n",
- "Epoch 9/1000 \t Train Err: 29.6250\n",
- "Epoch 9/1000 \t Train Err: 29.6250\n",
- "Epoch 9/1000 \t Train Err: 29.5000\n",
- "Epoch 9/1000 \t Train Err: 29.3750\n",
- "Epoch 9/1000 \t Train Err: 29.5000\n",
- "Epoch 9/1000 \t Train Err: 29.5000\n",
- "Epoch 9/1000 \t Train Err: 29.5000\n",
- "Epoch 9/1000 \t Train Err: 29.3750\n",
- "Epoch 9/1000 \t Train Err: 29.5000\n",
- "Epoch 9/1000 \t Train Err: 29.2500\n",
- "Epoch 10/1000 \t Train Err: 29.2500\n",
- "Epoch 10/1000 \t Train Err: 29.3750\n",
- "Epoch 10/1000 \t Train Err: 29.2500\n",
- "Epoch 10/1000 \t Train Err: 29.5000\n",
- "Epoch 10/1000 \t Train Err: 29.3750\n",
- "Epoch 10/1000 \t Train Err: 29.2500\n",
- "Epoch 10/1000 \t Train Err: 29.2500\n",
- "Epoch 10/1000 \t Train Err: 29.2500\n",
- "Epoch 10/1000 \t Train Err: 29.3750\n",
- "Epoch 10/1000 \t Train Err: 29.3750\n",
- "Epoch 10/1000 \t Train Err: 29.2500\n",
- "Epoch 10/1000 \t Train Err: 29.2500\n",
- "Epoch 10/1000 \t Train Err: 29.2500\n",
- "Epoch 10/1000 \t Train Err: 29.2500\n",
- "Epoch 10/1000 \t Train Err: 29.2500\n",
- "Epoch 10/1000 \t Train Err: 29.1250\n",
- "Epoch 11/1000 \t Train Err: 29.2500\n",
- "Epoch 11/1000 \t Train Err: 29.2500\n",
- "Epoch 11/1000 \t Train Err: 29.2500\n",
- "Epoch 11/1000 \t Train Err: 29.1250\n",
- "Epoch 11/1000 \t Train Err: 29.0000\n",
- "Epoch 11/1000 \t Train Err: 29.2500\n",
- "Epoch 11/1000 \t Train Err: 29.1250\n",
- "Epoch 11/1000 \t Train Err: 29.0000\n",
- "Epoch 11/1000 \t Train Err: 29.0000\n",
- "Epoch 11/1000 \t Train Err: 29.0000\n",
- "Epoch 11/1000 \t Train Err: 29.0000\n",
- "Epoch 11/1000 \t Train Err: 29.1250\n",
- "Epoch 11/1000 \t Train Err: 29.1250\n",
- "Epoch 11/1000 \t Train Err: 29.2500\n",
- "Epoch 11/1000 \t Train Err: 29.1250\n",
- "Epoch 11/1000 \t Train Err: 29.1250\n",
- "Epoch 12/1000 \t Train Err: 29.1250\n",
- "Epoch 12/1000 \t Train Err: 29.0000\n",
- "Epoch 12/1000 \t Train Err: 29.0000\n",
- "Epoch 12/1000 \t Train Err: 29.0000\n",
- "Epoch 12/1000 \t Train Err: 28.8750\n",
- "Epoch 12/1000 \t Train Err: 29.0000\n",
- "Epoch 12/1000 \t Train Err: 29.1250\n",
- "Epoch 12/1000 \t Train Err: 28.8750\n",
- "Epoch 12/1000 \t Train Err: 29.0000\n",
- "Epoch 12/1000 \t Train Err: 29.0000\n",
- "Epoch 12/1000 \t Train Err: 29.0000\n",
- "Epoch 12/1000 \t Train Err: 28.8750\n",
- "Epoch 12/1000 \t Train Err: 28.7500\n",
- "Epoch 12/1000 \t Train Err: 28.8750\n",
- "Epoch 12/1000 \t Train Err: 28.8750\n",
- "Epoch 12/1000 \t Train Err: 28.8750\n",
- "Epoch 13/1000 \t Train Err: 29.0000\n",
- "Epoch 13/1000 \t Train Err: 28.8750\n",
- "Epoch 13/1000 \t Train Err: 29.1250\n",
- "Epoch 13/1000 \t Train Err: 29.0000\n",
- "Epoch 13/1000 \t Train Err: 29.0000\n",
- "Epoch 13/1000 \t Train Err: 28.8750\n",
- "Epoch 13/1000 \t Train Err: 28.8750\n",
- "Epoch 13/1000 \t Train Err: 29.0000\n",
- "Epoch 13/1000 \t Train Err: 28.8750\n",
- "Epoch 13/1000 \t Train Err: 28.8750\n",
- "Epoch 13/1000 \t Train Err: 28.7500\n",
- "Epoch 13/1000 \t Train Err: 28.6250\n",
- "Epoch 13/1000 \t Train Err: 28.6250\n",
- "Epoch 13/1000 \t Train Err: 28.8750\n",
- "Epoch 13/1000 \t Train Err: 28.6250\n",
- "Epoch 13/1000 \t Train Err: 28.7500\n",
- "Epoch 14/1000 \t Train Err: 28.7500\n",
- "Epoch 14/1000 \t Train Err: 28.8750\n",
- "Epoch 14/1000 \t Train Err: 28.5000\n",
- "Epoch 14/1000 \t Train Err: 28.7500\n",
- "Epoch 14/1000 \t Train Err: 28.7500\n",
- "Epoch 14/1000 \t Train Err: 28.7500\n",
- "Epoch 14/1000 \t Train Err: 28.7500\n",
- "Epoch 14/1000 \t Train Err: 28.7500\n",
- "Epoch 14/1000 \t Train Err: 28.8750\n",
- "Epoch 14/1000 \t Train Err: 28.7500\n",
- "Epoch 14/1000 \t Train Err: 28.7500\n",
- "Epoch 14/1000 \t Train Err: 28.8750\n",
- "Epoch 14/1000 \t Train Err: 28.7500\n",
- "Epoch 14/1000 \t Train Err: 28.8750\n",
- "Epoch 14/1000 \t Train Err: 28.7500\n",
- "Epoch 14/1000 \t Train Err: 28.7500\n",
- "Epoch 15/1000 \t Train Err: 28.7500\n",
- "Epoch 15/1000 \t Train Err: 28.7500\n",
- "Epoch 15/1000 \t Train Err: 28.6250\n",
- "Epoch 15/1000 \t Train Err: 28.7500\n",
- "Epoch 15/1000 \t Train Err: 28.6250\n",
- "Epoch 15/1000 \t Train Err: 28.7500\n",
- "Epoch 15/1000 \t Train Err: 28.7500\n",
- "Epoch 15/1000 \t Train Err: 28.6250\n",
- "Epoch 15/1000 \t Train Err: 28.7500\n",
- "Epoch 15/1000 \t Train Err: 28.6250\n",
- "Epoch 15/1000 \t Train Err: 28.7500\n",
- "Epoch 15/1000 \t Train Err: 28.5000\n",
- "Epoch 15/1000 \t Train Err: 28.6250\n",
- "Epoch 15/1000 \t Train Err: 28.6250\n",
- "Epoch 15/1000 \t Train Err: 28.5000\n",
- "Epoch 15/1000 \t Train Err: 28.6250\n",
- "Epoch 16/1000 \t Train Err: 28.3750\n",
- "Epoch 16/1000 \t Train Err: 28.2500\n",
- "Epoch 16/1000 \t Train Err: 28.1250\n",
- "Epoch 16/1000 \t Train Err: 27.8750\n",
- "Epoch 16/1000 \t Train Err: 28.0000\n",
- "Epoch 16/1000 \t Train Err: 27.6250\n",
- "Epoch 16/1000 \t Train Err: 27.5000\n",
- "Epoch 16/1000 \t Train Err: 27.2500\n",
- "Epoch 16/1000 \t Train Err: 27.1250\n",
- "Epoch 16/1000 \t Train Err: 27.0000\n",
- "Epoch 16/1000 \t Train Err: 26.5000\n",
- "Epoch 16/1000 \t Train Err: 27.0000\n",
- "Epoch 16/1000 \t Train Err: 26.5000\n",
- "Epoch 16/1000 \t Train Err: 26.3750\n",
- "Epoch 16/1000 \t Train Err: 25.6250\n",
- "Epoch 16/1000 \t Train Err: 25.8750\n",
- "Epoch 17/1000 \t Train Err: 25.2500\n",
- "Epoch 17/1000 \t Train Err: 25.1250\n",
- "Epoch 17/1000 \t Train Err: 24.8750\n",
- "Epoch 17/1000 \t Train Err: 24.7500\n",
- "Epoch 17/1000 \t Train Err: 24.1250\n",
- "Epoch 17/1000 \t Train Err: 23.8750\n",
- "Epoch 17/1000 \t Train Err: 23.7500\n",
- "Epoch 17/1000 \t Train Err: 23.5000\n",
- "Epoch 17/1000 \t Train Err: 23.1250\n",
- "Epoch 17/1000 \t Train Err: 22.8750\n"
+ "Epoch 0/100 \t Train Err: 87.5000 0.62109375 3.28125 8.125 222.0\n",
+ "Epoch 0/100 \t Train Err: 70.5000 0.5078125 0.173828125 1.953125 182.0\n",
+ "Epoch 0/100 \t Train Err: 59.7500 2.828125 0.4140625 0.134765625 154.0\n",
+ "Epoch 0/100 \t Train Err: 54.0000 5.5 1.734375 0.1279296875 137.0\n",
+ "Epoch 0/100 \t Train Err: 50.7500 7.9375 3.21875 0.6953125 126.0\n",
+ "Epoch 0/100 \t Train Err: 48.5000 10.0 4.625 1.40625 118.0\n",
+ "Epoch 0/100 \t Train Err: 46.7500 11.75 5.84375 2.109375 111.5\n",
+ "Epoch 0/100 \t Train Err: 45.7500 13.125 6.90625 2.75 107.5\n",
+ "Epoch 1/100 \t Train Err: 44.7500 14.25 7.75 3.28125 104.0\n",
+ "Epoch 1/100 \t Train Err: 44.5000 15.1875 8.4375 3.71875 102.0\n",
+ "Epoch 1/100 \t Train Err: 44.2500 15.875 9.0 4.09375 100.0\n",
+ "Epoch 1/100 \t Train Err: 43.7500 16.5 9.4375 4.34375 98.5\n",
+ "Epoch 1/100 \t Train Err: 43.7500 16.875 9.8125 4.59375 97.5\n",
+ "Epoch 1/100 \t Train Err: 43.5000 17.25 10.1875 4.8125 96.5\n",
+ "Epoch 1/100 \t Train Err: 43.2500 17.625 10.4375 5.0 95.0\n",
+ "Epoch 1/100 \t Train Err: 43.2500 18.0 10.6875 5.1875 95.0\n",
+ "Epoch 2/100 \t Train Err: 43.0000 18.5 11.0 5.34375 94.0\n",
+ "Epoch 2/100 \t Train Err: 42.5000 18.75 11.25 5.5625 92.5\n",
+ "Epoch 2/100 \t Train Err: 42.7500 19.125 11.5625 5.75 92.5\n",
+ "Epoch 2/100 \t Train Err: 42.5000 19.5 11.8125 5.9375 91.5\n",
+ "Epoch 2/100 \t Train Err: 42.0000 19.875 12.1875 6.1875 90.0\n",
+ "Epoch 2/100 \t Train Err: 42.2500 20.25 12.5 6.40625 90.0\n",
+ "Epoch 2/100 \t Train Err: 42.0000 20.625 12.6875 6.59375 89.0\n",
+ "Epoch 2/100 \t Train Err: 41.7500 21.0 13.0625 6.84375 88.0\n",
+ "Epoch 3/100 \t Train Err: 42.2500 21.375 13.375 7.0625 88.0\n",
+ "Epoch 3/100 \t Train Err: 41.7500 21.75 13.6875 7.28125 86.0\n",
+ "Epoch 3/100 \t Train Err: 41.5000 22.125 14.0 7.5625 85.5\n",
+ "Epoch 3/100 \t Train Err: 41.7500 22.5 14.3125 7.75 85.5\n",
+ "Epoch 3/100 \t Train Err: 41.2500 22.875 14.5625 7.9375 84.5\n",
+ "Epoch 3/100 \t Train Err: 41.2500 23.25 14.875 8.1875 83.5\n",
+ "Epoch 3/100 \t Train Err: 41.5000 23.5 15.1875 8.4375 83.5\n",
+ "Epoch 3/100 \t Train Err: 41.2500 23.75 15.4375 8.625 82.0\n",
+ "Epoch 4/100 \t Train Err: 41.0000 24.125 15.75 8.8125 81.0\n",
+ "Epoch 4/100 \t Train Err: 40.7500 24.375 16.0 9.0625 81.0\n",
+ "Epoch 4/100 \t Train Err: 40.7500 24.5 16.25 9.25 80.5\n",
+ "Epoch 4/100 \t Train Err: 40.7500 24.625 16.5 9.4375 79.5\n",
+ "Epoch 4/100 \t Train Err: 40.5000 24.75 16.75 9.625 79.0\n",
+ "Epoch 4/100 \t Train Err: 40.5000 24.625 16.875 9.75 79.0\n",
+ "Epoch 4/100 \t Train Err: 40.2500 24.375 17.125 9.875 78.5\n",
+ "Epoch 4/100 \t Train Err: 40.0000 23.75 17.125 10.0 78.0\n",
+ "Epoch 5/100 \t Train Err: 39.7500 23.0 17.125 10.0625 77.5\n",
+ "Epoch 5/100 \t Train Err: 39.5000 21.5 17.0 10.0 78.0\n",
+ "Epoch 5/100 \t Train Err: 38.7500 19.375 16.75 9.875 78.0\n",
+ "Epoch 5/100 \t Train Err: 38.5000 16.5 16.25 9.6875 78.5\n",
+ "Epoch 5/100 \t Train Err: 37.5000 12.9375 15.625 9.375 79.0\n",
+ "Epoch 5/100 \t Train Err: 36.5000 8.875 14.9375 9.125 80.0\n",
+ "Epoch 5/100 \t Train Err: 35.5000 5.09375 14.6875 9.25 79.5\n",
+ "Epoch 5/100 \t Train Err: 34.5000 2.390625 15.5 10.0 78.0\n",
+ "Epoch 6/100 \t Train Err: 33.5000 0.9140625 17.5 11.3125 75.0\n",
+ "Epoch 6/100 \t Train Err: 33.0000 0.38671875 19.875 12.4375 72.5\n",
+ "Epoch 6/100 \t Train Err: 32.7500 0.4921875 21.0 12.9375 71.5\n",
+ "Epoch 6/100 \t Train Err: 33.0000 0.85546875 21.375 13.0 71.0\n",
+ "Epoch 6/100 \t Train Err: 33.0000 1.1328125 21.5 13.125 70.5\n",
+ "Epoch 6/100 \t Train Err: 32.7500 1.1875 21.875 13.4375 69.5\n",
+ "Epoch 6/100 \t Train Err: 32.5000 1.0234375 22.5 13.9375 68.5\n",
+ "Epoch 6/100 \t Train Err: 32.2500 0.73828125 23.125 14.5 67.5\n",
+ "Epoch 7/100 \t Train Err: 31.8750 0.451171875 23.875 15.0625 66.0\n",
+ "Epoch 7/100 \t Train Err: 31.6250 0.251953125 24.625 15.625 64.5\n",
+ "Epoch 7/100 \t Train Err: 31.5000 0.2060546875 25.25 16.125 63.75\n",
+ "Epoch 7/100 \t Train Err: 31.2500 0.2734375 25.625 16.5 63.0\n",
+ "Epoch 7/100 \t Train Err: 31.1250 0.37109375 26.125 17.0 62.25\n",
+ "Epoch 7/100 \t Train Err: 30.8750 0.400390625 26.625 17.25 61.5\n",
+ "Epoch 7/100 \t Train Err: 30.8750 0.353515625 26.875 17.5 61.0\n",
+ "Epoch 7/100 \t Train Err: 30.7500 0.275390625 27.25 17.75 60.5\n",
+ "Epoch 8/100 \t Train Err: 30.6250 0.18359375 27.625 18.125 59.75\n",
+ "Epoch 8/100 \t Train Err: 30.5000 0.10986328125 28.125 18.625 59.0\n",
+ "Epoch 8/100 \t Train Err: 30.3750 0.06640625 28.625 19.0 58.5\n",
+ "Epoch 8/100 \t Train Err: 30.3750 0.04931640625 29.125 19.375 57.75\n",
+ "Epoch 8/100 \t Train Err: 30.1250 0.048583984375 29.75 19.875 57.0\n",
+ "Epoch 8/100 \t Train Err: 30.0000 0.054443359375 30.25 20.25 56.0\n",
+ "Epoch 8/100 \t Train Err: 29.8750 0.0576171875 30.875 20.875 55.25\n",
+ "Epoch 8/100 \t Train Err: 29.8750 0.056884765625 31.5 21.25 54.5\n",
+ "Epoch 9/100 \t Train Err: 29.7500 0.051025390625 32.0 21.75 53.75\n",
+ "Epoch 9/100 \t Train Err: 29.5000 0.04296875 32.75 22.25 53.0\n",
+ "Epoch 9/100 \t Train Err: 29.5000 0.03369140625 33.0 22.625 52.25\n",
+ "Epoch 9/100 \t Train Err: 29.5000 0.0260009765625 33.75 23.125 51.75\n",
+ "Epoch 9/100 \t Train Err: 29.3750 0.02197265625 34.25 23.5 51.25\n",
+ "Epoch 9/100 \t Train Err: 29.3750 0.0216064453125 35.0 24.125 50.25\n",
+ "Epoch 9/100 \t Train Err: 29.2500 0.0238037109375 35.25 24.375 50.0\n",
+ "Epoch 9/100 \t Train Err: 29.1250 0.02734375 35.75 24.75 49.5\n",
+ "Epoch 10/100 \t Train Err: 29.1250 0.0301513671875 36.0 25.0 49.0\n",
+ "Epoch 10/100 \t Train Err: 29.1250 0.032470703125 36.75 25.625 48.25\n",
+ "Epoch 10/100 \t Train Err: 29.0000 0.03271484375 37.25 26.125 47.5\n",
+ "Epoch 10/100 \t Train Err: 28.8750 0.03125 37.5 26.25 47.25\n",
+ "Epoch 10/100 \t Train Err: 29.0000 0.027587890625 38.0 26.75 46.5\n",
+ "Epoch 10/100 \t Train Err: 28.8750 0.023193359375 38.25 26.875 46.5\n",
+ "Epoch 10/100 \t Train Err: 28.8750 0.0196533203125 38.25 26.875 46.5\n",
+ "Epoch 10/100 \t Train Err: 28.7500 0.0172119140625 38.75 27.375 45.75\n",
+ "Epoch 11/100 \t Train Err: 28.7500 0.0166015625 39.0 27.5 45.5\n",
+ "Epoch 11/100 \t Train Err: 28.8750 0.0169677734375 39.0 27.5 45.5\n",
+ "Epoch 11/100 \t Train Err: 28.7500 0.0172119140625 39.0 27.5 45.5\n",
+ "Epoch 11/100 \t Train Err: 28.7500 0.017578125 39.75 28.25 44.75\n",
+ "Epoch 11/100 \t Train Err: 28.7500 0.017578125 39.75 28.25 44.75\n",
+ "Epoch 11/100 \t Train Err: 28.7500 0.017333984375 39.75 28.25 44.75\n",
+ "Epoch 11/100 \t Train Err: 28.7500 0.016845703125 39.75 28.25 44.75\n",
+ "Epoch 11/100 \t Train Err: 28.7500 0.016357421875 39.75 28.25 44.75\n",
+ "Epoch 12/100 \t Train Err: 28.7500 0.015869140625 40.0 28.5 44.25\n",
+ "Epoch 12/100 \t Train Err: 28.7500 0.01513671875 40.75 28.875 44.0\n",
+ "Epoch 12/100 \t Train Err: 28.7500 0.01483154296875 40.75 28.875 44.0\n",
+ "Epoch 12/100 \t Train Err: 28.8750 0.01416015625 40.75 28.875 44.0\n",
+ "Epoch 12/100 \t Train Err: 28.7500 0.0140380859375 40.75 28.875 44.0\n",
+ "Epoch 12/100 \t Train Err: 28.7500 0.01397705078125 40.75 28.875 44.0\n",
+ "Epoch 12/100 \t Train Err: 28.7500 0.0140380859375 40.75 28.875 44.0\n",
+ "Epoch 12/100 \t Train Err: 28.6250 0.01422119140625 40.75 29.0 43.75\n",
+ "Epoch 13/100 \t Train Err: 28.6250 0.01422119140625 41.0 29.375 43.25\n",
+ "Epoch 13/100 \t Train Err: 28.7500 0.01416015625 41.5 29.5 43.0\n",
+ "Epoch 13/100 \t Train Err: 28.7500 0.0142822265625 41.5 29.625 43.0\n",
+ "Epoch 13/100 \t Train Err: 28.6250 0.01446533203125 41.5 29.625 43.0\n",
+ "Epoch 13/100 \t Train Err: 28.6250 0.01422119140625 41.5 29.625 43.0\n",
+ "Epoch 13/100 \t Train Err: 28.6250 0.013916015625 41.5 29.625 43.0\n",
+ "Epoch 13/100 \t Train Err: 28.6250 0.01373291015625 41.5 29.625 43.0\n",
+ "Epoch 13/100 \t Train Err: 28.6250 0.0135498046875 41.5 29.625 43.0\n",
+ "Epoch 14/100 \t Train Err: 28.6250 0.01318359375 41.5 29.625 43.0\n",
+ "Epoch 14/100 \t Train Err: 28.5000 0.012939453125 41.5 29.625 42.75\n",
+ "Epoch 14/100 \t Train Err: 28.6250 0.01275634765625 41.75 29.875 42.5\n",
+ "Epoch 14/100 \t Train Err: 28.6250 0.012451171875 42.0 30.125 42.5\n",
+ "Epoch 14/100 \t Train Err: 28.6250 0.01220703125 42.25 30.25 42.25\n",
+ "Epoch 14/100 \t Train Err: 28.6250 0.01226806640625 42.25 30.25 42.25\n",
+ "Epoch 14/100 \t Train Err: 28.6250 0.01190185546875 42.25 30.25 42.25\n",
+ "Epoch 14/100 \t Train Err: 28.6250 0.01190185546875 42.25 30.25 42.25\n",
+ "Epoch 15/100 \t Train Err: 28.7500 0.0118408203125 42.25 30.25 42.25\n",
+ "Epoch 15/100 \t Train Err: 28.6250 0.0115966796875 42.25 30.25 42.25\n",
+ "Epoch 15/100 \t Train Err: 28.6250 0.0115966796875 42.25 30.25 42.25\n",
+ "Epoch 15/100 \t Train Err: 28.6250 0.01141357421875 42.25 30.25 42.25\n",
+ "Epoch 15/100 \t Train Err: 28.6250 0.011474609375 42.25 30.25 42.0\n",
+ "Epoch 15/100 \t Train Err: 28.6250 0.01123046875 42.25 30.375 42.0\n",
+ "Epoch 15/100 \t Train Err: 28.5000 0.0111083984375 42.5 30.625 41.75\n",
+ "Epoch 15/100 \t Train Err: 28.6250 0.010986328125 42.5 30.75 41.75\n",
+ "Epoch 16/100 \t Train Err: 28.6250 0.01104736328125 42.75 30.875 41.5\n",
+ "Epoch 16/100 \t Train Err: 28.6250 0.01092529296875 42.75 30.875 41.5\n",
+ "Epoch 16/100 \t Train Err: 28.6250 0.0107421875 43.0 31.0 41.5\n",
+ "Epoch 16/100 \t Train Err: 28.6250 0.0107421875 43.0 31.0 41.5\n",
+ "Epoch 16/100 \t Train Err: 28.6250 0.01068115234375 43.0 31.0 41.5\n",
+ "Epoch 16/100 \t Train Err: 28.6250 0.01043701171875 43.0 31.0 41.5\n",
+ "Epoch 16/100 \t Train Err: 28.6250 0.0103759765625 43.0 31.0 41.5\n",
+ "Epoch 16/100 \t Train Err: 28.6250 0.01025390625 43.0 31.0 41.5\n",
+ "Epoch 17/100 \t Train Err: 28.6250 0.0101318359375 43.0 31.0 41.5\n",
+ "Epoch 17/100 \t Train Err: 28.6250 0.0098876953125 43.0 31.0 41.5\n",
+ "Epoch 17/100 \t Train Err: 28.6250 0.00982666015625 43.0 31.0 41.5\n",
+ "Epoch 17/100 \t Train Err: 28.6250 0.009765625 43.0 31.0 41.5\n",
+ "Epoch 17/100 \t Train Err: 28.6250 0.00958251953125 43.0 31.0 41.5\n",
+ "Epoch 17/100 \t Train Err: 28.6250 0.00946044921875 43.0 31.0 41.5\n",
+ "Epoch 17/100 \t Train Err: 28.5000 0.0093994140625 43.0 31.0 41.5\n",
+ "Epoch 17/100 \t Train Err: 28.6250 0.0091552734375 43.0 31.0 41.5\n",
+ "Epoch 18/100 \t Train Err: 28.6250 0.00897216796875 43.0 31.0 41.25\n",
+ "Epoch 18/100 \t Train Err: 28.5000 0.0089111328125 43.0 31.0 41.25\n",
+ "Epoch 18/100 \t Train Err: 28.3750 0.00885009765625 43.0 31.0 41.25\n",
+ "Epoch 18/100 \t Train Err: 28.3750 0.0087890625 43.0 31.125 41.25\n",
+ "Epoch 18/100 \t Train Err: 28.6250 0.0086669921875 43.0 31.125 41.25\n",
+ "Epoch 18/100 \t Train Err: 28.5000 0.008544921875 43.0 31.125 41.25\n",
+ "Epoch 18/100 \t Train Err: 28.5000 0.00836181640625 43.0 31.125 41.25\n",
+ "Epoch 18/100 \t Train Err: 28.5000 0.0081787109375 43.0 31.125 41.25\n",
+ "Epoch 19/100 \t Train Err: 28.3750 0.0079345703125 43.0 31.125 41.25\n",
+ "Epoch 19/100 \t Train Err: 28.5000 0.0078125 43.0 31.125 41.25\n",
+ "Epoch 19/100 \t Train Err: 28.5000 0.007781982421875 43.0 31.0 41.25\n",
+ "Epoch 19/100 \t Train Err: 28.5000 0.00750732421875 43.0 31.0 41.25\n",
+ "Epoch 19/100 \t Train Err: 28.5000 0.00738525390625 42.75 30.875 41.25\n",
+ "Epoch 19/100 \t Train Err: 28.5000 0.00714111328125 42.5 30.75 41.5\n",
+ "Epoch 19/100 \t Train Err: 28.3750 0.006866455078125 42.25 30.5 41.5\n",
+ "Epoch 19/100 \t Train Err: 28.3750 0.0067138671875 41.75 30.125 42.0\n",
+ "Epoch 20/100 \t Train Err: 28.2500 0.006591796875 40.5 29.25 42.5\n",
+ "Epoch 20/100 \t Train Err: 28.1250 0.00634765625 37.5 27.125 44.5\n",
+ "Epoch 20/100 \t Train Err: 27.8750 0.0067138671875 27.75 19.875 52.0\n",
+ "Epoch 20/100 \t Train Err: 27.8750 0.0040283203125 25.875 18.5 53.5\n",
+ "Epoch 20/100 \t Train Err: 27.7500 0.011962890625 34.0 24.5 46.5\n",
+ "Epoch 20/100 \t Train Err: 27.8750 0.0240478515625 36.5 26.125 44.75\n",
+ "Epoch 20/100 \t Train Err: 27.6250 0.0267333984375 35.5 25.5 45.0\n",
+ "Epoch 20/100 \t Train Err: 27.2500 0.016357421875 30.125 21.5 48.5\n",
+ "Epoch 21/100 \t Train Err: 27.5000 0.005279541015625 19.5 13.5 57.5\n",
+ "Epoch 21/100 \t Train Err: 26.8750 0.00982666015625 28.875 20.875 48.25\n",
+ "Epoch 21/100 \t Train Err: 26.7500 0.01019287109375 32.5 23.875 45.0\n",
+ "Epoch 21/100 \t Train Err: 26.5000 0.0057373046875 27.75 20.625 47.5\n",
+ "Epoch 21/100 \t Train Err: 26.5000 0.0111083984375 14.0 10.375 58.5\n",
+ "Epoch 21/100 \t Train Err: 25.7500 0.007110595703125 27.625 21.875 45.0\n",
+ "Epoch 21/100 \t Train Err: 25.3750 0.0081787109375 27.625 22.25 44.25\n",
+ "Epoch 21/100 \t Train Err: 24.7500 0.0101318359375 11.4375 9.5 55.5\n",
+ "Epoch 22/100 \t Train Err: 23.7500 0.0091552734375 14.8125 12.625 50.0\n",
+ "Epoch 22/100 \t Train Err: 23.3750 0.0196533203125 18.5 16.5 45.5\n",
+ "Epoch 22/100 \t Train Err: 22.8750 0.0205078125 9.5625 8.25 52.0\n",
+ "Epoch 22/100 \t Train Err: 22.3750 0.045654296875 9.1875 7.90625 50.75\n",
+ "Epoch 22/100 \t Train Err: 22.2500 0.1318359375 15.375 13.8125 45.0\n",
+ "Epoch 22/100 \t Train Err: 21.6250 0.150390625 11.4375 9.5625 47.25\n",
+ "Epoch 22/100 \t Train Err: 21.3750 0.126953125 8.4375 6.34375 49.75\n",
+ "Epoch 22/100 \t Train Err: 20.8750 0.1455078125 11.0625 8.75 46.0\n",
+ "Epoch 23/100 \t Train Err: 20.6250 0.125 13.6875 11.4375 43.0\n",
+ "Epoch 23/100 \t Train Err: 20.3750 0.04931640625 11.625 9.625 44.0\n",
+ "Epoch 23/100 \t Train Err: 20.0000 0.033935546875 9.3125 7.6875 45.25\n",
+ "Epoch 23/100 \t Train Err: 19.6250 0.07275390625 10.0625 8.875 43.25\n",
+ "Epoch 23/100 \t Train Err: 19.5000 0.1181640625 11.5625 10.9375 41.0\n",
+ "Epoch 23/100 \t Train Err: 19.0000 0.1787109375 11.0 11.1875 40.25\n",
+ "Epoch 23/100 \t Train Err: 18.7500 0.25 8.1875 8.9375 41.75\n",
+ "Epoch 23/100 \t Train Err: 18.5000 0.2216796875 8.1875 9.9375 40.25\n",
+ "Epoch 24/100 \t Train Err: 18.1250 0.1513671875 10.0625 13.4375 37.0\n",
+ "Epoch 24/100 \t Train Err: 17.6250 0.12890625 7.6875 11.6875 37.75\n",
+ "Epoch 24/100 \t Train Err: 17.3750 0.1201171875 6.28125 10.875 38.0\n",
+ "Epoch 24/100 \t Train Err: 17.0000 0.126953125 7.53125 14.5625 35.0\n",
+ "Epoch 24/100 \t Train Err: 16.7500 0.11181640625 7.3125 15.6875 33.75\n",
+ "Epoch 24/100 \t Train Err: 16.5000 0.08203125 4.75 13.3125 35.5\n",
+ "Epoch 24/100 \t Train Err: 16.3750 0.068359375 5.75 17.125 32.75\n",
+ "Epoch 24/100 \t Train Err: 15.9375 0.057861328125 6.34375 19.25 30.75\n",
+ "Epoch 25/100 \t Train Err: 15.7500 0.051025390625 3.578125 14.5 33.5\n",
+ "Epoch 25/100 \t Train Err: 15.2500 0.04248046875 5.0 18.625 29.625\n",
+ "Epoch 25/100 \t Train Err: 15.0000 0.040771484375 5.53125 21.125 27.875\n",
+ "Epoch 25/100 \t Train Err: 14.8125 0.033935546875 3.171875 16.0 30.375\n",
+ "Epoch 25/100 \t Train Err: 14.6250 0.0322265625 3.734375 18.875 28.5\n",
+ "Epoch 25/100 \t Train Err: 14.3750 0.03369140625 5.09375 23.0 25.5\n",
+ "Epoch 25/100 \t Train Err: 14.1250 0.028076171875 2.046875 14.3125 30.125\n",
+ "Epoch 25/100 \t Train Err: 13.8125 0.023681640625 3.234375 19.375 26.625\n",
+ "Epoch 26/100 \t Train Err: 13.6875 0.023681640625 4.75 24.875 23.125\n",
+ "Epoch 26/100 \t Train Err: 13.5625 0.0245361328125 1.515625 13.6875 29.0\n",
+ "Epoch 26/100 \t Train Err: 13.0625 0.0179443359375 2.875 20.5 24.25\n",
+ "Epoch 26/100 \t Train Err: 13.0000 0.016845703125 3.5 24.0 22.25\n",
+ "Epoch 26/100 \t Train Err: 12.8750 0.02197265625 1.46875 15.625 26.5\n",
+ "Epoch 26/100 \t Train Err: 12.5000 0.0174560546875 2.03125 19.5 23.5\n",
+ "Epoch 26/100 \t Train Err: 12.4375 0.014404296875 3.0 24.75 20.625\n",
+ "Epoch 26/100 \t Train Err: 12.1250 0.0230712890625 1.46875 17.625 23.875\n",
+ "Epoch 27/100 \t Train Err: 11.9375 0.022705078125 1.421875 17.75 23.125\n",
+ "Epoch 27/100 \t Train Err: 11.7500 0.0150146484375 2.09375 22.625 20.0\n",
+ "Epoch 27/100 \t Train Err: 11.6250 0.01531982421875 1.6796875 20.875 20.75\n",
+ "Epoch 27/100 \t Train Err: 11.3750 0.0177001953125 1.0546875 17.25 22.0\n",
+ "Epoch 27/100 \t Train Err: 11.0625 0.0128173828125 1.359375 20.375 19.875\n",
+ "Epoch 27/100 \t Train Err: 11.0000 0.0128173828125 1.5078125 22.0 18.875\n",
+ "Epoch 27/100 \t Train Err: 10.8125 0.01190185546875 1.03125 18.125 20.125\n",
+ "Epoch 27/100 \t Train Err: 10.7500 0.01165771484375 0.99609375 18.125 20.25\n",
+ "Epoch 28/100 \t Train Err: 10.5625 0.012451171875 1.328125 21.125 18.125\n",
+ "Epoch 28/100 \t Train Err: 10.3750 0.01104736328125 1.15625 19.375 18.625\n",
+ "Epoch 28/100 \t Train Err: 10.3125 0.01025390625 0.953125 17.25 19.5\n",
+ "Epoch 28/100 \t Train Err: 10.1250 0.010498046875 1.171875 19.875 17.75\n",
+ "Epoch 28/100 \t Train Err: 10.0625 0.0101318359375 1.109375 20.0 17.5\n",
+ "Epoch 28/100 \t Train Err: 10.0000 0.0111083984375 0.7578125 16.75 18.875\n",
+ "Epoch 28/100 \t Train Err: 9.8125 0.0093994140625 0.87109375 18.375 17.5\n",
+ "Epoch 28/100 \t Train Err: 9.7500 0.01043701171875 1.0390625 20.625 16.375\n",
+ "Epoch 29/100 \t Train Err: 9.4375 0.00921630859375 0.828125 18.5 16.75\n",
+ "Epoch 29/100 \t Train Err: 9.5000 0.00836181640625 0.59375 16.5 17.75\n",
+ "Epoch 29/100 \t Train Err: 9.4375 0.0115966796875 0.796875 20.375 15.8125\n",
+ "Epoch 29/100 \t Train Err: 9.3125 0.010986328125 0.72265625 20.25 15.625\n",
+ "Epoch 29/100 \t Train Err: 9.1875 0.00762939453125 0.51953125 16.75 16.875\n",
+ "Epoch 29/100 \t Train Err: 9.0625 0.00799560546875 0.56640625 18.375 15.8125\n",
+ "Epoch 29/100 \t Train Err: 9.0625 0.00946044921875 0.66796875 20.625 14.625\n",
+ "Epoch 29/100 \t Train Err: 9.0000 0.00665283203125 0.46484375 16.125 16.5\n",
+ "Epoch 30/100 \t Train Err: 8.8750 0.008056640625 0.5234375 18.25 15.3125\n",
+ "Epoch 30/100 \t Train Err: 8.7500 0.0111083984375 0.59375 20.0 14.1875\n",
+ "Epoch 36/100 \t Train Err: 7.3750 0.00799560546875 0.302734375 13.5 12.9375\n",
+ "Epoch 36/100 \t Train Err: 7.2188 0.00799560546875 0.369140625 15.5625 11.375\n",
+ "Epoch 36/100 \t Train Err: 7.2188 0.00823974609375 0.4296875 17.375 10.375\n",
+ "Epoch 36/100 \t Train Err: 7.2500 0.00860595703125 0.412109375 18.0 10.125\n",
+ "Epoch 36/100 \t Train Err: 7.1875 0.01171875 0.33984375 15.625 11.1875\n",
+ "Epoch 36/100 \t Train Err: 7.0625 0.0177001953125 0.2890625 12.875 12.1875\n",
+ "Epoch 36/100 \t Train Err: 7.1562 0.01806640625 0.271484375 11.8125 13.0\n",
+ "Epoch 36/100 \t Train Err: 7.1875 0.0120849609375 0.24609375 11.5625 13.0625\n",
+ "Epoch 37/100 \t Train Err: 7.0625 0.007171630859375 0.2431640625 12.375 12.3125\n",
+ "Epoch 37/100 \t Train Err: 7.0625 0.0101318359375 0.2490234375 14.0 11.5\n",
+ "Epoch 37/100 \t Train Err: 7.0625 0.0181884765625 0.28125 15.4375 10.875\n",
+ "Epoch 37/100 \t Train Err: 7.0938 0.0244140625 0.287109375 15.8125 10.6875\n",
+ "Epoch 37/100 \t Train Err: 6.9375 0.0230712890625 0.27734375 15.1875 10.625\n",
+ "Epoch 37/100 \t Train Err: 6.8750 0.01556396484375 0.255859375 13.5625 11.4375\n",
+ "Epoch 37/100 \t Train Err: 6.9375 0.0091552734375 0.220703125 12.4375 12.0625\n",
+ "Epoch 37/100 \t Train Err: 6.9375 0.006011962890625 0.2158203125 12.4375 12.0625\n",
+ "Epoch 38/100 \t Train Err: 6.8438 0.004791259765625 0.232421875 12.9375 11.625\n",
+ "Epoch 38/100 \t Train Err: 6.8750 0.004486083984375 0.2421875 14.3125 10.875\n",
+ "Epoch 38/100 \t Train Err: 6.8438 0.00433349609375 0.28125 15.0625 10.1875\n",
+ "Epoch 38/100 \t Train Err: 6.9375 0.004241943359375 0.25 14.9375 10.4375\n",
+ "Epoch 38/100 \t Train Err: 6.7500 0.004241943359375 0.23828125 13.75 10.625\n",
+ "Epoch 38/100 \t Train Err: 6.7812 0.0042724609375 0.2109375 12.25 11.5625\n",
+ "Epoch 38/100 \t Train Err: 6.8438 0.003692626953125 0.1943359375 11.9375 11.875\n",
+ "Epoch 38/100 \t Train Err: 6.6875 0.004119873046875 0.197265625 11.5 11.5\n",
+ "Epoch 39/100 \t Train Err: 6.6562 0.007720947265625 0.193359375 12.1875 11.0625\n",
+ "Epoch 39/100 \t Train Err: 6.6250 0.01318359375 0.2080078125 13.25 10.4375\n",
+ "Epoch 39/100 \t Train Err: 6.6562 0.016357421875 0.224609375 13.9375 10.3125\n",
+ "Epoch 39/100 \t Train Err: 6.6562 0.0159912109375 0.2021484375 13.75 10.375\n",
+ "Epoch 39/100 \t Train Err: 6.5312 0.0126953125 0.19140625 12.9375 10.5\n",
+ "Epoch 39/100 \t Train Err: 6.5938 0.0081787109375 0.1796875 11.9375 11.0625\n",
+ "Epoch 39/100 \t Train Err: 6.6250 0.005401611328125 0.1796875 11.875 11.375\n",
+ "Epoch 39/100 \t Train Err: 6.5000 0.0040283203125 0.1787109375 12.125 10.9375\n",
+ "Epoch 40/100 \t Train Err: 6.5312 0.0031890869140625 0.1962890625 12.8125 10.5625\n",
+ "Epoch 40/100 \t Train Err: 6.5625 0.0029296875 0.2080078125 13.25 10.3125\n",
+ "Epoch 40/100 \t Train Err: 6.5625 0.0026702880859375 0.189453125 13.5 10.25\n",
+ "Epoch 40/100 \t Train Err: 6.5312 0.002685546875 0.177734375 12.5625 10.4375\n",
+ "Epoch 40/100 \t Train Err: 6.4375 0.0027008056640625 0.169921875 11.625 10.8125\n",
+ "Epoch 40/100 \t Train Err: 6.5000 0.0026092529296875 0.1630859375 11.6875 11.0625\n",
+ "Epoch 40/100 \t Train Err: 6.5000 0.0030670166015625 0.162109375 11.9375 10.875\n",
+ "Epoch 40/100 \t Train Err: 6.5000 0.004486083984375 0.1630859375 12.4375 10.5625\n",
+ "Epoch 41/100 \t Train Err: 6.4375 0.006011962890625 0.1875 13.3125 9.9375\n",
+ "Epoch 41/100 \t Train Err: 6.4688 0.005706787109375 0.1708984375 12.75 10.25\n",
+ "Epoch 41/100 \t Train Err: 6.4688 0.00445556640625 0.15234375 12.25 10.625\n",
+ "Epoch 41/100 \t Train Err: 6.4688 0.0032501220703125 0.166015625 11.8125 10.875\n",
+ "Epoch 41/100 \t Train Err: 6.3750 0.0027008056640625 0.166015625 12.0625 10.5\n",
+ "Epoch 41/100 \t Train Err: 6.3125 0.0023040771484375 0.158203125 12.0 10.25\n",
+ "Epoch 41/100 \t Train Err: 6.4062 0.002227783203125 0.1640625 12.4375 10.125\n",
+ "Epoch 41/100 \t Train Err: 6.3438 0.002227783203125 0.171875 12.6875 9.9375\n",
+ "Epoch 42/100 \t Train Err: 6.3125 0.002197265625 0.1591796875 12.0625 10.1875\n",
+ "Epoch 42/100 \t Train Err: 6.2500 0.0021209716796875 0.1513671875 11.4375 10.3125\n",
+ "Epoch 42/100 \t Train Err: 6.2812 0.0022430419921875 0.1396484375 11.5 10.5\n",
+ "Epoch 42/100 \t Train Err: 6.1875 0.002838134765625 0.146484375 11.8125 9.9375\n",
+ "Epoch 42/100 \t Train Err: 6.3125 0.0037078857421875 0.150390625 12.125 10.0625\n",
+ "Epoch 42/100 \t Train Err: 6.2812 0.004425048828125 0.1591796875 12.375 9.875\n",
+ "Epoch 42/100 \t Train Err: 6.2188 0.004150390625 0.1357421875 11.625 10.0625\n",
+ "Epoch 42/100 \t Train Err: 6.2188 0.0035858154296875 0.1416015625 11.4375 10.25\n",
+ "Epoch 43/100 \t Train Err: 6.2500 0.0028839111328125 0.1328125 11.1875 10.4375\n",
+ "Epoch 43/100 \t Train Err: 6.1562 0.0025482177734375 0.13671875 11.1875 10.125\n",
+ "Epoch 43/100 \t Train Err: 6.0938 0.002227783203125 0.142578125 11.625 9.75\n",
+ "Epoch 43/100 \t Train Err: 6.1875 0.002105712890625 0.1435546875 12.0625 9.8125\n",
+ "Epoch 43/100 \t Train Err: 6.2812 0.001983642578125 0.150390625 12.125 9.9375\n",
+ "Epoch 43/100 \t Train Err: 6.0938 0.0019683837890625 0.1396484375 11.5 9.8125\n",
+ "Epoch 43/100 \t Train Err: 6.2188 0.00191497802734375 0.1337890625 11.5 10.125\n",
+ "Epoch 43/100 \t Train Err: 6.0938 0.00201416015625 0.1337890625 11.4375 9.875\n",
+ "Epoch 44/100 \t Train Err: 6.0938 0.0023040771484375 0.140625 11.375 9.9375\n",
+ "Epoch 44/100 \t Train Err: 6.0938 0.002960205078125 0.1298828125 11.125 10.0\n",
+ "Epoch 44/100 \t Train Err: 6.1562 0.003662109375 0.1357421875 11.375 10.0\n",
+ "Epoch 44/100 \t Train Err: 6.0625 0.003997802734375 0.130859375 11.4375 9.75\n",
+ "Epoch 44/100 \t Train Err: 6.0312 0.003997802734375 0.134765625 11.4375 9.5625\n",
+ "Epoch 44/100 \t Train Err: 6.0000 0.003265380859375 0.1337890625 11.4375 9.625\n",
+ "Epoch 44/100 \t Train Err: 6.0000 0.0024871826171875 0.1337890625 11.5 9.625\n",
+ "Epoch 44/100 \t Train Err: 6.0312 0.0020904541015625 0.1376953125 11.0625 9.8125\n",
+ "Epoch 45/100 \t Train Err: 5.9688 0.0020294189453125 0.125 10.8125 9.6875\n",
+ "Epoch 45/100 \t Train Err: 6.0000 0.0019683837890625 0.1318359375 10.75 9.75\n",
+ "Epoch 45/100 \t Train Err: 6.0312 0.002044677734375 0.12890625 11.0625 9.6875\n",
+ "Epoch 45/100 \t Train Err: 5.9688 0.002197265625 0.12353515625 11.0625 9.6875\n",
+ "Epoch 45/100 \t Train Err: 5.8750 0.0026397705078125 0.1318359375 11.3125 9.375\n",
+ "Epoch 45/100 \t Train Err: 5.9375 0.003204345703125 0.1201171875 11.25 9.4375\n",
+ "Epoch 45/100 \t Train Err: 5.8125 0.003326416015625 0.115234375 11.0 9.375\n",
+ "Epoch 45/100 \t Train Err: 5.9062 0.0030975341796875 0.111328125 11.0 9.5625\n",
+ "Epoch 46/100 \t Train Err: 5.9062 0.0026702880859375 0.10498046875 10.5 9.75\n",
+ "Epoch 46/100 \t Train Err: 5.8125 0.0024566650390625 0.1044921875 10.375 9.625\n",
+ "Epoch 46/100 \t Train Err: 5.8438 0.0024566650390625 0.11474609375 10.875 9.5\n",
+ "Epoch 46/100 \t Train Err: 5.8438 0.0023956298828125 0.11962890625 11.375 9.1875\n",
+ "Epoch 46/100 \t Train Err: 5.7812 0.0023651123046875 0.12060546875 11.125 9.125\n",
+ "Epoch 46/100 \t Train Err: 5.9062 0.0023193359375 0.11767578125 10.875 9.5625\n",
+ "Epoch 46/100 \t Train Err: 5.7500 0.002349853515625 0.09912109375 10.25 9.5625\n",
+ "Epoch 46/100 \t Train Err: 5.7812 0.0024871826171875 0.10986328125 9.9375 9.8125\n",
+ "Epoch 47/100 \t Train Err: 5.7812 0.002960205078125 0.107421875 10.5 9.375\n",
+ "Epoch 47/100 \t Train Err: 5.7812 0.0032501220703125 0.1123046875 10.875 9.1875\n",
+ "Epoch 47/100 \t Train Err: 5.7188 0.0033111572265625 0.11767578125 10.5625 9.125\n",
+ "Epoch 47/100 \t Train Err: 5.7812 0.0030517578125 0.10986328125 10.4375 9.5\n",
+ "Epoch 47/100 \t Train Err: 5.6562 0.002899169921875 0.1181640625 10.625 9.0\n",
+ "Epoch 47/100 \t Train Err: 5.6562 0.0026702880859375 0.12109375 11.0 8.875\n",
+ "Epoch 47/100 \t Train Err: 5.7812 0.00262451171875 0.10302734375 10.4375 9.375\n",
+ "Epoch 47/100 \t Train Err: 5.7812 0.0026702880859375 0.1015625 9.8125 9.75\n",
+ "Epoch 48/100 \t Train Err: 5.7188 0.002655029296875 0.09814453125 9.9375 9.4375\n",
+ "Epoch 48/100 \t Train Err: 5.6562 0.003143310546875 0.111328125 10.75 8.875\n",
+ "Epoch 48/100 \t Train Err: 5.5625 0.00335693359375 0.111328125 10.6875 8.75\n",
+ "Epoch 48/100 \t Train Err: 5.5625 0.003326416015625 0.1044921875 9.8125 9.1875\n",
+ "Epoch 48/100 \t Train Err: 5.6562 0.003265380859375 0.099609375 10.125 9.25\n",
+ "Epoch 48/100 \t Train Err: 5.6875 0.0030670166015625 0.10888671875 10.875 8.875\n",
+ "Epoch 48/100 \t Train Err: 5.5938 0.0027923583984375 0.09619140625 10.25 8.9375\n",
+ "Epoch 48/100 \t Train Err: 5.5938 0.0027313232421875 0.10400390625 9.4375 9.4375\n",
+ "Epoch 49/100 \t Train Err: 5.5312 0.002777099609375 0.09326171875 10.0 8.9375\n",
+ "Epoch 49/100 \t Train Err: 5.5938 0.0031890869140625 0.1015625 10.9375 8.5\n",
+ "Epoch 49/100 \t Train Err: 5.5000 0.00341796875 0.08984375 10.25 8.8125\n",
+ "Epoch 49/100 \t Train Err: 5.5938 0.003662109375 0.07666015625 8.8125 9.75\n",
+ "Epoch 49/100 \t Train Err: 5.5312 0.004547119140625 0.09375 9.6875 9.0625\n",
+ "Epoch 49/100 \t Train Err: 5.5000 0.004638671875 0.1103515625 11.125 8.125\n",
+ "Epoch 49/100 \t Train Err: 5.4375 0.0031890869140625 0.08447265625 8.875 9.3125\n",
+ "Epoch 49/100 \t Train Err: 5.5312 0.0031890869140625 0.0908203125 9.3125 9.375\n",
+ "Epoch 50/100 \t Train Err: 5.5625 0.003265380859375 0.10693359375 11.75 7.625\n",
+ "Epoch 50/100 \t Train Err: 5.5000 0.00341796875 0.07763671875 7.5625 10.1875\n",
+ "Epoch 50/100 \t Train Err: 5.3750 0.00323486328125 0.08056640625 8.8125 9.125\n",
+ "Epoch 50/100 \t Train Err: 5.4688 0.00433349609375 0.11767578125 11.875 7.46875\n",
+ "Epoch 50/100 \t Train Err: 5.5312 0.00433349609375 0.09130859375 8.625 9.75\n",
+ "Epoch 50/100 \t Train Err: 5.4062 0.0047607421875 0.087890625 9.3125 9.0\n",
+ "Epoch 50/100 \t Train Err: 5.4375 0.00457763671875 0.11767578125 11.875 7.4375\n",
+ "Epoch 50/100 \t Train Err: 5.5312 0.003387451171875 0.06640625 6.75 10.8125\n",
+ "Epoch 51/100 \t Train Err: 5.4062 0.003448486328125 0.08984375 9.5 8.75\n",
+ "Epoch 51/100 \t Train Err: 5.4062 0.003662109375 0.11279296875 12.0 7.40625\n",
+ "Epoch 51/100 \t Train Err: 5.3125 0.003692626953125 0.087890625 8.4375 9.1875\n",
+ "Epoch 51/100 \t Train Err: 5.3750 0.003692626953125 0.08349609375 8.5625 9.3125\n",
+ "Epoch 51/100 \t Train Err: 5.3125 0.0037994384765625 0.09765625 10.8125 7.9375\n",
+ "Epoch 51/100 \t Train Err: 5.3750 0.0036773681640625 0.0791015625 9.6875 8.4375\n",
+ "Epoch 51/100 \t Train Err: 5.3125 0.0037078857421875 0.06787109375 8.375 9.25\n",
+ "Epoch 51/100 \t Train Err: 5.2500 0.003936767578125 0.076171875 9.25 8.625\n",
+ "Epoch 52/100 \t Train Err: 5.3125 0.00433349609375 0.08349609375 10.375 8.0\n",
+ "Epoch 52/100 \t Train Err: 5.1875 0.00457763671875 0.0732421875 8.8125 8.6875\n",
+ "Epoch 52/100 \t Train Err: 5.2188 0.0047607421875 0.06787109375 8.25 8.9375\n",
+ "Epoch 52/100 \t Train Err: 5.2812 0.00518798828125 0.08056640625 9.5 8.3125\n",
+ "Epoch 52/100 \t Train Err: 5.0938 0.0047607421875 0.08154296875 9.5 7.9375\n",
+ "Epoch 52/100 \t Train Err: 5.2188 0.00396728515625 0.06591796875 8.0625 9.1875\n",
+ "Epoch 52/100 \t Train Err: 5.1250 0.004180908203125 0.07421875 9.5 8.0\n",
+ "Epoch 52/100 \t Train Err: 5.1250 0.004150390625 0.078125 9.5 8.0625\n",
+ "Epoch 53/100 \t Train Err: 5.1562 0.004180908203125 0.064453125 8.0 8.875\n",
+ "Epoch 53/100 \t Train Err: 5.0312 0.004608154296875 0.0703125 8.3125 8.4375\n",
+ "Epoch 53/100 \t Train Err: 5.0625 0.00531005859375 0.07861328125 8.625 8.1875\n",
+ "Epoch 53/100 \t Train Err: 5.0312 0.005340576171875 0.07763671875 9.0 8.0625\n",
+ "Epoch 53/100 \t Train Err: 5.0312 0.004791259765625 0.07421875 8.4375 8.25\n",
+ "Epoch 53/100 \t Train Err: 5.0312 0.00445556640625 0.0673828125 8.3125 8.375\n",
+ "Epoch 53/100 \t Train Err: 4.9688 0.004486083984375 0.06591796875 8.625 8.0\n",
+ "Epoch 53/100 \t Train Err: 5.0312 0.004486083984375 0.06396484375 8.25 8.4375\n",
+ "Epoch 54/100 \t Train Err: 4.9688 0.004425048828125 0.06689453125 8.3125 8.3125\n",
+ "Epoch 54/100 \t Train Err: 5.0000 0.00457763671875 0.07470703125 8.75 7.96875\n",
+ "Epoch 54/100 \t Train Err: 4.9375 0.004669189453125 0.07080078125 7.96875 8.25\n",
+ "Epoch 54/100 \t Train Err: 4.8750 0.004852294921875 0.07275390625 7.90625 8.1875\n",
+ "Epoch 54/100 \t Train Err: 4.9062 0.00494384765625 0.0791015625 8.5625 7.8125\n",
+ "Epoch 54/100 \t Train Err: 4.9688 0.0045166015625 0.0732421875 7.90625 8.3125\n",
+ "Epoch 54/100 \t Train Err: 4.9375 0.0045166015625 0.06689453125 7.625 8.4375\n",
+ "Epoch 54/100 \t Train Err: 4.8438 0.004608154296875 0.0791015625 8.625 7.5\n",
+ "Epoch 55/100 \t Train Err: 4.9062 0.0045166015625 0.0693359375 7.3125 8.5625\n",
+ "Epoch 55/100 \t Train Err: 4.9062 0.00494384765625 0.07958984375 8.3125 7.875\n",
+ "Epoch 55/100 \t Train Err: 4.8750 0.00531005859375 0.0849609375 8.6875 7.6875\n",
+ "Epoch 55/100 \t Train Err: 4.7812 0.004638671875 0.06494140625 6.8125 8.5625\n",
+ "Epoch 55/100 \t Train Err: 4.8438 0.004791259765625 0.08203125 9.0 7.1875\n",
+ "Epoch 55/100 \t Train Err: 4.9688 0.004241943359375 0.055419921875 5.40625 10.0\n",
+ "Epoch 55/100 \t Train Err: 6.1562 0.005401611328125 0.19140625 20.125 4.1875\n",
+ "Epoch 55/100 \t Train Err: 14.0000 0.005706787109375 0.107421875 0.53515625 36.0\n",
+ "Epoch 56/100 \t Train Err: 11.1875 0.314453125 0.1689453125 0.56640625 29.0\n",
+ "Epoch 56/100 \t Train Err: 7.6250 1.75 3.984375 17.0 9.1875\n",
+ "Epoch 56/100 \t Train Err: 12.3125 2.03125 11.1875 47.25 3.21875\n",
+ "Epoch 56/100 \t Train Err: 7.0625 0.88671875 1.78125 10.1875 12.5625\n",
+ "Epoch 56/100 \t Train Err: 8.7500 0.05078125 0.1953125 2.484375 21.875\n",
+ "Epoch 56/100 \t Train Err: 8.8750 0.1328125 0.0419921875 2.5625 22.0\n",
+ "Epoch 56/100 \t Train Err: 6.6562 0.32421875 0.0888671875 9.6875 12.8125\n",
+ "Epoch 56/100 \t Train Err: 7.5000 0.349609375 0.51171875 28.375 5.90625\n",
+ "Epoch 57/100 \t Train Err: 8.4375 0.337890625 0.66796875 35.75 4.53125\n",
+ "Epoch 57/100 \t Train Err: 7.1562 0.3125 0.265625 25.375 6.375\n",
+ "Epoch 57/100 \t Train Err: 6.3750 0.259765625 0.095703125 10.75 11.5\n",
+ "Epoch 57/100 \t Train Err: 7.0625 0.125 0.08837890625 5.75 15.75\n",
+ "Epoch 57/100 \t Train Err: 7.1562 0.022705078125 0.06005859375 5.75 16.0\n",
+ "Epoch 57/100 \t Train Err: 6.5625 0.1787109375 0.2412109375 8.625 13.0625\n",
+ "Epoch 57/100 \t Train Err: 6.4375 0.443359375 0.59765625 14.5 9.375\n",
+ "Epoch 57/100 \t Train Err: 6.5625 0.408203125 0.6484375 18.875 7.34375\n",
+ "Epoch 58/100 \t Train Err: 6.4062 0.150390625 0.337890625 19.25 6.9375\n",
+ "Epoch 58/100 \t Train Err: 6.1875 0.0218505859375 0.11865234375 16.5 7.65625\n",
+ "Epoch 58/100 \t Train Err: 5.9688 0.1796875 0.30078125 11.4375 9.4375\n",
+ "Epoch 58/100 \t Train Err: 6.1562 0.35546875 0.5703125 8.625 11.0625\n",
+ "Epoch 58/100 \t Train Err: 6.0312 0.31640625 0.55859375 8.125 11.375\n",
+ "Epoch 58/100 \t Train Err: 5.8125 0.130859375 0.333984375 9.375 10.125\n",
+ "Epoch 58/100 \t Train Err: 5.5938 0.01422119140625 0.138671875 12.125 8.4375\n",
+ "Epoch 58/100 \t Train Err: 5.6562 0.06884765625 0.1396484375 14.4375 7.28125\n",
+ "Epoch 59/100 \t Train Err: 5.6875 0.1884765625 0.20703125 14.5625 7.28125\n",
+ "Epoch 59/100 \t Train Err: 5.6562 0.24609375 0.216796875 12.9375 7.96875\n",
+ "Epoch 59/100 \t Train Err: 5.4688 0.2109375 0.162109375 10.3125 9.0\n",
+ "Epoch 59/100 \t Train Err: 5.4688 0.134765625 0.10888671875 8.6875 9.9375\n",
+ "Epoch 59/100 \t Train Err: 5.4375 0.06689453125 0.0966796875 8.125 10.25\n",
+ "Epoch 59/100 \t Train Err: 5.2812 0.0262451171875 0.11279296875 8.3125 9.6875\n",
+ "Epoch 59/100 \t Train Err: 5.2812 0.0106201171875 0.1416015625 10.375 8.5\n",
+ "Epoch 59/100 \t Train Err: 5.3125 0.0084228515625 0.177734375 12.5 7.46875\n",
+ "Epoch 60/100 \t Train Err: 5.3125 0.0126953125 0.1875 12.1875 7.28125\n",
+ "Epoch 60/100 \t Train Err: 5.2188 0.019775390625 0.1982421875 11.1875 7.5625\n",
+ "Epoch 60/100 \t Train Err: 5.1250 0.0240478515625 0.203125 9.3125 8.375\n",
+ "Epoch 60/100 \t Train Err: 5.1250 0.019775390625 0.1875 8.25 9.1875\n",
+ "Epoch 60/100 \t Train Err: 5.1250 0.010986328125 0.1572265625 7.71875 9.1875\n",
+ "Epoch 60/100 \t Train Err: 5.0312 0.007171630859375 0.1259765625 8.375 8.625\n",
+ "Epoch 60/100 \t Train Err: 4.9688 0.0150146484375 0.10400390625 9.125 8.0\n",
+ "Epoch 60/100 \t Train Err: 4.9375 0.0308837890625 0.09033203125 9.625 7.625\n",
+ "Epoch 61/100 \t Train Err: 4.9688 0.046630859375 0.08447265625 9.5625 7.65625\n",
+ "Epoch 61/100 \t Train Err: 4.9688 0.0546875 0.08154296875 9.125 7.875\n",
+ "Epoch 61/100 \t Train Err: 4.9688 0.054443359375 0.0712890625 8.1875 8.5\n",
+ "Epoch 61/100 \t Train Err: 4.9062 0.049072265625 0.07275390625 7.875 8.4375\n",
+ "Epoch 61/100 \t Train Err: 4.8125 0.040771484375 0.0693359375 7.71875 8.25\n",
+ "Epoch 61/100 \t Train Err: 4.8750 0.031494140625 0.06884765625 7.9375 8.4375\n",
+ "Epoch 61/100 \t Train Err: 4.8125 0.0228271484375 0.072265625 8.5625 7.8125\n",
+ "Epoch 61/100 \t Train Err: 4.8125 0.01611328125 0.07568359375 9.0 7.59375\n",
+ "Epoch 62/100 \t Train Err: 4.7500 0.010986328125 0.07861328125 8.5625 7.625\n",
+ "Epoch 62/100 \t Train Err: 4.6875 0.0079345703125 0.08203125 7.9375 7.625\n",
+ "Epoch 62/100 \t Train Err: 4.7500 0.00665283203125 0.0810546875 7.5 8.3125\n",
+ "Epoch 62/100 \t Train Err: 4.7188 0.00634765625 0.07861328125 7.09375 8.3125\n",
+ "Epoch 62/100 \t Train Err: 4.6562 0.006317138671875 0.08349609375 7.40625 8.0\n",
+ "Epoch 62/100 \t Train Err: 4.6562 0.006500244140625 0.0791015625 7.9375 7.625\n",
+ "Epoch 62/100 \t Train Err: 4.7188 0.006500244140625 0.0732421875 8.1875 7.6875\n",
+ "Epoch 62/100 \t Train Err: 4.6875 0.00677490234375 0.0771484375 8.5 7.53125\n",
+ "Epoch 63/100 \t Train Err: 4.6875 0.00732421875 0.07421875 7.9375 7.78125\n",
+ "Epoch 63/100 \t Train Err: 4.6562 0.00836181640625 0.0703125 7.84375 7.8125\n",
+ "Epoch 63/100 \t Train Err: 4.6875 0.0093994140625 0.0654296875 7.65625 7.90625\n",
+ "Epoch 63/100 \t Train Err: 4.6875 0.010498046875 0.0654296875 7.75 7.96875\n",
+ "Epoch 63/100 \t Train Err: 4.6250 0.01116943359375 0.0634765625 7.6875 7.71875\n",
+ "Epoch 63/100 \t Train Err: 4.6562 0.01153564453125 0.064453125 7.75 7.78125\n",
+ "Epoch 63/100 \t Train Err: 4.6562 0.0118408203125 0.060302734375 7.875 7.78125\n",
+ "Epoch 63/100 \t Train Err: 4.5938 0.01171875 0.0634765625 7.625 7.71875\n",
+ "Epoch 64/100 \t Train Err: 4.6562 0.010986328125 0.060791015625 7.53125 7.90625\n",
+ "Epoch 64/100 \t Train Err: 4.5312 0.01019287109375 0.05908203125 7.15625 7.6875\n",
+ "Epoch 64/100 \t Train Err: 4.5312 0.009033203125 0.064453125 7.28125 7.75\n",
+ "Epoch 64/100 \t Train Err: 4.5625 0.00836181640625 0.06298828125 7.5 7.75\n",
+ "Epoch 64/100 \t Train Err: 4.5625 0.007598876953125 0.059326171875 7.5625 7.6875\n",
+ "Epoch 64/100 \t Train Err: 4.5938 0.007110595703125 0.0654296875 7.5625 7.75\n",
+ "Epoch 64/100 \t Train Err: 4.5312 0.00689697265625 0.06396484375 7.53125 7.625\n",
+ "Epoch 64/100 \t Train Err: 4.5625 0.006591796875 0.06494140625 7.59375 7.59375\n",
+ "Epoch 65/100 \t Train Err: 4.5000 0.0064697265625 0.061767578125 7.40625 7.625\n",
+ "Epoch 65/100 \t Train Err: 4.5938 0.006439208984375 0.061767578125 7.625 7.71875\n",
+ "Epoch 65/100 \t Train Err: 4.5625 0.006500244140625 0.0615234375 7.5 7.6875\n",
+ "Epoch 65/100 \t Train Err: 4.5625 0.0064697265625 0.055908203125 7.25 7.75\n",
+ "Epoch 65/100 \t Train Err: 4.5625 0.00640869140625 0.056640625 7.34375 7.75\n",
+ "Epoch 65/100 \t Train Err: 4.4688 0.0064697265625 0.0625 7.375 7.5625\n",
+ "Epoch 65/100 \t Train Err: 4.4688 0.00640869140625 0.060302734375 7.34375 7.53125\n",
+ "Epoch 65/100 \t Train Err: 4.4375 0.00640869140625 0.059814453125 7.03125 7.6875\n",
+ "Epoch 66/100 \t Train Err: 4.5000 0.00628662109375 0.06005859375 6.96875 7.71875\n",
+ "Epoch 66/100 \t Train Err: 4.5312 0.006622314453125 0.058349609375 7.125 7.71875\n",
+ "Epoch 66/100 \t Train Err: 4.4688 0.006500244140625 0.05908203125 7.34375 7.53125\n",
+ "Epoch 66/100 \t Train Err: 4.5000 0.006683349609375 0.06298828125 7.46875 7.4375\n",
+ "Epoch 66/100 \t Train Err: 4.3750 0.006744384765625 0.057861328125 7.0625 7.3125\n",
+ "Epoch 66/100 \t Train Err: 4.4688 0.006683349609375 0.060791015625 7.125 7.5625\n",
+ "Epoch 66/100 \t Train Err: 4.5000 0.006744384765625 0.060302734375 7.25 7.65625\n",
+ "Epoch 66/100 \t Train Err: 4.5000 0.006744384765625 0.056884765625 7.125 7.625\n",
+ "Epoch 67/100 \t Train Err: 4.4688 0.006683349609375 0.05712890625 6.96875 7.65625\n",
+ "Epoch 67/100 \t Train Err: 4.3750 0.006683349609375 0.060546875 6.875 7.4375\n",
+ "Epoch 67/100 \t Train Err: 4.3438 0.0067138671875 0.06298828125 7.21875 7.25\n",
+ "Epoch 67/100 \t Train Err: 4.4688 0.00665283203125 0.06201171875 7.25 7.5625\n",
+ "Epoch 67/100 \t Train Err: 4.4375 0.00628662109375 0.057861328125 6.96875 7.5625\n",
+ "Epoch 67/100 \t Train Err: 4.2500 0.00640869140625 0.055419921875 6.53125 7.4375\n",
+ "Epoch 67/100 \t Train Err: 4.3750 0.0062255859375 0.051513671875 6.96875 7.46875\n",
+ "Epoch 67/100 \t Train Err: 4.4062 0.006195068359375 0.055419921875 6.96875 7.46875\n",
+ "Epoch 68/100 \t Train Err: 4.3438 0.00604248046875 0.052734375 6.71875 7.34375\n",
+ "Epoch 68/100 \t Train Err: 4.4062 0.005950927734375 0.055419921875 6.65625 7.59375\n",
+ "Epoch 68/100 \t Train Err: 4.3750 0.005859375 0.0546875 6.90625 7.34375\n",
+ "Epoch 68/100 \t Train Err: 4.3125 0.005950927734375 0.057373046875 6.6875 7.375\n",
+ "Epoch 68/100 \t Train Err: 4.3750 0.005889892578125 0.05517578125 6.53125 7.625\n",
+ "Epoch 68/100 \t Train Err: 4.3438 0.00592041015625 0.053955078125 6.53125 7.65625\n",
+ "Epoch 68/100 \t Train Err: 4.3438 0.005889892578125 0.056640625 6.65625 7.34375\n",
+ "Epoch 68/100 \t Train Err: 4.3125 0.005950927734375 0.05419921875 7.21875 7.25\n",
+ "Epoch 69/100 \t Train Err: 4.3438 0.005859375 0.057373046875 6.9375 7.25\n",
+ "Epoch 69/100 \t Train Err: 4.4062 0.00592041015625 0.0576171875 6.875 7.4375\n",
+ "Epoch 69/100 \t Train Err: 4.3125 0.00592041015625 0.0556640625 6.65625 7.375\n",
+ "Epoch 69/100 \t Train Err: 4.3438 0.00592041015625 0.0556640625 7.03125 7.40625\n",
+ "Epoch 69/100 \t Train Err: 4.2812 0.005889892578125 0.058837890625 6.53125 7.34375\n",
+ "Epoch 69/100 \t Train Err: 4.2500 0.00579833984375 0.05419921875 6.46875 7.25\n",
+ "Epoch 69/100 \t Train Err: 4.3125 0.005828857421875 0.051513671875 6.625 7.34375\n",
+ "Epoch 69/100 \t Train Err: 4.3125 0.005859375 0.052978515625 6.78125 7.21875\n",
+ "Epoch 70/100 \t Train Err: 4.2812 0.005859375 0.04931640625 6.625 7.25\n",
+ "Epoch 70/100 \t Train Err: 4.1875 0.005767822265625 0.051025390625 6.21875 7.1875\n",
+ "Epoch 70/100 \t Train Err: 4.2500 0.005828857421875 0.05126953125 6.34375 7.4375\n",
+ "Epoch 70/100 \t Train Err: 4.2812 0.005706787109375 0.05126953125 6.28125 7.5\n",
+ "Epoch 70/100 \t Train Err: 4.3438 0.0057373046875 0.05078125 6.40625 7.5\n",
+ "Epoch 70/100 \t Train Err: 4.3125 0.0057373046875 0.05517578125 6.875 7.1875\n",
+ "Epoch 70/100 \t Train Err: 4.1562 0.005828857421875 0.052734375 6.78125 6.875\n",
+ "Epoch 70/100 \t Train Err: 4.2188 0.00579833984375 0.05419921875 6.59375 7.15625\n",
+ "Epoch 71/100 \t Train Err: 4.3125 0.00567626953125 0.052490234375 6.5 7.46875\n",
+ "Epoch 71/100 \t Train Err: 4.2500 0.00567626953125 0.05126953125 6.125 7.5\n",
+ "Epoch 71/100 \t Train Err: 4.2500 0.005706787109375 0.05224609375 6.5 7.4375\n",
+ "Epoch 71/100 \t Train Err: 4.1875 0.0057373046875 0.053466796875 6.65625 6.96875\n",
+ "Epoch 71/100 \t Train Err: 4.2812 0.005767822265625 0.0546875 6.875 7.09375\n",
+ "Epoch 71/100 \t Train Err: 4.1562 0.005645751953125 0.054443359375 6.40625 6.96875\n",
+ "Epoch 71/100 \t Train Err: 4.2188 0.005615234375 0.0498046875 6.3125 7.4375\n",
+ "Epoch 71/100 \t Train Err: 4.2500 0.00555419921875 0.05224609375 6.40625 7.40625\n",
+ "Epoch 72/100 \t Train Err: 4.2500 0.005615234375 0.052978515625 6.15625 7.375\n",
+ "Epoch 72/100 \t Train Err: 4.2500 0.00537109375 0.0517578125 6.5 7.25\n",
+ "Epoch 72/100 \t Train Err: 4.1562 0.00543212890625 0.05029296875 6.4375 7.0\n",
+ "Epoch 72/100 \t Train Err: 4.1875 0.005401611328125 0.04638671875 6.3125 7.09375\n",
+ "Epoch 72/100 \t Train Err: 4.1562 0.00537109375 0.048828125 6.34375 7.09375\n",
+ "Epoch 72/100 \t Train Err: 4.2500 0.00537109375 0.0498046875 6.21875 7.3125\n",
+ "Epoch 72/100 \t Train Err: 4.2188 0.00531005859375 0.05078125 6.46875 7.21875\n",
+ "Epoch 72/100 \t Train Err: 4.1250 0.00537109375 0.05078125 6.46875 6.90625\n",
+ "Epoch 73/100 \t Train Err: 4.2500 0.005340576171875 0.053955078125 6.40625 7.1875\n",
+ "Epoch 73/100 \t Train Err: 4.1875 0.005279541015625 0.0517578125 6.59375 6.96875\n",
+ "Epoch 73/100 \t Train Err: 4.2500 0.00537109375 0.0556640625 6.40625 7.21875\n",
+ "Epoch 73/100 \t Train Err: 4.1250 0.00531005859375 0.051513671875 6.34375 6.96875\n",
+ "Epoch 73/100 \t Train Err: 4.2188 0.005279541015625 0.04833984375 6.0 7.46875\n",
+ "Epoch 73/100 \t Train Err: 4.1250 0.00537109375 0.050537109375 6.0625 7.1875\n",
+ "Epoch 73/100 \t Train Err: 4.1250 0.00531005859375 0.048095703125 5.90625 7.09375\n",
+ "Epoch 73/100 \t Train Err: 4.1562 0.0052490234375 0.05029296875 6.375 6.90625\n",
+ "Epoch 74/100 \t Train Err: 4.0625 0.005523681640625 0.049560546875 6.4375 6.8125\n",
+ "Epoch 74/100 \t Train Err: 4.1875 0.005279541015625 0.049072265625 5.90625 7.34375\n",
+ "Epoch 74/100 \t Train Err: 4.2188 0.00531005859375 0.04931640625 6.09375 7.375\n",
+ "Epoch 74/100 \t Train Err: 4.0938 0.00531005859375 0.05029296875 6.21875 6.96875\n",
+ "Epoch 74/100 \t Train Err: 4.1562 0.0052490234375 0.053955078125 6.3125 7.09375\n",
+ "Epoch 74/100 \t Train Err: 4.1250 0.00518798828125 0.05126953125 6.3125 7.0625\n",
+ "Epoch 74/100 \t Train Err: 4.1562 0.0052490234375 0.051513671875 6.21875 7.0625\n",
+ "Epoch 74/100 \t Train Err: 4.0938 0.005218505859375 0.049560546875 6.1875 7.0\n",
+ "Epoch 75/100 \t Train Err: 4.1250 0.005218505859375 0.0517578125 6.3125 7.0625\n",
+ "Epoch 75/100 \t Train Err: 4.0312 0.005218505859375 0.050048828125 6.03125 6.875\n",
+ "Epoch 75/100 \t Train Err: 4.0938 0.005157470703125 0.0498046875 5.84375 7.15625\n",
+ "Epoch 75/100 \t Train Err: 4.1250 0.005126953125 0.04541015625 5.875 7.21875\n",
+ "Epoch 75/100 \t Train Err: 4.0625 0.005126953125 0.05126953125 6.09375 6.9375\n",
+ "Epoch 75/100 \t Train Err: 4.0000 0.005218505859375 0.050537109375 6.40625 6.53125\n",
+ "Epoch 75/100 \t Train Err: 4.1250 0.00518798828125 0.049560546875 6.09375 6.9375\n",
+ "Epoch 75/100 \t Train Err: 4.0625 0.005126953125 0.046875 5.59375 7.1875\n",
+ "Epoch 76/100 \t Train Err: 4.0000 0.00506591796875 0.045654296875 5.5 7.15625\n",
+ "Epoch 76/100 \t Train Err: 4.1562 0.005157470703125 0.0458984375 6.0 7.25\n",
+ "Epoch 76/100 \t Train Err: 4.0625 0.00518798828125 0.048095703125 6.5 6.59375\n",
+ "Epoch 76/100 \t Train Err: 4.0625 0.005096435546875 0.0458984375 6.0 6.875\n",
+ "Epoch 76/100 \t Train Err: 4.0312 0.0052490234375 0.044921875 5.75 7.03125\n",
+ "Epoch 76/100 \t Train Err: 4.1562 0.00518798828125 0.043701171875 5.75 7.28125\n",
+ "Epoch 76/100 \t Train Err: 4.0625 0.005096435546875 0.0498046875 5.84375 7.09375\n",
+ "Epoch 76/100 \t Train Err: 4.0625 0.005126953125 0.044189453125 6.25 6.8125\n",
+ "Epoch 77/100 \t Train Err: 3.9688 0.005126953125 0.0478515625 6.15625 6.59375\n",
+ "Epoch 77/100 \t Train Err: 4.0312 0.005126953125 0.046142578125 5.84375 6.90625\n",
+ "Epoch 77/100 \t Train Err: 4.0000 0.00506591796875 0.04541015625 5.71875 6.96875\n",
+ "Epoch 77/100 \t Train Err: 3.9531 0.004974365234375 0.046142578125 5.46875 7.0\n",
+ "Epoch 77/100 \t Train Err: 4.0312 0.004974365234375 0.044677734375 6.0625 6.8125\n",
+ "Epoch 77/100 \t Train Err: 4.0312 0.005035400390625 0.047607421875 6.125 6.6875\n",
+ "Epoch 77/100 \t Train Err: 4.0312 0.004852294921875 0.047607421875 6.09375 6.84375\n",
+ "Epoch 77/100 \t Train Err: 3.9844 0.0048828125 0.043701171875 5.53125 6.96875\n",
+ "Epoch 78/100 \t Train Err: 3.9844 0.004852294921875 0.045654296875 5.875 6.875\n",
+ "Epoch 78/100 \t Train Err: 3.9844 0.004913330078125 0.0458984375 5.8125 6.78125\n",
+ "Epoch 78/100 \t Train Err: 3.9375 0.0048828125 0.046142578125 5.6875 6.65625\n",
+ "Epoch 78/100 \t Train Err: 3.9531 0.004974365234375 0.04541015625 5.71875 6.78125\n",
+ "Epoch 78/100 \t Train Err: 4.0000 0.004913330078125 0.0458984375 5.5625 7.0\n",
+ "Epoch 78/100 \t Train Err: 3.9375 0.0048828125 0.041748046875 5.46875 6.90625\n",
+ "Epoch 78/100 \t Train Err: 3.9688 0.00494384765625 0.043701171875 5.71875 6.875\n",
+ "Epoch 78/100 \t Train Err: 3.9375 0.005035400390625 0.048095703125 5.96875 6.46875\n",
+ "Epoch 79/100 \t Train Err: 3.9844 0.0050048828125 0.049072265625 6.03125 6.625\n",
+ "Epoch 79/100 \t Train Err: 3.9688 0.0050048828125 0.040283203125 4.90625 7.25\n",
+ "Epoch 79/100 \t Train Err: 4.0312 0.004974365234375 0.04345703125 5.4375 7.1875\n",
+ "Epoch 79/100 \t Train Err: 3.9688 0.005035400390625 0.045166015625 6.03125 6.5625\n",
+ "Epoch 79/100 \t Train Err: 4.0000 0.005035400390625 0.0478515625 6.46875 6.28125\n",
+ "Epoch 79/100 \t Train Err: 3.9219 0.005035400390625 0.043701171875 5.03125 7.03125\n",
+ "Epoch 79/100 \t Train Err: 4.0312 0.004974365234375 0.044921875 4.90625 7.375\n",
+ "Epoch 79/100 \t Train Err: 3.9062 0.004974365234375 0.04296875 5.8125 6.625\n",
+ "Epoch 80/100 \t Train Err: 3.9219 0.0050048828125 0.04638671875 6.40625 6.25\n",
+ "Epoch 80/100 \t Train Err: 3.8750 0.004974365234375 0.045654296875 5.65625 6.5\n",
+ "Epoch 80/100 \t Train Err: 3.9531 0.0048828125 0.04345703125 5.15625 7.15625\n",
+ "Epoch 80/100 \t Train Err: 3.9688 0.004913330078125 0.043701171875 5.25 7.0\n",
+ "Epoch 80/100 \t Train Err: 3.8906 0.004974365234375 0.04345703125 5.875 6.53125\n",
+ "Epoch 80/100 \t Train Err: 3.8281 0.00494384765625 0.044677734375 5.65625 6.4375\n",
+ "Epoch 80/100 \t Train Err: 3.9531 0.004852294921875 0.041259765625 5.4375 6.96875\n",
+ "Epoch 80/100 \t Train Err: 3.9531 0.00494384765625 0.04345703125 5.15625 7.0\n",
+ "Epoch 81/100 \t Train Err: 3.9844 0.0048828125 0.0419921875 5.5625 6.8125\n",
+ "Epoch 81/100 \t Train Err: 3.9219 0.0048828125 0.04443359375 5.71875 6.46875\n",
+ "Epoch 81/100 \t Train Err: 3.9375 0.00482177734375 0.046630859375 5.71875 6.625\n",
+ "Epoch 81/100 \t Train Err: 3.9219 0.00469970703125 0.042236328125 5.03125 6.90625\n",
+ "Epoch 81/100 \t Train Err: 3.9219 0.0047607421875 0.042236328125 5.0625 7.03125\n",
+ "Epoch 81/100 \t Train Err: 3.9844 0.004791259765625 0.044189453125 5.5625 6.90625\n",
+ "Epoch 81/100 \t Train Err: 3.8750 0.0048828125 0.044677734375 5.875 6.375\n",
+ "Epoch 81/100 \t Train Err: 3.8438 0.00482177734375 0.045166015625 5.40625 6.5625\n",
+ "Epoch 82/100 \t Train Err: 3.8281 0.0048828125 0.044189453125 5.0625 6.8125\n",
+ "Epoch 82/100 \t Train Err: 3.8906 0.004913330078125 0.042724609375 5.15625 7.03125\n",
+ "Epoch 82/100 \t Train Err: 3.8750 0.00482177734375 0.048583984375 5.90625 6.34375\n",
+ "Epoch 82/100 \t Train Err: 3.8594 0.004913330078125 0.045166015625 5.6875 6.28125\n",
+ "Epoch 82/100 \t Train Err: 3.8594 0.004852294921875 0.043701171875 5.28125 6.78125\n",
+ "Epoch 82/100 \t Train Err: 3.8594 0.004852294921875 0.042236328125 4.84375 6.9375\n",
+ "Epoch 82/100 \t Train Err: 3.7969 0.0048828125 0.0400390625 5.15625 6.625\n",
+ "Epoch 82/100 \t Train Err: 3.8281 0.004730224609375 0.04736328125 5.53125 6.28125\n",
+ "Epoch 83/100 \t Train Err: 3.8438 0.0047607421875 0.044921875 5.3125 6.53125\n",
+ "Epoch 83/100 \t Train Err: 3.7969 0.00469970703125 0.043212890625 5.09375 6.625\n",
+ "Epoch 83/100 \t Train Err: 3.8125 0.004669189453125 0.043701171875 5.1875 6.71875\n",
+ "Epoch 83/100 \t Train Err: 3.8281 0.004669189453125 0.042236328125 5.09375 6.6875\n",
+ "Epoch 83/100 \t Train Err: 3.8125 0.0047607421875 0.042236328125 5.40625 6.46875\n",
+ "Epoch 83/100 \t Train Err: 3.8594 0.00482177734375 0.04150390625 5.34375 6.59375\n",
+ "Epoch 83/100 \t Train Err: 3.7500 0.00482177734375 0.041748046875 5.3125 6.375\n",
+ "Epoch 83/100 \t Train Err: 3.7812 0.004791259765625 0.040771484375 4.75 6.75\n",
+ "Epoch 84/100 \t Train Err: 3.7188 0.0047607421875 0.0390625 5.21875 6.34375\n",
+ "Epoch 84/100 \t Train Err: 3.7656 0.0047607421875 0.040771484375 5.09375 6.40625\n",
+ "Epoch 84/100 \t Train Err: 3.7969 0.004852294921875 0.04248046875 5.25 6.34375\n",
+ "Epoch 84/100 \t Train Err: 3.7344 0.0047607421875 0.03955078125 5.125 6.3125\n",
+ "Epoch 84/100 \t Train Err: 3.7500 0.004791259765625 0.03857421875 4.5 6.90625\n",
+ "Epoch 84/100 \t Train Err: 3.7656 0.00482177734375 0.04248046875 5.1875 6.40625\n",
+ "Epoch 84/100 \t Train Err: 3.7656 0.00482177734375 0.042724609375 5.3125 6.21875\n",
+ "Epoch 84/100 \t Train Err: 3.7188 0.004730224609375 0.04345703125 5.21875 6.28125\n",
+ "Epoch 85/100 \t Train Err: 3.7500 0.004608154296875 0.039794921875 4.65625 6.625\n",
+ "Epoch 85/100 \t Train Err: 3.7344 0.004608154296875 0.04150390625 4.71875 6.53125\n",
+ "Epoch 85/100 \t Train Err: 3.7344 0.004730224609375 0.042236328125 5.375 6.3125\n",
+ "Epoch 85/100 \t Train Err: 3.7656 0.004730224609375 0.040283203125 5.4375 6.28125\n",
+ "Epoch 85/100 \t Train Err: 3.6875 0.00469970703125 0.0400390625 5.0 6.28125\n",
+ "Epoch 85/100 \t Train Err: 3.7031 0.00469970703125 0.037109375 4.8125 6.53125\n",
+ "Epoch 85/100 \t Train Err: 3.7500 0.00469970703125 0.038330078125 4.78125 6.625\n",
+ "Epoch 85/100 \t Train Err: 3.6875 0.004669189453125 0.0400390625 5.34375 6.0625\n",
+ "Epoch 86/100 \t Train Err: 3.7344 0.004638671875 0.037841796875 5.40625 6.25\n",
+ "Epoch 86/100 \t Train Err: 3.7031 0.004638671875 0.0380859375 4.875 6.46875\n",
+ "Epoch 86/100 \t Train Err: 3.7344 0.00457763671875 0.0390625 4.59375 6.6875\n",
+ "Epoch 86/100 \t Train Err: 3.7031 0.004638671875 0.036865234375 5.09375 6.21875\n",
+ "Epoch 86/100 \t Train Err: 3.7031 0.004638671875 0.0390625 5.28125 6.15625\n",
+ "Epoch 86/100 \t Train Err: 3.7500 0.004669189453125 0.037841796875 4.875 6.59375\n",
+ "Epoch 86/100 \t Train Err: 3.7188 0.004638671875 0.0390625 4.625 6.5625\n",
+ "Epoch 86/100 \t Train Err: 3.6406 0.004730224609375 0.039306640625 5.125 5.9375\n",
+ "Epoch 87/100 \t Train Err: 3.6875 0.00469970703125 0.03759765625 4.9375 6.28125\n",
+ "Epoch 87/100 \t Train Err: 3.6875 0.004730224609375 0.039794921875 4.78125 6.25\n",
+ "Epoch 87/100 \t Train Err: 3.7031 0.004791259765625 0.0361328125 4.65625 6.53125\n",
+ "Epoch 87/100 \t Train Err: 3.6719 0.0047607421875 0.03662109375 4.625 6.5\n",
+ "Epoch 87/100 \t Train Err: 3.6094 0.004791259765625 0.037109375 5.125 6.0\n",
+ "Epoch 87/100 \t Train Err: 3.7344 0.00482177734375 0.036865234375 5.375 6.15625\n",
+ "Epoch 87/100 \t Train Err: 3.6719 0.0047607421875 0.037353515625 4.65625 6.53125\n",
+ "Epoch 87/100 \t Train Err: 3.6562 0.004791259765625 0.03662109375 4.59375 6.4375\n",
+ "Epoch 88/100 \t Train Err: 3.6562 0.004638671875 0.03857421875 5.28125 6.0\n",
+ "Epoch 88/100 \t Train Err: 3.5938 0.004730224609375 0.040283203125 4.875 6.03125\n",
+ "Epoch 88/100 \t Train Err: 3.6875 0.004608154296875 0.03955078125 4.5625 6.5\n",
+ "Epoch 88/100 \t Train Err: 3.6875 0.004730224609375 0.0380859375 4.71875 6.46875\n",
+ "Epoch 88/100 \t Train Err: 3.5469 0.004608154296875 0.037353515625 4.53125 6.21875\n",
+ "Epoch 88/100 \t Train Err: 3.6250 0.004608154296875 0.040771484375 4.9375 6.125\n",
+ "Epoch 88/100 \t Train Err: 3.6094 0.004486083984375 0.038330078125 4.8125 6.21875\n",
+ "Epoch 88/100 \t Train Err: 3.5938 0.0045166015625 0.0400390625 4.75 6.21875\n",
+ "Epoch 89/100 \t Train Err: 3.6406 0.0045166015625 0.038330078125 4.875 6.21875\n",
+ "Epoch 89/100 \t Train Err: 3.5469 0.00457763671875 0.041748046875 5.21875 5.6875\n",
+ "Epoch 89/100 \t Train Err: 3.6250 0.004608154296875 0.03759765625 4.46875 6.40625\n",
+ "Epoch 89/100 \t Train Err: 3.5469 0.004638671875 0.035400390625 4.625 6.15625\n",
+ "Epoch 89/100 \t Train Err: 3.6094 0.00469970703125 0.0341796875 4.40625 6.375\n",
+ "Epoch 89/100 \t Train Err: 3.5781 0.00469970703125 0.0361328125 4.75 6.0625\n",
+ "Epoch 89/100 \t Train Err: 3.4688 0.0047607421875 0.0341796875 4.53125 5.96875\n",
+ "Epoch 89/100 \t Train Err: 3.5781 0.00469970703125 0.033935546875 4.40625 6.34375\n",
+ "Epoch 90/100 \t Train Err: 3.5625 0.0047607421875 0.03369140625 4.53125 6.25\n",
+ "Epoch 90/100 \t Train Err: 3.6094 0.004791259765625 0.033447265625 4.875 6.0625\n",
+ "Epoch 90/100 \t Train Err: 3.6250 0.0048828125 0.034423828125 4.40625 6.34375\n",
+ "Epoch 90/100 \t Train Err: 3.5469 0.004913330078125 0.03662109375 4.71875 6.0\n",
+ "Epoch 90/100 \t Train Err: 3.5000 0.004913330078125 0.035400390625 4.40625 6.125\n",
+ "Epoch 90/100 \t Train Err: 3.5469 0.00494384765625 0.03369140625 4.125 6.375\n",
+ "Epoch 90/100 \t Train Err: 3.5156 0.004852294921875 0.035400390625 4.53125 5.96875\n",
+ "Epoch 90/100 \t Train Err: 3.5156 0.00469970703125 0.036376953125 4.71875 5.90625\n",
+ "Epoch 91/100 \t Train Err: 3.5312 0.0047607421875 0.033447265625 4.375 6.28125\n",
+ "Epoch 91/100 \t Train Err: 3.5469 0.004730224609375 0.032958984375 4.34375 6.3125\n",
+ "Epoch 91/100 \t Train Err: 3.6094 0.004669189453125 0.0390625 5.34375 5.65625\n",
+ "Epoch 91/100 \t Train Err: 3.5156 0.004608154296875 0.032958984375 4.28125 6.0625\n",
+ "Epoch 91/100 \t Train Err: 3.5469 0.00457763671875 0.03125 4.03125 6.4375\n",
+ "Epoch 91/100 \t Train Err: 3.5312 0.004608154296875 0.03564453125 4.84375 5.71875\n",
+ "Epoch 91/100 \t Train Err: 3.4844 0.004547119140625 0.0322265625 4.5 5.90625\n",
+ "Epoch 91/100 \t Train Err: 3.5312 0.004608154296875 0.0299072265625 3.953125 6.625\n",
+ "Epoch 92/100 \t Train Err: 3.5000 0.004547119140625 0.03515625 5.21875 5.46875\n",
+ "Epoch 92/100 \t Train Err: 3.4688 0.004486083984375 0.034423828125 4.53125 5.9375\n",
+ "Epoch 92/100 \t Train Err: 3.4531 0.00457763671875 0.032470703125 3.875 6.375\n",
+ "Epoch 92/100 \t Train Err: 3.5938 0.004669189453125 0.038330078125 4.9375 5.84375\n",
+ "Epoch 92/100 \t Train Err: 3.5156 0.004730224609375 0.03369140625 5.03125 5.6875\n",
+ "Epoch 92/100 \t Train Err: 3.5625 0.004791259765625 0.029052734375 3.890625 6.65625\n",
+ "Epoch 92/100 \t Train Err: 3.4531 0.004730224609375 0.032470703125 4.5 5.90625\n",
+ "Epoch 92/100 \t Train Err: 3.4531 0.0047607421875 0.030517578125 4.71875 5.6875\n",
+ "Epoch 93/100 \t Train Err: 3.5156 0.00482177734375 0.0281982421875 3.78125 6.53125\n",
+ "Epoch 93/100 \t Train Err: 3.4531 0.00482177734375 0.03173828125 4.6875 5.5625\n",
+ "Epoch 93/100 \t Train Err: 3.4531 0.004791259765625 0.03271484375 4.6875 5.6875\n",
+ "Epoch 93/100 \t Train Err: 3.4375 0.00469970703125 0.0279541015625 3.96875 6.25\n",
+ "Epoch 93/100 \t Train Err: 3.3594 0.004730224609375 0.03076171875 4.125 5.9375\n",
+ "Epoch 93/100 \t Train Err: 3.4688 0.004730224609375 0.0301513671875 4.96875 5.625\n",
+ "Epoch 93/100 \t Train Err: 3.3906 0.004730224609375 0.0296630859375 4.03125 6.0625\n",
+ "Epoch 93/100 \t Train Err: 3.4688 0.0047607421875 0.0294189453125 4.25 6.0625\n",
+ "Epoch 94/100 \t Train Err: 3.3906 0.0047607421875 0.031005859375 4.375 5.6875\n",
+ "Epoch 94/100 \t Train Err: 3.4219 0.004791259765625 0.031494140625 4.53125 5.8125\n",
+ "Epoch 94/100 \t Train Err: 3.4375 0.004791259765625 0.0281982421875 4.25 6.03125\n",
+ "Epoch 94/100 \t Train Err: 3.4219 0.004791259765625 0.02978515625 4.34375 5.8125\n",
+ "Epoch 94/100 \t Train Err: 3.4219 0.00469970703125 0.0308837890625 4.375 5.78125\n",
+ "Epoch 94/100 \t Train Err: 3.4375 0.004669189453125 0.028564453125 4.25 6.0\n",
+ "Epoch 94/100 \t Train Err: 3.3594 0.00469970703125 0.0283203125 4.25 5.875\n",
+ "Epoch 94/100 \t Train Err: 3.3594 0.004852294921875 0.03125 4.28125 5.6875\n",
+ "Epoch 95/100 \t Train Err: 3.3750 0.004791259765625 0.0284423828125 4.25 5.6875\n",
+ "Epoch 95/100 \t Train Err: 3.3750 0.0047607421875 0.0279541015625 3.90625 5.96875\n",
+ "Epoch 95/100 \t Train Err: 3.3594 0.004730224609375 0.029052734375 4.28125 5.71875\n",
+ "Epoch 95/100 \t Train Err: 3.3906 0.004791259765625 0.0296630859375 4.3125 5.78125\n",
+ "Epoch 95/100 \t Train Err: 3.3750 0.004791259765625 0.03076171875 4.125 5.78125\n",
+ "Epoch 95/100 \t Train Err: 3.3594 0.0047607421875 0.0291748046875 4.25 5.8125\n",
+ "Epoch 95/100 \t Train Err: 3.3438 0.004669189453125 0.02783203125 4.15625 5.875\n",
+ "Epoch 95/100 \t Train Err: 3.3281 0.0047607421875 0.0299072265625 4.375 5.5\n",
+ "Epoch 96/100 \t Train Err: 3.3438 0.0048828125 0.02880859375 3.84375 5.90625\n",
+ "Epoch 96/100 \t Train Err: 3.3750 0.0047607421875 0.030029296875 4.0625 5.8125\n",
+ "Epoch 96/100 \t Train Err: 3.3438 0.00482177734375 0.0308837890625 3.9375 5.71875\n",
+ "Epoch 96/100 \t Train Err: 3.3125 0.004730224609375 0.028564453125 4.21875 5.59375\n",
+ "Epoch 96/100 \t Train Err: 3.2969 0.004638671875 0.0291748046875 3.765625 5.90625\n",
+ "Epoch 96/100 \t Train Err: 3.3750 0.004638671875 0.034912109375 4.375 5.75\n",
+ "Epoch 96/100 \t Train Err: 3.2656 0.004638671875 0.029296875 4.125 5.53125\n",
+ "Epoch 96/100 \t Train Err: 3.2500 0.004638671875 0.0286865234375 3.984375 5.625\n",
+ "Epoch 97/100 \t Train Err: 3.2656 0.00457763671875 0.028564453125 4.125 5.59375\n",
+ "Epoch 97/100 \t Train Err: 3.3438 0.004547119140625 0.02587890625 3.859375 5.9375\n",
+ "Epoch 97/100 \t Train Err: 3.3125 0.00457763671875 0.02783203125 4.03125 5.6875\n",
+ "Epoch 97/100 \t Train Err: 3.3438 0.004608154296875 0.02783203125 4.125 5.6875\n",
+ "Epoch 97/100 \t Train Err: 3.3125 0.004730224609375 0.0279541015625 3.875 5.75\n",
+ "Epoch 97/100 \t Train Err: 3.2500 0.004730224609375 0.0289306640625 3.78125 5.6875\n",
+ "Epoch 97/100 \t Train Err: 3.2812 0.004669189453125 0.030517578125 4.03125 5.5625\n",
+ "Epoch 97/100 \t Train Err: 3.3281 0.004638671875 0.029296875 4.1875 5.625\n",
+ "Epoch 98/100 \t Train Err: 3.2969 0.004669189453125 0.0264892578125 3.734375 6.0\n",
+ "Epoch 98/100 \t Train Err: 3.2031 0.0047607421875 0.0262451171875 3.921875 5.5\n",
+ "Epoch 98/100 \t Train Err: 3.2969 0.004791259765625 0.02685546875 4.59375 5.1875\n",
+ "Epoch 98/100 \t Train Err: 3.3750 0.0047607421875 0.02392578125 3.34375 6.5\n",
+ "Epoch 98/100 \t Train Err: 3.4688 0.004791259765625 0.033447265625 5.625 4.625\n",
+ "Epoch 98/100 \t Train Err: 4.5625 0.00482177734375 0.01953125 1.4296875 11.0625\n",
+ "Epoch 98/100 \t Train Err: 10.6875 0.00567626953125 0.44140625 46.5 1.109375\n",
+ "Epoch 98/100 \t Train Err: 11.5625 0.0096435546875 0.0252685546875 0.322265625 29.875\n",
+ "Epoch 99/100 \t Train Err: 12.5000 0.1318359375 0.0341796875 0.1640625 32.25\n",
+ "Epoch 99/100 \t Train Err: 7.3125 0.71484375 0.66796875 2.953125 17.125\n",
+ "Epoch 99/100 \t Train Err: 9.1250 1.265625 3.046875 38.0 3.265625\n",
+ "Epoch 99/100 \t Train Err: 9.4375 1.078125 3.578125 40.5 2.859375\n",
+ "Epoch 99/100 \t Train Err: 5.9688 0.419921875 1.484375 14.3125 8.125\n",
+ "Epoch 99/100 \t Train Err: 6.6250 0.01513671875 0.3125 4.5 15.25\n",
+ "Epoch 99/100 \t Train Err: 7.5000 0.326171875 0.042236328125 2.78125 18.0\n",
+ "Epoch 99/100 \t Train Err: 6.6250 0.765625 0.087890625 5.0 14.375\n"
]
}
],
@@ -889,9 +1363,14 @@
" \n",
" # test_err.append(test_loss)\n",
" train_err.append(train_loss)\n",
+ " len1.append(criterion(output[batch_labels == 1].squeeze(1), batch_labels[batch_labels==1]))\n",
+ " len2.append(criterion(output[batch_labels == 2].squeeze(1), batch_labels[batch_labels==2]))\n",
+ " len3.append(criterion(output[batch_labels == 3].squeeze(1), batch_labels[batch_labels==3]))\n",
+ " len15.append(criterion(output[batch_labels == 15].squeeze(1), batch_labels[batch_labels==15]))\n",
+ " \n",
" with open('loss', 'a') as f:\n",
" f.write(f\"{train_loss}\\n\")\n",
- " print(f\"Epoch {epoch}/{NEPOCHS} \\t Train Err: {train_loss:.4f}\")\n",
+ " print(f\"Epoch {epoch}/{NEPOCHS} \\t Train Err: {train_loss:.4f} {len1[-1]} {len2[-1]} {len3[-1]} {len15[-1]}\")\n",
"\n",
" epoch += 1\n",
" if epoch % 100 == 0:\n",
@@ -900,7 +1379,8 @@
},
{
"cell_type": "code",
- "execution_count": 125,
+ "execution_count": 16,
+ "execution_state": "idle",
"metadata": {},
"outputs": [],
"source": [
@@ -929,21 +1409,10 @@
},
{
"cell_type": "code",
- "execution_count": 21,
+ "execution_count": null,
"execution_state": "idle",
"metadata": {},
- "outputs": [
- {
- "data": {
- "image/png": "iVBORw0KGgoAAAANSUhEUgAAAksAAAHgCAYAAAC1jimyAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjkuMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8hTgPZAAAACXBIWXMAAA9hAAAPYQGoP6dpAAAxHUlEQVR4nO3deXRV1cH38d8NIQOEJEAmAmGyTAKCbzAhaItIJAxV0fCAKULAVEQGsSAVFInauiiglUlA+og8lFGo5VEKuDBYBwgIQZC5DkwSkhghCYMkIdnvHw/c9pqwSTTThe9nrbP0nrtP7t5nUfPtuedeHMYYIwAAAJTKo7onAAAAUJMRSwAAABbEEgAAgAWxBAAAYEEsAQAAWBBLAAAAFsQSAACABbEEAABgQSwBAABYEEsA4GYcDofGjBlT3dMAbhrEEgAXS5YskcPhkMPh0KefflrieWOMIiIi5HA49Otf/9rlufPnzys5OVkdOnRQ3bp11bBhQ3Xu3Fnjxo1Tenq6c9wLL7zgfI3StoyMjCpZ67XY5jZy5MhqnRuAqudZ3RMAUDP5+PhoxYoVuuuuu1z2f/TRR/r222/l7e3tsr+wsFC/+tWvdPjwYSUmJmrs2LE6f/68Dhw4oBUrVujBBx9UeHi4yzELFiyQn59fidcODAyspFWV3b333quhQ4eW2N+6detqmQ+A6kMsAShV3759tWbNGs2ZM0eenv/+T8WKFSsUGRmp7Oxsl/Hr1q3T559/ruXLl+s3v/mNy3OXLl1SQUFBidcYMGCAgoKCKnEVP13r1q31yCOPVPc0ANQAvA0HoFQJCQn6/vvvtXnzZue+goICrV27tkQMSdLXX38tSbrzzjtLPOfj4yN/f/8KmVeHDh3Uo0ePEvuLi4vVuHFjDRgwwLlv1apVioyMVL169eTv76+OHTtq9uzZFTIPSbr77rvVoUMHpaWlqVu3bvL19VWLFi20cOHCEmOzsrKUlJSk0NBQ+fj4qFOnTvqf//mfUtcxe/ZsdezYUT4+PgoODlbv3r21a9euEmPXrVunDh06yNvbW+3bt9emTZtcnj937pyeeuopNW/eXN7e3goJCdG9996r3bt3V9g5AG4GxBKAUjVv3lwxMTFauXKlc9/GjRuVm5urhx9+uMT4Zs2aSZKWLl0qY0yZXuPMmTPKzs522XJycqzHDBo0SB9//HGJ+5o+/fRTpaenO+e2efNmJSQkqH79+po+fbr+9Kc/6e6779bWrVvLNLdLly6VmFt2dnaJK2Rnz55V3759FRkZqRkzZqhJkyZ64okntHjxYueYH374QXfffbf++te/avDgwZo5c6YCAgI0bNiwEvGWlJSkp556ShEREZo+fbomTZokHx8fbd++vcR6R40apYcfflgzZszQpUuXFB8fr++//945ZuTIkVqwYIHi4+M1f/58Pf300/L19dWhQ4fKdA4AXGEA4D+89dZbRpLZuXOnmTdvnqlXr565ePGiMcaY//qv/zI9evQwxhjTrFkz069fP+dxFy9eNG3atDGSTLNmzcywYcPMm2++aTIzM0u8RnJyspFU6tamTRvr/I4cOWIkmblz57rsHzVqlPHz83POddy4ccbf399cvny53OfgWnOTZFauXOkc1717dyPJvPrqq859+fn5pnPnziYkJMQUFBQYY4yZNWuWkWSWLVvmHFdQUGBiYmKMn5+fycvLM8YYs2XLFiPJPPnkkyXmVFxc7DI/Ly8v89VXXzn37d27t8R5CQgIMKNHjy73+gG44soSgGsaOHCgfvjhB61fv17nzp3T+vXrS30LTpJ8fX21Y8cOTZw4UbryqbqkpCQ1atRIY8eOVX5+folj/va3v2nz5s0u21tvvWWdU+vWrdW5c2etXr3aua+oqEhr167VfffdJ19fX+nKTeIXLlxweRuxPB544IESc9u8eXOJtwA9PT31+OOPOx97eXnp8ccfV1ZWltLS0iRJGzZsUFhYmBISEpzjateurSeffFLnz5/XRx995DwfDodDycnJJebjcDhcHsfGxuqWW25xPr7tttvk7++vb775xrkvMDBQO3bscPkkIoDy4wZvANcUHBys2NhYrVixQhcvXlRRUZHLPUE/FhAQoBkzZmjGjBk6fvy4UlJS9Morr2jevHkKCAjQH//4R5fxv/rVr37SDd6DBg3Ss88+q1OnTqlx48b65z//qaysLA0aNMg5ZtSoUXr77bfVp08fNW7cWL169dLAgQPVu3fvMr1GkyZNFBsbe91x4eHhqlu3rsu+q5+YO3bsmLp27arjx4+rVatW8vBw/f+n7dq1kyQdP35cunLfV3h4uBo0aHDd123atGmJffXr19fZs2edj2fMmKHExERFREQoMjJSffv21dChQ9WyZcvr/nwA/8aVJQBWv/nNb7Rx40YtXLhQffr0KfPH+ps1a6ZHH31UW7duVWBgoJYvX15hcxo0aJCMMVqzZo0k6e2331ZAQIBLCIWEhGjPnj169913df/99+vDDz9Unz59lJiYWGHzqE61atUqdf9/3i82cOBAffPNN5o7d67Cw8M1c+ZMtW/fXhs3bqzCmQLuj1gCYPXggw/Kw8ND27dvv+ZbcDb169fXLbfcotOnT1fYnFq0aKGoqCitXr1aly9f1jvvvKP+/fuX+O4nLy8v3XfffZo/f76+/vprPf7441q6dKm++uqrCptLenq6Lly44LLvX//6l3TlJnldCccvv/xSxcXFLuMOHz7sfF6SbrnlFqWnp+vMmTMVNr9GjRpp1KhRWrdunY4ePaqGDRvq5ZdfrrCfD9wMiCUAVn5+flqwYIFeeOEF3Xfffdcct3fv3hLfvaQrbzEdPHhQbdq0qdB5DRo0SNu3b9fixYuVnZ3t8hacJJdPhUmSh4eHbrvtNkkq9f6pn+ry5ct64403nI8LCgr0xhtvKDg4WJGRkdKV76zKyMhwuc/q8uXLmjt3rvz8/NS9e3dJUnx8vIwxevHFF0u8Tlk/YXhVUVGRcnNzXfaFhIQoPDy8QtcP3Ay4ZwnAdZXlravNmzcrOTlZ999/v7p27So/Pz998803Wrx4sfLz8/XCCy+UOGbt2rWlfoP3vffeq9DQUOvrDRw4UE8//bSefvppNWjQoMT9Rb/97W915swZ3XPPPWrSpImOHz+uuXPnqnPnzs57hWz+9a9/admyZSX2h4aG6t5773U+Dg8P1/Tp03Xs2DG1bt1aq1ev1p49e7Ro0SLVrl1bkjRixAi98cYbGjZsmNLS0tS8eXOtXbtWW7du1axZs1SvXj1JUo8ePTRkyBDNmTNHX375pXr37q3i4mJ98skn6tGjR7n+Prhz586pSZMmGjBggDp16iQ/Pz998MEH2rlzp1599dUy/xwAfHUAgB/5z68OsPnxVwd88803ZurUqaZr164mJCTEeHp6muDgYNOvXz+zZcsWl2NtXx0gyXz44Ydlmuudd95pJJnf/va3JZ5bu3at6dWrlwkJCTFeXl6madOm5vHHHzenT5++7s+1za179+7Ocd27dzft27c3u3btMjExMcbHx8c0a9bMzJs3r8TPzMzMNMOHDzdBQUHGy8vLdOzY0bz11lslxl2+fNnMnDnTtG3b1nh5eZng4GDTp08fk5aW5jK/0r4SoFmzZiYxMdGYK19hMHHiRNOpUydTr149U7duXdOpUyczf/78664fgCuHKe+1XQCAdOUbvLOzs7V///7qngqASsQ9SwAAABbEEgAAgAWxBAAAYME9SwAAABZcWQIAALAglgAAACyIJQAAAAtiCQAAwIJYAgAAsCCWAAAALIglAAAAC2IJAADAglgCAACwIJYAAAAsiCUAAAALYgkAAMCCWAIAALAglgAAACyIJQAAAAtiCQAAwIJYAgAAsCCWAAAALIglAAAAC2IJAADAglgCAACwIJYAAAAsiCUAAAALYgkAAMCCWAIAALAglgAAACyIJQAAAAvP6p7AjaC4uFjp6emqV6+eHA5HdU8HAACUgTFG586dU3h4uDw8rn39iFiqAOnp6YqIiKjuaQAAgJ/g5MmTatKkyTWfJ5YqQL169aQrJ9vf37+6pwMAAMogLy9PERERzt/j10IsVYCrb735+/sTSwAAuJnr3ULDDd4AAAAWxBIAAIAFsQQAAGBBLAEAAFgQSwAAABbEEgAAgAWxBAAAYEEsAQAAWBBLAAAAFsQSAACABbEEAABgQSwBAABYEEsAAAAWxBIAAIAFsQQAAGBBLAEAAFgQSwAAABbEEgAAgAWxBAAAYEEsAQAAWBBLAAAAFsQSAACABbEEAABgQSwBAABYEEsAAAAWxBIAAIAFsQQAAGBBLAEAAFgQSwAAABbEEgAAgAWxBAAAYEEsAQAAWBBLAAAAFsQSAACABbEEAABgQSwBAABYEEsAAAAWxBIAAIAFsQQAAGBBLAEAAFgQSwAAABbEEgAAgAWxBAAAYEEsAQAAWLhdLL3++utq3ry5fHx8FB0drc8++8w6fs2aNWrbtq18fHzUsWNHbdiw4ZpjR44cKYfDoVmzZlXCzAEAgDtyq1havXq1xo8fr+TkZO3evVudOnVSXFycsrKySh2/bds2JSQkKCkpSZ9//rn69++v/v37a//+/SXG/v3vf9f27dsVHh5eBSsBAADuwq1i6c9//rMee+wxDR8+XLfeeqsWLlyoOnXqaPHixaWOnz17tnr37q2JEyeqXbt2+sMf/qD/9//+n+bNm+cy7tSpUxo7dqyWL1+u2rVrV9FqAACAO3CbWCooKFBaWppiY2Od+zw8PBQbG6vU1NRSj0lNTXUZL0lxcXEu44uLizVkyBBNnDhR7du3L9Nc8vPzlZeX57IBAIAbk9vEUnZ2toqKihQaGuqyPzQ0VBkZGaUek5GRcd3x06dPl6enp5588skyz2XatGkKCAhwbhEREeVeDwAAcA9uE0uVIS0tTbNnz9aSJUvkcDjKfNzkyZOVm5vr3E6ePFmp8wQAANXHbWIpKChItWrVUmZmpsv+zMxMhYWFlXpMWFiYdfwnn3yirKwsNW3aVJ6envL09NTx48c1YcIENW/e/Jpz8fb2lr+/v8sGAABuTG4TS15eXoqMjFRKSopzX3FxsVJSUhQTE1PqMTExMS7jJWnz5s3O8UOGDNEXX3yhPXv2OLfw8HBNnDhR77//fiWvCAAAuAPP6p5AeYwfP16JiYnq0qWLoqKiNGvWLF24cEHDhw+XJA0dOlSNGzfWtGnTJEnjxo1T9+7d9eqrr6pfv35atWqVdu3apUWLFkmSGjZsqIYNG7q8Ru3atRUWFqY2bdpUwwoBAEBN41axNGjQIH333XeaOnWqMjIy1LlzZ23atMl5E/eJEyfk4fHvi2XdunXTihUrNGXKFD377LNq1aqV1q1bpw4dOlTjKgAAgDtxGGNMdU/C3eXl5SkgIEC5ubncvwQAgJso6+9vt7lnCQAAoDoQSwAAABbEEgAAgAWxBAAAYEEsAQAAWBBLAAAAFsQSAACABbEEAABgQSwBAABYEEsAAAAWxBIAAIAFsQQAAGBBLAEAAFgQSwAAABbEEgAAgAWxBAAAYEEsAQAAWBBLAAAAFsQSAACABbEEAABgQSwBAABYEEsAAAAWxBIAAIAFsQQAAGBBLAEAAFgQSwAAABbEEgAAgAWxBAAAYEEsAQAAWBBLAAAAFsQSAACABbEEAABgQSwBAABYEEsAAAAWxBIAAIAFsQQAAGBBLAEAAFgQSwAAABbEEgAAgAWxBAAAYEEsAQAAWBBLAAAAFsQSAACABbEEAABgQSwBAABYEEsAAAAWxBIAAIAFsQQAAGBBLAEAAFgQSwAAABbEEgAAgAWxBAAAYEEsAQAAWBBLAAAAFsQSAACABbEEAABg4Xax9Prrr6t58+by8fFRdHS0PvvsM+v4NWvWqG3btvLx8VHHjh21YcMG53OFhYV65pln1LFjR9WtW1fh4eEaOnSo0tPTq2AlAADAHbhVLK1evVrjx49XcnKydu/erU6dOikuLk5ZWVmljt+2bZsSEhKUlJSkzz//XP3791f//v21f/9+SdLFixe1e/duPf/889q9e7feeecdHTlyRPfff38VrwwAANRUDmOMqe5JlFV0dLTuuOMOzZs3T5JUXFysiIgIjR07VpMmTSoxftCgQbpw4YLWr1/v3Ne1a1d17txZCxcuLPU1du7cqaioKB0/flxNmzYt07zy8vIUEBCg3Nxc+fv7/+T1AQCAqlPW399uc2WpoKBAaWlpio2Nde7z8PBQbGysUlNTSz0mNTXVZbwkxcXFXXO8JOXm5srhcCgwMPCaY/Lz85WXl+eyAQCAG5PbxFJ2draKiooUGhrqsj80NFQZGRmlHpORkVGu8ZcuXdIzzzyjhIQEa2FOmzZNAQEBzi0iIuInrQkAANR8bhNLla2wsFADBw6UMUYLFiywjp08ebJyc3Od28mTJ6tsngAAoGp5VvcEyiooKEi1atVSZmamy/7MzEyFhYWVekxYWFiZxl8NpePHj2vLli3Xve/I29tb3t7eP3ktAADAfbjNlSUvLy9FRkYqJSXFua+4uFgpKSmKiYkp9ZiYmBiX8ZK0efNml/FXQ+nLL7/UBx98oIYNG1biKgAAgLtxmytLkjR+/HglJiaqS5cuioqK0qxZs3ThwgUNHz5ckjR06FA1btxY06ZNkySNGzdO3bt316uvvqp+/fpp1apV2rVrlxYtWiRdCaUBAwZo9+7dWr9+vYqKipz3MzVo0EBeXl7VuFoAAFATuFUsDRo0SN99952mTp2qjIwMde7cWZs2bXLexH3ixAl5ePz7Ylm3bt20YsUKTZkyRc8++6xatWqldevWqUOHDpKkU6dO6d1335Ukde7c2eW1PvzwQ919991Vuj4AAFDzuNX3LNVUfM8SAADu54b7niUAAIDqQCwBAABYEEsAAAAWxBIAAIAFsQQAAGBBLAEAAFgQSwAAABbEEgAAgAWxBAAAYEEsAQAAWBBLAAAAFsQSAACABbEEAABgQSwBAABYEEsAAAAWxBIAAIAFsQQAAGBBLAEAAFgQSwAAABbEEgAAgAWxBAAAYEEsAQAAWBBLAAAAFsQSAACABbEEAABgQSwBAABYEEsAAAAWxBIAAIAFsQQAAGBBLAEAAFgQSwAAABbEEgAAgAWxBAAAYEEsAQAAWBBLAAAAFsQSAACABbEEAABgQSwBAABYEEsAAAAWxBIAAIAFsQQAAGBBLAEAAFgQSwAAABbEEgAAgAWxBAAAYEEsAQAAWBBLAAAAFsQSAACABbEEAABgQSwBAABYlCuWZsyYoR9++MH5eOvWrcrPz3c+PnfunEaNGlWxMwQAAKhGDmOMKevgWrVq6fTp0woJCZEk+fv7a8+ePWrZsqUkKTMzU+Hh4SoqKqq8GddAeXl5CggIUG5urvz9/at7OgAAoAzK+vu7XFeWftxV5egsAAAAt8Q9SwAAABbEEgAAgIVneQ/47//+b/n5+UmSLl++rCVLligoKEi6coM3AADAjaRcN3g3b95cDofjuuOOHj36c+d1Ta+//rpmzpypjIwMderUSXPnzlVUVNQ1x69Zs0bPP/+8jh07platWmn69Onq27ev83ljjJKTk/WXv/xFOTk5uvPOO7VgwQK1atWqzHPiBm8AANxPpdzgfezYMR09evS6W2VZvXq1xo8fr+TkZO3evVudOnVSXFycsrKySh2/bds2JSQkKCkpSZ9//rn69++v/v37a//+/c4xM2bM0Jw5c7Rw4ULt2LFDdevWVVxcnC5dulRp6wAAAO6jXFeWqlt0dLTuuOMOzZs3T5JUXFysiIgIjR07VpMmTSoxftCgQbpw4YLWr1/v3Ne1a1d17txZCxculDFG4eHhmjBhgp5++mlJUm5urkJDQ7VkyRI9/PDDZZoXV5YAAHA/lXJlKTU11SU8JGnp0qVq0aKFQkJCNGLECJcvqaxIBQUFSktLU2xsrHOfh4eHYmNjlZqaes35/ud4SYqLi3OOP3r0qDIyMlzGBAQEKDo6+po/U5Ly8/OVl5fnsgEAgBtTuWLppZde0oEDB5yP9+3bp6SkJMXGxmrSpEl67733NG3atMqYp7Kzs1VUVKTQ0FCX/aGhocrIyCj1mIyMDOv4q/8sz8+UpGnTpikgIMC5RURE/OR1AQCAmq1csbRnzx717NnT+XjVqlWKjo7WX/7yF40fP15z5szR22+/XRnzrFEmT56s3Nxc53by5MnqnhIAAKgk5Yqls2fPulyF+eijj9SnTx/n4zvuuKPSwiEoKEi1atVSZmamy/7MzEyFhYWVekxYWJh1/NV/ludnSpK3t7f8/f1dNgAAcGMqVyyFhoY6P+1WUFCg3bt3q2vXrs7nz507p9q1a1f8LCV5eXkpMjJSKSkpzn3FxcVKSUlRTExMqcfExMS4jJekzZs3O8e3aNFCYWFhLmPy8vK0Y8eOa/5MAABwcynXl1L27dtXkyZN0vTp07Vu3TrVqVNHv/zlL53Pf/HFF7rlllsqY56SpPHjxysxMVFdunRRVFSUZs2apQsXLmj48OGSpKFDh6px48bO+6bGjRun7t2769VXX1W/fv20atUq7dq1S4sWLZIkORwOPfXUU/rjH/+oVq1aqUWLFnr++ecVHh6u/v37V9o6AACA+yhXLP3hD3/QQw89pO7du8vPz09LliyRl5eX8/nFixerV69elTFP6cpXAXz33XeaOnWqMjIy1LlzZ23atMn51uCJEyfk4fHvi2XdunXTihUrNGXKFD377LNq1aqV1q1bpw4dOjjH/P73v9eFCxc0YsQI5eTk6K677tKmTZvk4+NTaesAAADu4yd9z1Jubq78/PxUq1Ytl/1nzpxRvXr1Ku2tuJqK71kCAMD9lPX3d7muLD366KNlGrd48eLy/FgAAIAaq1yxtGTJEjVr1ky333673OiLvwEAAH6ycsXSE088oZUrV+ro0aMaPny4HnnkETVo0KDyZgcAAFDNyvXVAa+//rpOnz6t3//+93rvvfcUERGhgQMH6v333+dKEwAAuCH9rL9I9/jx41qyZImWLl2qy5cv68CBA/Lz86vYGboBbvAGAMD9VMpfpFviYA8PORwOGWNUVFT0c34UAABAjVTuWMrPz9fKlSt17733qnXr1tq3b5/mzZunEydO3JRXlQAAwI2tXDd4jxo1SqtWrVJERIQeffRRrVy5UkFBQZU3OwAAgGpWrnuWPDw81LRpU91+++1yOBzXHPfOO+9U1PzcAvcsAQDgfirlSymHDh1qjSQAAIAbTbm/lBIAAOBm8rM+DQcAAHCjI5YAAAAsiCUAAAALYgkAAMCCWAIAALAglgAAACyIJQAAAAtiCQAAwIJYAgAAsCCWAAAALIglAAAAC2IJAADAglgCAACwIJYAAAAsiCUAAAALYgkAAMCCWAIAALAglgAAACyIJQAAAAtiCQAAwIJYAgAAsCCWAAAALIglAAAAC2IJAADAglgCAACwIJYAAAAsiCUAAAALYgkAAMCCWAIAALAglgAAACyIJQAAAAtiCQAAwIJYAgAAsCCWAAAALIglAAAAC2IJAADAglgCAACwIJYAAAAsiCUAAAALYgkAAMCCWAIAALAglgAAACyIJQAAAAtiCQAAwIJYAgAAsHCbWDpz5owGDx4sf39/BQYGKikpSefPn7cec+nSJY0ePVoNGzaUn5+f4uPjlZmZ6Xx+7969SkhIUEREhHx9fdWuXTvNnj27ClYDAADchdvE0uDBg3XgwAFt3rxZ69ev18cff6wRI0ZYj/nd736n9957T2vWrNFHH32k9PR0PfTQQ87n09LSFBISomXLlunAgQN67rnnNHnyZM2bN68KVgQAANyBwxhjqnsS13Po0CHdeuut2rlzp7p06SJJ2rRpk/r27atvv/1W4eHhJY7Jzc1VcHCwVqxYoQEDBkiSDh8+rHbt2ik1NVVdu3Yt9bVGjx6tQ4cOacuWLWWeX15engICApSbmyt/f/+fvE4AAFB1yvr72y2uLKWmpiowMNAZSpIUGxsrDw8P7dixo9Rj0tLSVFhYqNjYWOe+tm3bqmnTpkpNTb3ma+Xm5qpBgwbW+eTn5ysvL89lAwAANya3iKWMjAyFhIS47PP09FSDBg2UkZFxzWO8vLwUGBjosj80NPSax2zbtk2rV6++7tt706ZNU0BAgHOLiIgo95oAAIB7qNZYmjRpkhwOh3U7fPhwlcxl//79euCBB5ScnKxevXpZx06ePFm5ubnO7eTJk1UyRwAAUPU8q/PFJ0yYoGHDhlnHtGzZUmFhYcrKynLZf/nyZZ05c0ZhYWGlHhcWFqaCggLl5OS4XF3KzMwscczBgwfVs2dPjRgxQlOmTLnuvL29veXt7X3dcQAAwP1VaywFBwcrODj4uuNiYmKUk5OjtLQ0RUZGSpK2bNmi4uJiRUdHl3pMZGSkateurZSUFMXHx0uSjhw5ohMnTigmJsY57sCBA7rnnnuUmJiol19+ucLWBgAAbgxu8Wk4SerTp48yMzO1cOFCFRYWavjw4erSpYtWrFghSTp16pR69uyppUuXKioqSpL0xBNPaMOGDVqyZIn8/f01duxY6cq9Sbry1ts999yjuLg4zZw50/latWrVKlPEXcWn4QAAcD9l/f1drVeWymP58uUaM2aMevbsKQ8PD8XHx2vOnDnO5wsLC3XkyBFdvHjRue+1115zjs3Pz1dcXJzmz5/vfH7t2rX67rvvtGzZMi1btsy5v1mzZjp27FgVrg4AANRUbnNlqSbjyhIAAO7nhvqeJQAAgOpCLAEAAFgQSwAAABbEEgAAgAWxBAAAYEEsAQAAWBBLAAAAFsQSAACABbEEAABgQSwBAABYEEsAAAAWxBIAAIAFsQQAAGBBLAEAAFgQSwAAABbEEgAAgAWxBAAAYEEsAQAAWBBLAAAAFsQSAACABbEEAABgQSwBAABYEEsAAAAWxBIAAIAFsQQAAGBBLAEAAFgQSwAAABbEEgAAgAWxBAAAYEEsAQAAWBBLAAAAFsQSAACABbEEAABgQSwBAABYEEsAAAAWxBIAAIAFsQQAAGBBLAEAAFgQSwAAABbEEgAAgAWxBAAAYEEsAQAAWBBLAAAAFsQSAACABbEEAABgQSwBAABYEEsAAAAWxBIAAIAFsQQAAGBBLAEAAFgQSwAAABbEEgAAgAWxBAAAYEEsAQAAWBBLAAAAFsQSAACABbEEAABg4TaxdObMGQ0ePFj+/v4KDAxUUlKSzp8/bz3m0qVLGj16tBo2bCg/Pz/Fx8crMzOz1LHff/+9mjRpIofDoZycnEpaBQAAcDduE0uDBw/WgQMHtHnzZq1fv14ff/yxRowYYT3md7/7nd577z2tWbNGH330kdLT0/XQQw+VOjYpKUm33XZbJc0eAAC4K4cxxlT3JK7n0KFDuvXWW7Vz50516dJFkrRp0yb17dtX3377rcLDw0sck5ubq+DgYK1YsUIDBgyQJB0+fFjt2rVTamqqunbt6hy7YMECrV69WlOnTlXPnj119uxZBQYGXnM++fn5ys/Pdz7Oy8tTRESEcnNz5e/vX8GrBwAAlSEvL08BAQHX/f3tFleWUlNTFRgY6AwlSYqNjZWHh4d27NhR6jFpaWkqLCxUbGysc1/btm3VtGlTpaamOvcdPHhQL730kpYuXSoPj7KdjmnTpikgIMC5RURE/Kz1AQCAmsstYikjI0MhISEu+zw9PdWgQQNlZGRc8xgvL68SV4hCQ0Odx+Tn5yshIUEzZ85U06ZNyzyfyZMnKzc317mdPHnyJ60LAADUfNUaS5MmTZLD4bBuhw8frrTXnzx5stq1a6dHHnmkXMd5e3vL39/fZQMAADcmz+p88QkTJmjYsGHWMS1btlRYWJiysrJc9l++fFlnzpxRWFhYqceFhYWpoKBAOTk5LleXMjMzncds2bJF+/bt09q1ayVJV2/fCgoK0nPPPacXX3zxZ68RAAC4t2qNpeDgYAUHB193XExMjHJycpSWlqbIyEjpSugUFxcrOjq61GMiIyNVu3ZtpaSkKD4+XpJ05MgRnThxQjExMZKkv/3tb/rhhx+cx+zcuVOPPvqoPvnkE91yyy0VtEoAAODOqjWWyqpdu3bq3bu3HnvsMS1cuFCFhYUaM2aMHn74Yecn4U6dOqWePXtq6dKlioqKUkBAgJKSkjR+/Hg1aNBA/v7+Gjt2rGJiYpyfhPtxEGVnZztfz/ZpOAAAcPNwi1iSpOXLl2vMmDHq2bOnPDw8FB8frzlz5jifLyws1JEjR3Tx4kXnvtdee805Nj8/X3FxcZo/f341rQAAALgjt/iepZqurN/TAAAAao4b6nuWAAAAqguxBAAAYEEsAQAAWBBLAAAAFsQSAACABbEEAABgQSwBAABYEEsAAAAWxBIAAIAFsQQAAGBBLAEAAFgQSwAAABbEEgAAgAWxBAAAYEEsAQAAWBBLAAAAFsQSAACABbEEAABgQSwBAABYEEsAAAAWxBIAAIAFsQQAAGBBLAEAAFgQSwAAABbEEgAAgAWxBAAAYEEsAQAAWBBLAAAAFsQSAACABbEEAABgQSwBAABYEEsAAAAWxBIAAIAFsQQAAGBBLAEAAFgQSwAAABbEEgAAgAWxBAAAYEEsAQAAWBBLAAAAFsQSAACABbEEAABg4VndE7gRGGMkSXl5edU9FQAAUEZXf29f/T1+LcRSBTh37pwkKSIiorqnAgAAyuncuXMKCAi45vMOc72cwnUVFxcrPT1d9erVk8PhqO7pVKu8vDxFRETo5MmT8vf3r+7p3LA4z1WHc101OM9Vg/Psyhijc+fOKTw8XB4e174ziStLFcDDw0NNmjSp7mnUKP7+/vwPsQpwnqsO57pqcJ6rBuf532xXlK7iBm8AAAALYgkAAMCCWEKF8vb2VnJysry9vat7Kjc0znPV4VxXDc5z1eA8/zTc4A0AAGDBlSUAAAALYgkAAMCCWAIAALAglgAAACyIJZTbmTNnNHjwYPn7+yswMFBJSUk6f/689ZhLly5p9OjRatiwofz8/BQfH6/MzMxSx37//fdq0qSJHA6HcnJyKmkVNV9lnOe9e/cqISFBERER8vX1Vbt27TR79uwqWE3N8frrr6t58+by8fFRdHS0PvvsM+v4NWvWqG3btvLx8VHHjh21YcMGl+eNMZo6daoaNWokX19fxcbG6ssvv6zkVdR8FXmeCwsL9cwzz6hjx46qW7euwsPDNXToUKWnp1fBSmq2iv7z/J9Gjhwph8OhWbNmVcLM3YwByql3796mU6dOZvv27eaTTz4xv/jFL0xCQoL1mJEjR5qIiAiTkpJidu3aZbp27Wq6detW6tgHHnjA9OnTx0gyZ8+eraRV1HyVcZ7ffPNN8+STT5p//vOf5uuvvzZ//etfja+vr5k7d24VrKj6rVq1ynh5eZnFixebAwcOmMcee8wEBgaazMzMUsdv3brV1KpVy8yYMcMcPHjQTJkyxdSuXdvs27fPOeZPf/qTCQgIMOvWrTN79+41999/v2nRooX54YcfqnBlNUtFn+ecnBwTGxtrVq9ebQ4fPmxSU1NNVFSUiYyMrOKV1SyV8ef5qnfeecd06tTJhIeHm9dee60KVlOzEUsol4MHDxpJZufOnc59GzduNA6Hw5w6darUY3Jyckzt2rXNmjVrnPsOHTpkJJnU1FSXsfPnzzfdu3c3KSkpN3UsVfZ5/k+jRo0yPXr0qOAV1ExRUVFm9OjRzsdFRUUmPDzcTJs2rdTxAwcONP369XPZFx0dbR5//HFjjDHFxcUmLCzMzJw50/l8Tk6O8fb2NitXrqy0ddR0FX2eS/PZZ58ZSeb48eMVOHP3Ulnn+dtvvzWNGzc2+/fvN82aNSOWjDG8DYdySU1NVWBgoLp06eLcFxsbKw8PD+3YsaPUY9LS0lRYWKjY2FjnvrZt26pp06ZKTU117jt48KBeeuklLV261PoXGt4MKvM8/1hubq4aNGhQwSuoeQoKCpSWluZyfjw8PBQbG3vN85OamuoyXpLi4uKc448ePaqMjAyXMQEBAYqOjrae8xtZZZzn0uTm5srhcCgwMLACZ+8+Kus8FxcXa8iQIZo4caLat29fiStwLzf3bySUW0ZGhkJCQlz2eXp6qkGDBsrIyLjmMV5eXiX+oxYaGuo8Jj8/XwkJCZo5c6aaNm1aiStwD5V1nn9s27ZtWr16tUaMGFGBs6+ZsrOzVVRUpNDQUJf9tvOTkZFhHX/1n+X5mTe6yjjPP3bp0iU988wzSkhIuGn/MtjKOs/Tp0+Xp6ennnzyyUqauXsiliBJmjRpkhwOh3U7fPhwpb3+5MmT1a5dOz3yyCOV9ho1QXWf5/+0f/9+PfDAA0pOTlavXr2q5DWBn6uwsFADBw6UMUYLFiyo7uncUNLS0jR79mwtWbJEDoejuqdTo3hW9wRQM0yYMEHDhg2zjmnZsqXCwsKUlZXlsv/y5cs6c+aMwsLCSj0uLCxMBQUFysnJcbnqkZmZ6Txmy5Yt2rdvn9auXStd+YSRJAUFBem5557Tiy+++LPXWBNU93m+6uDBg+rZs6dGjBihKVOm/Kw1uYugoCDVqlWrxKcwSzs/V4WFhVnHX/1nZmamGjVq5DKmc+fOlbCKmq8yzvNVV0Pp+PHj2rJly017VUmVdJ4/+eQTZWVluVzdLyoq0oQJEzRr1iwdO3asUtbiFqr7pim4l6s3Hu/atcu57/333y/Tjcdr16517jt8+LDLjcdfffWV2bdvn3NbvHixkWS2bdt2zU923Mgq6zwbY8z+/ftNSEiImThxYiWvouaJiooyY8aMcT4uKioyjRs3tt4Q++tf/9plX0xMTIkbvF955RXn87m5udzgXcHn2RhjCgoKTP/+/U379u1NVlZWJc7efVT0ec7Oznb57/C+fftMeHi4eeaZZ8zhw4creTU1G7GEcuvdu7e5/fbbzY4dO8ynn35qWrVq5fKR9m+//da0adPG7Nixw7lv5MiRpmnTpmbLli1m165dJiYmxsTExFzzNT788MOb+tNwppLO8759+0xwcLB55JFHzOnTp53bzfLLZ9WqVcbb29ssWbLEHDx40IwYMcIEBgaajIwMY4wxQ4YMMZMmTXKO37p1q/H09DSvvPKKOXTokElOTi71qwMCAwPN//7v/5ovvvjCPPDAA3x1QAWf54KCAnP//febJk2amD179rj82c3Pz6+2dVa3yvjz/GN8Gu7/EEsot++//94kJCQYPz8/4+/vb4YPH27OnTvnfP7o0aNGkvnwww+d+3744QczatQoU79+fVOnTh3z4IMPmtOnT1/zNYilyjnPycnJRlKJrVmzZlW+vuoyd+5c07RpU+Pl5WWioqLM9u3bnc91797dJCYmuox/++23TevWrY2Xl5dp3769+cc//uHyfHFxsXn++edNaGio8fb2Nj179jRHjhypsvXUVBV5nq/+WS9t+88//zejiv7z/GPE0v9xmKs3hwAAAKAEPg0HAABgQSwBAABYEEsAAAAWxBIAAIAFsQQAAGBBLAEAAFgQSwAAABbEEgAAgAWxBAAVwOFwaN26ddU9DQCVgFgC4PaGDRsmh8NRYuvdu3d1Tw3ADcCzuicAABWhd+/eeuutt1z2eXt7V9t8ANw4uLIE4Ibg7e2tsLAwl61+/frSlbfIFixYoD59+sjX11ctW7bU2rVrXY7ft2+f7rnnHvn6+qphw4YaMWKEzp8/7zJm8eLFat++vby9vdWoUSONGTPG5fns7Gw9+OCDqlOnjlq1aqV3333X+dzZs2c1ePBgBQcHy9fXV61atSoRdwBqJmIJwE3h+eefV3x8vPbu3avBgwfr4Ycf1qFDhyRJFy5cUFxcnOrXr6+dO3dqzZo1+uCDD1xiaMGCBRo9erRGjBihffv26d1339UvfvELl9d48cUXNXDgQH3xxRfq27evBg8erDNnzjhf/+DBg9q4caMOHTqkBQsWKCgoqIrPAoCfxACAm0tMTDS1atUydevWddlefvllY4wxkszIkSNdjomOjjZPPPGEMcaYRYsWmfr165vz5887n//HP/5hPDw8TEZGhjHGmPDwcPPcc89dcw6SzJQpU5yPz58/bySZjRs3GmOMue+++8zw4cMreOUAqgL3LAG4IfTo0UMLFixw2degQQPnv8fExLg8FxMToz179kiSDh06pE6dOqlu3brO5++8804VFxfryJEjcjgcSk9PV8+ePa1zuO2225z/XrduXfn7+ysrK0uS9MQTTyg+Pl67d+9Wr1691L9/f3Xr1u1nrhpAVSCWANwQ6tatW+JtsYri6+tbpnG1a9d2eexwOFRcXCxJ6tOnj44fP64NGzZo8+bN6tmzp0aPHq1XXnmlUuYMoOJwzxKAm8L27dtLPG7Xrp0kqV27dtq7d68uXLjgfH7r1q3y8PBQmzZtVK9ePTVv3lwpKSk/aw7BwcFKTEzUsmXLNGvWLC1atOhn/TwAVYMrSwBuCPn5+crIyHDZ5+np6byJes2aNerSpYvuuusuLV++XJ999pnefPNNSdLgwYOVnJysxMREvfDCC/ruu+80duxYDRkyRKGhoZKkF154QSNHjlRISIj69Omjc+fOaevWrRo7dmyZ5jd16lRFRkaqffv2ys/P1/r1652xBqBmI5YA3BA2bdqkRo0auexr06aNDh8+LF35pNqqVas0atQoNWrUSCtXrtStt94qSapTp47ef/99jRs3TnfccYfq1Kmj+Ph4/fnPf3b+rMTERF26dEmvvfaann76aQUFBWnAgAFlnp+Xl5cmT56sY8eOydfXV7/85S+1atWqCls/gMrjMP/3KQ4AuGE5HA79/e9/V//+/at7KgDcEPcsAQAAWBBLAAAAFtyzBOCGx90GAH4OriwBAABYEEsAAAAWxBIAAIAFsQQAAGBBLAEAAFgQSwAAABbEEgAAgAWxBAAAYPH/AdAj8vr4ugedAAAAAElFTkSuQmCC",
- "text/plain": [
- "<Figure size 640x480 with 1 Axes>"
- ]
- },
- "metadata": {},
- "output_type": "display_data"
- }
- ],
+ "outputs": [],
"source": [
"plt.suptitle('MSE vs Epochs')\n",
"plt.plot(train_err, label='Train', color='blue')\n",
@@ -954,7 +1423,7 @@
},
{
"cell_type": "code",
- "execution_count": 13,
+ "execution_count": 41,
"execution_state": "idle",
"metadata": {
"id": "LoGEmM5lH7_A"
@@ -962,46 +1431,9 @@
"outputs": [
{
"data": {
+ "image/png": "iVBORw0KGgoAAAANSUhEUgAAAhIAAAGdCAYAAABHM5ovAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjkuMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8hTgPZAAAACXBIWXMAAA9hAAAPYQGoP6dpAAAwtElEQVR4nO3de3RU9b338c9kQiYQk0igJKQkkLbeuJgoBKp4WiJ5ykktLbq8HkojrqWnbRAw1gNYwUvFiFaaqjlQfJZiz5FKu1ZBK0d6OBGNtqC5NJ7yWBCOEVMxiVTJkNBcmNnPH5I5BgbYmdnzm8nk/Vpr/zE7e8/3u8mF7/yuLsuyLAEAAIQgIdoJAACAwYtCAgAAhIxCAgAAhIxCAgAAhIxCAgAAhIxCAgAAhIxCAgAAhIxCAgAAhCwx2gmczO/369ChQ0pNTZXL5Yp2OgCAGGZZlo4ePars7GwlJETus3FXV5d6enrCfp+kpCQlJyc7klOsiLlC4tChQ8rJyYl2GgCAQaS5uVnjxo2LyHt3dXUpb/w5amnzhf1eWVlZampqiqtiIuYKidTUVOnED0VaWlq003HMd4ofMRbrkykpRuL4hhkJI0nqyjATp2d8+J847MgYfdRIHElKTeoyEmfb1x8yEgeDh7/1kojH8Hb4Nf7S9wP/d0RCT0+PWtp8aqofr7TU0Fs9vEf9ypt6UD09PRQSkdTXnZGWlhZXhURiorkfGneSoVhJZsJIktvQIyUMNzNsyD3CTMEiSYkeM9vpxNPvK5zhP+Y2FstEV3haakJYhUS84l8EAAAbfJY/7EOSCgsLNXHiRFVVVUX7kRwRcy0SAADEIr8s+RV6C1/fvbW1tXHVgkchAQCADX755Q/z/nhE1wYAAAgZLRKGuDvMjJyXpJSPPEbifFxg7scn8ZiZON3dZmrrc5P/biSOJFUXrTUWC/i883/5g4jH8Hd1Sbo74nEkyWdZ8lmhd22Ec28so0UCAAAb+sZIhHOIwZYAACAcDLYEAGAI8suSz4FZG/GGQgIAABucmv4ZbwY8RqKmpkZz585Vdna2XC6Xtm7detprv//978vlcqmysjLcPAEAQAwacCHR2dmp/Pz8sw4S2bJli3bv3q3s7Oxw8gMAICb0zdoI5xCDLaWSkhKVlJSc8ZoPP/xQt99+u37/+9/rqquuCie/uLH9vx80FuvyGx4zEif9f8w10/1tspkt5RPO6TUSx6Tb6kqNxClI/cBIHEn64QU7jcVC6JK8kY/h6458jD7+E0c494vBlmfn9/u1YMEC3XXXXZo0adJZr+/u7lZ39//+JHi9Bn7yAACAIxxfR2LNmjVKTEzU4sWLbV1fUVGh9PT0wJGTk+N0SgAAhM13YtZGOEc8crSQqK+v189//nNt3LjR9pauK1asUHt7e+Bobm52MiUAABzhs8I/4pGjhcTrr7+utrY25ebmKjExUYmJiTp48KDuvPNOTZgwIeg9Ho9HaWlp/Q4AAGKN34FDDLY8swULFqi4uLjfuTlz5mjBggVauHChk6EAABiUhvxgy46ODh04cCDwuqmpSY2NjcrIyFBubq5GjRrV7/phw4YpKytLF1xwgTMZAwAQBX655FPoM8j8YdwbywZcSNTV1amoqCjwury8XJJUWlqqjRs3OpsdAAAxwm99doRzfzwacCExa9YsWQPYCvX9998faAiE6Y+b7zQS5+vffMRInM8MMxLF3+s2EudI13AjcSRJqWbCsLYDTpbYGfkYLoPrSCA49toAAMAGX5hdG+HcG8soJAAAsIFCIjjHF6QCAACnx/RPAACGIL/lkt8KY9bGiXuH/PRPAACGIro2gqNrAwAAhIwWiTh05eyHjcQ5ep7HSByTkkb0RDsFx22Y9qyROP+6r8jGVc5gqung4PIZiBHOvt4D5FOCfGF8/jbwzxEVFBIAANhghTlGwgrj3lhGIQEAgA2MkQiOMRIAACBkFBIAANjgsxLCPsQ6EgAADE1+ueQP4/O3X5/tUxVv60jQIgEAAEJGiwQAADYw2DI4Cok41JtqZitsk3wjzEwWT0ywjMTJTTtiJI5JrO2AkyUdjfzvk6/HzO+sPjdGIvT7zeVqEl0bAAAgZLRIAABgw2eDLcPYtIuuDQAAhi5/mEtk983aiDd0bQAAgJDRIgEAgA0MtgyOQgIAABv8SnBkQap4QyGBkKV+0Gss1icXm+mF6+k28yvxgfdcI3FMYhtxnCzBwL7ZlsG9uX2WS74wdvDsu7ewsFBut1tlZWUqKytzMMPooJAAAMCgeFsim0ICAAAbfGHO2vDRtQEAwNDltxLkD2OwpT9OB1sy/RMAAISMFgkAAGygayM4CgkAAGzwf27mRaj3xyMKiTg07KiZ+VAfX+IxEkeS5DbzTP5eMzunfvppipE4JjElEyfrOSfye0v4euJz/4rBhEICAAAbwl+QKj6HJVJIAABgQ/hLZMdnIRGfTwUAAIygRQIAABv8csmvcAZbxud4DgoJAABsoGsjOAoJAABsCH8difgsJOLzqQAAiFGFhYWaOHGiqqqqop2KI2iRQMgy3jG3jfjRSWbiJJ/TbSTOF0ceMRJHkopeudNInJ1XPmYkDgaP4YcjvwTT8V5zyzz5LZf84SxIdeJedv8EAGAI8ofZtRGv60jE51MBAAAjaJEAAMCG8LcRj8/P7hQSAADY4JNLvjDWggjn3lgWn+URAAAwghYJAABsoGsjuAEXEjU1NXr00UdVX1+vjz76SFu2bNG8efMkSb29vbrnnnv0H//xH3rvvfeUnp6u4uJiPfzww8rOzo5E/ggi6XCnkTjvX51hJI4kqcfMNuLDEs3EqS5aaySODE7/BE7mH2YgRuRDBPjC7J4w89fFvAGXR52dncrPzw+6kMaxY8fU0NCglStXqqGhQb/97W+1b98+ffvb33YqXwAAEEMG3CJRUlKikpKSoF9LT0/Xjh07+p178sknNX36dH3wwQfKzc0NPVMAAKKIro3gIj5Gor29XS6XS+eee27Qr3d3d6u7+39XE/R6vZFOCQCAAWPTruAi+lRdXV1atmyZbrrpptMuB1pRUaH09PTAkZOTE8mUAAAIiXViG/FQD4vpnwPT29ur66+/XpZlad26dae9bsWKFWpvbw8czc3NkUoJAAA4LCJdG31FxMGDB/XKK6+ccXMSj8cjj8cTiTQAAHAMXRvBOV5I9BUR+/fv186dOzVq1CinQwAAYJxTu3/GmwEXEh0dHTpw4EDgdVNTkxobG5WRkaGxY8fq2muvVUNDg1566SX5fD61tLRIkjIyMpSUlORs9ogqz6fmYnWNMROnu4c12gCnJHZZkQ/SayAGzmjAfzXr6upUVFQUeF1eXi5JKi0t1X333acXX3xRklRQUNDvvp07d2rWrFnhZwwAQBT4wtxGPJx7Y9mAC4lZs2bJsk5fAZ7pawAADFZ0bQQXn+URAAAwgkICAAAb/EoI+4hFV199tUaOHKlrr702pPtj86kAAIgxPssV9hGLlixZol/+8pch308hAQDAEDZr1iylpqaGfD9z3QwpyfqhsVidX80zFssUd5eZmnfE8B4jcWbvLDcS5zOx+SkIGGyiMdiypqZGjz76qOrr6/XRRx9py5YtmjdvXr9rqqqq9Oijj6qlpUX5+fl64oknNH369JDzHCgKCQAAbLDC3P3TCuHezs5O5efn65ZbbtE111xzytc3b96s8vJyrV+/XjNmzFBlZaXmzJmjffv2acyYzxbgKSgo0PHjx0+59z//8z+VnZ0d4tP8LwoJAABs8MklXxgtfH33nrzL9Zm2iigpKVFJSclp33Pt2rW69dZbtXDhQknS+vXrtW3bNj399NNavny5JKmxsTHknO1gjAQAAAbl5OT02/W6oqIipPfp6elRfX29iouLA+cSEhJUXFysXbt2OZjxmdEiAQCADX4rvEWl/CfWa2xubu63mWWoG1cePnxYPp9PmZmZ/c5nZmZq7969tt+nuLhYb7/9tjo7OzVu3Dj95je/0WWXXWb7fgoJAABs8Ic5RqLv3rS0tDPuim3af/3Xf4V1P10bAAAMQqNHj5bb7VZra2u/862trcrKyjKWBy0Shrzc8q/GYl05+2EjcZI/cRuJI0ntF/iNxDnyt3OMxBk1otNIHCCaEgzszGkiRh+/XPKHMdiy797CwkK53W6VlZWprKws5PdLSkrS1KlTVV1dHZgS6vf7VV1drUWLFoX8vgNFIQEAgA3hrk7Zd29tba3tro2Ojg4dOHAg8LqpqUmNjY3KyMhQbm6uysvLVVpaqmnTpmn69OmqrKxUZ2dnYBaHCRQSAADEqLq6OhUVFQVel5d/tphdaWmpNm7cqBtuuEEff/yxVq1apZaWFhUUFGj79u2nDMCMJAoJAABscGqw5UDMmjVLlnXm7ptFixYZ7co4GYMtAQCwwS9XYJnskI7PjZGYOHGiqqqqov1IjqBFAgAAgwYyRmIwoJAAAMAGK8xZG1acbqBHIQEAgA3R2P1zMKCQMMTkNuKvGFqzIn/Jz4zEkSTPYTNrVnQPMzMnvemj0UbiSNJ7/3S3kTivv/8VI3Ek6R8mHLBxFaLNZeDXyUSMPtEYbDkYxOdTAQAQoxhsCQDAEORU1waDLQEAGIKcWiI73tC1AQAAQkaLBAAANjBrIzhaJAAAsCGsVS0/V4Qw2BIhMbmN+D/Me9RInK5Lzf349H7l70biJCcfNxKnp8vcv13RK3caibPzSqZkor/eEZH/rHq8d/B9HmawJQAAQxBdG8FRSAAAYAOFRHCDr00IAADEDFokAACwwQpzLQiDq3kbRSEBAIANdG0ER9cGAAA2MP0zOFokDDG5+2dK5igjcZK+kmEkjiT1/M1jJM655x81EqetK9VIHCCakry+iMdIOB75GE5j+icAAEMQXRvBUUgAAGADhURwjJEAAAAho0UCAAAbLMslK4xWhXDujWUUEgAA2OCXK6x1JMK5N5bRtQEAAEJGIQEAgA2sIxEcXRuGmNxG/B/zVxqJ03uOkTCSJOscM3PFW1rPNRLn/e8tMxJHkm6rKzUWC/g8f1Lkm/L9LnPdBU6NkYi3dSQG3CJRU1OjuXPnKjs7Wy6XS1u3bu33dcuytGrVKo0dO1bDhw9XcXGx9u/f72TOAAAgRgy4kOjs7FR+fv5pm2QeeeQRPf7441q/fr3efPNNpaSkaM6cOerq6nIiXwAAosKpro14M+CujZKSEpWUlAT9mmVZqqys1D333KPvfOc7kqRf/vKXyszM1NatW3XjjTeGnzEAAFHA9M/gHB1s2dTUpJaWFhUXFwfOpaena8aMGdq1a5eToQAAMMoKszUiXgsJRwdbtrS0SJIyMzP7nc/MzAx87WTd3d3q7u4OvPZ6vU6mBAAAIijq0z8rKiqUnp4eOHJycqKdEgAAp7AkWVYYR7QfIEIcbZHIysqSJLW2tmrs2LGB862trSooKAh6z4oVK1ReXh547fV6KSbC1JlnZlrRsA4jYSRJvZ+amamcNaXVSByT9ntHRzsFDFGJxwxM2za4jbhfLrlY2fIUjrZI5OXlKSsrS9XV1YFzXq9Xb775pi677LKg93g8HqWlpfU7AADA4DDgj3kdHR06cOBA4HVTU5MaGxuVkZGh3NxcLV26VA8++KDOO+885eXlaeXKlcrOzta8efOczh0AAGOYtRHcgAuJuro6FRUVBV73dUuUlpZq48aN+pd/+Rd1dnbqtttu05EjR3TFFVdo+/btSk5OdjZzAAAM8lsuucIoBj6/RLbb7VZZWZnKysoczDA6BlxIzJo1S5Z1+iEjLpdLDzzwgB544IFwcwMAIO7E2xLZ7LUBAIANfbMvwrk/HlFIAABgA2Mkgov6OhIAAGDwokXCkJKsHxqLlZI5ykictkszjMQxqeu4mV+J2TvLbVzljJ1XrjUWC/g8V68/8jGORz5GH1okgqOQAADABqdmbcQbCgkAAGxgsGVwjJEAAAAho0UCAAAbPmuRCGeMhKPpxAwKCQAAbGCwZXB0bQAAgJDRIhGHekanRDsFx/lGHjcSJyvlqJE4LZ2pRuIA0dQ9KiniMY4bmGLaxzpxhHN/PKKQAADABro2gqNrAwAAgwoLCzVx4kRVVVVFOxVH0CIBAIAdDvVtsPsnAABDUZhdG4rTrg0KCQAAbGBly+AYIwEAAEJGiwQAADYwayM4CglDrDHmttzuGj3MSBx3j5EwkqTk9C4jcf7yYZaROO/ddLeROJJU9MqdRuLsvPIxI3EweKR80BnxGMd9Zv42SCfGODBG4hR0bQAAgJDRIgEAgA0MtgyOQgIAADtYIzsoujYAAEDIaJEAAMAGZm0ERyEBAIBdcdo9EQ4KCUO2//eDxmJdOfthI3G6Lov8FsF9zExolSaN+8hInKv/UGYkjiTtvDI+NgbC4NN7rifiMY4f53/2aKOQAADABro2gqOQAADADmZtBEUhAQCALa4TRzj3xx+mfwIAgJDRIgEAgB10bQRFIQEAgB0UEkFRSBjyjxffYyxW75fSjcQZ5jXX32dqtPP7n5rZpbWzI/LT4gJmmgsFfJ772PGIx7CORz5GPGtubtaCBQvU1tamxMRErVy5Utddd92A3oNCAgAAO+JwG/HExERVVlaqoKBALS0tmjp1qr75zW8qJSXF/ntENEMAAOJEPO7+OXbsWI0dO1aSlJWVpdGjR+uTTz4ZUCHBrA0AAGJUTU2N5s6dq+zsbLlcLm3duvWUa6qqqjRhwgQlJydrxowZeuutt0KKVV9fL5/Pp5ycnAHdRyEBAIAdlgPHAHV2dio/P19VVcGXut+8ebPKy8t17733qqGhQfn5+ZozZ47a2toC1xQUFGjy5MmnHIcOHQpc88knn+h73/ueNmzYMOAc6doAAMAOh8ZIeL3efqc9Ho88nuADsEtKSlRSUnLat1y7dq1uvfVWLVy4UJK0fv16bdu2TU8//bSWL18uSWpsbDxjWt3d3Zo3b56WL1+uyy+/fMCPRYsEAAAG5eTkKD09PXBUVFSE9D49PT2qr69XcXFx4FxCQoKKi4u1a9cuW+9hWZZuvvlmXXnllVqwYEFIedAiAQCADS7rsyOc+3ViymVaWlrg/OlaI87m8OHD8vl8yszM7Hc+MzNTe/futfUef/jDH7R582ZdfPHFgfEX//Zv/6YpU6bYzoNCwhCT24hf+oOfGYtlSu/fzfyojkjuMRLnvX+620gcIJr8HnfkY7gjHyPAoQWp0tLS+hUS0XTFFVfI7/eH9R4UEgAA2BFj60iMHj1abrdbra2t/c63trYqKyvL0VhnwhgJAAAMKiws1MSJE087E8OupKQkTZ06VdXV1YFzfr9f1dXVuuyyyxzI1B7HWyR8Pp/uu+8+/fu//7taWlqUnZ2tm2++Wffcc49crthb1QsAAFsc6tqora213bXR0dGhAwcOBF43NTWpsbFRGRkZys3NVXl5uUpLSzVt2jRNnz5dlZWV6uzsDMziMMHxQmLNmjVat26dnn32WU2aNEl1dXVauHCh0tPTtXjxYqfDAQBgRhQ27aqrq1NRUVHgdXl5uSSptLRUGzdu1A033KCPP/5Yq1atUktLiwoKCrR9+/ZTBmBGkuOFxB//+Ed95zvf0VVXXSVJmjBhgn71q1+FvNIWAABD1axZs2SdZW3tRYsWadGiRcZyOpnjYyQuv/xyVVdX691335Ukvf3223rjjTdOu6BGd3e3vF5vvwMAgJjj0MqWTo2RiBWOt0gsX75cXq9XF154odxut3w+n1avXq358+cHvb6iokL333+/02kMacmfhDeVx64jFxkJI0nyd5mZYOROMPNv96VNDxmJI0nzp9QaifOTKVuMxMHg4UuK/Hh+X4LBOQMOzdoYyBiJwcDx78Cvf/1rPffcc9q0aZMaGhr07LPP6qc//ameffbZoNevWLFC7e3tgaO5udnplAAAQIQ4/jHvrrvu0vLly3XjjTdKkqZMmaKDBw+qoqJCpaWlp1x/pjXGAQCIFU6tbBlvHC8kjh07poSTmprcbnfYK2cBABBVUZi1MRg43rUxd+5crV69Wtu2bdP777+vLVu2aO3atbr66qudDgUAwKDDYMuzeOKJJ7Ry5Ur98Ic/VFtbm7Kzs/XP//zPWrVqldOhAAAYdOJtsKXjhURqaqoqKytVWVnp9FsDABA1rjDHOcTr2s5s2hWH3N1mxqNYLnPTrjwju4zESfOYiTPxy602rnLGGx9/yVgsoB8T2yKY3HohxjbtihVs2gUAAEJGIQEAgB2sbBkUXRsAANgRhd0/BwNaJAAAQMhokQAAwAZWtgyOQgIAADtY2TIoujYAAEDIaJEwpCTrh8ZidX3zK4YimSuvz035u5E4PT4zvxKHjpkbaLXzyseMxQI+L+F45Ne0MREjgBaJoGiRAADAhr4xEuEcYvonAAAIR7xN/6SQAADADpbIDopCAgAAOxgjERSFBAAANrCORHAMtgQAACGjRSIO+RPN9MP5Rxw3EkeSenxuI3GGuX1G4qQldRuJI0lFr9xpJA7TTHEyly/yH8FNxAigayMoCgkAAOwIs2sjXgsJujYAADCIdSQAABiK2EY8KAoJAADsYIxEUHRtAACAkNEiAQCADawjERwtEgAAIGS0SMSh7nQzcVx+c+vGJ8RZKb9lZnyM1gbOJPFoT+SD+AzEwBlRSAAAYAeDLYOikAAAwAbGSARHIQEAgF1xWgyEg8GWAAAgZBQSAADYYTlwsEQ2AABDk1NjJFgiGzHP7zETJ+GcXjOBJH0hpcNInI4eM/94s3eWG4kjSdVFa43FAj7veGpS5GMc90c8Bs6MQgIAADuY/hkUhQQAADYw/TM4BlsCAICQ0SIBAIAddG0ERSEBAIAdFBJB0bUBAABCRotEHPIb+q4mJvnMBJI0ItHMVFO/ZWZH0x2zfmYkDhBN7r8fj3gM63jkY/RhsGVwFBIAANhB10ZQFBIAANhBIREUYyQAAEDIaJEAAMAGxkgER4sEAAB2sPtnUBEpJD788EN997vf1ahRozR8+HBNmTJFdXV1kQgFAMCgUltbq3feeUdlZWXRTsURjndtfPrpp5o5c6aKior08ssv6wtf+IL279+vkSNHOh0KAABj6NoIzvFCYs2aNcrJydEzzzwTOJeXl+d0mEGne1KusVguQ8s7DPf0mAkkKXfEJ0bivNOeZSTO1X8w90nkSHeykTg7r3zMSBwMHq6uyK/x4PKZW8+GWRvBOd618eKLL2ratGm67rrrNGbMGF1yySV66qmnTnt9d3e3vF5vvwMAAAwOjhcS7733ntatW6fzzjtPv//97/WDH/xAixcv1rPPPhv0+oqKCqWnpweOnJwcp1MCACB8Dg22jDeOFxJ+v1+XXnqpHnroIV1yySW67bbbdOutt2r9+vVBr1+xYoXa29sDR3Nzs9MpAQAQNpcDRzxyvJAYO3asJk6c2O/cRRddpA8++CDo9R6PR2lpaf0OAAAwODg+2HLmzJnat29fv3Pvvvuuxo8f73QoAADMYbBlUI63SNxxxx3avXu3HnroIR04cECbNm3Shg0b4ma+LABgaOqb/hnOEY8cb5EoLCzUli1btGLFCj3wwAPKy8tTZWWl5s+f73SoQSXp4w5jsXrP8RiJMyHN3AybbkN7ox81NFXyfw59wUgcSXrvn+42Fgv4PCs58r+3JrcRp0UiuIh8l7/1rW/pW9/6ViTeGgAAxBA27QIAwK44bVUIB4UEAAA2sER2cOz+CQAAQkaLBAAAdjDYMigKCQAAbKBrIzi6NgAAQMhokTCk5wvnGIt1PN3Mtrp55/zNSBxJSkvsMhJn+DAzW6OPHNlpJA4QTa7j/sjH8EU+RgBdG0FRSAAAYANdG8HRtQEAwBB15MgRTZs2TQUFBZo8ebKeeuqpAb8HLRIAANgRh10bqampqqmp0YgRI9TZ2anJkyfrmmuu0ahRo2y/B4UEAAB2xGEh4Xa7NWLECElSd3e3LMuSZQ0sUbo2AACwIRq7f9bU1Gju3LnKzs6Wy+XS1q1bT7mmqqpKEyZMUHJysmbMmKG33nprQDGOHDmi/Px8jRs3TnfddZdGjx49oPspJAAAiFGdnZ3Kz89XVVVV0K9v3rxZ5eXluvfee9XQ0KD8/HzNmTNHbW1tgWv6xj+cfBw6dEiSdO655+rtt99WU1OTNm3apNbW1gHlSNeGIe6/G9zq1mXm2zoh2dz0z4960o3EOdhqv18wHO/dxNbeiH+urshPp3b5eiMeI8Chrg2v19vvtMfjkcfjCXpLSUmJSkpKTvuWa9eu1a233qqFCxdKktavX69t27bp6aef1vLlyyVJjY2NttLLzMxUfn6+Xn/9dV177bV2n4oWCQAA7HBZVtiHJOXk5Cg9PT1wVFRUhJRPT0+P6uvrVVxcHDiXkJCg4uJi7dq1y9Z7tLa26ujRo5Kk9vZ21dTU6IILLhhQHrRIAABgUHNzs9LS0gKvT9cacTaHDx+Wz+dTZmZmv/OZmZnau3evrfc4ePCgbrvttsAgy9tvv11TpkwZUB4UEgAA2OFQ10ZaWlq/QiKapk+fbrvr43QoJAAAsCHWVrYcPXq03G73KYMjW1tblZWV5WywM2CMBAAABhUWFmrixImnnYlhV1JSkqZOnarq6urAOb/fr+rqal122WUOZGoPLRIAANjhUNdGbW2t7a6Njo4OHThwIPC6qalJjY2NysjIUG5ursrLy1VaWqpp06Zp+vTpqqysVGdnZ2AWhwkUEob8fWyysVgZOUeMxBmfdNhIHEn6fx3ZxmIBcEa87f4Zja6Nuro6FRUVBV6Xl5dLkkpLS7Vx40bdcMMN+vjjj7Vq1Sq1tLSooKBA27dvP2UAZiRRSAAAEKNmzZp11iWrFy1apEWLFhnL6WSMkQAAwA7LgcPBMRKxghYJAABscKprYyBjJAYDCgkAAOyIw90/nUDXBgAACBktEgAA2OT0olLxgBYJAADssKzwDwZbIlQJx82VsZNGtRiJ8zffOUbiSFLNu18xEqfpuyuMxAGGguP7Dti4KswYlsFtxB3CYEsAAIagWNtrI1ZQSAAAYAezNoJijAQAAAgZhQQAADa4/OEfYrAlAABDVBR2/xwMaJEAAAAho0XCkKNfNPdPPS39fSNxftd6sZE4kuT+0Nw27ACcscP/m4jH8Hq9Sk9Pj3gcMWvjtCgkAACw43OLSoV8fxyikAAAwAZaJIJjjAQAAAgZhQQAAHZYDhxM/wQAYGhyqmuD6Z8AAAAnRLxF4uGHH9aKFSu0ZMkSVVZWRjrcgP2fhOuMxOn68eVG4khSzrC/GYnzlz05RuJIUsqnxkIBQHDM2ggqooVEbW2tfvGLX+jii82tNwAAQCQwayO4iHVtdHR0aP78+Xrqqac0cuTISIUBAABRFLFCoqysTFdddZWKi4vPeF13d7e8Xm+/AwCAmOPQrI14E5Gujeeff14NDQ2qra0967UVFRW6//77I5EGAACOoWsjOMdbJJqbm7VkyRI999xzSk4++/4IK1asUHt7e+Bobm52OiUAAGIG60icRX19vdra2nTppZcGzvl8PtXU1OjJJ59Ud3e33G534Gsej0cej8fpNAAAcJbf+uwI5/44XEfC8UJi9uzZ+vOf/9zv3MKFC3XhhRdq2bJl/YoIAAAGjXDHOcRp14bjhURqaqomT57c71xKSopGjRp1yvlYYGKbW0masP6nRuJIUpeVZCROxn9TFAIYOlxhjnNwOZlMDGFlSwAAEDIje228+uqrJsIAABA5rGwZFJt2AQBgA9M/g6NrAwAAhIwWCQAA7GDWRlAUEgAA2OCyLLnCGOcQzr2xjELCkEumNBmLtcv7FSNxGtbfYSQOACB2MUYCAAA7/A4cLJENAMDQ5FTXRrwtkU2LBAAACBktEgAA2MGsjaAoJAAAsIOVLYOikAAAwAZWtgyOMRIAACBktEgYUv7F3xuLVbrrFiNxnrjUSBgAiA10bQRFIQEAgA0u/2dHOPfHI7o2AABAyGiRAADADro2gqKQAADADtaRCIquDQAAEDJaJAAAsIFtxIMb8oVEyRdvNxJn9a4eI3EkKXX3cDOBbjITBgBigkNjJAoLC+V2u1VWVqaysjLn8ouSIV9IAABgUrzt/kkhAQCAHZakcNaCiM+eDQoJAADsYIxEcBQSAADYYYW5FkR81hFM/wQAAKGjRQIAADtY2TIoCgkAAOzwS3KFeX8cGvKFxKHrvmQkzqOH/tFIHEl6++d3GIsFABjahnwhAQCAHczaCI5CAgAAOxgjERSzNgAAQMhokQAAwA5aJIKikAAAwA4KiaDo2gAAACEb8i0S3mldRuL8aftFRuJIkr5qLhQADBmsIxHUkC8kAACwg+mfwVFIAABgB2MkgmKMBAAACBktEgAA2OG3JFcYrQp+WiQAABi6+ro2wjli1LFjxzR+/Hj96Ec/GvC9FBIAAAxxq1ev1le/GtqUvyHftbF39lNG4sz7kbndP3WfuVAAMHSE26oQmy0S+/fv1969ezV37lzt2bNnwPfTIgEAgB1R6NqoqanR3LlzlZ2dLZfLpa1bt55yTVVVlSZMmKDk5GTNmDFDb7311oBi/OhHP1JFRcWAc+vjeCFRUVGhwsJCpaamasyYMZo3b5727dvndBgAAOJeZ2en8vPzVVVVFfTrmzdvVnl5ue699141NDQoPz9fc+bMUVtbW+CagoICTZ48+ZTj0KFDeuGFF3T++efr/PPPDzlHx7s2XnvtNZWVlamwsFDHjx/X3XffrW984xt65513lJKS4nQ4AADM8FvhdU+cmLXh9Xr7nfZ4PPJ4PEFvKSkpUUlJyWnfcu3atbr11lu1cOFCSdL69eu1bds2Pf3001q+fLkkqbGx8bT37969W88//7x+85vfqKOjQ729vUpLS9OqVatsP5bjhcT27dv7vd64caPGjBmj+vp6fe1rX3M6HAAAZlj+z45w7peUk5PT7/S9996r++4b+OC2np4e1dfXa8WKFYFzCQkJKi4u1q5du2y9R0VFRaBbY+PGjdqzZ8+AigiZGGzZ3t4uScrIyAj69e7ubnV3dwden1ypAQAQT5qbm5WWlhZ4fbrWiLM5fPiwfD6fMjMz+53PzMzU3r17w87TrogWEn6/X0uXLtXMmTM1efLkoNdUVFTo/vvvj2QaAACEz6ElstPS0voVErHi5ptvDum+iM7aKCsr0549e/T888+f9poVK1aovb09cDQ3N0cyJQAAQuO3wj8cNHr0aLndbrW2tvY739raqqysLEdjnUnEWiQWLVqkl156STU1NRo3btxprzvTIBMTbvlgtpE4L7f8q5E4AIAIcahForCwUG63W2VlZSorKwv57ZKSkjR16lRVV1dr3rx50omegOrqai1atCj0PAfI8ULCsizdfvvt2rJli1599VXl5eU5HQIAgEGrtrbWdtdGR0eHDhw4EHjd1NSkxsZGZWRkKDc3V+Xl5SotLdW0adM0ffp0VVZWqrOzMzCLwwTHC4mysjJt2rRJL7zwglJTU9XS0iJJSk9P1/Dhw50OBwCAGVaYW4GHcGtdXZ2KiooCr8vLyyVJpaWl2rhxo2644QZ9/PHHWrVqlVpaWlRQUKDt27efMgAzkhwvJNatWydJmjVrVr/zzzzzTMgDOQAAiDqHujYGYtasWbLOct+iRYuMdmWcLCJdGwAAIDinxkjEiiG/aRcAALb4/ZLCWJDK/9m9AxkjMRhQSAAAYEcUujYGgyFfSDw3w8w24gAAxKMhX0gAAGALLRJBRXRlSwAA4oZDK1sWFhZq4sSJp90afLChRQIAAIMYbAkAwBBkWX5ZYWwjHs69sYxCAgAAO6wwN96K0zESFBIAANhhWaGtc93v/vgTs4XE1xZXyZ2UHPE49RvuiHgMAAD6sLIlAABDkd8vucIY52CxsiUAAEMXXRtBsY4EAAAIGS0SAADYYPn9ssLo2mD6JwAAQxldG0HRtQEAAEJGIQEAgB3stRFUzHZtvPrj/6u0VLeBSKwjAQCwwbIkhTP987NCIt6mf9IiAQAAQhazLRIAAMQSy2/JcoU+YNKK08GWFBIAANhh+cPs2mD6JwAAQxYtEsExRgIAAIQs5lok+io2b4eZJqCEEV4jcQAAzvN6P/sbbuLT/nGrO6zuiePqdTSfWOGyYqyt5a9//atycnKinQYAYBBpbm7WuHHjIvLeXV1dysvLU0tLS9jvlZaWprFjxyohISFuthGPuULC7/fr0KFDSk1NlcvlMhLT6/UqJydHzc3NcTW3t0+8P594xrgQ788nnjEiLMvS0aNHlZ2drYSEyPXWd3V1qaenJ+z3SUpKUnJysiM5xYqY69pISEiIWFV5NmlpaXH7y60h8HziGeNCvD+feEbHpaenRzxGcnJy3BUATmGwJQAACBmFBAAACBmFhCSPx6N7771XHo8n2qlERLw/n3jGuBDvzyeeEXEq5gZbAgCAwYMWCQAAEDIKCQAAEDIKCQAAEDIKCQAAELIhW0hUVFSosLBQqampGjNmjObNm6d9+/ZFO62Ievjhh+VyubR06dJop+KoDz/8UN/97nc1atQoDR8+XFOmTFFdXV2003KEz+fTypUrlZeXp+HDh+vLX/6yfvKTnwzqXQRramo0d+5cZWdny+VyaevWrf2+blmWVq1apbFjx2r48OEqLi7W/v37o5ZvKM70jL29vVq2bJmmTJmilJQUZWdn63vf+54OHToU1ZwH6mzfx8/7/ve/L5fLpcrKSqM5wowhW0i89tprKisr0+7du7Vjxw719vbqG9/4hjo7O6OdWkTU1tbqF7/4hS6++OJop+KoTz/9VDNnztSwYcP08ssv65133tFjjz2mkSNHRjs1R6xZs0br1q3Tk08+qb/85S9as2aNHnnkET3xxBPRTi1knZ2dys/PV1VVVdCvP/LII3r88ce1fv16vfnmm0pJSdGcOXPU1dVlPNdQnekZjx07poaGBq1cuVINDQ367W9/q3379unb3/52VHIN1dm+j322bNmi3bt3Kzs721huMMyCZVmW1dbWZkmyXnvttWin4rijR49a5513nrVjxw7r61//urVkyZJop+SYZcuWWVdccUW004iYq666yrrlllv6nbvmmmus+fPnRy0nJ0mytmzZEnjt9/utrKws69FHHw2cO3LkiOXxeKxf/epXUcoyPCc/YzBvvfWWJck6ePCgsbycdLpn/Otf/2p98YtftPbs2WONHz/e+tnPfhaV/BBZQ7ZF4mTt7e2SpIyMjGin4riysjJdddVVKi4ujnYqjnvxxRc1bdo0XXfddRozZowuueQSPfXUU9FOyzGXX365qqur9e6770qS3n77bb3xxhsqKSmJdmoR0dTUpJaWln4/q+np6ZoxY4Z27doV1dwiqb29XS6XS+eee260U3GM3+/XggULdNddd2nSpEnRTgcRFHObdkWD3+/X0qVLNXPmTE2ePDna6Tjq+eefV0NDg2pra6OdSkS89957WrduncrLy3X33XertrZWixcvVlJSkkpLS6OdXtiWL18ur9erCy+8UG63Wz6fT6tXr9b8+fOjnVpE9G3TnJmZ2e98ZmamI1s4x6Kuri4tW7ZMN910U1xt5LVmzRolJiZq8eLF0U4FEUYhceIT+549e/TGG29EOxVHNTc3a8mSJdqxY0fc7lrn9/s1bdo0PfTQQ5KkSy65RHv27NH69evjopD49a9/reeee06bNm3SpEmT1NjYqKVLlyo7Ozsunm+o6+3t1fXXXy/LsrRu3bpop+OY+vp6/fznP1dDQ4NcLle000GEDfmujUWLFumll17Szp07o7Z9eaTU19erra1Nl156qRITE5WYmKjXXntNjz/+uBITE+Xz+aKdYtjGjh2riRMn9jt30UUX6YMPPohaTk666667tHz5ct14442aMmWKFixYoDvuuEMVFRXRTi0isrKyJEmtra39zre2tga+Fi/6ioiDBw9qx44dcdUa8frrr6utrU25ubmBvz0HDx7UnXfeqQkTJkQ7PThsyLZIWJal22+/XVu2bNGrr76qvLy8aKfkuNmzZ+vPf/5zv3MLFy7UhRdeqGXLlsntdkctN6fMnDnzlGm77777rsaPHx+1nJx07NgxJST0r/fdbrf8fn/UcoqkvLw8ZWVlqbq6WgUFBZIkr9erN998Uz/4wQ+inZ5j+oqI/fv3a+fOnRo1alS0U3LUggULThmTNWfOHC1YsEALFy6MWl6IjCFbSJSVlWnTpk164YUXlJqaGuh/TU9P1/Dhw6OdniNSU1NPGfORkpKiUaNGxc1YkDvuuEOXX365HnroIV1//fV66623tGHDBm3YsCHaqTli7ty5Wr16tXJzczVp0iT96U9/0tq1a3XLLbdEO7WQdXR06MCBA4HXTU1NamxsVEZGhnJzc7V06VI9+OCDOu+885SXl6eVK1cqOztb8+bNi2reA3GmZxw7dqyuvfZaNTQ06KWXXpLP5wv8/cnIyFBSUlIUM7fvbN/Hk4ujYcOGKSsrSxdccEEUskVERXvaSLRICno888wz0U4touJt+qdlWdbvfvc7a/LkyZbH47EuvPBCa8OGDdFOyTFer9dasmSJlZubayUnJ1tf+tKXrB//+MdWd3d3tFML2c6dO4P+7pWWllrWiSmgK1eutDIzMy2Px2PNnj3b2rdvX7TTHpAzPWNTU9Np//7s3Lkz2qnbdrbv48mY/hm/2EYcAACEbMgPtgQAAKGjkAAAACGjkAAAACGjkAAAACGjkAAAACGjkAAAACGjkAAAACGjkAAAACGjkAAAACGjkAAAACGjkAAAACGjkAAAACH7/1iL6hoUmWwOAAAAAElFTkSuQmCC",
"text/plain": [
- "(array([[3.1870e+04, 4.5000e+01, 0.0000e+00, ..., 0.0000e+00, 0.0000e+00,\n",
- " 0.0000e+00],\n",
- " [0.0000e+00, 0.0000e+00, 0.0000e+00, ..., 0.0000e+00, 0.0000e+00,\n",
- " 0.0000e+00],\n",
- " [0.0000e+00, 0.0000e+00, 0.0000e+00, ..., 0.0000e+00, 0.0000e+00,\n",
- " 0.0000e+00],\n",
- " ...,\n",
- " [0.0000e+00, 0.0000e+00, 0.0000e+00, ..., 0.0000e+00, 0.0000e+00,\n",
- " 0.0000e+00],\n",
- " [0.0000e+00, 0.0000e+00, 0.0000e+00, ..., 0.0000e+00, 0.0000e+00,\n",
- " 0.0000e+00],\n",
- " [0.0000e+00, 0.0000e+00, 0.0000e+00, ..., 1.3300e+02, 2.9200e+02,\n",
- " 4.8201e+04]]),\n",
- " array([ 1. , 1.28 , 1.561, 1.84 , 2.121, 2.4 , 2.68 , 2.96 ,\n",
- " 3.24 , 3.52 , 3.8 , 4.08 , 4.36 , 4.64 , 4.92 , 5.2 ,\n",
- " 5.48 , 5.76 , 6.04 , 6.32 , 6.6 , 6.88 , 7.16 , 7.44 ,\n",
- " 7.72 , 8. , 8.28 , 8.56 , 8.84 , 9.12 , 9.4 , 9.68 ,\n",
- " 9.96 , 10.24 , 10.52 , 10.805, 11.08 , 11.36 , 11.64 , 11.92 ,\n",
- " 12.2 , 12.484, 12.76 , 13.04 , 13.32 , 13.6 , 13.88 , 14.164,\n",
- " 14.44 , 14.72 , 15. ], dtype=float16),\n",
- " array([ 0.824, 1.1 , 1.376, 1.652, 1.928, 2.203, 2.48 , 2.756,\n",
- " 3.031, 3.307, 3.582, 3.86 , 4.133, 4.41 , 4.688, 4.96 ,\n",
- " 5.24 , 5.516, 5.79 , 6.066, 6.34 , 6.617, 6.895, 7.168,\n",
- " 7.445, 7.723, 7.996, 8.27 , 8.55 , 8.83 , 9.09 , 9.375,\n",
- " 9.66 , 9.92 , 10.2 , 10.484, 10.75 , 11.03 , 11.31 , 11.58 ,\n",
- " 11.86 , 12.14 , 12.41 , 12.69 , 12.97 , 13.234, 13.516, 13.8 ,\n",
- " 14.06 , 14.34 , 14.625], dtype=float16),\n",
- " <matplotlib.collections.QuadMesh at 0x7fe60c0a49b0>)"
- ]
- },
- "execution_count": 13,
- "metadata": {},
- "output_type": "execute_result"
- },
- {
- "data": {
- "image/png": "iVBORw0KGgoAAAANSUhEUgAAAh8AAAGdCAYAAACyzRGfAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjkuMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8hTgPZAAAACXBIWXMAAA9hAAAPYQGoP6dpAAAhqUlEQVR4nO3df3CU5b338c/m1yalyUJSIYkkGC1HFBDxSJ1HnBbGKM0gyukRRwcpgzNt1ShEHAppja0/IEZbiz8YEP8QOiNaO8egZYo+eSiCTgWBiJWxw4+BYpQfsefgbhLMstm9nz8eydNglr2j917XveH9mtk/dvMl3+/chOTDlb2uO+A4jiMAAABDsmwPAAAAzi2EDwAAYBThAwAAGEX4AAAARhE+AACAUYQPAABgFOEDAAAYRfgAAABG5dge4EyJREJHjhxRYWGhAoGA7XEAAIALjuOoo6ND5eXlyso6+9qG78LHkSNHVFFRYXsMAADwNbS1tWnkyJFnrfFd+CgsLJS+HL6oqMj2OL7ynxMfdFX3X+8/7Gnfm6ofT1nz2v/5uac93Zpxw29d1f1pw/2Doq9bP/pfj7iqe/XdhrTPciZbX8f/Odrd38V/7Xf3d+u1m0I/dlX3Wvj3aZ8FOFPi+MSUNZHOhEZd8Y/en+Nn47vwcfpXLUVFRYSPM+RkB13VeX3dcnLyjfd0y81ssnRN0tHXLVtfK25Y+zrOyrPS162cQK6rOr4vwobEyWzXtW7eMsEbTgEAgFGEDwAAYBThAwAAGEX4AAAARvnuDac4i45OK21zDh210teNnI5TVvpmdcet9HUr9p0htkdIztLX8cajK6z0dasl8UfbIwBJXdj805Q1iS+6JbnbQcfKBwAAMIrwAQAAjCJ8AAAAowgfAADAqAGHj61bt2rGjBkqLy9XIBDQ+vXrk9beeeedCgQCWr58+TedEwAADBIDDh9dXV2aMGGCVqw4+zvHm5ubtW3bNpWXl3+T+QAAwCAz4K22NTU1qqmpOWvNp59+qnvvvVdvvvmmpk+f/k3mAwAAg4zn53wkEgnNmTNHixYt0tixY1PWR6NRRaPR3ueRSMTrkQaNnuPtVvr6+XyErPBJK339fPaJJOV98rntEZIr/LaVttdlzXJVZ+u8Db/Ph3Nb3onUN5aLd7u/+ZznbzhtampSTk6O5s+f76q+sbFRoVCo91FRUeH1SAAAwEc8DR+7du3SU089pTVr1ri6pa4k1dfXKxwO9z7a2tq8HAkAAPiMp+Hj7bffVnt7uyorK5WTk6OcnBwdPnxY999/vy644IJ+/0wwGFRRUVGfBwAAGLw8fc/HnDlzVF1d3ee1adOmac6cOZo3b56XrQAAQIYacPjo7OzUgQMHep8fOnRIu3fvVnFxsSorK1VSUtKnPjc3V6Wlpbr44ou9mRgAAGS0AYePnTt3aurUqb3PFy5cKEmaO3eu1qxZ4+10AABg0Blw+JgyZYocx3Fd/49//GOgLZCErS12NWW1KWusbce1dHt257xhVvq61fOdQtsjJOXk59oewZfYQgs/y+1IXZMVTV3TW/uNpgEAABggwgcAADCK8AEAAIwifAAAAKMIHwAAwCjCBwAAMMrzu9oCJtna8hrojlnp65af77pr69rljBhupS8wGDgubljrpuY0Vj4AAIBRhA8AAGAU4QMAABhF+AAAAEYRPgAAgFGEDwAAYBThAwAAGMU5HxnkuqxZruq8vjV3z/F2Tz+flwKfnbDS99TIoVb6DgYb9zVZ6evnr2PA73I7UtdkRd1/PlY+AACAUYQPAABgFOEDAAAYRfgAAABGET4AAIBRhA8AAGAUW22BryHvk89tj3BWiZH+vX18zb8tdlXn9ZZcr7egA+cSx8VShZua01j5AAAARhE+AACAUYQPAABgFOEDAAAYRfgAAABGET4AAIBRhA8AAGAU53xkEFvnFPj6fITCb1tpGysLWenrVlb4pO0RknLyc22PAGCAclx8Swmccv/5WPkAAABGET4AAIBRhA8AAGAU4QMAABhF+AAAAEYRPgAAgFFstc0g12XNclXn9dbYmrLalDUbj67wtKdbPQcOWumbezRspa9bidC3bI+QVHzPXtsjALCMlQ8AAGAU4QMAABhF+AAAAEYRPgAAgFGEDwAAYBThAwAAGDXgrbZbt27VE088oV27duno0aNqbm7WzJkzJUmxWEwPPPCA/vznP+vgwYMKhUKqrq7WY489pvLy8nTMj3Nc1r+Ps9K35zuFVvq6Ff+Wf+8cm/PdC630tbVV3S2/z4dzW8LFt5SE4/7zDXjlo6urSxMmTNCKFV891+HkyZNqbW1VQ0ODWltb9eqrr2rv3r268cYbB9oGAAAMUgNe+aipqVFNTU2/HwuFQmppaenz2rPPPqvvfe97+vjjj1VZWfn1JwUAAINC2k84DYfDCgQCGjp0aL8fj0ajikajvc8jkUi6RwIAABal9Q2n3d3dWrx4sW677TYVFRX1W9PY2KhQKNT7qKioSOdIAADAsrSFj1gspltuuUWO42jlypVJ6+rr6xUOh3sfbW1t6RoJAAD4QFp+7XI6eBw+fFh/+ctfkq56SFIwGFQwGEzHGAAAwIc8Dx+ng8f+/fu1efNmlZSUeN0CAABksAGHj87OTh04cKD3+aFDh7R7924VFxerrKxMN998s1pbW7VhwwbF43EdO3ZMklRcXKy8vDxvp8c5Lyt80krf7M5uK33dyj4Zsz1CUhv3NVnp6/fzMfw+H85t+ScSKWt6YqlrThtw+Ni5c6emTp3a+3zhwoWSpLlz5+rXv/61Xn/9dUnS5Zdf3ufPbd68WVOmTBloOwAAMMgMOHxMmTJFjpP8GLOzfQwAAIB7uwAAAKMIHwAAwCjCBwAAMIrwAQAAjEr7vV2Q+XqOt9sewXdi3xlie4SMxa3jgcwT+1bqtYr4KffrGax8AAAAowgfAADAKMIHAAAwivABAACMInwAAACjCB8AAMAottpmEFtbD3O+e6GVvm7EykJW+ub+s8tKXwCwIfeL1PdtC8Tc39uNlQ8AAGAU4QMAABhF+AAAAEYRPgAAgFGEDwAAYBThAwAAGEX4AAAARnHOBzJaIJaw0rdn2Les9HXr1LB82yMklT3uYit9r8ua5arO1nk6gK85Ls7wcFPzJVY+AACAUYQPAABgFOEDAAAYRfgAAABGET4AAIBRhA8AAGAUW22R0sZ9TbZHSOrU0DwrfePBbCt93Qoe67A9QlJv/O1R2yMAGCAnK5C6KOCi5kusfAAAAKMIHwAAwCjCBwAAMIrwAQAAjCJ8AAAAowgfAADAKMIHAAAwinM+kNIPL3sgZY2tsxvyP/vCTt9//I+Vvm5FSwttj5DUtEkPuap7c8ev0j4LADtY+QAAAEYRPgAAgFGEDwAAYBThAwAAGEX4AAAARhE+AACAUWy1zSA1ZbWu6jYeXZH2Wfyi+7wCK31jZSErfd1K5Li/tTUApBJIOKmLHBc1XxrwysfWrVs1Y8YMlZeXKxAIaP369Wf0dvTggw+qrKxMBQUFqq6u1v79+wfaBgAADFIDDh9dXV2aMGGCVqzo/3/Xjz/+uJ5++mmtWrVK27dv15AhQzRt2jR1d3d7MS8AAMhwA/61S01NjWpqavr9mOM4Wr58uR544AHddNNNkqTf//73GjFihNavX69bb731m08MAAAymqdvOD106JCOHTum6urq3tdCoZCuuuoqvfvuu162AgAAGcrTN5weO3ZMkjRixIg+r48YMaL3Y2eKRqOKRqO9zyORiJcjAQAAn7G+1baxsVGhUKj3UVFRYXskAACQRp6Gj9LSUknS8ePH+7x+/Pjx3o+dqb6+XuFwuPfR1tbm5UgAAMBnPP21S1VVlUpLS7Vp0yZdfvnl0pe/Rtm+fbvuuuuufv9MMBhUMBj0coxBy9b5HYHPTljp64arvefnoKwe/16XrE/abY/gS9dlzXJV15L4Y9pnAc4USHhTc9qAw0dnZ6cOHDjQ+/zQoUPavXu3iouLVVlZqbq6Oj366KMaPXq0qqqq1NDQoPLycs2cOXOgrQAAwCA04PCxc+dOTZ06tff5woULJUlz587VmjVr9POf/1xdXV366U9/qs8//1zXXHON3njjDeXn53s7OQAAyEgDDh9TpkyRc5YjVAOBgB5++GE9/PDD33Q2AAAwCFnf7QIAAM4thA8AAGAU4QMAABjl6VZbDE7OecNsj5BUzhdxK32jxXlW+roVPNZhe4SkYv92vu0RfIkttDiXsPIBAACMInwAAACjCB8AAMAowgcAADCK8AEAAIwifAAAAKPYaptBaspqXdV5fffbQHfM08/npVhRrpW+2Za2+LoV+84Q2yMkFXj7fSt92coKfH25J1PfsjYQc39bW1Y+AACAUYQPAABgFOEDAAAYRfgAAABGET4AAIBRhA8AAGAU4QMAABjFOR8ZxOvzO1zr6LTT14Wcrh4rfRNBcvvXZeu8jeuyZrmqYz7gq74ozk5ZEz+VuuY0voMCAACjCB8AAMAowgcAADCK8AEAAIwifAAAAKMIHwAAwCi22mYQW1vxEiOHe/r5vBQdlmulbyInYKWvW1ndcdsjJGXr69jvW1T9Ph/Obdmx1DXOAE4+YOUDAAAYRfgAAABGET4AAIBRhA8AAGAU4QMAABhF+AAAAEax1RYpJXbtsT1CUsETLvZ/pUH+P09Z6etWdme37RGSyvnuhVb6/vCyB1zVvfG3R9M+C5BpAnHHk5rTWPkAAABGET4AAIBRhA8AAGAU4QMAABhF+AAAAEYRPgAAgFGEDwAAYBTnfCCl7HEX2x4hqVOhXCt9u8qDVvq69cWokO0RknLy7fydxffstdIXGAyCJ3pS1mT3pK45jZUPAABgFOEDAAAY5Xn4iMfjamhoUFVVlQoKCnTRRRfpkUcekeO4P3YVAAAMXp6/56OpqUkrV67U2rVrNXbsWO3cuVPz5s1TKBTS/PnzvW4HAAAyjOfh469//atuuukmTZ8+XZJ0wQUX6KWXXtJ7773ndSsAAJCBPP+1y9VXX61NmzZp3759kqQPPvhA77zzjmpqavqtj0ajikQifR4AAGDw8nzlY8mSJYpEIhozZoyys7MVj8e1dOlSzZ49u9/6xsZGPfTQQ16PMSi1JP5opW/gsxNW+roRzw3YHsGXciMx2yMkFej272wA+pcIpl6rSGS5X8/wfOXjlVde0Ysvvqh169aptbVVa9eu1W9+8xutXbu23/r6+nqFw+HeR1tbm9cjAQAAH/F85WPRokVasmSJbr31VknS+PHjdfjwYTU2Nmru3LlfqQ8GgwoG/X1gEwAA8I7nKx8nT55U1hlLL9nZ2UokEl63AgAAGcjzlY8ZM2Zo6dKlqqys1NixY/X+++/rySef1B133OF1KwAAkIE8Dx/PPPOMGhoadPfdd6u9vV3l5eX62c9+pgcffNDrVgAAIAN5Hj4KCwu1fPlyLV++3OtPDQAABgHu7QIAAIzyfOUDg09PVZntEZI6VcQ5H/2JFufZHiG5jk4rbW2dkwMMBlnR1JtGsnrcbyxh5QMAABhF+AAAAEYRPgAAgFGEDwAAYBThAwAAGEX4AAAARrHVNoPUlNW6qtt4dIWnfZ1c/2bU7FN2+iZy/b3F1822uHPNdVmzXNWxJRdIP//+VAEAAIMS4QMAABhF+AAAAEYRPgAAgFGEDwAAYBThAwAAGMVW2wzSc7zdSt9YUa6Vvm5kRx0rfWND/L3VNl7g3/9X2Po6Zgst4B/+/Q4FAAAGJcIHAAAwivABAACMInwAAACjCB8AAMAowgcAADCK8AEAAIzinA+kFrdzloYbpwotnbfh8zvWB3w+H4DM0jMkO3VNLHXNaax8AAAAowgfAADAKMIHAAAwivABAACMInwAAACjCB8AAMAottpmEGu3BM/27+3jbd3aPh600ta1rB7/bo/m1vb9uy5rlqs6rh9syP/nqZQ1PT2pa05j5QMAABhF+AAAAEYRPgAAgFGEDwAAYBThAwAAGEX4AAAARhE+AACAUZzzkUFqympd1W08usLTvrEC/2bUnnw7fR2f/8vJ+++o7RGS+uFlD7iqe+Nvj6Z9Fj/h/A74WSCWSF3Tk7rmNP/+VAEAAIMS4QMAABhF+AAAAEalJXx8+umnuv3221VSUqKCggKNHz9eO3fuTEcrAACQYTx/29yJEyc0efJkTZ06VRs3btR5552n/fv3a9iwYV63AgAAGcjz8NHU1KSKigq98MILva9VVVV53QYAAGQoz8PH66+/rmnTpmnWrFnasmWLzj//fN199936yU9+4nUrGJLItT1BcnFLW23jQTt93eqqKLA9QlLxPXttjwDAMs/f83Hw4EGtXLlSo0eP1ptvvqm77rpL8+fP19q1a/utj0ajikQifR4AAGDw8nzlI5FI6Morr9SyZcskSRMnTtSePXu0atUqzZ079yv1jY2Neuihh7weAwAA+JTnKx9lZWW69NJL+7x2ySWX6OOPP+63vr6+XuFwuPfR1tbm9UgAAMBHPF/5mDx5svbu7fs73X379mnUqFH91geDQQWDPv8FOgAA8IznKx/33Xeftm3bpmXLlunAgQNat26dVq9erdpad/clAQAAg5vn4WPSpElqbm7WSy+9pHHjxumRRx7R8uXLNXv2bK9bAQCADJSWe3PecMMNuuGGG9Lxqc9thd+20jYa8u8p/Ik8S319flfbnvyA7RGSyhkx3PYIAAbo1NDU32x7XNz59jT//lQBAACDEuEDAAAYRfgAAABGET4AAIBRhA8AAGAU4QMAABhF+AAAAEb5/LQC+EHAsT1BcrEi9/vKvWTrfBG3CtuitkdIauPRFVb6Xpc1y1VdS+KPaZ8FyDR5n59KWZPVk7qmt/YbzgMAADAghA8AAGAU4QMAABhF+AAAAEYRPgAAgFGEDwAAYBRbbTNI9wXFVvqe9PEd0J1Qj+0RfCmrO257hKRsbXllCy3gH6x8AAAAowgfAADAKMIHAAAwivABAACMInwAAACjCB8AAMAottpmkJzOmJW+p4b597a2uQV2rkn+P620dS0r5t+ttmx5BTKPk5t6rcIJuF/PYOUDAAAYRfgAAABGET4AAIBRhA8AAGAU4QMAABhF+AAAAEYRPgAAgFGc85FBYkW5Vvomhto5S8ON2Bd2rkn+Cf+efSJJWZ+02x4BwCCSezScsiYQj7r+fKx8AAAAowgfAADAKMIHAAAwivABAACMInwAAACjCB8AAMAottpmEDe3NE6Hoed1WunrRl5bnpW+w3b/j5W+bm08usL2CAAGkZ4DB1PXOO6PZWDlAwAAGEX4AAAARhE+AACAUYQPAABgFOEDAAAYRfgAAABGpT18PPbYYwoEAqqrq0t3KwAAkAHSes7Hjh079Nxzz+myyy5LZ5tzRqTSzrEsI77dYaWvGyV7Elb6Brrd72cHgEzXkvhjyppIJKJQKOTq86Vt5aOzs1OzZ8/W888/r2HDhqWrDQAAyDBpCx+1tbWaPn26qqurz1oXjUYViUT6PAAAwOCVlnX8l19+Wa2trdqxY0fK2sbGRj300EPpGAMAAPiQ5ysfbW1tWrBggV588UXl5+enrK+vr1c4HO59tLW1eT0SAADwEc9XPnbt2qX29nZdccUVva/F43Ft3bpVzz77rKLRqLKzs3s/FgwGFQwGvR4DAAD4lOfh49prr9WHH37Y57V58+ZpzJgxWrx4cZ/gAQAAzj2eh4/CwkKNGzeuz2tDhgxRSUnJV17HwJwstdP3i55cO41dePel+6303bivyUpfABgMOOEUAAAYZeTUqrfeestEGwAAkAFY+QAAAEYRPgAAgFGEDwAAYBThAwAAGGXnNqn4WnIuC1vpe+S/3d2lEAAAN1j5AAAARhE+AACAUYQPAABgFOEDAAAYRfgAAABGET4AAIBRhA8AAGAU53xkkB9d+IGVvnm7h6QuusXEJACAwYCVDwAAYBThAwAAGEX4AAAARhE+AACAUYQPAABgFOEDAAAYxVbbf3Hd1Y+6qmv56wNpn6U/2YGElb5l27qt9AUADE6sfAAAAKMIHwAAwCjCBwAAMIrwAQAAjCJ8AAAAowgfAADAKMIHAAAwinM+/sVn/+7i1vEW/WH/Fa7qfj3O276b/lLv7ScEAJzTWPkAAABGET4AAIBRhA8AAGAU4QMAABhF+AAAAEYRPgAAgFFstf0Xn18Rsz3CWRX87yJ3hf+R7kkAAPj6WPkAAABGET4AAIBRhA8AAGAU4QMAABhF+AAAAEYRPgAAgFEBx3Ec20P8q0gkolAopHA4rKIil1tLPfJR2/mu6i6t+DTtswAAkEkG8vOblQ8AAGCU5+GjsbFRkyZNUmFhoYYPH66ZM2dq7969XrcBAAAZyvPwsWXLFtXW1mrbtm1qaWlRLBbT9ddfr66uLq9bAQCADOT58epvvPFGn+dr1qzR8OHDtWvXLn3/+9/3uh0AAMgwab+3SzgcliQVFxf3+/FoNKpoNNr7PBKJpHskAABgUVrfcJpIJFRXV6fJkydr3Lhx/dY0NjYqFAr1PioqKtI5EgAAsCyt4aO2tlZ79uzRyy+/nLSmvr5e4XC499HW1pbOkQAAgGVp+7XLPffcow0bNmjr1q0aOXJk0rpgMKhgMJiuMQbk5l0/cVX3EYszAAB8bZ6HD8dxdO+996q5uVlvvfWWqqqqvG4BAAAymOfho7a2VuvWrdNrr72mwsJCHTt2TJIUCoVUUFDgdTsAAJBhPH/Px8qVKxUOhzVlyhSVlZX1Pv7whz943QoAAGSgtPzaBQAAIBnu7QIAAIwifAAAAKPSfsJpJvlo5q9tjwAAwKDHygcAADCK8AEAAIwifAAAAKMIHwAAwCjCBwAAMIrwAQAAjCJ8AAAAo3x7zsfVP1+h7Lz8s9Z88PR9xuYBAADeYOUDAAAYRfgAAABGET4AAIBRhA8AAGAU4QMAABhF+AAAAEb5dqttc91TKixMlY3YagsAQKZh5QMAABhF+AAAAEYRPgAAgFGEDwAAYBThAwAAGOW73S6O40iSOjsTKWu/FYkYmAgAAKQS+fJn8umf42fju/DR0dEhSbpi0mcuqkNpnwcAALjX0dGhUOjsP58DjpuIYlAikdCRI0dUWFioQCBgexxFIhFVVFSora1NRUVFtsfxFa5N/7guyXFtkuPaJMe1Sc5P18ZxHHV0dKi8vFxZWWd/V4fvVj6ysrI0cuRI22N8RVFRkfW/WL/i2vSP65Ic1yY5rk1yXJvk/HJtUq14nMYbTgEAgFGEDwAAYBThI4VgMKhf/epXCgaDtkfxHa5N/7guyXFtkuPaJMe1SS5Tr43v3nAKAAAGN1Y+AACAUYQPAABgFOEDAAAYRfgAAABGET760djYqEmTJqmwsFDDhw/XzJkztXfvXttj+dJjjz2mQCCguro626P4wqeffqrbb79dJSUlKigo0Pjx47Vz507bY1kXj8fV0NCgqqoqFRQU6KKLLtIjjzzi6h4Qg83WrVs1Y8YMlZeXKxAIaP369X0+7jiOHnzwQZWVlamgoEDV1dXav3+/tXlNOtu1icViWrx4scaPH68hQ4aovLxcP/7xj3XkyBGrM5uS6uvmX915550KBAJavny50RkHgvDRjy1btqi2tlbbtm1TS0uLYrGYrr/+enV1ddkezVd27Nih5557TpdddpntUXzhxIkTmjx5snJzc7Vx40Z99NFH+u1vf6thw4bZHs26pqYmrVy5Us8++6z+/ve/q6mpSY8//rieeeYZ26MZ19XVpQkTJmjFihX9fvzxxx/X008/rVWrVmn79u0aMmSIpk2bpu7ubuOzmna2a3Py5Em1traqoaFBra2tevXVV7V3717deOONVmY1LdXXzWnNzc3atm2bysvLjc32tThIqb293ZHkbNmyxfYovtHR0eGMHj3aaWlpcX7wgx84CxYssD2SdYsXL3auueYa22P40vTp05077rijz2s/+tGPnNmzZ1ubyQ8kOc3Nzb3PE4mEU1pa6jzxxBO9r33++edOMBh0XnrpJUtT2nHmtenPe++950hyDh8+bGwuP0h2bT755BPn/PPPd/bs2eOMGjXK+d3vfmdlPjdY+XAhHA5LkoqLi22P4hu1tbWaPn26qqurbY/iG6+//rquvPJKzZo1S8OHD9fEiRP1/PPP2x7LF66++mpt2rRJ+/btkyR98MEHeuedd1RTU2N7NF85dOiQjh071uffVSgU0lVXXaV3333X6mx+FA6HFQgENHToUNujWJdIJDRnzhwtWrRIY8eOtT1OSr67sZzfJBIJ1dXVafLkyRo3bpztcXzh5ZdfVmtrq3bs2GF7FF85ePCgVq5cqYULF+oXv/iFduzYofnz5ysvL09z5861PZ5VS5YsUSQS0ZgxY5Sdna14PK6lS5dq9uzZtkfzlWPHjkmSRowY0ef1ESNG9H4M/093d7cWL16s2267zRc3VLOtqalJOTk5mj9/vu1RXCF8pFBbW6s9e/bonXfesT2KL7S1tWnBggVqaWlRfn6+7XF8JZFI6Morr9SyZcskSRMnTtSePXu0atWqcz58vPLKK3rxxRe1bt06jR07Vrt371ZdXZ3Ky8vP+WuDgYvFYrrlllvkOI5Wrlxpexzrdu3apaeeekqtra0KBAK2x3GFX7ucxT333KMNGzZo8+bNGjlypO1xfGHXrl1qb2/XFVdcoZycHOXk5GjLli16+umnlZOTo3g8bntEa8rKynTppZf2ee2SSy7Rxx9/bG0mv1i0aJGWLFmiW2+9VePHj9ecOXN03333qbGx0fZovlJaWipJOn78eJ/Xjx8/3vuxc93p4HH48GG1tLSw6iHp7bffVnt7uyorK3u/Lx8+fFj333+/LrjgAtvj9YuVj344jqN7771Xzc3Neuutt1RVVWV7JN+49tpr9eGHH/Z5bd68eRozZowWL16s7Oxsa7PZNnny5K9syd63b59GjRplbSa/OHnypLKy+v5fJzs7W4lEwtpMflRVVaXS0lJt2rRJl19+uSQpEolo+/btuuuuu2yPZ93p4LF//35t3rxZJSUltkfyhTlz5nzl/XfTpk3TnDlzNG/ePGtznQ3hox+1tbVat26dXnvtNRUWFvb+rjUUCqmgoMD2eFYVFhZ+5b0vQ4YMUUlJyTn/npj77rtPV199tZYtW6ZbbrlF7733nlavXq3Vq1fbHs26GTNmaOnSpaqsrNTYsWP1/vvv68knn9Qdd9xhezTjOjs7deDAgd7nhw4d0u7du1VcXKzKykrV1dXp0Ucf1ejRo1VVVaWGhgaVl5dr5syZVuc24WzXpqysTDfffLNaW1u1YcMGxePx3u/NxcXFysvLszh5+qX6ujkziOXm5qq0tFQXX3yxhWldsL3dxo8k9ft44YUXbI/mS2y1/f/+9Kc/OePGjXOCwaAzZswYZ/Xq1bZH8oVIJOIsWLDAqaysdPLz850LL7zQ+eUvf+lEo1Hboxm3efPmfr+/zJ0713G+3G7b0NDgjBgxwgkGg861117r7N271/bYRpzt2hw6dCjp9+bNmzfbHj3tUn3dnMnvW20Dzrl4xCAAALCGN5wCAACjCB8AAMAowgcAADCK8AEAAIwifAAAAKMIHwAAwCjCBwAAMIrwAQAAjCJ8AAAAowgfAADAKMIHAAAwivABAACM+r9SXh0TglR/aQAAAABJRU5ErkJggg==",
- "text/plain": [
- "<Figure size 640x480 with 1 Axes>"
+ "<Figure size 640x480 with 2 Axes>"
]
},
"metadata": {},
@@ -1013,29 +1445,22 @@
"model.eval()\n",
"with torch.no_grad():\n",
" output = model(batch_src, batch_padding_mask)\n",
- "batch_src[0], batch_labels[0], output[0]\n",
- "x = batch_labels.detach().to(torch.float16).cpu().numpy().flatten()\n",
- "y = output.detach().to(torch.float16).cpu().numpy().flatten()\n",
- "plt.hist2d(x, y, bins=50, norm=mpl.colors.LogNorm())"
+ "x = batch_labels.detach().to(torch.uint8)\n",
+ "y = output.detach()\n",
+ "cnts = torch.bincount(x)\n",
+ "weights = [1/cnts[i.item()].item() for i in x] # normalize by label count\n",
+ "fig, ax = plt.subplots()\n",
+ "h = ax.hist2d(x.cpu().numpy().flatten(), y.to(torch.float16).cpu().numpy().flatten(), weights=weights, bins=[15,50], norm=mpl.colors.LogNorm())\n",
+ "fig.colorbar(h[3], ax=ax)\n",
+ "plt.show()"
]
},
{
"cell_type": "code",
- "execution_count": 14,
+ "execution_count": null,
"execution_state": "idle",
"metadata": {},
- "outputs": [
- {
- "data": {
- "text/plain": [
- "0.353515625"
- ]
- },
- "execution_count": 14,
- "metadata": {},
- "output_type": "execute_result"
- }
- ],
+ "outputs": [],
"source": [
"evaluate()"
]
@@ -1051,7 +1476,7 @@
},
{
"cell_type": "code",
- "execution_count": 19,
+ "execution_count": null,
"execution_state": "idle",
"metadata": {},
"outputs": [],
@@ -1071,7 +1496,7 @@
},
{
"cell_type": "code",
- "execution_count": 20,
+ "execution_count": null,
"execution_state": "idle",
"metadata": {},
"outputs": [],
@@ -1088,117 +1513,10 @@
},
{
"cell_type": "code",
- "execution_count": 23,
+ "execution_count": null,
"execution_state": "idle",
"metadata": {},
- "outputs": [
- {
- "name": "stdout",
- "output_type": "stream",
- "text": [
- "Epoch 1/100 \t Train Err: 2.8906\n",
- "Epoch 2/100 \t Train Err: 0.3340\n",
- "Epoch 3/100 \t Train Err: 0.1709\n",
- "Epoch 4/100 \t Train Err: 0.2373\n",
- "Epoch 5/100 \t Train Err: 0.2520\n",
- "Epoch 6/100 \t Train Err: 0.1953\n",
- "Epoch 7/100 \t Train Err: 0.1963\n",
- "Epoch 8/100 \t Train Err: 0.2236\n",
- "Epoch 9/100 \t Train Err: 0.2119\n",
- "Epoch 10/100 \t Train Err: 0.1777\n",
- "Epoch 11/100 \t Train Err: 0.1660\n",
- "Epoch 12/100 \t Train Err: 0.1787\n",
- "Epoch 13/100 \t Train Err: 0.1816\n",
- "Epoch 14/100 \t Train Err: 0.1562\n",
- "Epoch 15/100 \t Train Err: 0.1377\n",
- "Epoch 16/100 \t Train Err: 0.1377\n",
- "Epoch 17/100 \t Train Err: 0.1387\n",
- "Epoch 18/100 \t Train Err: 0.1289\n",
- "Epoch 19/100 \t Train Err: 0.1162\n",
- "Epoch 20/100 \t Train Err: 0.1079\n",
- "Epoch 21/100 \t Train Err: 0.1108\n",
- "Epoch 22/100 \t Train Err: 0.1099\n",
- "Epoch 23/100 \t Train Err: 0.1021\n",
- "Epoch 24/100 \t Train Err: 0.0918\n",
- "Epoch 25/100 \t Train Err: 0.0913\n",
- "Epoch 26/100 \t Train Err: 0.0913\n",
- "Epoch 27/100 \t Train Err: 0.0859\n",
- "Epoch 28/100 \t Train Err: 0.0820\n",
- "Epoch 29/100 \t Train Err: 0.0767\n",
- "Epoch 30/100 \t Train Err: 0.0776\n",
- "Epoch 31/100 \t Train Err: 0.0747\n",
- "Epoch 32/100 \t Train Err: 0.0713\n",
- "Epoch 33/100 \t Train Err: 0.0698\n",
- "Epoch 34/100 \t Train Err: 0.0679\n",
- "Epoch 35/100 \t Train Err: 0.0664\n",
- "Epoch 36/100 \t Train Err: 0.0669\n",
- "Epoch 37/100 \t Train Err: 0.0645\n",
- "Epoch 38/100 \t Train Err: 0.0601\n",
- "Epoch 39/100 \t Train Err: 0.0583\n",
- "Epoch 40/100 \t Train Err: 0.0569\n",
- "Epoch 41/100 \t Train Err: 0.0564\n",
- "Epoch 42/100 \t Train Err: 0.0554\n",
- "Epoch 43/100 \t Train Err: 0.0532\n",
- "Epoch 44/100 \t Train Err: 0.0520\n",
- "Epoch 45/100 \t Train Err: 0.0500\n",
- "Epoch 46/100 \t Train Err: 0.0483\n",
- "Epoch 47/100 \t Train Err: 0.0457\n",
- "Epoch 48/100 \t Train Err: 0.0452\n",
- "Epoch 49/100 \t Train Err: 0.0444\n",
- "Epoch 50/100 \t Train Err: 0.0430\n",
- "Epoch 51/100 \t Train Err: 0.0422\n",
- "Epoch 52/100 \t Train Err: 0.0405\n",
- "Epoch 53/100 \t Train Err: 0.0408\n",
- "Epoch 54/100 \t Train Err: 0.0378\n",
- "Epoch 55/100 \t Train Err: 0.0378\n",
- "Epoch 56/100 \t Train Err: 0.0369\n",
- "Epoch 57/100 \t Train Err: 0.0354\n",
- "Epoch 58/100 \t Train Err: 0.0344\n",
- "Epoch 59/100 \t Train Err: 0.0337\n",
- "Epoch 60/100 \t Train Err: 0.0334\n",
- "Epoch 61/100 \t Train Err: 0.0322\n",
- "Epoch 62/100 \t Train Err: 0.0312\n",
- "Epoch 63/100 \t Train Err: 0.0304\n",
- "Epoch 64/100 \t Train Err: 0.0310\n",
- "Epoch 65/100 \t Train Err: 0.0304\n",
- "Epoch 66/100 \t Train Err: 0.0297\n",
- "Epoch 67/100 \t Train Err: 0.0283\n",
- "Epoch 68/100 \t Train Err: 0.0281\n",
- "Epoch 69/100 \t Train Err: 0.0280\n",
- "Epoch 70/100 \t Train Err: 0.0273\n",
- "Epoch 71/100 \t Train Err: 0.0267\n",
- "Epoch 72/100 \t Train Err: 0.0277\n",
- "Epoch 73/100 \t Train Err: 0.0269\n",
- "Epoch 74/100 \t Train Err: 0.0258\n",
- "Epoch 75/100 \t Train Err: 0.0249\n",
- "Epoch 76/100 \t Train Err: 0.0254\n",
- "Epoch 77/100 \t Train Err: 0.0245\n",
- "Epoch 78/100 \t Train Err: 0.0244\n",
- "Epoch 79/100 \t Train Err: 0.0242\n",
- "Epoch 80/100 \t Train Err: 0.0237\n",
- "Epoch 81/100 \t Train Err: 0.0243\n",
- "Epoch 82/100 \t Train Err: 0.0225\n",
- "Epoch 83/100 \t Train Err: 0.0225\n",
- "Epoch 84/100 \t Train Err: 0.0221\n",
- "Epoch 85/100 \t Train Err: 0.0227\n",
- "Epoch 86/100 \t Train Err: 0.0222\n",
- "Epoch 87/100 \t Train Err: 0.0219\n",
- "Epoch 88/100 \t Train Err: 0.0220\n",
- "Epoch 89/100 \t Train Err: 0.0210\n",
- "Epoch 90/100 \t Train Err: 0.0210\n",
- "Epoch 91/100 \t Train Err: 0.0211\n",
- "Epoch 92/100 \t Train Err: 0.0208\n",
- "Epoch 93/100 \t Train Err: 0.0205\n",
- "Epoch 94/100 \t Train Err: 0.0200\n",
- "Epoch 95/100 \t Train Err: 0.0208\n",
- "Epoch 96/100 \t Train Err: 0.0198\n",
- "Epoch 97/100 \t Train Err: 0.0195\n",
- "Epoch 98/100 \t Train Err: 0.0197\n",
- "Epoch 99/100 \t Train Err: 0.0190\n",
- "Epoch 100/100 \t Train Err: 0.0192\n"
- ]
- }
- ],
+ "outputs": [],
"source": [
"for epoch in range(N_TUNE_EPOCHS):\n",
" model.train()\n",
@@ -1222,21 +1540,10 @@
},
{
"cell_type": "code",
- "execution_count": 24,
+ "execution_count": null,
"execution_state": "idle",
"metadata": {},
- "outputs": [
- {
- "data": {
- "image/png": "iVBORw0KGgoAAAANSUhEUgAAAjcAAAHgCAYAAABZ+0ykAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjkuMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8hTgPZAAAACXBIWXMAAA9hAAAPYQGoP6dpAAA97ElEQVR4nO3deXxU1f3/8feEkEkgC2DIAoRFQVRQoCAQaAVrEAKlYlGRokTEBQGFUtuvaAXbfm1UiktdWL4tUquIwk9RqcsjBYGCgGyx4IILslRIEJFsSAKZ8/tjmEmGhJjInTnJ8Ho+HueRzL3nznzufVDz7rnnnnEZY4wAAADCRITtAgAAAJxEuAEAAGGFcAMAAMIK4QYAAIQVwg0AAAgrhBsAABBWCDcAACCsEG4AAEBYIdwAAICwQrgBgCByuVyaPHmy7TKAswrhBmjAFi5cKJfLJZfLpbVr11bZb4xRWlqaXC6XfvaznwXsKy4u1syZM9W1a1c1bdpU55xzjrp3764pU6Zo//79/n4PPPCA/zOqa3l5eSE519OpqbYJEyZYrQ2AHZG2CwBw5qKjo7Vo0SL9+Mc/Dti+evVq/fe//5Xb7Q7Yfvz4cV122WX65JNPlJWVpTvvvFPFxcX68MMPtWjRIl199dVq1apVwDFz5sxRbGxslc9u1qxZkM6q9gYNGqSxY8dW2X7++edbqQeAXYQbIAwMHTpUS5Ys0V/+8hdFRlb8z3rRokXq2bOnDh06FNB/2bJl2rZtm1544QX98pe/DNh37NgxlZWVVfmMa665RomJiUE8ix/u/PPP1w033GC7DAD1BLelgDAwevRoffPNN8rJyfFvKysr09KlS6uEF0n64osvJEn9+/evsi86Olrx8fGO1NW1a1ddfvnlVbZ7PB61bt1a11xzjX/b4sWL1bNnT8XFxSk+Pl4XX3yxnnjiCUfqkKSBAweqa9eu2rJli/r166eYmBh16NBBc+fOrdL34MGDGj9+vJKTkxUdHa1u3brp73//e7Xn8cQTT+jiiy9WdHS0WrZsqSFDhmjz5s1V+i5btkxdu3aV2+1Wly5d9PbbbwfsLyoq0tSpU9W+fXu53W4lJSVp0KBB2rp1q2PXADhbEG6AMNC+fXulp6frxRdf9G976623VFBQoOuvv75K/3bt2kmSnnvuORljavUZhw8f1qFDhwLakSNHajxm1KhRWrNmTZV5OWvXrtX+/fv9teXk5Gj06NFq3ry5Hn74YT300EMaOHCg1q1bV6vajh07VqW2Q4cOVRmB+vbbbzV06FD17NlTjzzyiNq0aaM77rhDCxYs8Pf57rvvNHDgQP3jH//QmDFjNGvWLCUkJOimm26qErbGjx+vqVOnKi0tTQ8//LDuueceRUdHa8OGDVXOd+LEibr++uv1yCOP6NixYxo5cqS++eYbf58JEyZozpw5GjlypJ555hndfffdiomJ0ccff1yrawCgEgOgwXr22WeNJLNp0ybz1FNPmbi4OHP06FFjjDHXXnutufzyy40xxrRr184MGzbMf9zRo0dN586djSTTrl07c9NNN5m//e1vJj8/v8pnzJw500iqtnXu3LnG+nbu3GkkmSeffDJg+8SJE01sbKy/1ilTppj4+Hhz4sSJOl+D09Umybz44ov+fgMGDDCSzOzZs/3bSktLTffu3U1SUpIpKyszxhjz+OOPG0nm+eef9/crKysz6enpJjY21hQWFhpjjFm5cqWRZO66664qNXk8noD6oqKizOeff+7f9sEHH1S5LgkJCWbSpEl1Pn8AVTFyA4SJ6667Tt99952WL1+uoqIiLV++vNpbUpIUExOjjRs36je/+Y108qmr8ePHKzU1VXfeeadKS0urHPP//t//U05OTkB79tlna6zp/PPPV/fu3fXSSy/5t5WXl2vp0qUaPny4YmJipJOTkktKSgJuq9XFVVddVaW2nJycKrfEIiMjdfvtt/tfR0VF6fbbb9fBgwe1ZcsWSdKbb76plJQUjR492t+vcePGuuuuu1RcXKzVq1f7r4fL5dLMmTOr1ONyuQJeZ2Rk6LzzzvO/vuSSSxQfH69du3b5tzVr1kwbN24MeFINwA/DhGIgTLRs2VIZGRlatGiRjh49qvLy8oA5LadKSEjQI488okceeUR79uzRihUr9Oc//1lPPfWUEhIS9L//+78B/S+77LIfNKF41KhRuvfee/XVV1+pdevWWrVqlQ4ePKhRo0b5+0ycOFEvv/yyMjMz1bp1a1155ZW67rrrNGTIkFp9Rps2bZSRkfG9/Vq1aqWmTZsGbPM9UbV792717dtXe/bsUadOnRQREfj//S688EJJ0p49e6ST85ZatWqlFi1afO/ntm3btsq25s2b69tvv/W/fuSRR5SVlaW0tDT17NlTQ4cO1dixY3Xuued+7/sDCMTIDRBGfvnLX+qtt97S3LlzlZmZWevHtNu1a6ebb75Z69atU7NmzfTCCy84VtOoUaNkjNGSJUskSS+//LISEhICgktSUpJyc3P1+uuv6+c//7neffddZWZmKisry7E6bGrUqFG12yvPd7ruuuu0a9cuPfnkk2rVqpVmzZqlLl266K233gphpUB4INwAYeTqq69WRESENmzYcNpbUjVp3ry5zjvvPB04cMCxmjp06KDevXvrpZde0okTJ/TKK69oxIgRVdbeiYqK0vDhw/XMM8/oiy++0O23367nnntOn3/+uWO17N+/XyUlJQHbPv30U+nkpGydDHqfffaZPB5PQL9PPvnEv1+SzjvvPO3fv1+HDx92rL7U1FRNnDhRy5Yt05dffqlzzjlHDz74oGPvD5wtCDdAGImNjdWcOXP0wAMPaPjw4aft98EHH1RZ+0Ynb7l89NFH6ty5s6N1jRo1Shs2bNCCBQt06NChgFtSkgKeGpKkiIgIXXLJJZJU7fyfH+rEiROaN2+e/3VZWZnmzZunli1bqmfPntLJNYPy8vIC5gmdOHFCTz75pGJjYzVgwABJ0siRI2WM0e9///sqn1PbJ9B8ysvLVVBQELAtKSlJrVq1cvT8gbMFc26AMFObWzk5OTmaOXOmfv7zn6tv376KjY3Vrl27tGDBApWWluqBBx6ocszSpUurXaF40KBBSk5OrvHzrrvuOt199926++671aJFiyrzY2655RYdPnxYP/3pT9WmTRvt2bNHTz75pLp37+6f61KTTz/9VM8//3yV7cnJyRo0aJD/datWrfTwww9r9+7dOv/88/XSSy8pNzdX8+fPV+PGjSVJt912m+bNm6ebbrpJW7ZsUfv27bV06VKtW7dOjz/+uOLi4iRJl19+uW688Ub95S9/0WeffaYhQ4bI4/Ho3//+ty6//PI6fZ9UUVGR2rRpo2uuuUbdunVTbGys/vWvf2nTpk2aPXt2rd8HwEm2H9cC8MNVfhS8Jqc+Cr5r1y4zY8YM07dvX5OUlGQiIyNNy5YtzbBhw8zKlSsDjq3pUXBJ5t13361Vrf379zeSzC233FJl39KlS82VV15pkpKSTFRUlGnbtq25/fbbzYEDB773fWuqbcCAAf5+AwYMMF26dDGbN2826enpJjo62rRr18489dRTVd4zPz/fjBs3ziQmJpqoqChz8cUXm2effbZKvxMnTphZs2aZCy64wERFRZmWLVuazMxMs2XLloD6qnvEu127diYrK8uYk4+k/+Y3vzHdunUzcXFxpmnTpqZbt27mmWee+d7zB1CVy9R1/BQAGqCBAwfq0KFD2rFjh+1SAAQZc24AAEBYIdwAAICwQrgBAABhhTk3AAAgrDByAwAAwgrhBgAAhBXCDQAACCuEGwAAEFYINwAAIKwQbgAAQFgh3AAAgLBCuAEAAGGFcAMAAMIK4QYAAIQVwg0AAAgrhBsAABBWCDcAACCsEG4AAEBYIdwAAICwQrgBAABhhXADAADCCuEGAACEFcINAAAIK4QbAAAQVgg3AAAgrBBuAABAWCHcAACAsEK4AQAAYYVwAwAAwgrhBgAAhBXCDQAACCuRtgsINY/Ho/379ysuLk4ul8t2OQAAoBaMMSoqKlKrVq0UEVHz2MxZF27279+vtLQ022UAAIAfYN++fWrTpk2NfayGmzlz5mjOnDnavXu3JKlLly6aMWOGMjMzT3vMkiVLdP/992v37t3q1KmTHn74YQ0dOrTWnxkXFyedvDjx8fEOnAUAAAi2wsJCpaWl+f+O18RquGnTpo0eeughderUScYY/f3vf9dVV12lbdu2qUuXLlX6v/feexo9erSys7P1s5/9TIsWLdKIESO0detWde3atVaf6bsVFR8fT7gBAKCBqc2UEpcxxoSkmlpq0aKFZs2apfHjx1fZN2rUKJWUlGj58uX+bX379lX37t01d+7cat+vtLRUpaWl/te+5FdQUEC4AQCggSgsLFRCQkKt/n7Xm6elysvLtXjxYpWUlCg9Pb3aPuvXr1dGRkbAtsGDB2v9+vWnfd/s7GwlJCT4G/NtAAAIb9bDzfbt2xUbGyu3260JEybo1Vdf1UUXXVRt37y8PCUnJwdsS05OVl5e3mnff/r06SooKPC3ffv2OX4OAACg/rD+tFTnzp2Vm5urgoICLV26VFlZWVq9evVpA05dud1uud1uR94LAADUf9bDTVRUlDp27ChJ6tmzpzZt2qQnnnhC8+bNq9I3JSVF+fn5Advy8/OVkpISsnoBAED9Zv221Kk8Hk/ABODK0tPTtWLFioBtOTk5p52jAwAAzj5WR26mT5+uzMxMtW3bVkVFRVq0aJFWrVqld955R5I0duxYtW7dWtnZ2ZKkKVOmaMCAAZo9e7aGDRumxYsXa/PmzZo/f77N0wAAAPWI1XBz8OBBjR07VgcOHFBCQoIuueQSvfPOOxo0aJAkae/evQFLLPfr10+LFi3S7373O917773q1KmTli1bVus1bgAAQPird+vcBFtdnpMHAAD1Q4Nc5wYAAMAJhBsAABBWCDcAACCsWF/nJlyUlkr5+ZLLJfENDwAA2MPIjUO2bJHatZN++lPblQAAcHYj3Dgk8uQY2IkTtisBAODsRrhxCOEGAID6gXDjkEaNvD8JNwAA2EW4cQgjNwAA1A+EG4cQbgAAqB8INw7xhZvyctuVAABwdiPcOISRGwAA6gfCjUMINwAA1A+EG4cQbgAAqB8INw7xhRtjJI/HdjUAAJy9CDcOiaz0LV2M3gAAYA/hxiGEGwAA6gfCjUN8KxSLcAMAgFWEG4cwcgMAQP1AuHEIIzcAANQPhBuHuFwVAYdVigEAsIdw4yDWugEAwD7CjYMINwAA2Ee4cRDhBgAA+wg3DiLcAABgH+HGQYQbAADsI9w4iHADAIB9hBsH+R4FJ9wAAGAP4cZBjNwAAGAf4cZBhBsAAOwj3DjIF25YoRgAAHsINw5i5AYAAPsINw4i3AAAYB/hxkGEGwAA7CPcOIhwAwCAfYQbBxFuAACwj3DjIMINAAD2EW4cxArFAADYR7hxECM3AADYR7hxEIv4AQBgH+HGQYzcAABgH+HGQYQbAADsI9w4iHADAIB9hBsHEW4AALCPcOMgwg0AAPYRbhxEuAEAwD7CjYNYxA8AAPushpvs7GxdeumliouLU1JSkkaMGKGdO3fWeMzChQvlcrkCWnR0dMhqrgkjNwAA2Gc13KxevVqTJk3Shg0blJOTo+PHj+vKK69USUlJjcfFx8frwIED/rZnz56Q1VwTwg0AAPZF2vzwt99+O+D1woULlZSUpC1btuiyyy477XEul0spKSkhqLBuWKEYAAD76tWcm4KCAklSixYtauxXXFysdu3aKS0tTVdddZU+/PDD0/YtLS1VYWFhQAsWRm4AALCv3oQbj8ejqVOnqn///uratetp+3Xu3FkLFizQa6+9pueff14ej0f9+vXTf//732r7Z2dnKyEhwd/S0tKCdg6EGwAA7Ks34WbSpEnasWOHFi9eXGO/9PR0jR07Vt27d9eAAQP0yiuvqGXLlpo3b161/adPn66CggJ/27dvX5DOgHADAEB9YHXOjc/kyZO1fPlyrVmzRm3atKnTsY0bN1aPHj30+eefV7vf7XbL7XY7VGnNCDcAANhndeTGGKPJkyfr1Vdf1cqVK9WhQ4c6v0d5ebm2b9+u1NTUoNRYF4QbAADsszpyM2nSJC1atEivvfaa4uLilJeXJ0lKSEhQTEyMJGns2LFq3bq1srOzJUl/+MMf1LdvX3Xs2FFHjhzRrFmztGfPHt1yyy02T0Ui3AAAUC9YDTdz5syRJA0cODBg+7PPPqubbrpJkrR3715FRFQMMH377be69dZblZeXp+bNm6tnz5567733dNFFF4W4+qpYoRgAAPushhtjzPf2WbVqVcDrxx57TI899lgQq/rhGLkBAMC+evO0VDgg3AAAYB/hxkGsUAwAgH2EGwcxcgMAgH2EGwcRbgAAsI9w4yDCDQAA9hFuHES4AQDAPsKNgwg3AADYR7hxEOEGAAD7CDcOYoViAADsI9w4iJEbAADsI9w4iHADAIB9hBsHsUIxAAD2EW4cxMgNAAD2EW4cRLgBAMA+wo2DCDcAANhHuHEQ4QYAAPsINw4i3AAAYB/hxkGEGwAA7CPcOIgVigEAsI9w4yBGbgAAsI9w4yDCDQAA9hFuHFR5hWJjbFcDAMDZiXDjIF+4kSSPx2YlAACcvQg3Dqocbrg1BQCAHYQbBxFuAACwj3DjIMINAAD2EW4cRLgBAMA+wo2DIipdTcINAAB2EG4cxlo3AADYRbhxGOEGAAC7CDcOI9wAAGAX4cZhlVcpBgAAoUe4cRgjNwAA2EW4cRjhBgAAuwg3DiPcAABgF+HGYYQbAADsItw4jHADAIBdhBuHNWrk/Um4AQDADsKNwxi5AQDALsKNwwg3AADYRbhxGOEGAAC7CDcOY4ViAADsItw4jJEbAADsItw4jHADAIBdhBuHEW4AALCLcOMwwg0AAHYRbhxGuAEAwC7CjcNYoRgAALushpvs7GxdeumliouLU1JSkkaMGKGdO3d+73FLlizRBRdcoOjoaF188cV68803Q1JvbTByAwCAXVbDzerVqzVp0iRt2LBBOTk5On78uK688kqVlJSc9pj33ntPo0eP1vjx47Vt2zaNGDFCI0aM0I4dO0Ja++kQbgAAsMtljDG2i/D5+uuvlZSUpNWrV+uyyy6rts+oUaNUUlKi5cuX+7f17dtX3bt319y5c6v0Ly0tVWlpqf91YWGh0tLSVFBQoPj4eMfPYcwYadEi6dFHpV/9yvG3BwDgrFRYWKiEhIRa/f2uV3NuCgoKJEktWrQ4bZ/169crIyMjYNvgwYO1fv36avtnZ2crISHB39LS0hyuOhArFAMAYFe9CTcej0dTp05V//791bVr19P2y8vLU3JycsC25ORk5eXlVdt/+vTpKigo8Ld9+/Y5Xntl3JYCAMCuSNsF+EyaNEk7duzQ2rVrHX1ft9stt9vt6HvWhHADAIBd9SLcTJ48WcuXL9eaNWvUpk2bGvumpKQoPz8/YFt+fr5SUlKCXGXtEG4AALDL6m0pY4wmT56sV199VStXrlSHDh2+95j09HStWLEiYFtOTo7S09ODWGntEW4AALDL6sjNpEmTtGjRIr322muKi4vzz5tJSEhQTEyMJGns2LFq3bq1srOzJUlTpkzRgAEDNHv2bA0bNkyLFy/W5s2bNX/+fJun4ke4AQDALqsjN3PmzFFBQYEGDhyo1NRUf3vppZf8ffbu3asDBw74X/fr10+LFi3S/Pnz1a1bNy1dulTLli2rcRJyKLFCMQAAdlkduanNEjurVq2qsu3aa6/VtddeG6SqzgwjNwAA2FVvHgUPF4QbAADsItw4jHADAIBdhBuHsUIxAAB2EW4cxsgNAAB2EW4cRrgBAMAuwo3DCDcAANhFuHEY4QYAALsINw4j3AAAYBfhxmGsUAwAgF2EG4cxcgMAgF2EG4cRbgAAsItw4zDCDQAAdhFuHMYKxQAA2EW4cRgjNwAA2EW4cRjhBgAAuwg3DiPcAABgF+HGYYQbAADsItw4jEX8AACwi3DjMEZuAACwi3DjMMINAAB2EW4cRrgBAMAuwo3DWMQPAAC7CDcOY+QGAAC7CDcOI9wAAGAX4cZhhBsAAOwi3DiMcAMAgF2EG4cRbgAAsItw4zBWKAYAwC7CjcN8Izcej7cBAIDQItw4zBduxFo3AABYQbhxWOVww60pAABCj3DjMEZuAACwi3DjMEZuAACwi3DjMN/TUiLcAABgBeHGYRER3ibCDQAAVhBugoCF/AAAsIdwEwSEGwAA7CHcBAGrFAMAYA/hJggYuQEAwB7CTRAQbgAAsIdwEwSEGwAA7CHcBIEv3LBCMQAAoUe4CQJGbgAAsIdwEwSEGwAA7CHcBAHhBgAAewg3QUC4AQDAnjqFm0ceeUTfffed//W6detUWlrqf11UVKSJEyc6W2EDRLgBAMCeOoWb6dOnq6ioyP86MzNTX331lf/10aNHNW/evFq/35o1azR8+HC1atVKLpdLy5Ytq7H/qlWr5HK5qrS8vLy6nEbQsUIxAAD21CncGGNqfF1XJSUl6tatm55++uk6Hbdz504dOHDA35KSks6oDqcxcgMAgD2RNj88MzNTmZmZdT4uKSlJzZo1C0pNTiDcAABgT4OcUNy9e3elpqZq0KBBWrduXY19S0tLVVhYGNCCjXADAIA9dR65+etf/6rY2FhJ0okTJ7Rw4UIlJiZKJycUB1Nqaqrmzp2rXr16qbS0VH/96181cOBAbdy4UT/60Y+qPSY7O1u///3vg1rXqVihGAAAe1ymDhNn2rdvL5fL9b39vvzyy7oX4nLp1Vdf1YgRI+p03IABA9S2bVv94x//qHZ/aWlpwBNdhYWFSktLU0FBgeLj4+tcZ21kZkpvvy39/e/S2LFB+QgAAM4qhYWFSkhIqNXf7zqN3OzevftMa3Nc7969tXbt2tPud7vdcrvdIa2J21IAANjTIOfcVJabm6vU1FTbZQQg3AAAYE+dws369eu1fPnygG3PPfecOnTooKSkJN12220Bt4C+T3FxsXJzc5WbmyudvJ2Vm5urvXv3SifX1Rlb6b7O448/rtdee02ff/65duzYoalTp2rlypWaNGlSXU4j6Ag3AADYU6dw84c//EEffvih//X27ds1fvx4ZWRk6J577tEbb7yh7OzsWr/f5s2b1aNHD/Xo0UOSNG3aNPXo0UMzZsyQJB04cMAfdCSprKxMv/71r3XxxRdrwIAB+uCDD/Svf/1LV1xxRV1OI+hYxA8AAHvqNKE4NTVVb7zxhnr16iVJuu+++7R69Wr/nJclS5Zo5syZ+uijj4JX8Rmqy4SkH+qGG6QXXpBmz5amTQvKRwAAcFapy9/vOo3cfPvtt0pOTva/Xr16dcAifJdeeqn27dv3Q2oOK9yWAgDAnjqFm+TkZP9j3mVlZdq6dav69u3r319UVKTGjRs7X2UDQ7gBAMCeOoWboUOH6p577tG///1vTZ8+XU2aNNFPfvIT//7//Oc/Ou+884JRZ4NCuAEAwJ46rXPzxz/+Ub/4xS80YMAAxcbGauHChYqKivLvX7Bgga688spg1NmgsEIxAAD21CncJCYmas2aNSooKFBsbKwa+R4LOmnJkiWKi4tzusYGh5EbAADsqVO4ufnmm2vVb8GCBT+0nrBAuAEAwJ46hZuFCxeqXbt26tGjh+rwBPlZh3ADAIA9dQo3d9xxh1588UV9+eWXGjdunG644Qa1aNEieNU1UIQbAADsqdPTUk8//bQOHDig3/72t3rjjTeUlpam6667Tu+88w4jOZWwQjEAAPbU+Ysz3W63Ro8erZycHH300Ufq0qWLJk6cqPbt26u4uDg4VTYwjNwAAGDPGX0reEREhFwul4wxKue5Zz/CDQAA9tQ53JSWlurFF1/UoEGDdP7552v79u166qmntHfvXsXGxganygaGcAMAgD11mlA8ceJELV68WGlpabr55pv14osvKjExMXjVNVCEGwAA7KlTuJk7d67atm2rc889V6tXr9bq1aur7ffKK684VV+DxArFAADYU6dwM3bsWLlcruBVEyYYuQEAwJ46L+KH70e4AQDAnjN6WgrVI9wAAGAP4SYICDcAANhDuAkCVigGAMAewk0QMHIDAIA9hJsgINwAAGAP4SYICDcAANhDuAkCwg0AAPYQboKAFYoBALCHcBMEjNwAAGAP4SYICDcAANhDuAkCwg0AAPYQboKAcAMAgD2EmyBghWIAAOwh3AQBIzcAANhDuAkCwg0AAPYQboKAcAMAgD2EmyAg3AAAYA/hJghYoRgAAHsIN0HAyA0AAPYQboKAcAMAgD2EmyDwhRtjJI/HdjUAAJxdCDdB4FvET4zeAAAQcoSbIPCN3IhwAwBAyBFugoBwAwCAPYSbICDcAABgD+EmCJhzAwCAPYSbIHC5+GZwAABsIdwECasUAwBgB+EmSFjIDwAAOwg3QUK4AQDADsJNkBBuAACwg3ATJEwoBgDADqvhZs2aNRo+fLhatWoll8ulZcuWfe8xq1at0o9+9CO53W517NhRCxcuDEmtdcXIDQAAdlgNNyUlJerWrZuefvrpWvX/8ssvNWzYMF1++eXKzc3V1KlTdcstt+idd94Jeq11RbgBAMCOyFr0CZrMzExlZmbWuv/cuXPVoUMHzZ49W5J04YUXau3atXrsscc0ePDgao8pLS1VaWmp/3VhYaEDlX8/wg0AAHY0qDk369evV0ZGRsC2wYMHa/369ac9Jjs7WwkJCf6WlpYWgkoJNwAA2NKgwk1eXp6Sk5MDtiUnJ6uwsFDfffddtcdMnz5dBQUF/rZv376Q1Eq4AQDADqu3pULB7XbL7XaH/HNZoRgAADsa1MhNSkqK8vPzA7bl5+crPj5eMTEx1uqqDiM3AADY0aDCTXp6ulasWBGwLScnR+np6dZqOh3CDQAAdlgNN8XFxcrNzVVubq508lHv3Nxc7d27Vzo5X2bs2LH+/hMmTNCuXbv029/+Vp988omeeeYZvfzyy/rVr35l7RxOh3ADAIAdVsPN5s2b1aNHD/Xo0UOSNG3aNPXo0UMzZsyQJB04cMAfdCSpQ4cO+uc//6mcnBx169ZNs2fP1l//+tfTPgZuEysUAwBgh9UJxQMHDpQx5rT7q1t9eODAgdq2bVuQKztzjNwAAGBHg5pz05AQbgAAsINwEySEGwAA7CDcBAnhBgAAOwg3QcIifgAA2EG4CRJGbgAAsINwEySEGwAA7CDcBAnhBgAAOwg3QUK4AQDADsJNkLBCMQAAdhBugoSRGwAA7CDcBAnhBgAAOwg3QUK4AQDADsJNkBBuAACwg3ATJKxQDACAHYSbIGHkBgAAOwg3QUK4AQDADsJNkBBuAACwg3ATJCziBwCAHYSbIGHkBgAAOwg3QUK4AQDADsJNkBBuAACwg3ATJIQbAADsINwECeEGAAA7CDdBwgrFAADYQbgJEkZuAACwg3ATJIQbAADsINwECeEGAAA7CDdBwgrFAADYQbgJEkZuAACwg3ATJIQbAADsINwECeEGAAA7CDdBQrgBAMAOwk2QEG4AALCDcBMkrFAMAIAdhJsgYeQGAAA7CDdBQrgBAMAOwk2QEG4AALCDcBMkrFAMAIAdhJsgYeQGAAA7CDdBQrgBAMAOwk2QEG4AALCDcBMkhBsAAOwg3ARJ5UX8jLFdDQAAZw/CTZD4wo0keTw2KwEA4OxCuAmSyuGGW1MAAIQO4SZICDcAANhBuAkSwg0AAHbUi3Dz9NNPq3379oqOjlafPn30/vvvn7bvwoUL5XK5Alp0dHRI660N3wrFItwAABBS1sPNSy+9pGnTpmnmzJnaunWrunXrpsGDB+vgwYOnPSY+Pl4HDhzwtz179oS05tqIqHRlCTcAAISO9XDz6KOP6tZbb9W4ceN00UUXae7cuWrSpIkWLFhw2mNcLpdSUlL8LTk5OaQ114bLxVo3AADYYDXclJWVacuWLcrIyKgoKCJCGRkZWr9+/WmPKy4uVrt27ZSWlqarrrpKH3744Wn7lpaWqrCwMKCFCuEGAIDQsxpuDh06pPLy8iojL8nJycrLy6v2mM6dO2vBggV67bXX9Pzzz8vj8ahfv37673//W23/7OxsJSQk+FtaWlpQzqU6hBsAAELP+m2pukpPT9fYsWPVvXt3DRgwQK+88opatmypefPmVdt/+vTpKigo8Ld9+/aFrFbCDQAAoRdZiz5Bk5iYqEaNGik/Pz9ge35+vlJSUmr1Ho0bN1aPHj30+eefV7vf7XbL7XY7Um9dVf4KBgAAEBpWR26ioqLUs2dPrVixwr/N4/FoxYoVSk9Pr9V7lJeXa/v27UpNTQ1ipT8MIzcAAISe1ZEbSZo2bZqysrLUq1cv9e7dW48//rhKSko0btw4SdLYsWPVunVrZWdnS5L+8Ic/qG/fvurYsaOOHDmiWbNmac+ePbrlllssn0lVhBsAAELPergZNWqUvv76a82YMUN5eXnq3r273n77bf8k47179yqi0qIx3377rW699Vbl5eWpefPm6tmzp9577z1ddNFFFs+ier6F/Ag3AACEjssYY2wXEUqFhYVKSEhQQUGB4uPjg/pZHTtKX3whrVsn9esX1I8CACCs1eXvd4N7Wqoh4bYUAAChR7gJIsINAAChR7gJIsINAAChR7gJIsINAAChR7gJIsINAAChR7gJIlYoBgAg9Ag3QcTIDQAAoUe4CSLCDQAAoUe4CSJWKAYAIPQIN0HEyA0AAKFHuAkiwg0AAKFHuAkiX7g5ftx2JQAAnD0IN0F08ovN9cUXtisBAODsQbgJov79vT///W/blQAAcPYg3ATRT37i/bl1q1RcbLsaAADODoSbIGrb1tvKy6UNG2xXAwDA2YFwE2S+0RtuTQEAEBqEmyDzhZs1a2xXAgDA2YFwE2S+cLNhg1RWZrsaAADCH+EmyC68UDrnHOnYMWnLFtvVAAAQ/gg3QeZyST/+sff32sy7MUZav17KzQ16aQAAhCXCTQjUZlLxnj3SH/8odewo9esn9eolbdoUshIBAAgbkbYLOBtcdpn357p1kscjRVSKlJ9/Lt1xh7RihXfUxqe8XLrxRmnbNikmJvQ1AwDQUDFyEwI9ekhNm0rffit9+GHFdo9HGjNG+te/vMHmpz+VnntO2rtXSk2Vdu6Upk+v/ecYExiQAAA4GxFuQiAyUkpP9/5e+dbUc89J778vxcVJn3ziHb258UYpLU3629+8fZ54Qlq5sub393ikRx/1Tlw+91zprruknByezgIAnJ0INyFy6rybggLpf/7H+/uMGVLnzoH9MzOl22/3/n7TTd7+1dm3Txo0SPr1r70jQ7t3S08+KV15pZSYKN1wg3TwYNBOCwCAeodwEyKVw40x0h/+4A0d55/vHWmpzp//7B2J2bdPmjKl6v6XXpIuucQ7stOkiTRnjrRsmTR+vPcbyYuKpBde8I4affppcM8PAID6wmXM2TVLo7CwUAkJCSooKFB8fHzIPvfoUalZM+n4cemf/5Suuko6cUJ66y1pyJDTH7dunTcYGSP17u2daFxW5n2/L77w9rn0Uun5571Bycfj8T5SfuON0pdfem9Zvf6690ksAAAamrr8/WbkJkSaNJF69vT+/stfeoPN8OE1BxtJ6t+/4vbV++97FwLcvt0bbCIivLe01q0LDDaSd1///t6A06uX9M030hVXSK+8Ure6y8ulQ4ek/Py6HQcAgC2M3ITQb38rzZrl/T0qyvvkVMeO339cebn0zjtSaankdnuPdbuldu283zr+fUpKpOuvl5Yv9y4qeNdd0rXXSn37So0aVfT7+mvvba1ly6TPPvOGmiNHKp7AGjFCeuYZ75NcAACEUl3+fhNuQuiNN6Sf/9z7+/Tp0p/+FLrPPnHCG2rmzKnYlpgoDR0qde3qvT22erX3dlZ1XC5vyGnWzPtk1k03ebcBABAKhJsa2Aw3BQVSp05SQoJ3cb7Y2JB+vIyRXntNevllb5g5cqRqn549pZEjvXNzWrb0ztVp0cL7qPrNN0ubN3v7XXmlNG+e1L59aM8BAHB2ItzUwGa40cmA06hR6IPNqY4f987VeeMN72KBl18u/eIXUocOpz/mxAnpsce883yOHfPeGhs3znu7rabjAAA4U4SbGtgON+Hg00+9a/CsWuV93aiRNHq0dM890kUXBfb1Pd1VWur9efy4d85O5bk+AAB8H8JNDQg3zjDGu2bPn/7knexcF/Hx3u/bGjjQO2LUrRthBwBQM8JNDQg3ztuyRcrO9j5m/n3/mho18o7mVNa8uXfdn5Ejvastu91BLRcA0AARbmpAuAmeoiLvXJzKIiK8YcXt9n7Hlscj5eZ6b2m9+6539KewsKJ/fLz0s59JgwdLffp4J2BHsBoTAJz1CDc1INzULydOSO+9Jy1d6h35+eqrwP3NmnlXYL7oIqm42Lv2zjffeFt5ecWaP1FR3oUSu3b1PvHVq5d3YUNudwFAeCDc1IBwU395PNLGjd5FBNet897uOnUkqC6aNvV+ZcUVV3hbr17e0SMAQMNDuKkB4abhOH5c2rHDG3i++MI7inPOORUtKsr7FJbvSaxvv/Xe8tqyxbuO0NGjge/nm8icmOgd0fG1iIiqCxKmpkrnnVfRmjcP6akDAE5BuKkB4ebsUF4uffyxtGaNtGKF95vTq1u0sLaaNZNatQps7dp5g0/HjlJaGqNCABBMhJsaEG7OTuXl3tGc9eul777zvj5xwvvz1K+cKC+X/vtf72jR55/X7ktDIyO9oz2nrusTHe0dZUpMrPiZlORtycnen82be+cL+VpMjHeRx8aNg3Y5AKDBIdzUgHCDuioulvbskQ4ckPbv9/786ivpyy+9AWjXLm+YcZrbLcXFeVtsrDf0VA5AxnjDlK9FRHi/2qNZs4qWkuIdVUpLk1q39oYtAGiI6vL3m4F04HvExkpdunhbdTweb9jZv9872uJ7eisqyjtK5Hu669Ahbzt4sKLl53u/kuPoUW/fo0e9I0pSxXyiQ4ecO5cWLSrCUtOmFe3U4BQZGTgv6dTz8gWvhATvXKaEBO97RkdXNG7TAbCF//wAZygiomJ0xAllZd7RoqKiwJ++8OP76XIFBpDycm9QOnLE2w4f9o4y7dvnbceOebcdPuxMnd+nUSNvcPKNPvla5VDVtKl3hKlFC+/tuRYtvEGpcriKjPS2xo29LTLSG7B8gYwQBeBU/GcBqGeiorx/5Fu0cO49jfGOHuXleYNSSUlFO3o0MDSdOifpxAlvqzyX6Ngxb+gqLPQGqsJC72vfqJNOzl0qLAxcpDEYGjf2Bh23uyIA+VqTJoGjVG63d6TN13TyKTrfE3iJid7Xpz495wtUvgUp3e6qo12+0apTjwUQeoQb4Czgcnn/cCcmBvdzysu9AejYMW9IKinxhp7KrXKwKi72hiPfiNLhw94w5AtWvnbihHdpgOPHK4KWj297feByVdy6i4qqGrYqLz0QEVGxgnfl23nR0YG3/yq/l+9no0YVodN3rVyuilGu6ka7fLcWY2IqWnS0970q1+MLZy5XRTv1fU9tvnoqB+Dy8sCafedBAEQoEG4AOKZRo4rRjGAyxvtH1DfSVFJS8a3zvlZW5t1fOUyVlQX+ITfGG64qz4sqKqr6Wb7Q5vvjXVoaONJ1al3BmGAeLip/JUvl8OYLQVLgyKExFSNnvuZbebxyAKscGn0/K9/a9N26rRwKXS7vyF7lVt1tTt97Vf53c2q4rO7zfP19v+vkvxHfYzwREVVD7KlfN+PrX/k43y3pyu9f+d915e2V1/Oq/H8Sjh+v/vMrv5fv2p76+ZVvF0dGVlwT335jvMG5detg/AuqnXoRbp5++mnNmjVLeXl56tatm5588kn17t37tP2XLFmi+++/X7t371anTp308MMPa+jQoSGtGYA9LlfFKIdtxngDTuXwU1ZWNWwdP17xx9rjqQhNZWXe448dq2i+4yu/ny+wVR4VqTxycuof3Mqf63vte//vvqv4zFPrUTV/UE8NBZVvP57K5fL+sYyIqH5UzeOp+HyEr/R071fr2GI93Lz00kuaNm2a5s6dqz59+ujxxx/X4MGDtXPnTiUlJVXp/95772n06NHKzs7Wz372My1atEgjRozQ1q1b1bVrVyvnAODs5XKFZrSqPvEFosphx3fryff//Cv3rRzMjh2rGN0qLa3Y5/spBY44uFwV4cz3Hr4wdmrzbfd4Am9r+tqpIw4eT8XtUV/zBbxTz9XXfKM0p45c+K6HLwhW7u/7WXmkyeWqCLeVQ2x1i7P4+vt+P7UeX2iurtbK63lVvlXZuHHFSGPlGk69jsYE1ixVnKsvOFd3brb/92B9nZs+ffro0ksv1VNPPSVJ8ng8SktL05133ql77rmnSv9Ro0appKREy5cv92/r27evunfvrrlz51bpX1paqtJKY8SFhYVKS0tjnRsAABqQuqxzE1Hj3iArKyvTli1blJGRUVFQRIQyMjK0fv36ao9Zv359QH9JGjx48Gn7Z2dnKyEhwd/SnHpeFwAA1EtWw82hQ4dUXl6u5OTkgO3JycnKy8ur9pi8vLw69Z8+fboKCgr8bd++fQ6eAQAAqG+sz7kJNrfbLbfbbbsMAAAQIlZHbhITE9WoUSPln/LNhPn5+UpJSan2mJSUlDr1BwAAZxer4SYqKko9e/bUihUr/Ns8Ho9WrFih9PT0ao9JT08P6C9JOTk5p+0PAADOLtZvS02bNk1ZWVnq1auXevfurccff1wlJSUaN26cJGns2LFq3bq1srOzJUlTpkzRgAEDNHv2bA0bNkyLFy/W5s2bNX/+fMtnAgAA6gPr4WbUqFH6+uuvNWPGDOXl5al79+56++23/ZOG9+7dq4hKSzb269dPixYt0u9+9zvde++96tSpk5YtW8YaNwAAQKoP69yEWl2ekwcAAPVDg1nnBgAAwGmEGwAAEFYINwAAIKwQbgAAQFgh3AAAgLBCuAEAAGHF+jo3oeZ78r2wsNB2KQAAoJZ8f7drs4LNWRduioqKJElpaWm2SwEAAHVUVFSkhISEGvucdYv4eTwe7d+/X3FxcXK5XI6+d2FhodLS0rRv3z4WCAwyrnXocK1Dh2sdOlzr0HHqWhtjVFRUpFatWgV8c0F1zrqRm4iICLVp0yaonxEfH8//WEKEax06XOvQ4VqHDtc6dJy41t83YuPDhGIAABBWCDcAACCsEG4c5Ha7NXPmTLndbtulhD2udehwrUOHax06XOvQsXGtz7oJxQAAILwxcgMAAMIK4QYAAIQVwg0AAAgrhBsAABBWCDcOefrpp9W+fXtFR0erT58+ev/9922X1OBlZ2fr0ksvVVxcnJKSkjRixAjt3LkzoM+xY8c0adIknXPOOYqNjdXIkSOVn59vreZw8dBDD8nlcmnq1Kn+bVxr53z11Ve64YYbdM455ygmJkYXX3yxNm/e7N9vjNGMGTOUmpqqmJgYZWRk6LPPPrNac0NUXl6u+++/Xx06dFBMTIzOO+88/fGPfwz4biKu9Q+3Zs0aDR8+XK1atZLL5dKyZcsC9tfm2h4+fFhjxoxRfHy8mjVrpvHjx6u4uPjMizM4Y4sXLzZRUVFmwYIF5sMPPzS33nqradasmcnPz7ddWoM2ePBg8+yzz5odO3aY3NxcM3ToUNO2bVtTXFzs7zNhwgSTlpZmVqxYYTZv3mz69u1r+vXrZ7Xuhu7999837du3N5dccomZMmWKfzvX2hmHDx827dq1MzfddJPZuHGj2bVrl3nnnXfM559/7u/z0EMPmYSEBLNs2TLzwQcfmJ///OemQ4cO5rvvvrNae0Pz4IMPmnPOOccsX77cfPnll2bJkiUmNjbWPPHEE/4+XOsf7s033zT33XefeeWVV4wk8+qrrwbsr821HTJkiOnWrZvZsGGD+fe//206duxoRo8efca1EW4c0Lt3bzNp0iT/6/LyctOqVSuTnZ1tta5wc/DgQSPJrF692hhjzJEjR0zjxo3NkiVL/H0+/vhjI8msX7/eYqUNV1FRkenUqZPJyckxAwYM8IcbrrVz/ud//sf8+Mc/Pu1+j8djUlJSzKxZs/zbjhw5Ytxut3nxxRdDVGV4GDZsmLn55psDtv3iF78wY8aMMYZr7ahTw01tru1HH31kJJlNmzb5+7z11lvG5XKZr7766ozq4bbUGSorK9OWLVuUkZHh3xYREaGMjAytX7/eam3hpqCgQJLUokULSdKWLVt0/PjxgGt/wQUXqG3btlz7H2jSpEkaNmxYwDUV19pRr7/+unr16qVrr71WSUlJ6tGjh/7v//7Pv//LL79UXl5ewLVOSEhQnz59uNZ11K9fP61YsUKffvqpJOmDDz7Q2rVrlZmZKXGtg6o213b9+vVq1qyZevXq5e+TkZGhiIgIbdy48Yw+/6z74kynHTp0SOXl5UpOTg7YnpycrE8++cRaXeHG4/Fo6tSp6t+/v7p27SpJysvLU1RUlJo1axbQNzk5WXl5eZYqbbgWL16srVu3atOmTVX2ca2ds2vXLs2ZM0fTpk3Tvffeq02bNumuu+5SVFSUsrKy/Nezuv+mcK3r5p577lFhYaEuuOACNWrUSOXl5XrwwQc1ZswY6eS/a3Gtg6I21zYvL09JSUkB+yMjI9WiRYszvv6EGzQIkyZN0o4dO7R27VrbpYSlffv2acqUKcrJyVF0dLTtcsKax+NRr1699Kc//UmS1KNHD+3YsUNz585VVlaW7fLCyssvv6wXXnhBixYtUpcuXZSbm6upU6eqVatWXOswx22pM5SYmKhGjRpVeWokPz9fKSkp1uoKJ5MnT9by5cv17rvvqk2bNv7tKSkpKisr05EjRwL6c+3rbsuWLTp48KB+9KMfKTIyUpGRkVq9erX+8pe/KDIyUsnJyVxrh6Smpuqiiy4K2HbhhRdq79690sl/1zp5bSvjWtfdb37zG91zzz26/vrrdfHFF+vGG2/Ur371K2VnZ0tc66CqzbVNSUnRwYMHA/afOHFChw8fPuPrT7g5Q1FRUerZs6dWrFjh3+bxeLRixQqlp6dbra2hM8Zo8uTJevXVV7Vy5Up16NAhYH/Pnj3VuHHjgGu/c+dO7d27l2tfR1dccYW2b9+u3Nxcf+vVq5fGjBnj/51r7Yz+/ftXWdLg008/Vbt27SRJHTp0UEpKSsC1Liws1MaNG7nWdXT06FFFRAT+mWvUqJE8Ho/EtQ6q2lzb9PR0HTlyRFu2bPH3WblypTwej/r06XNmBZzRdGQYc/JRcLfbbRYuXGg++ugjc9ttt5lmzZqZvLw826U1aHfccYdJSEgwq1atMgcOHPC3o0eP+vtMmDDBtG3b1qxcudJs3rzZpKenm/T0dKt1h4vKT0sZrrVj3n//fRMZGWkefPBB89lnn5kXXnjBNGnSxDz//PP+Pg899JBp1qyZee2118x//vMfc9VVV/F48g+QlZVlWrdu7X8U/JVXXjGJiYnmt7/9rb8P1/qHKyoqMtu2bTPbtm0zksyjjz5qtm3bZvbs2WNMLa/tkCFDTI8ePczGjRvN2rVrTadOnXgUvD558sknTdu2bU1UVJTp3bu32bBhg+2SGjxJ1bZnn33W3+e7774zEydONM2bNzdNmjQxV199tTlw4IDVusPFqeGGa+2cN954w3Tt2tW43W5zwQUXmPnz5wfs93g85v777zfJycnG7XabK664wuzcudNavQ1VYWGhmTJlimnbtq2Jjo425557rrnvvvtMaWmpvw/X+od79913q/1vdFZWljG1vLbffPONGT16tImNjTXx8fFm3Lhxpqio6Ixrc5nKSzUCAAA0cMy5AQAAYYVwAwAAwgrhBgAAhBXCDQAACCuEGwAAEFYINwAAIKwQbgAAQFgh3AAAgLBCuAFwVnK5XFq2bJntMgAEAeEGQMjddNNNcrlcVdqQIUNslwYgDETaLgDA2WnIkCF69tlnA7a53W5r9QAIH4zcALDC7XYrJSUloDVv3lw6ectozpw5yszMVExMjM4991wtXbo04Pjt27frpz/9qWJiYnTOOefotttuU3FxcUCfBQsWqEuXLnK73UpNTdXkyZMD9h86dEhXX321mjRpok6dOun111/37/v22281ZswYtWzZUjExMerUqVOVMAagfiLcAKiX7r//fo0cOVIffPCBxowZo+uvv14ff/yxJKmkpESDBw9W8+bNtWnTJi1ZskT/+te/AsLLnDlzNGnSJN12223avn27Xn/9dXXs2DHgM37/+9/ruuuu03/+8x8NHTpUY8aM0eHDh/2f/9FHH+mtt97Sxx9/rDlz5igxMTHEVwHAD3LG3ysOAHWUlZVlGjVqZJo2bRrQHnzwQWOMMZLMhAkTAo7p06ePueOOO4wxxsyfP980b97cFBcX+/f/85//NBERESYvL88YY0yrVq3Mfffdd9oaJJnf/e53/tfFxcVGknnrrbeMMcYMHz7cjBs3zuEzBxAKzLkBYMXll1+uOXPmBGxr0aKF//f09PSAfenp6crNzZUkffzxx+rWrZuaNm3q39+/f395PB7t3LlTLpdL+/fv1xVXXFFjDZdccon/96ZNmyo+Pl4HDx6UJN1xxx0aOXKktm7dqiuvvFIjRoxQv379zvCsAYQC4QaAFU2bNq1ym8gpMTExterXuHHjgNcul0sej0eSlJmZqT179ujNN99UTk6OrrjiCk2aNEl//vOfg1IzAOcw5wZAvbRhw4Yqry+88EJJ0oUXXqgPPvhAJSUl/v3r1q1TRESEOnfurLi4OLVv314rVqw4oxpatmyprKwsPf/883r88cc1f/78M3o/AKHByA0AK0pLS5WXlxewLTIy0j9pd8mSJerVq5d+/OMf64UXXtD777+vv/3tb5KkMWPGaObMmcrKytIDDzygr7/+WnfeeaduvPFGJScnS5IeeOABTZgwQUlJScrMzFRRUZHWrVunO++8s1b1zZgxQz179lSXLl1UWlqq5cuX+8MVgPqNcAPAirffflupqakB2zp37qxPPvlEOvkk0+LFizVx4kSlpqbqxRdf1EUXXSRJatKkid555x1NmTJFl156qZo0aaKRI0fq0Ucf9b9XVlaWjh07pscee0x33323EhMTdc0119S6vqioKE2fPl27d+9WTEyMfvKTn2jx4sWOnT+A4HEZ71MDAFBvuFwuvfrqqxoxYoTtUgA0QMy5AQAAYYVwAwAAwgpzbgDUO9wtB3AmGLkBAABhhXADAADCCuEGAACEFcINAAAIK4QbAAAQVgg3AAAgrBBuAABAWCHcAACAsPL/AR7PgkVWvPS6AAAAAElFTkSuQmCC",
- "text/plain": [
- "<Figure size 640x480 with 1 Axes>"
- ]
- },
- "metadata": {},
- "output_type": "display_data"
- }
- ],
+ "outputs": [],
"source": [
"plt.suptitle('MSE vs Epochs')\n",
"plt.plot(tune_train_err, label='Train', color='blue')\n",
@@ -1247,78 +1554,20 @@
},
{
"cell_type": "code",
- "execution_count": 26,
+ "execution_count": null,
"execution_state": "idle",
"metadata": {},
- "outputs": [
- {
- "data": {
- "text/plain": [
- "0.0189208984375"
- ]
- },
- "execution_count": 26,
- "metadata": {},
- "output_type": "execute_result"
- }
- ],
+ "outputs": [],
"source": [
"tune_evaluate()"
]
},
{
"cell_type": "code",
- "execution_count": 25,
+ "execution_count": null,
"execution_state": "idle",
"metadata": {},
- "outputs": [
- {
- "data": {
- "text/plain": [
- "(array([[2.6100e+02, 8.9530e+03, 8.2329e+04, ..., 0.0000e+00, 0.0000e+00,\n",
- " 0.0000e+00],\n",
- " [0.0000e+00, 0.0000e+00, 0.0000e+00, ..., 0.0000e+00, 0.0000e+00,\n",
- " 0.0000e+00],\n",
- " [0.0000e+00, 0.0000e+00, 0.0000e+00, ..., 0.0000e+00, 0.0000e+00,\n",
- " 0.0000e+00],\n",
- " ...,\n",
- " [0.0000e+00, 0.0000e+00, 0.0000e+00, ..., 0.0000e+00, 0.0000e+00,\n",
- " 0.0000e+00],\n",
- " [0.0000e+00, 0.0000e+00, 0.0000e+00, ..., 0.0000e+00, 0.0000e+00,\n",
- " 0.0000e+00],\n",
- " [0.0000e+00, 0.0000e+00, 0.0000e+00, ..., 0.0000e+00, 1.0000e+00,\n",
- " 0.0000e+00]]),\n",
- " array([1. , 1.1 , 1.2 , 1.3 , 1.4 , 1.5 , 1.6 , 1.699, 1.8 ,\n",
- " 1.9 , 2. , 2.1 , 2.2 , 2.3 , 2.398, 2.5 , 2.6 , 2.7 ,\n",
- " 2.8 , 2.898, 3. , 3.1 , 3.2 , 3.299, 3.398, 3.5 , 3.6 ,\n",
- " 3.7 , 3.799, 3.898, 4. , 4.1 , 4.2 , 4.297, 4.4 , 4.5 ,\n",
- " 4.6 , 4.7 , 4.797, 4.9 , 5. , 5.098, 5.2 , 5.3 , 5.4 ,\n",
- " 5.5 , 5.598, 5.7 , 5.797, 5.9 , 6. ], dtype=float16),\n",
- " array([0.8477, 0.913 , 0.9785, 1.044 , 1.109 , 1.176 , 1.241 , 1.307 ,\n",
- " 1.372 , 1.4375, 1.503 , 1.568 , 1.635 , 1.699 , 1.766 , 1.831 ,\n",
- " 1.896 , 1.962 , 2.027 , 2.094 , 2.158 , 2.225 , 2.29 , 2.355 ,\n",
- " 2.422 , 2.486 , 2.55 , 2.617 , 2.684 , 2.75 , 2.814 , 2.879 ,\n",
- " 2.945 , 3.012 , 3.076 , 3.143 , 3.207 , 3.273 , 3.338 , 3.404 ,\n",
- " 3.469 , 3.535 , 3.602 , 3.666 , 3.732 , 3.797 , 3.863 , 3.928 ,\n",
- " 3.994 , 4.062 , 4.125 ], dtype=float16),\n",
- " <matplotlib.collections.QuadMesh at 0x7fe6040e22a0>)"
- ]
- },
- "execution_count": 25,
- "metadata": {},
- "output_type": "execute_result"
- },
- {
- "data": {
- "image/png": "iVBORw0KGgoAAAANSUhEUgAAAigAAAGdCAYAAAA44ojeAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjkuMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8hTgPZAAAACXBIWXMAAA9hAAAPYQGoP6dpAAAgkUlEQVR4nO3dfVBU5+H28Wt5W4jCRjMBVNDHVIMvCEFjk8VpNInGUMvIPzTj2GLSJDPJg63GNungk8mbTdZOxrF2TFDyIm1TholJxTbVGJoEHUdMBcMM6NTUmARiAPtMDCskILD7/PFDKo8CnkU89+5+PzPnjz3cZ/faWWUv7j33WYff7/cLAADAIBF2BwAAAPj/UVAAAIBxKCgAAMA4FBQAAGAcCgoAADAOBQUAABiHggIAAIxDQQEAAMaJsjvAlfD5fPrqq68UHx8vh8NhdxwAAHAF/H6/zp07p4kTJyoiwtqcSFAUlK+++kqpqal2xwAAAAFoampSSkqKpWOCoqDEx8dLfU8wISHB7jgAAMMtdxUMO2Z32x+vSZZw5vV6lZqa2v8+bkVQFJQLH+skJCRQUAAAw4pyRA87hveTayeQ0zM4SRYAABiHggIAAIxDQQEAAMahoAAAAOMExUmyAGC6JRH5VzSu0rdz1LMA18KV/Jvv8XcHfP/MoAAAAONQUAAAgHFGVFA2btwoh8OhtWvXDjlu586dmjFjhmJjYzVnzhzt2bNnJA8LAABCXMAF5ciRI9q+fbsyMjKGHHfo0CGtWLFCDz74oD7++GPl5eUpLy9PDQ0NgT40AAAIcQEVlPb2dq1cuVKvvPKKxo0bN+TYLVu26N5779Xjjz+umTNnasOGDZo7d662bt0aaGYAABDiAioohYWFWrZsmRYvXjzs2Orq6kvGLV26VNXV1YMe09XVJa/XO2ADAADhw/Iy4/Lych09elRHjhy5ovEtLS1KSkoasC8pKUktLS2DHuPxePTss89ajQYAtomcfbPdEXCRiKxZdkcIeVeyZN7r9crlcgV0/5ZmUJqamrRmzRr9+c9/VmxsbEAPeCWKiorU1tbWvzU1NY3aYwEAAPNYmkGpra3VmTNnNHfu3P59vb29OnDggLZu3aquri5FRkYOOCY5OVmtra0D9rW2tio5OXnQx3E6nXI6nVaiAQCAEGJpBuXuu+9WfX296urq+rdbb71VK1euVF1d3SXlRJLcbrfef//9AfsqKyvldrtHnh4AAIQkSzMo8fHxSk9PH7BvzJgxuuGGG/r3FxQUaNKkSfJ4PJKkNWvWaOHChdq0aZOWLVum8vJy1dTUqKSk5Go+DwAAEEKu+pVkGxsb1dzc3H87OztbZWVlKikpUWZmpt566y1VVFRcUnQAAAAuGPGXBVZVVQ15W5Ly8/OVn39lX6QFAADAd/EAAADjjHgGBQAgOb47b3cEXMRxvsfuCBghZlAAAIBxKCgAAMA4FBQAAGAcCgoAADAOBQUAABiHggIAAIzDMmMgiC2JuLILIF7J16JjZM6njLM7Ai7SNSHB7ggh70p+//T4uwO+f2ZQAACAcSgoAADAOBQUAABgHAoKAAAwDgUFAAAYh4ICAACMQ0EBAADG4TooQBCLnH2z3RHQ5/z10XZHwEWivu2xOwJGiBkUAABgHAoKAAAwDgUFAAAYh4ICAACMQ0EBAADGoaAAAADjsMwYCGK+MU67I6DP+YRIuyMA11Slb+ewY7xer1wuV0D3zwwKAAAwDgUFAAAYh4ICAACMQ0EBAADGoaAAAADjUFAAAIBxWGYMBLHvkuPsjoA+vXyZsVEcPb12Rwh5SyLyhx3T4+8O+P6ZQQEAAMahoAAAAONQUAAAgHEoKAAAwDgUFAAAYBwKCgAAMA4FBQAAGIfroABBrHN8pN0R0Kc73mF3BFyEawQFP2ZQAACAcSgoAADAOBQUAABgHAoKAAAwDgUFAAAYh4ICAACMY2mZcXFxsYqLi/X5559LkmbPnq2nnnpKOTk5lx1fWlqqBx54YMA+p9Opzs7OkWQG0Oe7G1jaaoru6+xOgIv5ovi/MdoqfTuHHeP1euVyuQK6f0sFJSUlRRs3btT06dPl9/v1hz/8QcuXL9fHH3+s2bNnX/aYhIQEnThxov+2w8E/GgAAMDRLBSU3N3fA7eeff17FxcU6fPjwoAXF4XAoOTl5ZCkBAEBYCfgclN7eXpWXl6ujo0Nut3vQce3t7ZoyZYpSU1O1fPlyHTt2bNj77urqktfrHbABAIDwYbmg1NfXa+zYsXI6nXrkkUe0a9cuzZo167Jj09LS9Prrr2v37t1644035PP5lJ2drS+//HLIx/B4PHK5XP1bamqq1ZgAACCIWS4oaWlpqqur00cffaRHH31Uq1at0vHjxy871u12q6CgQLfccosWLlyov/zlL7rxxhu1ffv2IR+jqKhIbW1t/VtTU5PVmAAAIIhZ/rLAmJgYTZs2TZI0b948HTlyRFu2bBm2dEhSdHS0srKydPLkySHHOZ1OOZ1Oq9EAAECIGPF1UHw+n7q6uq5obG9vr+rr6zVhwoSRPiwAAAhhlmZQioqKlJOTo8mTJ+vcuXMqKytTVVWV9u3bJ0kqKCjQpEmT5PF4JEnPPfecbr/9dk2bNk3ffPONXnzxRX3xxRd66KGHRufZAGHGb3kOFKPF4bc7AS429rNzdkfACFn69XbmzBkVFBSoublZLpdLGRkZ2rdvn5YsWSJJamxsVETEfydlzp49q4cfflgtLS0aN26c5s2bp0OHDg16Ui0AAICsFpTXXnttyJ9XVVUNuL1582Zt3rw5sGQAACBs8V08AADAOBQUAABgHAoKAAAwDgUFAAAYh0WKQBDrHmN3Alzg47epUSJO/8fuCBghZlAAAIBxKCgAAMA4FBQAAGAcCgoAADAOBQUAABiHggIAAIzDwjggiPVex1fomiLqW7sT4GL+G8fZHQEjxAwKAAAwDgUFAAAYh4ICAACMQ0EBAADGoaAAAADjUFAAAIBxKCgAAMA4XAcFCGJ+/sQwRq/T7gRAaOHXGwAAMA4FBQAAGIeCAgAAjENBAQAAxqGgAAAA41BQAACAcVhmDMvunfN/rmjcu/XPj3qWcOeL89kdAX3GfuW3OwIu4o/h7S3YMYMCAACMQ0EBAADGoaAAAADjUFAAAIBxKCgAAMA4FBQAAGAc1mHBMt8YvrbVFP5olhmb4vxYh90RcJF9tc/aHQEjxAwKAAAwDgUFAAAYh4ICAACMQ0EBAADGoaAAAADjUFAAAIBxKCgAAMA4XAcFlvVeF213BPRxRHEdFFPEnuW1MElO8v8edszelpevSRYEhhkUAABgHAoKAAAwDgUFAAAYx1JBKS4uVkZGhhISEpSQkCC32629e/cOeczOnTs1Y8YMxcbGas6cOdqzZ89IMwMAgBBnqaCkpKRo48aNqq2tVU1Nje666y4tX75cx44du+z4Q4cOacWKFXrwwQf18ccfKy8vT3l5eWpoaLha+QEAQAiyVFByc3P1wx/+UNOnT9fNN9+s559/XmPHjtXhw4cvO37Lli2699579fjjj2vmzJnasGGD5s6dq61bt16t/AAAIAQFvMy4t7dXO3fuVEdHh9xu92XHVFdXa926dQP2LV26VBUVFYE+LAwQ+W233RHQx9/DaWSmiO5gmbFJWEIc/CwXlPr6erndbnV2dmrs2LHatWuXZs2addmxLS0tSkpKGrAvKSlJLS0tQz5GV1eXurq6+m97vV6rMQEAQBCz/OdXWlqa6urq9NFHH+nRRx/VqlWrdPz48asayuPxyOVy9W+pqalX9f4BAIDZLBeUmJgYTZs2TfPmzZPH41FmZqa2bNly2bHJyclqbW0dsK+1tVXJyclDPkZRUZHa2tr6t6amJqsxAQBAEBvxB9g+n2/AxzEXc7vdev/99wfsq6ysHPSclQucTmf/UuYLGwAACB+WzkEpKipSTk6OJk+erHPnzqmsrExVVVXat2+fJKmgoECTJk2Sx+ORJK1Zs0YLFy7Upk2btGzZMpWXl6umpkYlJSWj82wAAEBIsFRQzpw5o4KCAjU3N8vlcikjI0P79u3TkiVLJEmNjY2KiPjvpEx2drbKysr05JNPav369Zo+fboqKiqUnp5+9Z8JAAAIGZYKymuvvTbkz6uqqi7Zl5+fr/z8fOvJAABA2Ar4OigIX98lx9kdAX0cnVwHxRQRvXYnwMWWRAz/h3Glb+c1yYLA8NsNAAAYh4ICAACMQ0EBAADGoaAAAADjUFAAAIBxKCgAAMA4LDOGZT1x9FpTRJzntTBGr9/uBEBI4bcbAAAwDgUFAAAYh4ICAACMQ0EBAADGoaAAAADjUFAAAIBxWGYMyzrH02tN4Yvz2R0BfSJ6WGZskqjEG+2OgBHinQYAABiHggIAAIxDQQEAAMahoAAAAONQUAAAgHEoKAAAwDgUFAAAYByugwIEsch2/sYwhS/KYXcEXKTnzH/sjoAR4rcbAAAwDgUFAAAYh4ICAACMQ0EBAADGoaAAAADjUFAAAIBxWGYMy3ri7E6AC6K9LG01hfPrTrsj4CKVvp12R8AIMYMCAACMQ0EBAADGoaAAAADjUFAAAIBxKCgAAMA4FBQAAGAcCgoAADAO10GBZb2xdidAP/7EMEZvLL9OgauJX28AAMA4FBQAAGAcCgoAADAOBQUAABiHggIAAIxDQQEAAMZhXRws642xOwEuiGq3OwH6RTjsTgCEFGZQAACAcSwVFI/Ho/nz5ys+Pl6JiYnKy8vTiRMnhjymtLRUDodjwBYby5W+AADA4CwVlP3796uwsFCHDx9WZWWluru7dc8996ijo2PI4xISEtTc3Ny/ffHFFyPNDQAAQpilc1DefffdAbdLS0uVmJio2tpa3XHHHYMe53A4lJycHHhKAAAQVkZ0DkpbW5skafz48UOOa29v15QpU5Samqrly5fr2LFjQ47v6uqS1+sdsAEAgPARcEHx+Xxau3atFixYoPT09EHHpaWl6fXXX9fu3bv1xhtvyOfzKTs7W19++eWgx3g8Hrlcrv4tNTU10JgAACAIBbzMuLCwUA0NDTp48OCQ49xut9xud//t7OxszZw5U9u3b9eGDRsue0xRUZHWrVvXf9vr9VJSDNLt8tsdAX34ZmlzRH7bbXcEIKQEVFBWr16td955RwcOHFBKSoqlY6Ojo5WVlaWTJ08OOsbpdMrpdAYSDQAAhABLH/H4/X6tXr1au3bt0gcffKCpU6dafsDe3l7V19drwoQJlo8FAADhwdIMSmFhocrKyrR7927Fx8erpaVFkuRyuRQXFydJKigo0KRJk+TxeCRJzz33nG6//XZNmzZN33zzjV588UV98cUXeuihh0bj+QAAgBBgqaAUFxdLkhYtWjRg/44dO3T//fdLkhobGxUR8d+JmbNnz+rhhx9WS0uLxo0bp3nz5unQoUOaNWvW1XkGAAAg5FgqKH7/8CdHVlVVDbi9efNmbd682XoyAAAQtvguHgAAYBwKCgAAME7A10FB+PJFcR0UU0QP/TVYuIYc3b12RwBCCjMoAADAOBQUAABgHAoKAAAwDgUFAAAYh4ICAACMQ0EBAADGYZkxrIvvsTsB+vTE2p0AF0S0fWt3BCCkMIMCAACMQ0EBAADGoaAAAADjUFAAAIBxKCgAAMA4FBQAAGAcCgoAADAO10GBZZFOroNiitiv7U6AC3pOfW53BCCkMIMCAACMQ0EBAADGoaAAAADjUFAAAIBxKCgAAMA4FBQAAGAclhnDstjYbrsjoE9vjN0JcEHk7JvtjgCEFGZQAACAcSgoAADAOBQUAABgHAoKAAAwDgUFAAAYh4ICAACMwzJjWJac4LU7AmCc3mOf2B0BCCnMoAAAAONQUAAAgHEoKAAAwDgUFAAAYBwKCgAAMA4FBQAAGIeCAgAAjBNU10FZ7ipQlCN6yDGVvp3XLE+4Ght93u4I6BP3td/uCOgTddP/sjsCEFKYQQEAAMahoAAAAONQUAAAgHEoKAAAwDgUFAAAYBxLBcXj8Wj+/PmKj49XYmKi8vLydOLEiWGP27lzp2bMmKHY2FjNmTNHe/bsGUlmAAAQ4iwtM96/f78KCws1f/589fT0aP369brnnnt0/PhxjRkz5rLHHDp0SCtWrJDH49GPfvQjlZWVKS8vT0ePHlV6evrVeh64hqbH/8fuCOjTHeewOwL69Jz63O4IQEixVFDefffdAbdLS0uVmJio2tpa3XHHHZc9ZsuWLbr33nv1+OOPS5I2bNigyspKbd26Vdu2bRtJdgAAEKJGdA5KW1ubJGn8+PGDjqmurtbixYsH7Fu6dKmqq6sHPaarq0ter3fABgAAwkfABcXn82nt2rVasGDBkB/VtLS0KCkpacC+pKQktbS0DHqMx+ORy+Xq31JTUwONCQAAglDABaWwsFANDQ0qLy+/uokkFRUVqa2trX9ramq66o8BAADMFdB38axevVrvvPOODhw4oJSUlCHHJicnq7W1dcC+1tZWJScnD3qM0+mU0+kMJBoAAAgBlmZQ/H6/Vq9erV27dumDDz7Q1KlThz3G7Xbr/fffH7CvsrJSbrfbeloAABAWLM2gFBYWqqysTLt371Z8fHz/eSQul0txcXGSpIKCAk2aNEkej0eStGbNGi1cuFCbNm3SsmXLVF5erpqaGpWUlIzG8wEAACHAUkEpLi6WJC1atGjA/h07duj++++XJDU2Nioi4r8TM9nZ2SorK9OTTz6p9evXa/r06aqoqAjoGihRUycrKoKPfux2+juX3RHQJ/Zsr90R0Ccq8Ua7IwAhxVJB8fv9w46pqqq6ZF9+fr7y8/OtJQMAAGGL7+IBAADGoaAAAADjUFAAAIBxKCgAAMA4FBQAAGCcgK4ka5fe8WPliIq1O0bY++QsyylNEXl++JV1ABCMmEEBAADGoaAAAADjUFAAAIBxKCgAAMA4FBQAAGAcCgoAADBOUC0zjvy6XZER3XbHCHvfdsbYHQF9vk2MtDsC+vSc+Y/dEYCQwgwKAAAwDgUFAAAYh4ICAACMQ0EBAADGoaAAAADjUFAAAIBxKCgAAMA4QXUdlPMp4+SLirU7Rtj79us4uyOgT4zXb3cEABgVzKAAAADjUFAAAIBxKCgAAMA4FBQAAGAcCgoAADAOBQUAABgnqJYZdyTHKCo6xu4YYS/qbFD9swlpMd4euyMAwKhgBgUAABiHggIAAIxDQQEAAMahoAAAAONQUAAAgHEoKAAAwDgUFAAAYJyguqBF5/gIRTrpVMAFked9dkdAn0rfTrsjACGFd3sAAGAcCgoAADAOBQUAABiHggIAAIxDQQEAAMahoAAAAOME1TJjSZLf7gBwfu2wOwL6xDR+bXcE9FkSkX9F41iODFwZZlAAAIBxKCgAAMA4FBQAAGAcywXlwIEDys3N1cSJE+VwOFRRUTHk+KqqKjkcjku2lpaWkeQGAAAhzHJB6ejoUGZmpl566SVLx504cULNzc39W2JiotWHBgAAYcLyKp6cnBzl5ORYfqDExERdf/31lo8DAADh55otM77lllvU1dWl9PR0PfPMM1qwYMGgY7u6utTV1dV/2+v1SpJ64iS/85rExVBYZQxcguXDwNU16ifJTpgwQdu2bdPbb7+tt99+W6mpqVq0aJGOHj066DEej0cul6t/S01NHe2YAADAIKM+g5KWlqa0tLT+29nZ2fr000+1efNm/elPf7rsMUVFRVq3bl3/ba/XS0kBACCM2HIl2e9///s6ePDgoD93Op1yOvksBwCAcGXLdVDq6uo0YcIEOx4aAAAEAcszKO3t7Tp58mT/7c8++0x1dXUaP368Jk+erKKiIp0+fVp//OMfJUm/+93vNHXqVM2ePVudnZ169dVX9cEHH+i99967us8EAACEDMsFpaamRnfeeWf/7QvniqxatUqlpaVqbm5WY2Nj/8/Pnz+vX/7ylzp9+rSuu+46ZWRk6B//+MeA+wAAALiY5YKyaNEi+f2Df6VwaWnpgNtPPPGEnnjiicDSAQCAsGTLSbIBc3ANDhNEdl3BIFwTPtd1dkcAgFHBlwUCAADjUFAAAIBxKCgAAMA4FBQAAGAcCgoAADAOBQUAABgnqJYZd18n+WLtToExzT67I6BPxOn/2B0BAEYFMygAAMA4FBQAAGAcCgoAADAOBQUAABiHggIAAIxDQQEAAMahoAAAAOME1XVQfHF+KdZvd4yw54t22B0Bffa2vGx3BAAYFcygAAAA41BQAACAcSgoAADAOBQUAABgHAoKAAAwDgUFAAAYJ6iWGffG+eSP89kdI+yNq2+zOwIAIMQxgwIAAIxDQQEAAMahoAAAAONQUAAAgHEoKAAAwDgUFAAAYJygWmY8dmK7Iq/rtjtG2NtX+6zdEQAAIY4ZFAAAYBwKCgAAMA4FBQAAGIeCAgAAjENBAQAAxqGgAAAA41BQAACAcYLqOiiZiacVMzbG7hgAAGCUMYMCAACMQ0EBAADGoaAAAADjUFAAAIBxKCgAAMA4FBQAAGCcoFpmXJx6SAnxkXbHAAAAo8zyDMqBAweUm5uriRMnyuFwqKKiYthjqqqqNHfuXDmdTk2bNk2lpaWB5gUAAGHAckHp6OhQZmamXnrppSsa/9lnn2nZsmW68847VVdXp7Vr1+qhhx7Svn37AskLAADCgOWPeHJycpSTk3PF47dt26apU6dq06ZNkqSZM2fq4MGD2rx5s5YuXWr14QEAQBgY9ZNkq6urtXjx4gH7li5dqurq6kGP6erqktfrHbABAIDwMeoFpaWlRUlJSQP2JSUlyev16rvvvrvsMR6PRy6Xq39LTU0d7ZgAAMAgRi4zLioqUltbW//W1NRkdyQAAHANjfoy4+TkZLW2tg7Y19raqoSEBMXFxV32GKfTKafTOdrRAACAoUa9oLjdbu3Zs2fAvsrKSrndbsv39cL/nSlnZ/SQYzYkW75bAABgGMsf8bS3t6uurk51dXVS3zLiuro6NTY2Sn0fzxQUFPSPf+SRR3Tq1Ck98cQT+te//qWXX35Zb775ph577LGr+TwAAEAIsVxQampqlJWVpaysLEnSunXrlJWVpaeeekqS1Nzc3F9WJGnq1Kn6+9//rsrKSmVmZmrTpk169dVXWWIMAAAG5fD7/X67QwzH6/XK5XLp8UPL5Bw7zEc8c3Zds1wAAGBwF96/29ralJCQYOlYI1fxAACA8EZBAQAAxqGgAAAA44z6MuOr6c2qbEXExg45ZsOcaxYHAACMEmZQAACAcSgoAADAOBQUAABgHAoKAAAwDgUFAAAYJyhW8Vy42K2vs3PYsV6v9xokAgAAw7nwnhzIReuD4lL3p06d0ve+9z27YwAAgAB8+umnuummmywdExQzKOPHj5ckNTY2yuVy2R0nrHm9XqWmpqqpqcny9yrg6uK1MAevhVl4PczR1tamyZMn97+PWxEUBSUi4n9OlXG5XPxjM0RCQgKvhSF4LczBa2EWXg9zXHgft3TMqCQBAAAYAQoKAAAwTlAUFKfTqaefflpOp9PuKGGP18IcvBbm4LUwC6+HOUbyWgTFKh4AABBegmIGBQAAhBcKCgAAMA4FBQAAGIeCAgAAjGN0QTlw4IByc3M1ceJEORwOVVRU2B0pbHk8Hs2fP1/x8fFKTExUXl6eTpw4YXessFRcXKyMjIz+i1C53W7t3bvX7liQtHHjRjkcDq1du9buKGHnmWeekcPhGLDNmDHD7lhh6/Tp0/rJT36iG264QXFxcZozZ45qamos3YfRBaWjo0OZmZl66aWX7I4S9vbv36/CwkIdPnxYlZWV6u7u1j333KOOjg67o4WdlJQUbdy4UbW1taqpqdFdd92l5cuX69ixY3ZHC2tHjhzR9u3blZGRYXeUsDV79mw1Nzf3bwcPHrQ7Ulg6e/asFixYoOjoaO3du1fHjx/Xpk2bNG7cOEv3Y/Sl7nNycpSTk2N3DEh69913B9wuLS1VYmKiamtrdccdd9iWKxzl5uYOuP3888+ruLhYhw8f1uzZs23LFc7a29u1cuVKvfLKK/rNb35jd5ywFRUVpeTkZLtjhL3f/va3Sk1N1Y4dO/r3TZ061fL9GD2DAnO1tbVJF32RI+zR29ur8vJydXR0yO122x0nbBUWFmrZsmVavHix3VHC2r///W9NnDhRN910k1auXKnGxka7I4Wlv/71r7r11luVn5+vxMREZWVl6ZVXXrF8P0bPoMBMPp9Pa9eu1YIFC5Senm53nLBUX18vt9utzs5OjR07Vrt27dKsWbPsjhWWysvLdfToUR05csTuKGHttttuU2lpqdLS0tTc3Kxnn31WP/jBD9TQ0KD4+Hi744WVU6dOqbi4WOvWrdP69et15MgR/eIXv1BMTIxWrVp1xfdDQYFlhYWFamho4PNdG6Wlpamurk5tbW166623tGrVKu3fv5+Sco01NTVpzZo1qqysVGxsrN1xwtrFpwNkZGTotttu05QpU/Tmm2/qwQcftDVbuPH5fLr11lv1wgsvSJKysrLU0NCgbdu2WSoofMQDS1avXq133nlHH374oVJSUuyOE7ZiYmI0bdo0zZs3Tx6PR5mZmdqyZYvdscJObW2tzpw5o7lz5yoqKkpRUVHav3+/fv/73ysqKkq9vb12Rwxb119/vW6++WadPHnS7ihhZ8KECZf8sTRz5kzLH7kxg4Ir4vf79fOf/1y7du1SVVVVQCc8YfT4fD51dXXZHSPs3H333aqvrx+w74EHHtCMGTP061//WpGRkbZlC3ft7e369NNP9dOf/tTuKGFnwYIFl1yG4pNPPtGUKVMs3Y/RBaW9vX1A+/3ss89UV1en8ePHa/LkybZmCzeFhYUqKyvT7t27FR8fr5aWFkmSy+VSXFyc3fHCSlFRkXJycjR58mSdO3dOZWVlqqqq0r59++yOFnbi4+MvOQ9rzJgxuuGGGzg/6xr71a9+pdzcXE2ZMkVfffWVnn76aUVGRmrFihV2Rws7jz32mLKzs/XCCy/oxz/+sf75z3+qpKREJSUl1u7Ib7APP/zQL+mSbdWqVXZHCzuXex0k+Xfs2GF3tLDzs5/9zD9lyhR/TEyM/8Ybb/Tffffd/vfee8/uWOizcOFC/5o1a+yOEXbuu+8+/4QJE/wxMTH+SZMm+e+77z7/yZMn7Y4Vtv72t7/509PT/U6n0z9jxgx/SUmJ5ftw+P/nzQcAAMAYnCQLAACMQ0EBAADGoaAAAADjUFAAAIBxKCgAAMA4FBQAAGAcCgoAADAOBQUAABiHggIAAIxDQQEAAMahoAAAAONQUAAAgHH+H+VXw4oAI7ElAAAAAElFTkSuQmCC",
- "text/plain": [
- "<Figure size 640x480 with 1 Axes>"
- ]
- },
- "metadata": {},
- "output_type": "display_data"
- }
- ],
+ "outputs": [],
"source": [
"batch_src, batch_labels, batch_padding_mask = mktunebatch(BSZ)\n",
"model.eval()\n",
@@ -1340,51 +1589,30 @@
},
{
"cell_type": "code",
- "execution_count": 28,
+ "execution_count": null,
+ "execution_state": "idle",
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "batch_src, batch_labels, batch_padding_mask = mktunebatch(BSZ, test=True)\n",
+ "model.eval()\n",
+ "with torch.no_grad():\n",
+ " output = model(batch_src, batch_padding_mask)\n",
+ "print(criterion(output.squeeze(1), batch_labels).item())\n",
+ "x = batch_labels.detach().to(torch.float16).cpu().numpy().flatten()\n",
+ "y = output.detach().to(torch.float16).cpu().numpy().flatten()\n",
+ "plt.hist2d(x, y, bins=50, norm=mpl.colors.LogNorm())"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 14,
"execution_state": "idle",
"metadata": {},
"outputs": [
{
- "name": "stdout",
- "output_type": "stream",
- "text": [
- "0.1767578125\n"
- ]
- },
- {
"data": {
- "text/plain": [
- "(array([[ 241., 824., 9690., ..., 0., 0., 0.],\n",
- " [ 0., 0., 0., ..., 0., 0., 0.],\n",
- " [ 0., 0., 0., ..., 0., 0., 0.],\n",
- " ...,\n",
- " [ 0., 0., 0., ..., 0., 0., 0.],\n",
- " [ 0., 0., 0., ..., 0., 0., 0.],\n",
- " [ 0., 0., 0., ..., 0., 0., 0.]]),\n",
- " array([ 1. , 1.18 , 1.36 , 1.54 , 1.721, 1.9 , 2.08 , 2.262,\n",
- " 2.441, 2.621, 2.8 , 2.98 , 3.16 , 3.34 , 3.521, 3.701,\n",
- " 3.88 , 4.062, 4.242, 4.42 , 4.6 , 4.78 , 4.96 , 5.14 ,\n",
- " 5.32 , 5.5 , 5.68 , 5.863, 6.043, 6.223, 6.402, 6.582,\n",
- " 6.76 , 6.94 , 7.12 , 7.3 , 7.48 , 7.66 , 7.844, 8.02 ,\n",
- " 8.2 , 8.38 , 8.56 , 8.74 , 8.92 , 9.1 , 9.28 , 9.46 ,\n",
- " 9.64 , 9.82 , 10. ], dtype=float16),\n",
- " array([0.7344, 0.818 , 0.9014, 0.9844, 1.068 , 1.151 , 1.234 , 1.318 ,\n",
- " 1.402 , 1.485 , 1.568 , 1.652 , 1.735 , 1.819 , 1.902 , 1.986 ,\n",
- " 2.07 , 2.152 , 2.236 , 2.32 , 2.402 , 2.486 , 2.57 , 2.652 ,\n",
- " 2.736 , 2.82 , 2.904 , 2.986 , 3.07 , 3.154 , 3.238 , 3.32 ,\n",
- " 3.404 , 3.488 , 3.57 , 3.654 , 3.738 , 3.822 , 3.904 , 3.988 ,\n",
- " 4.07 , 4.156 , 4.24 , 4.32 , 4.406 , 4.49 , 4.57 , 4.656 ,\n",
- " 4.74 , 4.824 , 4.906 ], dtype=float16),\n",
- " <matplotlib.collections.QuadMesh at 0x7fe607ee0110>)"
- ]
- },
- "execution_count": 28,
- "metadata": {},
- "output_type": "execute_result"
- },
- {
- "data": {
- "image/png": "iVBORw0KGgoAAAANSUhEUgAAAiwAAAGdCAYAAAAxCSikAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjkuMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8hTgPZAAAACXBIWXMAAA9hAAAPYQGoP6dpAAApjElEQVR4nO3dfXBU12H38d9qJa1kkGRwrRdAYDBYvAoDdu0VqSEBjBWGQdNn1JShFQnGM2nFFFktTpTUcWLiiNihlBTCi99o7CjEOAVa15goOIKhCEdg1ArSEGP7QdiWRJsCi4QRYvc+f9RWogcEWu1qz7m738/M/WOvztX+roW1P92956zHcRxHAAAAFksyHQAAAOBmKCwAAMB6FBYAAGA9CgsAALAehQUAAFiPwgIAAKxHYQEAANajsAAAAOslmw7QF6FQSB999JEyMjLk8XhMxwEAAH3gOI4uXryoYcOGKSkpsmskrigsH330kfLz803HAAAA/XDmzBmNGDEiou/hisKSkZEhfXLCmZmZpuMAcev/THu8T+N+emz1gGeJxKKssj6N233hhwOeBUhkgUBA+fn53a/jkXBFYfn0baDMzEwKCzCAkpN8fRpn+/+HyZ6UPo2z/TyAeBGN2zm46RYAAFiPwgIAAKxHYQEAANajsAAAAOu54qZbuNO8pNI+jasN7RjwLP1VnFfep3F7WjYOeBb0nc3/pgD0D1dYAACA9SgsAADAehQWAABgPQoLAACwHoUFAABYj8ICAACsR2EBAADWYx0WDJh4WAvDuX2I6Qgx1XnHUNMRooL1c4D4wxUWAABgPQoLAACwHoUFAABYj8ICAACsR2EBAADWo7AAAADrMa0ZA2ZeUmmfxtk8/dnzX+dMR4ip1LZ20xGiY/Ag0wliJl6mcMfD7wsMLK6wAAAA61FYAACA9SgsAADAehQWAABgPQoLAACwHoUFAABYj2nNGDDOZ+42HSFil+65w3SEmLp6a7rpCAhTon2iOBIXV1gAAID1Iiosa9askcfjUUVFRa9jtm3bJo/H02NLS0uL5GkBAECC6fdbQg0NDdqyZYsKCwtvOjYzM1MnT57sfuzxePr7tAAAIAH16wpLe3u7lixZomeffVZDhtz8/VOPx6Pc3NzuLScnpz9PCwAAElS/Ckt5ebkWLFiguXPn9ml8e3u7Ro0apfz8fC1atEgnTpzoz9MCAIAEFfZbQtu3b9fbb7+thoaGPo0vKCjQCy+8oMLCQl24cEHf+973VFRUpBMnTmjEiBHXPaazs1OdnZ3djwOBQLgxAQBAHAnrCsuZM2e0cuVK/ehHP+rzjbN+v19lZWW6++67NWvWLP3TP/2Tbr/9dm3ZsqXXY6qrq5WVldW95efnhxMTAADEmbCusBw9elRnz57V9OnTu/cFg0EdOHBAGzZsUGdnp7xe7w2/R0pKiqZNm6ZTp071OqaqqkqVlZXdjwOBAKXFhRyv+2fNXxns/nMIx+U/8JmOEBWddww1HSFm4mXtnNrQDtMRYLmwCsucOXPU1NTUY9+XvvQljR8/Xl/5ylduWlb0ScFpamrS5z//+V7H+Hw++Xzx8YsTAABELqzCkpGRocmTJ/fYN2jQIN12223d+8vKyjR8+HBVV1dLkp588kndf//9Gjt2rM6fP69nnnlGp0+f1vLly6N5HgAAII5FfWn+5uZmJSX97jL6uXPn9Mgjj6i1tVVDhgzRjBkzdOjQIU2cODHaTw0AAOJUxIWlrq7uho/XrVundevWRfo0AAAggSXWHYUAAMCVKCwAAMB6Ub+HBfiUk8xnRrnN1Vvi42+YlHOXTUeImZR3PjIdAWGal1Tap3FM9e4pPn47AQCAuEZhAQAA1qOwAAAA61FYAACA9SgsAADAehQWAABgPQoLAACwHuuwxFCizb0PjEw1HSFil3ISq9N/PDQ+1s7pGDXYdISYuVw40nSEqEik34/xcA4mJNZvYwAA4EoUFgAAYD0KCwAAsB6FBQAAWI/CAgAArEdhAQAA1mNacwwl2lS2rsHunyIbdP/M7PC4/0cmSfJecUxHiBlfy0XTEYCY4AoLAACwHoUFAABYj8ICAACsR2EBAADWo7AAAADrUVgAAID1mNaMgRMHM0udRKv0cfAzk6SkKyHTEWLm6q3ppiNERSIt+5BIn0wdTYn26xgAALgQhQUAAFiPwgIAAKxHYQEAANajsAAAAOtRWAAAgPUoLAAAwHqswxJDxeMe69O4Pe88PeBZYqFzqOkEkevKMJ0gtq7Gx5Ie6shLMR0hZkI+r+kICFNyTrbpCK7EFRYAAGC9iArLmjVr5PF4VFFRccNxO3bs0Pjx45WWlqYpU6bo9ddfj+RpAQBAgul3YWloaNCWLVtUWFh4w3GHDh3S4sWL9fDDD+vYsWMqKSlRSUmJjh8/3t+nBgAACaZfhaW9vV1LlizRs88+qyFDhtxw7Pr16/XQQw9p1apVmjBhglavXq3p06drw4YN/c0MAAASTL8KS3l5uRYsWKC5c+fedGx9ff014+bPn6/6+vpej+ns7FQgEOixAQCAxBX2LKHt27fr7bffVkNDQ5/Gt7a2Kicnp8e+nJwctba29npMdXW1vvWtb4UbDQAAxKmwCsuZM2e0cuVK1dbWKi0tbcBCVVVVqbKysvtxIBBQfn7+gD1frDjpqaYjxFTQZzpB5II+x3SEmHLiZDZwMMVjOkLMOEnxca7FeeV9GrenZeOAZxloV9vOmo7gSmEVlqNHj+rs2bOaPn16975gMKgDBw5ow4YN6uzslNfbc02A3NxctbW19djX1tam3NzcXp/H5/PJ54uDVzsAABAVYd3DMmfOHDU1NamxsbF7u+eee7RkyRI1NjZeU1Ykye/3a9++fT321dbWyu/3R54eAAAkhLCusGRkZGjy5Mk99g0aNEi33XZb9/6ysjINHz5c1dXVkqSVK1dq1qxZWrt2rRYsWKDt27fryJEj2rp1azTPAwAAxLGor3Tb3NyslpaW7sdFRUWqqanR1q1bNXXqVL366qvatWvXNcUHAACgNxF/llBdXd0NH0tSaWmpSktLI30qAACQoPgsIQAAYD0+rTmGQoMSa+ZTMN39U4I97j+FsDj8CeM6qb/92HQEhKk2tMN0BFfi1xMAALAehQUAAFiPwgIAAKxHYQEAANajsAAAAOtRWAAAgPUoLAAAwHqswxJDSR2dpiPElON1/yImoRTTCWLLc9V0gujwXnH/v72+6hqSZjpCVFxtO2s6QswU55X3adyelo0DnsVNuMICAACsR2EBAADWo7AAAADrUVgAAID1KCwAAMB6FBYAAGA9pjXHUPvYW01HiKnQoKDpCBEL+UKmI8RUyGc6QXQ4XtMJYie5vct0hKioDe0wHSFm4mW68ryk0puOuepE798nV1gAAID1KCwAAMB6FBYAAGA9CgsAALAehQUAAFiPwgIAAKxHYQEAANZjHZYYCiV7TEeIKU+K+9cw8QQT62eW3GE6QXQkX3ZMR4gZ7/9tNR0hKvqypocSbL0W2/XlZxEIBJSVlRWV5+MKCwAAsB6FBQAAWI/CAgAArEdhAQAA1qOwAAAA61FYAACA9ZjWHEOXshOrHyYlu39as+NNnOmxkqQEO914ELwj13QEICYS6xUUAAC4UliFZdOmTSosLFRmZqYyMzPl9/u1Z8+eXsdv27ZNHo+nx5aWlhaN3AAAIIGE9ZbQiBEjtGbNGo0bN06O4+gf//EftWjRIh07dkyTJk267jGZmZk6efJk92OPJ7FWDgUAAJELq7AsXLiwx+OnnnpKmzZt0uHDh3stLB6PR7m5vMcKAAD6r9/3sASDQW3fvl0dHR3y+/29jmtvb9eoUaOUn5+vRYsW6cSJEzf93p2dnQoEAj02AACQuMIuLE1NTRo8eLB8Pp++/OUva+fOnZo4ceJ1xxYUFOiFF17Q7t279fLLLysUCqmoqEgffPDBDZ+jurpaWVlZ3Vt+fn64MQEAQBwJe1pzQUGBGhsbdeHCBb366qtaunSp9u/ff93S4vf7e1x9KSoq0oQJE7RlyxatXr261+eoqqpSZWVl9+NAIBAXpeXqLaYTxFZyatB0BITJ22k6QXSkXnT/lPq++ln946YjADERdmFJTU3V2LFjJUkzZsxQQ0OD1q9fry1bttz02JSUFE2bNk2nTp264TifzyefzxduNAAAEKciXoclFAqps7Nvf5YFg0E1NTUpLy8v0qcFAAAJJKwrLFVVVSouLtbIkSN18eJF1dTUqK6uTnv37pUklZWVafjw4aqurpYkPfnkk7r//vs1duxYnT9/Xs8884xOnz6t5cuXD8zZAACAuBRWYTl79qzKysrU0tKirKwsFRYWau/evZo3b54kqbm5WUlJv7toc+7cOT3yyCNqbW3VkCFDNGPGDB06dKjXm3QBAACuJ6zC8vzzz9/w63V1dT0er1u3TuvWretfMgAAgE/wWUIAAMB6FBYAAGC9sKc1m7Qoq0zJnpQbjqkN7YhZnnAFE2ymthNy/+dGea66/xzCkf7b+Fi/xHs5cdYAmpdU2qdxNv9uBPqCKywAAMB6FBYAAGA9CgsAALAehQUAAFiPwgIAAKxHYQEAANZz1bTm5OzblZyUajpGv4XcG71fHMd0gsh5P06sTh9KTqxp3PEgOSfbdISoYNo1biaxfhsDAABXorAAAADrUVgAAID1KCwAAMB6FBYAAGA9CgsAALCeq6Y1a/AtUpJ7P/K4a3AczPMNQ0pK4nxibtxgVrP7DB5kOgEQE1xhAQAA1qOwAAAA61FYAACA9SgsAADAehQWAABgPQoLAACwHoUFAABYz1XrsFx977TkSTEdo988IdMJYutKp3t/Vp9KvmQ6QWx5r8THWkEp5y6bjhAze9552nQEICa4wgIAAKxHYQEAANajsAAAAOtRWAAAgPUoLAAAwHoUFgAAYD1XTWt2u2B6gs1rjgNJXaYTxJb3cnxMa76a4TMdAWGal1Tap3G1oR0DngV24goLAACwXliFZdOmTSosLFRmZqYyMzPl9/u1Z8+eGx6zY8cOjR8/XmlpaZoyZYpef/31SDMDAIAEE1ZhGTFihNasWaOjR4/qyJEj+tznPqdFixbpxIkT1x1/6NAhLV68WA8//LCOHTumkpISlZSU6Pjx49HKDwAAEkBYhWXhwoX6/Oc/r3Hjxumuu+7SU089pcGDB+vw4cPXHb9+/Xo99NBDWrVqlSZMmKDVq1dr+vTp2rBhQ7TyAwCABNDve1iCwaC2b9+ujo4O+f3+646pr6/X3Llze+ybP3++6uvr+/u0AAAgAYU9S6ipqUl+v1+XL1/W4MGDtXPnTk2cOPG6Y1tbW5WTk9NjX05OjlpbW2/4HJ2dners7Ox+HAgEwo0JAADiSNhXWAoKCtTY2Ki33npLf/EXf6GlS5fqV7/6VVRDVVdXKysrq3vLz8+P6vcHAADuEvYVltTUVI0dO1aSNGPGDDU0NGj9+vXasmXLNWNzc3PV1tbWY19bW5tyc3Nv+BxVVVWqrKzsfhwIBOKjtNwSNJ0gpjxJ7l93xkmwif+O12M6AsLE+iVIFBH/Og6FQj3evvl9fr9f+/bt67Gvtra213tePuXz+bqnTn+6AQCAxBXWFZaqqioVFxdr5MiRunjxompqalRXV6e9e/dKksrKyjR8+HBVV1dLklauXKlZs2Zp7dq1WrBggbZv364jR45o69atA3M2AAAgLoVVWM6ePauysjK1tLQoKytLhYWF2rt3r+bNmydJam5uVlLS7y7aFBUVqaamRn/7t3+rr33taxo3bpx27dqlyZMnR/9MAABA3AqrsDz//PM3/HpdXd01+0pLS1Va2rf3WAEAAK4nwW4pBAAAbkRhAQAA1gt7WrNJSXdPUJLXxR8b73VMJ4iprvZU0xEi5r1sOkFsJXfEx9T75IvXn7kYj5Jzsk1HAGKCKywAAMB6FBYAAGA9CgsAALAehQUAAFiPwgIAAKxHYQEAANZz1bRmt/OmuP/Ti8MScv8n/zoJ9n9I12Cv6QhR4aTEx3n0xZ6WjaYjRAWfJo2b4QoLAACwHoUFAABYj8ICAACsR2EBAADWo7AAAADrUVgAAID1KCwAAMB6rlpl4tKwW5SckmY6Rr/50q6YjoBwJdrSOanuXztHkjxdQdMRAEQZV1gAAID1KCwAAMB6FBYAAGA9CgsAALAehQUAAFiPwgIAAKznqmnNTpJHTpJ7p1163Bu9X5I+dn8fTv7YdILY8v3PVdMRouLKbemmIwCIMve/ogAAgLhHYQEAANajsAAAAOtRWAAAgPUoLAAAwHoUFgAAYD0KCwAAsJ6r1mEZ9EGHkr3uXSciJTmxPvLeE4qDhWcSrNJ3DfaajhAVae/91nQEAFGWYL+OAQCAG4VVWKqrq3XvvfcqIyND2dnZKikp0cmTJ294zLZt2+TxeHpsaWlpkeYGAAAJJKzCsn//fpWXl+vw4cOqra1VV1eXHnzwQXV0dNzwuMzMTLW0tHRvp0+fjjQ3AABIIGHdw/LGG2/0eLxt2zZlZ2fr6NGjeuCBB3o9zuPxKDc3t/8pAQBAQovoHpYLFy5IkoYOHXrDce3t7Ro1apTy8/O1aNEinThx4objOzs7FQgEemwAACBx9buwhEIhVVRUaObMmZo8eXKv4woKCvTCCy9o9+7devnllxUKhVRUVKQPPvig12Oqq6uVlZXVveXn5/c3JgAAiAP9ntZcXl6u48eP6+DBgzcc5/f75ff7ux8XFRVpwoQJ2rJli1avXn3dY6qqqlRZWdn9OBAIKD8/XxfuypA31b037GamXTYdIaacJMd0hIiFEmwenfdKyHSEqLj67vumI8TMvKTSPo2rDe0Y8CzAQOpXYVmxYoVee+01HThwQCNGjAjr2JSUFE2bNk2nTp3qdYzP55PP5+tPNAAAEIfC+vvRcRytWLFCO3fu1JtvvqnRo0eH/YTBYFBNTU3Ky8sL+1gAAJCYwrrCUl5erpqaGu3evVsZGRlqbW2VJGVlZSk9PV2SVFZWpuHDh6u6ulqS9OSTT+r+++/X2LFjdf78eT3zzDM6ffq0li9fPhDnAwAA4lBYhWXTpk2SpNmzZ/fY/+KLL+qLX/yiJKm5uVlJSb+7cHPu3Dk98sgjam1t1ZAhQzRjxgwdOnRIEydOjM4ZAACAuBdWYXGcm99EWVdX1+PxunXrtG7duvCTAQAAfCLB5kAAAAA3orAAAADr9XsdFhOCqZJSTafovyS5f12ScHiuekxHiJjjqv9DItc12Gs6QlQk52SbjhAziXSuSGxcYQEAANajsAAAAOtRWAAAgPUoLAAAwHoUFgAAYD0KCwAAsJ6rJm16r0hunnTpS75qOkJMeeJgFre303SC2Eq9GDQdAWHa07LRdISomJdU2qdxtaEdA54FduIKCwAAsB6FBQAAWI/CAgAArEdhAQAA1qOwAAAA61FYAACA9Vw1rfnS7R55fe79BOBbUy+bjhBTThzU4ZCb59H3Q9egOPihSdLgQaYTIExMV8bNxMlvJwAAEM8oLAAAwHoUFgAAYD0KCwAAsB6FBQAAWI/CAgAArEdhAQAA1nPVOixOUnys7ZEwHNMBEK7U80HTEaLi6rvvm44QM/OSSvs0jnVO4Ha8/AMAAOtRWAAAgPUoLAAAwHoUFgAAYD0KCwAAsB6FBQAAWM9V05qv3Cp500yn6L9s30XTEWLKEwfTmuPhHMLhpHhMR0CYku8cbToCEBNcYQEAANYLq7BUV1fr3nvvVUZGhrKzs1VSUqKTJ0/e9LgdO3Zo/PjxSktL05QpU/T6669HkhkAACSYsArL/v37VV5ersOHD6u2tlZdXV168MEH1dHR0esxhw4d0uLFi/Xwww/r2LFjKikpUUlJiY4fPx6N/AAAIAGEdQ/LG2+80ePxtm3blJ2draNHj+qBBx647jHr16/XQw89pFWrVkmSVq9erdraWm3YsEGbN2+OJDsAAEgQEd3DcuHCBUnS0KFDex1TX1+vuXPn9tg3f/581dfX93pMZ2enAoFAjw0AACSufheWUCikiooKzZw5U5MnT+51XGtrq3Jycnrsy8nJUWtra6/HVFdXKysrq3vLz8/vb0wAABAH+j2tuby8XMePH9fBgwejm0hSVVWVKisrux8HAgHl5+crlOpIqe6dZ9oeTDUdIaaSOt0/RdYTHx9e3GfeyyHTEaLCO7nAdISY2fPO06YjADHRr8KyYsUKvfbaazpw4IBGjBhxw7G5ublqa2vrsa+trU25ubm9HuPz+eTz+foTDQAAxKGw3hJyHEcrVqzQzp079eabb2r06JsvWOT3+7Vv374e+2pra+X3+8NPCwAAElJYV1jKy8tVU1Oj3bt3KyMjo/s+lKysLKWnp0uSysrKNHz4cFVXV0uSVq5cqVmzZmnt2rVasGCBtm/friNHjmjr1q0DcT4AACAOhXWFZdOmTbpw4YJmz56tvLy87u0nP/lJ95jm5ma1tLR0Py4qKlJNTY22bt2qqVOn6tVXX9WuXbtueKMuAADA7wvrCovj3PyG17q6umv2lZaWqrS0NLxkAAAAn+CzhAAAgPUoLAAAwHr9XofFhJDPkdLcuw7Lu4E/MB0htty/DIuSEmwdlnjxxn9823SEmJmX1Le322tDOwY8SyTi5TwwcLjCAgAArEdhAQAA1qOwAAAA61FYAACA9SgsAADAehQWAABgPVdNa3Z8ITm+kOkY/fbfFweZjhBTSV2mE0QuHs4hHGnv/dZ0hKhIpCmy8XAOiqPzwMDhCgsAALAehQUAAFiPwgIAAKxHYQEAANajsAAAAOtRWAAAgPUoLAAAwHquWodFacH/3Vzq8qVU0xFiyzEdIHLp/+3ef2/90XnHUNMRAOC6uMICAACsR2EBAADWo7AAAADrUVgAAID1KCwAAMB6FBYAAGA9V01rTknvkvcWr+kY/Rb82FX/uSPmvWI6QeSCaR7TEWLKczUO5qIDiEtcYQEAANajsAAAAOtRWAAAgPUoLAAAwHoUFgAAYD0KCwAAsJ6r5tkmeRwledw77dJzNbH6oafLdILIOYk1q1mpv/7AdISoqA3tMB0BQJQl1isoAABwpbALy4EDB7Rw4UINGzZMHo9Hu3btuuH4uro6eTyea7bW1tZIcgMAgAQSdmHp6OjQ1KlTtXHjxrCOO3nypFpaWrq37OzscJ8aAAAkqLDvYSkuLlZxcXHYT5Sdna1bb7017OMAAABidg/L3Xffrby8PM2bN0//9m//dsOxnZ2dCgQCPTYAAJC4Bryw5OXlafPmzfrpT3+qn/70p8rPz9fs2bP19ttv93pMdXW1srKyurf8/PyBjgkAACw24NOaCwoKVFBQ0P24qKhI7777rtatW6eXXnrpusdUVVWpsrKy+3EgEKC0AACQwIysw/KHf/iHOnjwYK9f9/l88vl81+y/Ja1L3jT3zsT2dCXWoh7eOFiHJfO9S6YjoB+K88r7NG5PS3iTBwCYY+TVv7GxUXl5eSaeGgAAuFDYV1ja29t16tSp7sfvv/++GhsbNXToUI0cOVJVVVX68MMP9cMf/lCS9Pd///caPXq0Jk2apMuXL+u5557Tm2++qZ/97GfRPRMAABC3wi4sR44c0Wc/+9nux5/ea7J06VJt27ZNLS0tam5u7v76lStX9Nd//df68MMPdcstt6iwsFA///nPe3wPAACAGwm7sMyePVuO0/vn+Wzbtq3H48cee0yPPfZY/9IBAADwWUIAAMANKCwAAMB6RqY191eq96qSk72mY/RbUmdiTWtODfT+1qFbdA65dnp9PLvadtZ0hKiIl/MA8DtcYQEAANajsAAAAOtRWAAAgPUoLAAAwHoUFgAAYD0KCwAAsB6FBQAAWM9V67D4vFeV7HXvOiwp7Ym1DsugtqDpCBFLO3vJdAT0Q21oh+kIAKKMKywAAMB6FBYAAGA9CgsAALAehQUAAFiPwgIAAKxHYQEAANZz1bTmFG9QyV73TpVNbjedILaCqe6fxr33yDdNR4gp7+QC0xEA4Lq4wgIAAKxHYQEAANajsAAAAOtRWAAAgPUoLAAAwHoUFgAAYD1XTWsuvPUj+QanmI7Rb8mXTSeILd//dJmOELHicY/1adyed54e8Cyx8MZ/fNt0BAC4Lq6wAAAA61FYAACA9SgsAADAehQWAABgPQoLAACwHoUFAABYj8ICAACs56p1WL6V3aTMDK/pGP02+KOg6QgxlfrrD0xHiFi8rK8CAG7HFRYAAGC9sAvLgQMHtHDhQg0bNkwej0e7du266TF1dXWaPn26fD6fxo4dq23btvU3LwAASEBhF5aOjg5NnTpVGzdu7NP4999/XwsWLNBnP/tZNTY2qqKiQsuXL9fevXv7kxcAACSgsO9hKS4uVnFxcZ/Hb968WaNHj9batWslSRMmTNDBgwe1bt06zZ8/P9ynBwAACWjA72Gpr6/X3Llze+ybP3++6uvrez2ms7NTgUCgxwYAABLXgBeW1tZW5eTk9NiXk5OjQCCgjz/++LrHVFdXKysrq3vLz88f6JgAAMBiVk5rrqqqUmVlZffjQCCg/Px8PR8YpvTQjSP/ZW4MAvZT2n91mo4QU1fbzpqOELF5SaV9Glcb2jHgWQAgkQ14YcnNzVVbW1uPfW1tbcrMzFR6evp1j/H5fPL5fAMdDQAAuMSAvyXk9/u1b9++Hvtqa2vl9/sH+qkBAECcCLuwtLe3q7GxUY2NjdIn05YbGxvV3NwsffJ2TllZWff4L3/5y3rvvff02GOP6de//rV+8IMf6JVXXtGjjz4azfMAAABxLOzCcuTIEU2bNk3Tpk2TJFVWVmratGn6xje+IUlqaWnpLi+SNHr0aP3rv/6ramtrNXXqVK1du1bPPfccU5oBAECfhX0Py+zZs+U4Tq9fv94qtrNnz9axY8fCTwcAAMBnCQEAADewclpzb37wqwfkvSXthmP+siBmccKW8s5HpiPEVDxM9Y2HcwCAeMAVFgAAYD0KCwAAsB6FBQAAWI/CAgAArEdhAQAA1qOwAAAA61FYAACA9Vy1Dkuw5RY5aTdeh8VmV9vOmo4AAIArcYUFAABYj8ICAACsR2EBAADWo7AAAADrUVgAAID1XDFLyHEcSVLo8uWbjg0EAjFI1D9Xna4+jbP5HAAA6KtPX88+fR2PhMeJxncZYO+9957uvPNO0zEAAEA/vPvuuxozZkxE38MVV1iGDh0qSWpublZWVpbpOFEVCASUn5+vM2fOKDMz03ScqOLc3IlzcyfOzb3i+fwuXLigkSNHdr+OR8IVhSUp6X9vtcnKyoq7H+anMjMzOTcX4tzciXNzp3g+N8X5+X36Oh7R94hKEgAAgAFEYQEAANZzRWHx+Xx64okn5PP5TEeJOs7NnTg3d+Lc3Cmez01xfn7RPDdXzBICAACJzRVXWAAAQGKjsAAAAOtRWAAAgPUoLAAAwHpWF5YDBw5o4cKFGjZsmDwej3bt2mU6UlRUV1fr3nvvVUZGhrKzs1VSUqKTJ0+ajhU1mzZtUmFhYfciSH6/X3v27DEdK+rWrFkjj8ejiooK01Gi4pvf/KY8Hk+Pbfz48aZjRc2HH36oP/uzP9Ntt92m9PR0TZkyRUeOHDEdK2J33HHHNT83j8ej8vJy09EiFgwG9fjjj2v06NFKT0/XnXfeqdWrV0flc2lscPHiRVVUVGjUqFFKT09XUVGRGhoaTMcK281eqx3H0Te+8Q3l5eUpPT1dc+fO1TvvvBP281hdWDo6OjR16lRt3LjRdJSo2r9/v8rLy3X48GHV1taqq6tLDz74oDo6OkxHi4oRI0ZozZo1Onr0qI4cOaLPfe5zWrRokU6cOGE6WtQ0NDRoy5YtKiwsNB0lqiZNmqSWlpbu7eDBg6YjRcW5c+c0c+ZMpaSkaM+ePfrVr36ltWvXasiQIaajRayhoaHHz6y2tlaSVFpaajpaxL773e9q06ZN2rBhg/7zP/9T3/3ud/X000/rH/7hH0xHi4rly5ertrZWL730kpqamvTggw9q7ty5+vDDD01HC8vNXquffvppff/739fmzZv11ltvadCgQZo/f74u9+EDjXtwXEKSs3PnTtMxBsTZs2cdSc7+/ftNRxkwQ4YMcZ577jnTMaLi4sWLzrhx45za2lpn1qxZzsqVK01HioonnnjCmTp1qukYA+IrX/mK85nPfMZ0jJhYuXKlc+eddzqhUMh0lIgtWLDAWbZsWY99f/zHf+wsWbLEWKZouXTpkuP1ep3XXnutx/7p06c7X//6143litT//1odCoWc3Nxc55lnnuned/78ecfn8zk//vGPw/reVl9hSRQXLlyQfu9DHuNJMBjU9u3b1dHRIb/fbzpOVJSXl2vBggWaO3eu6ShR984772jYsGEaM2aMlixZoubmZtORouKf//mfdc8996i0tFTZ2dmaNm2ann32WdOxou7KlSt6+eWXtWzZMnk8HtNxIlZUVKR9+/bpN7/5jSTp3//933Xw4EEVFxebjhaxq1evKhgMKi0trcf+9PT0uLmyKUnvv/++Wltbe/y+zMrK0n333af6+vqwvpcrPvwwnoVCIVVUVGjmzJmaPHmy6ThR09TUJL/fr8uXL2vw4MHauXOnJk6caDpWxLZv3663337ble8z38x9992nbdu2qaCgQC0tLfrWt76lP/qjP9Lx48eVkZFhOl5E3nvvPW3atEmVlZX62te+poaGBv3VX/2VUlNTtXTpUtPxombXrl06f/68vvjFL5qOEhVf/epXFQgENH78eHm9XgWDQT311FNasmSJ6WgRy8jIkN/v1+rVqzVhwgTl5OToxz/+serr6zV27FjT8aKmtbVVkpSTk9Njf05OTvfX+orCYlh5ebmOHz8eV41akgoKCtTY2KgLFy7o1Vdf1dKlS7V//35Xl5YzZ85o5cqVqq2tveavonjw+3+1FhYW6r777tOoUaP0yiuv6OGHHzaaLVKhUEj33HOPvvOd70iSpk2bpuPHj2vz5s1xVVief/55FRcXa9iwYaajRMUrr7yiH/3oR6qpqdGkSZPU2NioiooKDRs2LC5+bi+99JKWLVum4cOHy+v1avr06Vq8eLGOHj1qOpqVeEvIoBUrVui1117TL37xC40YMcJ0nKhKTU3V2LFjNWPGDFVXV2vq1Klav3696VgROXr0qM6ePavp06crOTlZycnJ2r9/v77//e8rOTlZwWDQdMSouvXWW3XXXXfp1KlTpqNELC8v75qyPGHChLh5y0uSTp8+rZ///Odavny56ShRs2rVKn31q1/Vn/7pn2rKlCn68z//cz366KOqrq42HS0q7rzzTu3fv1/t7e06c+aMfvnLX6qrq0tjxowxHS1qcnNzJUltbW099re1tXV/ra8oLAY4jqMVK1Zo586devPNNzV69GjTkQZcKBRSZ2en6RgRmTNnjpqamtTY2Ni93XPPPVqyZIkaGxvl9XpNR4yq9vZ2vfvuu8rLyzMdJWIzZ868ZumA3/zmNxo1apSxTNH24osvKjs7WwsWLDAdJWouXbqkpKSeL1Ner1ehUMhYpoEwaNAg5eXl6dy5c9q7d68WLVpkOlLUjB49Wrm5udq3b1/3vkAgoLfeeivs+xqtfkuovb29x19377//vhobGzV06FCNHDnSaLZIlJeXq6amRrt371ZGRkb3+3hZWVlKT083HS9iVVVVKi4u1siRI3Xx4kXV1NSorq5Oe/fuNR0tIhkZGdfcZzRo0CDddtttcXH/0d/8zd9o4cKFGjVqlD766CM98cQT8nq9Wrx4seloEXv00UdVVFSk73znO/qTP/kT/fKXv9TWrVu1detW09GiIhQK6cUXX9TSpUuVnGz1r/WwLFy4UE899ZRGjhypSZMm6dixY/q7v/s7LVu2zHS0qNi7d68cx1FBQYFOnTqlVatWafz48frSl75kOlpYbvZaXVFRoW9/+9saN26cRo8erccff1zDhg1TSUlJeE8U1flMUfaLX/zCkXTNtnTpUtPRInK9c5LkvPjii6ajRcWyZcucUaNGOampqc7tt9/uzJkzx/nZz35mOtaAiKdpzV/4whecvLw8JzU11Rk+fLjzhS98wTl16pTpWFHzL//yL87kyZMdn8/njB8/3tm6davpSFGzd+9eR5Jz8uRJ01GiKhAIOCtXrnRGjhzppKWlOWPGjHG+/vWvO52dnaajRcVPfvITZ8yYMU5qaqqTm5vrlJeXO+fPnzcdK2w3e60OhULO448/7uTk5Dg+n8+ZM2dOv/6tepx4WTIQAADELe5hAQAA1qOwAAAA61FYAACA9SgsAADAehQWAABgPQoLAACwHoUFAABYj8ICAACsR2EBAADWo7AAAADrUVgAAID1KCwAAMB6/w/wFoD82D83ywAAAABJRU5ErkJggg==",
+ "image/png": "iVBORw0KGgoAAAANSUhEUgAAAjcAAAHgCAYAAABZ+0ykAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjkuMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8hTgPZAAAACXBIWXMAAA9hAAAPYQGoP6dpAABPsklEQVR4nO3dd3hUZf7+8XtCSAGSkFCSIKGX0BUQCM1CE1ABURFQUBF/KCCsnbWAohusq1/ZRViaigEFBVykCoKr9CbVAEqH0ElIgADJ+f3xkIEhEBKY5Ewm79d1zTVzSmY+c8Jubp/zFIdlWZYAAAC8hI/dBQAAALgT4QYAAHgVwg0AAPAqhBsAAOBVCDcAAMCrEG4AAIBXIdwAAACvQrgBAABehXADAAC8CuEGADzcrl275HA49OGHH9pdCpAvEG4Am0ycOFEOh0OrV6+2uxQNGzZMDodDPj4+2rt3b6bjSUlJCgwMlMPh0IABA1yOHTlyRIMGDVJ0dLQCAwNVunRpNWrUSK+88oqSk5Od5z3++ONyOBxXfQQEBOTJ97yWjPBwrceIESNsrQ9AzvjaXQAAz+Hv76/Jkyfr5Zdfdtn//fffX/X848ePq2HDhkpKStKTTz6p6OhoHTt2TBs2bNCoUaP0zDPPqFixYi7vP3bs2EzvU6hQoVz4NjnXvXt3dejQIdP+2267zZZ6ANwYwg0Apw4dOlw13MTFxaljx4767rvvXPaPGzdOe/bs0W+//aamTZu6HEtKSpKfn5/LPl9fXz366KO5+A1uTv369T26PgDZw20pwMOtW7dO7du3V3BwsIoVK6ZWrVpp+fLlmc7bsGGD7rjjDgUGBqps2bJ65513NGHCBDkcDu3atStbn9WjRw+tX79ef/zxh3NfQkKCFi1apB49emQ6/88//1ShQoXUpEmTTMeCg4Pdcrvp/PnzCgsL0xNPPJHpWFJSkgICAvTiiy8693322WeqVauWihQpotDQUDVs2FBxcXE3XUeGChUq6N5779X8+fN16623KiAgQDVr1rxq69Zff/2lhx56SGFhYSpSpIiaNGmiH3/8MdN5Z8+e1bBhw1StWjUFBAQoMjJSDzzwgP78889M544ZM0aVK1eWv7+/br/9dq1atcrleEJCgp544gmVLVtW/v7+ioyMVKdOnbL9bwDwBrTcAB5s8+bNatGihYKDg/Xyyy+rcOHCGj16tO68804tWbJEjRs3liTt379fd911lxwOh4YMGaKiRYtq7Nix8vf3z9HntWzZUmXLllVcXJzefvttSdI333yjYsWKqWPHjpnOL1++vNLS0vTVV1+pd+/e2fqMo0ePZtrn5+en4ODgq55fuHBhdenSRd9//71Gjx7t0ho0Y8YMpaam6pFHHpEk/ec//9Fzzz2nBx98UIMGDdLZs2e1YcMGrVix4qrh7EqnT5++an3FixeXr++l/7vcvn27unXrpn79+ql3796aMGGCHnroIc2dO1dt2rSRJB06dEhNmzbV6dOn9dxzz6lEiRL64osvdP/992vatGnq0qWLJCktLU333nuvFi5cqEceeUSDBg3SqVOntGDBAm3atEmVK1d2fm5cXJxOnTql//f//p8cDofef/99PfDAA/rrr79UuHBhSVLXrl21efNmDRw4UBUqVNDhw4e1YMEC7dmzRxUqVLjuNQC8ggXAFhMmTLAkWatWrbrmOZ07d7b8/PysP//807nvwIEDVlBQkNWyZUvnvoEDB1oOh8Nat26dc9+xY8essLAwS5K1c+fOLGsZOnSoJck6cuSI9eKLL1pVqlRxHrv99tutJ554wrIsy5Jk9e/f33ksISHBKlWqlCXJio6Otvr162fFxcVZJ0+ezPQZvXv3tiRd9dGuXbss65s3b54lyfrvf//rsr9Dhw5WpUqVnNudOnWyatWqleV7Xc3OnTuvWZska9myZc5zy5cvb0myvvvuO+e+xMREKzIy0rrtttuc+wYPHmxJsv73v/859506dcqqWLGiVaFCBSstLc2yLMsaP368Jcn6+OOPM9WVnp7uUl+JEiWs48ePO4/PnDnT5bqcOHHCkmR98MEHOb4GgDfhthTgodLS0jR//nx17txZlSpVcu6PjIxUjx499OuvvyopKUmSNHfuXMXExOjWW291nhcWFqaePXvm+HN79OihHTt2aNWqVc7na7V6hIeH6/fff1e/fv104sQJff755+rRo4dKly6t4cOHy+ShSwICArRgwYJMj+uNRrr77rtVsmRJffPNN859J06c0IIFC9StWzfnvuLFi2vfvn2ZbtVk19NPP33V+mrWrOlyXpkyZZwtL7p4C65Xr15at26dEhISJEmzZ89Wo0aN1Lx5c+d5xYoV09NPP61du3Zpy5YtkqTvvvtOJUuW1MCBAzPV43A4XLa7deum0NBQ53aLFi2ki7e/JCkwMFB+fn5avHixTpw4cUPXAPAG3JYCPNSRI0d0+vRpVa9ePdOxGjVqKD09XXv37lWtWrW0e/duxcTEZDqvSpUqOf7c2267TdHR0YqLi1Px4sUVERGhu++++5rnR0ZGatSoUfr3v/+t7du3a968eXrvvff05ptvKjIyUk899ZTz3EKFCql169Y5rsnX11ddu3ZVXFycUlNT5e/vr++//17nz593CTevvPKKfvrpJzVq1EhVqlRR27Zt1aNHDzVr1ixbn1O1atVs1VelSpVMwaNatWrSxWHlERER2r17t/O24eVq1KghSdq9e7dq166tP//8U9WrV3e57XUt5cqVc9nOCDoZQcbf31/vvfeeXnjhBYWHh6tJkya699571atXL0VERFz3/QFvQcsNgEx69Oihb775RnFxcerWrZt8fK7/fxUOh0PVqlXTwIED9csvv8jHx0dff/2122p65JFHdOrUKc2ZM0eS9O233yo6Olr16tVznlOjRg3Fx8drypQpat68ub777js1b95cQ4cOdVsddrrWkPnLW8gGDx6sbdu2KTY2VgEBAXrjjTdUo0YNrVu3Lg8rBexFuAE8VKlSpVSkSBHFx8dnOvbHH3/Ix8dHUVFR0sWOvTt27Mh03tX2ZUePHj108OBBbdu2LVsdca9UqVIlhYaG6uDBgzf0+VfTsmVLRUZG6ptvvtHRo0e1aNEil1abDEWLFlW3bt00YcIE7dmzRx07dtS7776rs2fPuq2WHTt2ZLrltm3bNuniaCpd/J1c63eXcVySKleurPj4eJ0/f95t9VWuXFkvvPCC5s+fr02bNuncuXP66KOP3Pb+gKcj3AAeqlChQmrbtq1mzpzpMoz30KFDiouLU/PmzZ0jjNq1a6dly5Zp/fr1zvOOHz9+wy0nlStX1ieffKLY2Fg1atTomuetWLFCKSkpmfavXLlSx44du+ottRvl4+OjBx98UP/973/11Vdf6cKFC5nCzbFjx1y2/fz8VLNmTVmW5dbwcODAAU2fPt25nZSUpC+//FK33nqr8/ZPhw4dtHLlSi1btsx5XkpKisaMGaMKFSo4+/F07dpVR48e1ciRIzN9zpUB6npOnz6dKcRVrlxZQUFBSk1NzfH3BPIr+twANhs/frzmzp2baf+gQYP0zjvvaMGCBWrevLmeffZZ+fr6avTo0UpNTdX777/vPPfll1/WpEmT1KZNGw0cONA5FLxcuXI6fvx4pv4h2TFo0KDrnvPVV1/p66+/VpcuXdSgQQP5+flp69atGj9+vAICAvT3v//d5fwLFy5o0qRJV32vLl26qGjRoll+Xrdu3fTZZ59p6NChqlOnjrP/Soa2bdsqIiJCzZo1U3h4uLZu3aqRI0eqY8eOCgoKuu73Wbt27VXrq1y5skufpmrVqqlPnz5atWqVwsPDNX78eB06dEgTJkxwnvPqq69q8uTJat++vZ577jmFhYXpiy++0M6dO/Xdd985b/X16tVLX375pZ5//nmtXLlSLVq0UEpKin766Sc9++yz6tSp03XrzrBt2za1atVKDz/8sGrWrClfX19Nnz5dhw4dcg6XBwoEu4drAQVVxlDwaz327t1rWZZlrV271mrXrp1VrFgxq0iRItZdd91lLV26NNP7rVu3zmrRooXl7+9vlS1b1oqNjbX+7//+z5JkJSQkZFnL5UPBs3LlUPANGzZYL730klW/fn0rLCzM8vX1tSIjI62HHnrIWrt2rcvPZjUUPDvD1a2LQ6OjoqIsSdY777yT6fjo0aOtli1bWiVKlLD8/f2typUrWy+99JKVmJiY5ftebyh47969neeWL1/e6tixozVv3jyrbt26lr+/vxUdHW1NnTo10/v++eef1oMPPmgVL17cCggIsBo1amTNmjUr03mnT5+2XnvtNatixYpW4cKFrYiICOvBBx90TgGQUd/VhnhLsoYOHWpZlmUdPXrU6t+/vxUdHW0VLVrUCgkJsRo3bmx9++231722gDdxWDlt9wSQbwwePFijR49WcnKyx6zflN9VqFBBtWvX1qxZs+wuBcA10OcG8BJnzpxx2T527Ji++uorNW/enGADoEChzw3gJWJiYnTnnXeqRo0aOnTokMaNG6ekpCS98cYbdpcGAHmKcAN4iQ4dOmjatGkaM2aMHA6H6tevr3Hjxqlly5Z2lwYAeYo+NwAAwKvQ5wYAAHgVwg0AAPAqhBsAAOBVCDcAAMCrEG4AAIBXIdwAAACvQrgBAABehXADAAC8CuEGAAB4FcINAADwKoQbAADgVQg3AADAqxBuAACAVyHcAAAAr0K4AQAAXoVwAwAAvArhBgAAeBXCDQAA8CqEGwAA4FUINwAAwKsQbgAAgFch3AAAAK9CuAEAAF6FcAMAALwK4QYAAHgVwg0AAPAqhBsAAOBVfO0uIK+lp6frwIEDCgoKksPhsLscAACQDZZl6dSpUypTpox8fLJumylw4ebAgQOKioqyuwwAAHAD9u7dq7Jly2Z5ToELN0FBQdLFixMcHGx3OQAAIBuSkpIUFRXl/DuelQIXbjJuRQUHBxNuAADIZ7LTpYQOxQAAwKsQbgAAgFch3AAAAK9CuAEAAF6FcAMAALwK4QYAAHgVwg0AAPAqhBsAAOBVCDcAAMCrEG4AAIBXIdwAAACv4jHhZsSIEXI4HBo8ePA1z5k4caIcDofLIyAgIE/rBAAAns0jFs5ctWqVRo8erbp161733ODgYMXHxzu3s7OAVl44d046fFhKS5PKl7e7GgAACi7bW26Sk5PVs2dP/ec//1FoaOh1z3c4HIqIiHA+wsPDszw/NTVVSUlJLo/csHy5FBUltWuXK28PAACyyfZw079/f3Xs2FGtW7fO1vnJyckqX768oqKi1KlTJ23evDnL82NjYxUSEuJ8REVFualyV0WLZtSXK28PAACyydZwM2XKFK1du1axsbHZOr969eoaP368Zs6cqUmTJik9PV1NmzbVvn37rvkzQ4YMUWJiovOxd+9eN36DS4oVM88pKbny9gAAIJts63Ozd+9eDRo0SAsWLMh2p+CYmBjFxMQ4t5s2baoaNWpo9OjRGj58+FV/xt/fX/7+/m6r+1pouQEAwDPYFm7WrFmjw4cPq379+s59aWlp+uWXXzRy5EilpqaqUKFCWb5H4cKFddttt2nHjh15UHHWMlpuLlwwnYv9/OyuCACAgsm2cNOqVStt3LjRZd8TTzyh6OhovfLKK9cNNroYhjZu3KgOHTrkYqXZk9Fyo4utN2FhdlYDAEDBZVu4CQoKUu3atV32FS1aVCVKlHDu79Wrl2655RZnn5y3335bTZo0UZUqVXTy5El98MEH2r17t5566ilbvsPlChc2rTXnzpl+N4QbAADs4RHz3FzLnj175ONzqc/ziRMn1LdvXyUkJCg0NFQNGjTQ0qVLVbNmTVvrzFCsmHT8OP1uAACwk8OyLMvuIvJSUlKSQkJClJiYqODgYLe+d7ly0t690qpVUsOGbn1rAAAKtJz8/bZ9nhtvktGpmJYbAADsQ7hxo4xOxcx1AwCAfQg3bkTLDQAA9iPcuBEtNwAA2I9w40a03AAAYD/CjRvRcgMAgP0IN25Eyw0AAPYj3LgRLTcAANiPcONGtNwAAGA/wo0b0XIDAID9CDduRMsNAAD2I9y4EeEGAAD7EW7ciHADAID9CDdulLFI6alTdlcCAEDBRbhxo6Ag85yUZHclAAAUXIQbN8pouSHcAABgH8KNG2W03KSkSGlpdlcDAEDBRLhxo4yWG9GpGAAA2xBu3MjfX/LzM6+5NQUAgD0IN26WcWuKEVMAANiDcONmdCoGAMBehBs3o+UGAAB7EW7cjJYbAADsRbhxMybyAwDAXoQbN2MJBgAA7EW4cTNuSwEAYC/CjZvRoRgAAHsRbtyMlhsAAOxFuHEzOhQDAGAvwo2bhYWZ5+PH7a4EAICCiXDjZqVLm+fDh+2uBACAgolw42alSplnwg0AAPYg3LjZ5S03lmV3NQAAFDyEGzfLaLk5d45OxQAA2IFw42ZFikjFipnX3JoCACDvEW5yAZ2KAQCwD+EmFxBuAACwD+EmFxBuAACwD+EmFxBuAACwD+EmF2SEm4QEuysBAKDgIdzkgvLlzfPu3XZXAgBAwUO4yQUVKpjnXbvsrgQAgILHY8LNiBEj5HA4NHjw4CzPmzp1qqKjoxUQEKA6depo9uzZeVZjdmWEm507maUYAIC85hHhZtWqVRo9erTq1q2b5XlLly5V9+7d1adPH61bt06dO3dW586dtWnTpjyrNTsybkudPi0dPWp3NQAAFCy2h5vk5GT17NlT//nPfxQaGprluZ9++qnuuecevfTSS6pRo4aGDx+u+vXra+TIkXlWb3b4+0tlypjX3JoCACBv2R5u+vfvr44dO6p169bXPXfZsmWZzmvXrp2WLVt2zZ9JTU1VUlKSyyMvVKxonnfuzJOPAwAAF9kabqZMmaK1a9cqNjY2W+cnJCQoPDzcZV94eLgSshhzHRsbq5CQEOcjKirqpuvOjox+N3/9lScfBwAALrIt3Ozdu1eDBg3S119/rYCAgFz7nCFDhigxMdH52Lt3b6591uWqVTPP8fF58nEAAOAiX7s+eM2aNTp8+LDq16/v3JeWlqZffvlFI0eOVGpqqgoVKuTyMxERETp06JDLvkOHDikiIuKan+Pv7y9/f/9c+AZZq1nTPG/ZkucfDQBAgWZby02rVq20ceNGrV+/3vlo2LChevbsqfXr12cKNpIUExOjhQsXuuxbsGCBYmJi8rDy7KlRwzxv3cpwcAAA8pJtLTdBQUGqXbu2y76iRYuqRIkSzv29evXSLbfc4uyTM2jQIN1xxx366KOP1LFjR02ZMkWrV6/WmDFjbPkOWalaVSpUSDp1Stq/Xypb1u6KAAAoGGwfLZWVPXv26ODBg87tpk2bKi4uTmPGjFG9evU0bdo0zZgxI1NI8gR+fibg6GLrDQAAyBsOyypYN02SkpIUEhKixMREBQcH5+pnPfSQNG2aFBsrvfpqrn4UAABeLSd/vz265Sa/a97cPP/yi92VAABQcBBuctEdd5jnX3+VLlywuxoAAAoGwk0uqlNHCgkxnYrXrLG7GgAACgbCTS4qVEhq1868njjR7moAACgYCDe5rF8/8/zVV1IeLWsFAECBRrjJZXfeKVWvLqWkSP/9r93VAADg/Qg3uczhkB5+2LyeNs3uagAA8H6EmzzQtat5njtX2rXL7moAAPBuhJs8ULeu1KSJdPas9MAD0pkzdlcEAID3ItzkAYdD+uYbqWRJad06qXt36fhxu6sCAMA7EW7ySLly0rffmuHhM2eajsa7d0uJiXZXBgCAdyHc5KG77jKzFYeHSxs3ShUqSKVKSf/+t92VAQDgPQg3eaxJEzNqqlw5c7vq/Hmpf3+zgvh339ldHQAA+R/hxgbNm5tbUmlpUp8+Zt+OHdKDD5qZjHfulNLT7a4SAID8iXBjI4dDGjtW2rbtUsh54gmpUiWpcWMpOdnuCgEAyH8INx6galXp88+lxx6TihY1nY5Xr5bKlDGPQYOkhQtZWRwAgOwg3HgIX1/pyy9Na83PP0tBQWY18YMHpf/7P6l1a6lFC9PKk5ZG0AEA4FoINx6oRQtpzx5p82YzP87dd0v+/tLy5WadKl9f0yF51iy7KwUAwPMQbjxU8eJSzZpmXaqFC6Xt203rTYaDB6X77pMqV5Zef11KTTUtOgcO2Fk1AAD2c1iWZdldRF5KSkpSSEiIEhMTFRwcbHc5OZKebm5dbd8uHT0q/ec/UsZv7447pBMnzPw5Y8dKTz5pd7UAALhPTv5+E27ysT//lObMkQYOdN3v4yMVKyZFR0ujR0v16pllH2rWlAIC7KoWAIAbl5O/39yWyscqV5YGDDDz5mSoW9e08CQlSStXSo0aSTVqSA0aSK1aSfPmSS+8IB06ZGflAADkHlpuvMDWrdLLL0uvvmrmx9mzx6w8/tprZh2rq2nZ0rTi7N1rRmdt3Wr68dx9t1keQpL27zejs+6808zJAwCAXbgtlQVvDDfXYlnS7NnSrl1mFfJhw64/83FoqLmVdeaMaRU6dUrq3FkaN04KCZE2bDBD0++7T3rggbz6JgCAgo5wk4WCFG6ulJJinufMkXr1MgEmg4+PVKKEdORI9t+vb1/pk09Mi8+SJZc6NzdrZvr7AADgLoSbLBTkcHO5PXvMbaeAAOn3383trMqVpTfeMIHFx0d65RWpY0epZ08zQutqSpSQEhNdJxX09TUroPtks0dXYKD07LNSTMyl7UKF3PAlAQBeg3CTBcLN9Z0/b54LFzbPycnSO++YWZOfe848L1pkQk9CgjmnYUMpIsIMUV++/OY+PzzcDGe/996b/CIAAK9BuMkC4cZ9jh83o6/CwqS2bU2nY8syHZT37cv++/zvf9KECWYSwsu99JL07ruXQhYAoOAi3GSBcOOZzp27tGbWG29In35q9jdrJk2ZIpUta3eFAAA7Mc8N8h0/P9PXJijI9PmZNk0KDpZ++01q184sLwEAQHYQbuCRunaV1q41/W+2bJHee8/uigAA+QXhBh6rcmXpww/N63HjLg01BwAgK4QbeLSuXaUiRczQ9XXr7K4GAJAfEG7g0QIDTZ8bSZoxw+5qAAD5AeEGHq9LF/M8fbrdlQAA8gPCDTzevfeaGYs3bZJ27LC7GgCApyPcwOOFhpqVySXphx/srgYA4OkIN8gX2rQxzze7tAMAwPsRbpAvNGhgntessbsSAICnI9wgX8gIN3/9Zda0AgDgWgg3yBdCQ6VKlczrtWvtrgYA4MkIN8g3GjY0zytX2l0JAMCTEW6Qb8TEmOelS+2uBADgyWwNN6NGjVLdunUVHBys4OBgxcTEaM6cOdc8f+LEiXI4HC6PgICAPK0Z9mnWzDwvXSqlp9tdDQDAU/na+eFly5bViBEjVLVqVVmWpS+++EKdOnXSunXrVKtWrav+THBwsOLj453bDocjDyuGnW691awzdeKE9McfUs2adlcEAPBEtoab++67z2X73Xff1ahRo7R8+fJrhhuHw6GIiIg8qhCepHBhc2tq4ULpv/8l3AAArs5j+tykpaVpypQpSklJUUxG54qrSE5OVvny5RUVFaVOnTpp8+bNWb5vamqqkpKSXB7Iv7p3N89ffCFZlt3VAAA8ke3hZuPGjSpWrJj8/f3Vr18/TZ8+XTWv8Z/k1atX1/jx4zVz5kxNmjRJ6enpatq0qfbt23fN94+NjVVISIjzERUVlYvfBrntoYfMSuFbt0rvvEPAAQBk5rAse/88nDt3Tnv27FFiYqKmTZumsWPHasmSJdcMOJc7f/68atSooe7du2v48OFXPSc1NVWpqanO7aSkJEVFRSkxMVHBwcFu/S7IG++9J736qnndqpVUooQJPQ8+aHdlAIDckpSUpJCQkGz9/bY93FypdevWqly5skaPHp2t8x966CH5+vpq8uTJ2To/JxcHnuuzz6TnnnPd9+mnmfcBALxDTv5+235b6krp6ekuLS1ZSUtL08aNGxUZGZnrdcGzDBxoFtEcOVLq18/se/55adkyuysDANjN1tFSQ4YMUfv27VWuXDmdOnVKcXFxWrx4sebNmydJ6tWrl2655RbFxsZKkt5++201adJEVapU0cmTJ/XBBx9o9+7deuqpp+z8GrBJ48bmYVlmePg330gdOkhz55r9AICCydZwc/jwYfXq1UsHDx5USEiI6tatq3nz5qlNmzaSpD179sjH51Lj0okTJ9S3b18lJCQoNDRUDRo00NKlS7PVPwfey+GQxoyR9u41E/y1bi3NmiXdcYfdlQEA7OBxfW5yG31uvFdystSpk7RokRQQIM2YIbVrZ3dVAAB3yNd9boAbVayYabHp2FE6e1a6/37pww+l3bvtrgwAkJcIN/AqgYHS99+bYeHnzkkvvSRVrCjVrm06Hv/1l90VAgByG+EGXsfPT5o8WXr5ZalKFdPhePNmafRoqVo1qWdP6Z//lPbvt7tSAEBuoM8NvN62bdKGDdLYsdLFgXjSxRD09NNS375S9eqSv7+dVQIAspKvJ/HLbYSbgm3FCmnSJGnlSvPIEB5uwk94uFmQs2hRO6sEAFyJcJMFwg0kc6tq8WJp+HDp559djxUvLtWpY/rvdOoklS9v9pcrZ/YDAPIe4SYLhBtcKTHR3J5auVI6edI8rqVSJalJExNypk+Xzp83o7IqVZJKlZKCgqT33zctP6+9Zjoz//ST1Ly5WQPrSufPS/PnS/XqSWXL5urXzHV790obN0pt2kiFC9tdDQBvQ7jJAuEGWblwwbTkJCZKW7aY2Y4vXDAhZONGKS0tZ+/n52dGbRUuLJUsmfl4UpKUkmImIgwPN89XU6qUOe/CBdOydOzY9WspUsSErKNHpZAQs+/kScnHR7r7bnNs9mzz3XIiNFRKTzfXKINlSYcOmecyZcw5GRo2NOHu9GmzHRJigtzJk5Kvr+tQ/cKFpVq1zL5OnaT1683ntGkj/fCD6RAeGWnmMMrmKi2ZVK1qZrQ+etR83r33SqdOmfmR0tMvnedwmBC7Z4/UsqX5PS5Y4HrdT56UvvvOhNM33zQd2CVp+3YT9u6++8ZqBJAZ4SYLhBvcqH37pHXrzB+zw4elu+4yLTU//mj+0K5aZf5I9u5tQsu0aXZXjLxUqJDUvbt0223S669LZ86Y1sDbb7e7MsA7EG6yQLhBbklJMbMkh4eb7X37zH/ZV6sm/fmn+S//Kzkc5vi+febnryajdaBOHdPq8fPPprUhq9tYliWtXi39/ru5bbZ0qfms9u2l48dN8Dp/3rSOVKqU/e+YmmrCXOHC0j33uN5+KlbMfPfPP780n9DKldKaNaZFo0EDs2/FCmnXLvM6KMgsl5ExUm3HDlP39dStazp+59Tp0+Y2YGioaU3atcuEUsmE1YzfnWSu008/ubbmtGxpWqYuV6qUFB9v3vdKgwdLH31kWssA3BzCTRYIN0DeSk42t8AybrmlpZkg5+9vgkNg4KVzLcvchgoONvMQhYSYAHX4sAkVp06Z2acjI699Cy879QQEmFtiGbfTfH2vfduwWDHpyBETUEqVuvb7/vqr9O675lbm5erUMcEnIuLG6gVgEG6yQLgBkJsGDpQmTjQhKkPdutKSJaa/FIAbw9pSAGCTzz4zLUy//WZacsLCzCSSDz9sWooA5D7CDQDkgqZNpb//3fSRCggwI60aNzZLgQDIXYQbAMhFdeua0VOS6bz8yit2VwR4P8INAOSyV1+VXnjBvF648NKcPwByB+EGAHJZoULSBx+YpTzOnjVDzAHkHsINAOQBh8PMKyRJ//wnnYuB3ES4AYA88vzzpnPx4sVmYkYAuYNwAwB5pHx56eWXzesXXzRLNABwP8INAOShV16RoqLM4qDjxtldDeCdCDcAkIeKFLnUevOvf9H3BsgNhBsAyGO9epk1q/74Q1q0yO5qAO9DuAGAPBYcLPXubV6PHGl3NYD3IdwAgA369zfPP/wgHTxodzWAdyHcAIANatSQbr9dSk+X5s+3uxrAu2Q73HTo0EGJiYnO7REjRujkyZPO7WPHjqlmzZrurxAAvFTbtuaZcAO4V7bDzbx585Samurc/sc//qHjx487ty9cuKD4+Hj3VwgAXqpNG/P800+mBQeAe2Q73FhXjFe8chsAkDMxMVLx4tLhw6w3BbgTfW4AwCZ+ftJjj5nXY8bYXQ3gPbIdbhwOhxwOR6Z9AIAb17evef7hB+mybo0AboJvdk+0LEuPP/64/P39JUlnz55Vv379VLRoUUly6Y8DAMieOnWk6tWl+Hhp7lypWze7KwLyv2yHm94ZM05d9Oijj2Y6p1evXu6pCgAKkM6dpffekx55RAoKkjp0sLsiIH9zWAWsZ3BSUpJCQkKUmJio4OBgu8sBAK1ZIzVsaF5Xrixt3y5x1x9wlZO/3zfdoXj37t3asmWL0hnHCAA3pEEDafZs8/rPP82aUwBuXLbDzfjx4/Xxxx+77Hv66adVqVIl1alTR7Vr19bevXtzo0YA8Hrt20v33GNe16wpbdhgd0VA/pXtcDNmzBiFhoY6t+fOnasJEyboyy+/1KpVq1S8eHG99dZbuVUnAHi9Bx+89Pq11+ysBMjfsh1utm/froYZN4UlzZw5U506dVLPnj1Vv359/eMf/9DChQtzq04A8HqPPy69/755PWeOmdwPQM5lO9ycOXPGpQPP0qVL1bJlS+d2pUqVlJCQ4P4KAaCAKFRIeukls6BmWpoZFn7qlN1VAflPtsNN+fLltWbNGknS0aNHtXnzZjVr1sx5PCEhQSEhIblTJQAUIO++KxUpIi1eLI0caXc1QP6T7XDTu3dv9e/fX8OHD9dDDz2k6OhoNWjQwHl86dKlql27dm7VCQAFRps20iefmNdxcXZXA+Q/2Q43L7/8svr27avvv/9eAQEBmjp1qsvx3377Td27d8+NGgGgwHnwQbP21KZN0saNdlcD5C/ZDjc+Pj56++23tW7dOs2ZM0c1atRwOT516lT16dMnRx8+atQo1a1bV8HBwQoODlZMTIzmzJmT5c9MnTpV0dHRCggIUJ06dTQ7Y3IIAPAioaGXZiqm9QbIGVtXBS9btqxGjBihNWvWaPXq1br77rvVqVMnbd68+arnL126VN27d1efPn20bt06de7cWZ07d9amTZvyvHYAyG09epjnyZMl5kkFsi/byy9UqlQpW2/4119/3VRBYWFh+uCDD67aCtStWzelpKRo1qxZzn1NmjTRrbfeqs8//zxb78/yCwDyizNnpPBwM2Jq3DjpySftrgiwT07+fmd74cxdu3apfPny6tGjh0qXLu2OOl2kpaVp6tSpSklJUUxMzFXPWbZsmZ5//nmXfe3atdOMGTOu+b6pqakuK5YnJSW5sWoAyD2BgWYyv1dflf72N7PAZliY3VUBni/b4eabb75xLsHQvn17Pfnkk+rQoYN8fG7uztbGjRsVExOjs2fPqlixYpo+fbpq1qx51XMTEhIUHh7usi88PDzL+XViY2OZORlAvvXii6bPzYYN0ujR0pAhdlcEeL5sJ5OHHnpIc+bM0Y4dO9SgQQP97W9/U1RUlF599VVt3779hguoXr261q9frxUrVuiZZ55R7969tWXLlht+vysNGTJEiYmJzgfrXwHITwoVMgFHkj79VDp50u6KAM+X42aXW265Ra+99pq2b9+uuLg4rVixQtHR0Tpx4sQNFeDn56cqVaqoQYMGio2NVb169fTpp59e9dyIiAgdOnTIZd+hQ4cUERFxzff39/d3jsbKeABAftKtm1S1qnTokJnBGEDWbuie0tmzZzVp0iS99dZbWrFihR566CEVKVLELQWlp6e79JG5XExMTKb1qxYsWHDNPjoA4A38/KTx483rCROkffvsrgjwbDkKNytWrNDTTz+tiIgIffzxx3rggQe0f/9+TZkyRf7+/jn+8CFDhuiXX37Rrl27tHHjRg0ZMkSLFy9Wz549JUm9evXSkMtuMA8aNEhz587VRx99pD/++EPDhg3T6tWrNWDAgBx/NgDkJ82bS3feadacGj3a7moAz5btDsW1atXS4cOH1aNHDy1ZskT16tW76Q8/fPiwevXqpYMHDyokJER169bVvHnz1KZNG0nSnj17XDosN23aVHFxcXr99df197//XVWrVtWMGTNY9gFAgTBggFlvaswY6fXXpRv4b0qgQMj2PDc+Pj4qWrSofH195XA4rnne8ePH3Vmf2zHPDYD86sIFqWJFc1tq0iTpYiM3UCDkyjw3EyZMcEdtAIAb5Osr9etnWm1GjiTcANeS7ZYbb0HLDYD87NAhKSpKOn9eWrVKatjQ7oqAvJGTv9+2ri0FAMiZ8HDp4YfN63/9y+5qAM9EuAGAfCZjgOjkydLRo3ZXA3gewg0A5DONG0sNGkipqWbWYgCuCDcAkM84HJfWmPrgA+n33+2uCPAshBsAyIceeEBq08a03rRoIe3YYXdFgOfI9lDwDM8///xV9zscDgUEBKhKlSrq1KmTwsLC3FEfAOAqHA7T56Z9ezNq6l//kv75T7urAjxDjoeC33XXXVq7dq3S0tJUvXp1SdK2bdtUqFAhRUdHKz4+Xg6HQ7/++qtq1qyZW3XfMIaCA/Ams2dLHTtKYWHSgQPMWgzvlatDwTt16qTWrVvrwIEDWrNmjdasWaN9+/apTZs26t69u/bv36+WLVvqb3/72818BwBANrRrJ91yi3T8uDRzpt3VAJ4hxy03t9xyixYsWJCpVWbz5s1q27at9u/fr7Vr16pt27Y66oFjFGm5AeBt3nhDeucdqW1bad48u6sBckeuttwkJibq8OHDmfYfOXJESUlJkqTixYvr3LlzOX1rAMANePJJ8zx/vjRqlN3VAPa7odtSTz75pKZPn659+/Zp3759mj59uvr06aPOnTtLklauXKlq1arlRr0AgCtUrChljPUYMEDav9/uigB75TjcjB49Wq1atdIjjzyi8uXLq3z58nrkkUfUqlUrff7555Kk6OhojR07NjfqBQBcxYcfSs2aSenpZsVwoCC74YUzk5OT9ddff0mSKlWqpGLFirm7tlxBnxsA3mrsWKlvX6laNWnLFqlQIbsrAtwnTxbOLFasmMLCwhQWFpZvgg0AeLOHH5ZCQ6Vt26QJE+yuBrBPjsNNenq63n77bYWEhDhvSxUvXlzDhw9Xenp67lQJALiu4GDp9dfN6+eekxYutLsiwB45nqH4tdde07hx4zRixAg1a9ZMkvTrr79q2LBhOnv2rN59993cqBMAkA0DBphQM3u29P/+n2nF8WGhHRQwOe5zU6ZMGX3++ee6//77XfbPnDlTzz77rPZ7eDd9+twA8HYpKVLZstLJk9KsWWYGYyC/y9U+N8ePH1d0dHSm/dHR0Tp+/HhO3w4A4GZFi16a+2bcOLurAfJejsNNvXr1NHLkyEz7R44cqXr16rmrLgDATejRwzzPn29WDgcKkhz3uXn//ffVsWNH/fTTT4qJiZEkLVu2THv37tXs2bNzo0YAQA7ddpsUGSkdPCh16SJNnWpadICCIMctN3fccYe2bdumLl266OTJkzp58qQeeOABxcfHq0WLFrlTJQAgR3x8pPvuM6/nzJGefdbuioC8c8OT+F1p3759evvttzVmzBh3vF2uoUMxgILiwAHpxRelyZPN9pIlUsuWdlcF3Jg8mcTvSseOHdM4eq4BgMcoU0aKi5Mee8xsz5pld0VA3mD2AwDwcm3bmufFi+2uBMgbhBsA8HJ33GGeV62SPv7Y7mqA3Ee4AQAvFxVlFtOUpBdekH791e6KgNyV7aHgDzzwQJbHT5486Y56AAC5IC5OatxYSkszq4c3b253RUDuyXa4CQkJue7xXr16uaMmAICbNWgg/e9/UtOm0rffSh99JJUoYXdVQO5w21Dw/IKh4AAKKssyIWfdOmnoUGnYMLsrArLPlqHgAADP5nBIf/+7ef3JJ1JCgt0VAbmDcAMABUiXLlL9+lJiovTSS3ZXA+QOwg0AFCCFCkmff25eT5kiHT5sd0WA+xFuAKCAuf1287hwwUzwx+0peBvCDQAUQL17m+fff5eee87uagD3ItwAQAHUp490553m9YIFZv4bwFsQbgCgAAoIMKEmKEg6edK04ADegnADAAWUr6/UsqV5PX++3dUA7kO4AYAC7N57zfMnn0hJSXZXA7gH4QYACrAnn5SqVpUOHZLeecfuagD3INwAQAHm5yf985/m9SefSFu22F0RcPMINwBQwHXsKHXoIJ0/L91/v5m9GMjPbA03sbGxuv322xUUFKTSpUurc+fOio+Pz/JnJk6cKIfD4fIICAjIs5oBwBtNmCCVKyf9+acUF3dj77F7t5kz56+/3F0dkDO2hpslS5aof//+Wr58uRYsWKDz58+rbdu2SklJyfLngoODdfDgQedj9+7deVYzAHij0qWlgQPN62+/vbH36NxZ+uwzqU0bt5YG5JjDsizL7iIyHDlyRKVLl9aSJUvUMmN84hUmTpyowYMH6+TJkzf0GTlZMh0ACpLdu6UKFczq4Vu2SNHROft5h+PSa8/5ywJvkZO/3x7V5ybx4o3esLCwLM9LTk5W+fLlFRUVpU6dOmnz5s3XPDc1NVVJSUkuDwBAZuXLS+3amWDSqZOZ3A/Ijzwm3KSnp2vw4MFq1qyZateufc3zqlevrvHjx2vmzJmaNGmS0tPT1bRpU+3bt++q58fGxiokJMT5iIqKysVvAQD525dfSlFR0rZtUs+etMAgf/KY21LPPPOM5syZo19//VVly5bN9s+dP39eNWrUUPfu3TV8+PBMx1NTU5WamurcTkpKUlRUFLelAOAa1q2TmjaVzp6VfvhBuu++7P0ct6WQm/LdbakBAwZo1qxZ+vnnn3MUbCSpcOHCuu2227Rjx46rHvf391dwcLDLAwBwbbfdJg0aZF6/+KJ06lT2fu7ycAPYydZwY1mWBgwYoOnTp2vRokWqWLFijt8jLS1NGzduVGRkZK7UCAAF0SuvSGXKmNtTGaOorsfXN7erArLH1nDTv39/TZo0SXFxcQoKClJCQoISEhJ05swZ5zm9evXSkCFDnNtvv/225s+fr7/++ktr167Vo48+qt27d+upp56y6VsAgPcJDZW++ca8/vprszzD9RQqlOtlAdlia7gZNWqUEhMTdeeddyoyMtL5+Cbjf1GS9uzZo4MHDzq3T5w4ob59+6pGjRrq0KGDkpKStHTpUtWsWdOmbwEA3ql5c6lxY+nCBemLL65//uXhhj43sJPHdCjOK8xzAwDZN26c9NRT0i23SDt2SFlNCF+8+KWlG06dkooVy7MyUQDkuw7FAADP1LOnVLastH+/mX04u7LbCRnIDYQbAMA1BQRIQ4ea16+/boaJX8tl3SXFfKmwE+EGAJClPn3MjMXnzkkvvXT1cy5cMMczEG5gJ8INACBLDof06adS4cLSwoXS2LGZzzl92nWb21KwE+EGAHBd5ctL/fqZ1337SosWuR6/MtzQcgM7EW4AANny8cdShw7m9bRprscIN/AkhBsAQLb4+krPPmtez57tOpfNleHms8+kBx7IvB/IC4QbAEC23XWX5O8v7d4tvffepf1XhpjVq6Xp06XPP8/zEgHCDQAg+4oUubSo5pAhl4aGp6Rc/fzsLNsAuBvhBgCQIyNGSF26mNdxceaZ20/wJIQbAECOOBzSY4+Z15MnS+np1w4358/naWmARLgBANyI9u2lokXNsgy//34p3Pj6up539Kgt5aGAI9wAAHIsIMB0LpakBQuk5GTzukIF1/MOH8772gDCDQDghrRta57nzTMjoySpWTPXc+bNY8QU8p7Dsi6fqcD75WTJdADAtW3fLlWrdmm7UCEpPl6qUiXzuUuXSjExeVoevExO/n7TcgMAuCFVq0qPP35p+/HHpcqVr37uJ5/kWVmAfLNxDgAAV/XJJ1JamlS9uvTKK9c+b/p0s2q4n19eVoeCinADALhhISHSl19e/7zz56WhQ6VevaQaNfKiMhRk9LkBALjVt99Ko0ZJ9eub0VIbN5rh4roYhk6etLtC5Ec5+ftNyw0AwK0eftg8MvTufSncJCbaVhYKEDoUAwBy1ZW3oR57zIyqAnIL4QYAkKvKlHHdnjTJ3LICcgvhBgCQqzp2lEqXdt13+rS0datdFcHbEW4AALmqRAkpIUG6917X/ffdJz36KCuKw/0INwCAXOdwmHluLvfnn9LXX0sTJ9pVFbwV4QYAkCeioq6+f9euvK4E3o5wAwDIE2+/Ld1/vzR3rtSgwaX99L2BuxFuAAB5okwZaeZMqV07ado0qWtXs3/FCun226VWraT0dLurhDdghmIAgC2OHpVKlXLdt2YNw8RxdawKDgDweCVLZp4D58cf7aoG3oRwAwCwzRdfSE8/LdWta7Y/+MC03IwebXdlyM+4LQUAsN3Bg1L58mb18AypqZKfn51VwZNwWwoAkK9ERkoDBrjua93ahB4gpwg3AACP8NZbUp8+UtmyZvt//5MaN2aoOHKOcAMA8AhBQdLYsdLmzdLrr0u33CLt3Ss1ayZt3253dchPCDcAAI8SHCwNHy6tXy81bCidOCENHCj99JNZowq4HsINAMAjlSwpffmlVKiQNG+e1KaNVKmSmQCwYA2FQU4RbgAAHqtGDTM8vG5dM5rqzBnpoYek3r3trgyejHADAPBof/ub9Pvvpt/NCy+YfV99Jf3yi92VwVMRbgAA+ULhwtKHH0rPPGO2n33WdEDeudPuyuBpCDcAgHzljTekgAAzqqpvX6lqVWnUKLurgich3AAA8pXISGnw4EvbaWmmFYd1qZDB1nATGxur22+/XUFBQSpdurQ6d+6s+Pj46/7c1KlTFR0drYCAANWpU0ezZ8/Ok3oBAJ5h+HDp66/NPDgZt6nuvVd68EHp9Gm7q4PdbA03S5YsUf/+/bV8+XItWLBA58+fV9u2bZWSknLNn1m6dKm6d++uPn36aN26dercubM6d+6sTZs25WntAAD7+PpKPXqY2YzffVcKCTH7v/tOiouT0tPtrhB28qiFM48cOaLSpUtryZIlatmy5VXP6datm1JSUjRr1iznviZNmujWW2/V559/ft3PYOFMAPA+8+ZJ99xzabtBAzOaqkgRO6uCO+XbhTMTExMlSWFhYdc8Z9myZWrdurXLvnbt2mnZsmVXPT81NVVJSUkuDwCAd2nXTjpyRAoMNNtr1pjOxseO2V0Z7OAx4SY9PV2DBw9Ws2bNVLt27Wuel5CQoPDwcJd94eHhSrjGnNyxsbEKCQlxPqKiotxeOwDAfiVLmtaa/v3NdlycVLGiNGWK3ZUhr3lMuOnfv782bdqkKW7+VzhkyBAlJiY6H3v37nXr+wMAPEfDhtLIkWbk1K23SqdOSU89xVw4BY1HhJsBAwZo1qxZ+vnnn1U2Y637a4iIiNChQ4dc9h06dEgRERFXPd/f31/BwcEuDwCAd+vQwdyaatlSSkmRunc3E/5lMV4FXsTWcGNZlgYMGKDp06dr0aJFqlix4nV/JiYmRgsXLnTZt2DBAsXExORipQCA/MbHx7TiSNKKFaYPzq23Sp9+ymgqb2druOnfv78mTZqkuLg4BQUFKSEhQQkJCTpz5ozznF69emnIkCHO7UGDBmnu3Ln66KOP9Mcff2jYsGFavXq1BgwYYNO3AAB4qjp1pM6dL23v2GEmAJw61c6qkNtsHQrucDiuun/ChAl6/PHHJUl33nmnKlSooIkTJzqPT506Va+//rp27dqlqlWr6v3331eHDh2y9ZkMBQeAguXQIdO5uEEDs5r4rl0m9Dz8sFSlivTII3ZXiOzIyd9vj5rnJi8QbgCg4Dp+XCpfXkpOvrRvzx6JgbSeL9/OcwMAQG4KC5OGDnXd99FHdlWD3EK4AQAUKIMGmVtRGYNsR42Stm2zuyq4E+EGAFCgFC4sTZ4sHThglmw4d07q2VNavNgMHy9YnTW8E+EGAFAgORzSv/9tblWtXi3ddZeZBPAf/5D277e7OtwMwg0AoMCqWNEsutmihVS5stn3+utmtfENG+yuDjeKcAMAKNAaNjRrUm3bJl0+H+wnn0gnT9pZGW4U4QYAgIszGs+dK730ktmeMEGqWtXMk4P8hXADAMBFwcHSu+9Kt9xito8elZ57Tjp71u7KkBOEGwAALlO4sLRkifTee2b722+lmjWln38226xL5fkINwAAXKFyZenll6Vp00wrzs6d0r33mlacwEDTCRmei3ADAMA1dO0qxcdLbdtKp09Ln31m5sX5xz/srgxZIdwAAJCFokUv3ZrK8MsvZiHOAwfsrAzXQrgBAOA6QkLMrahnn720b+1a6f337awK10K4AQAgG8qWlf71Lyk09NK+SZOklBQ7q8LVEG4AAMiByZOlzp0lX1/p2DFzu2rsWLurwuUINwAA5EC7dtL06dKiRWbY+J49Ut++ZtFNeAbCDQAAN6BFC2nVKikiwmz37Stt3253VRDhBgCAG1evnrR0qZnZeN06qUYN6fbbpQUL7K6sYCPcAABwEypWNMGmbVspLU1avVp6+mkzHw7sQbgBAOAmVapkFt1cvNhs79olPfmkGU3Fcg15j3ADAIAbOBzSHXdI//mP2f76a+mxx8ysxshbhBsAANyoTx+pV69L24MHS/36SYcP21lVwUK4AQDAjRwOaeJEad8+swCnJI0eLTVqxIR/eYVwAwCAmzkcZjXxxYulMWPM7Ma7d0uvvirNnCkdP253hd7NYVmWZXcReSkpKUkhISFKTExUcHCw3eUAAAqAiROlJ564tF2ypPTf/0pNmthZVf6Sk7/ftNwAAJDLHn3UDA9v3lyKjJSOHjXbY8dKqal2V+d9aLkBACAPbdsmVa9+aXvAAHO76pZb7KzK89FyAwCAh6pWzXV75EjTJ2fFCrsq8j6EGwAA8tjLL2fe99ZbdlTinQg3AADksTfflMaNMwttDh1q9s2ZIz37rBlV9dxzrE91M+hzAwCAzd57TxoyRLIsyc/PrEvl5yd9+qlUtKj04INSYKDdVdorJ3+/ffOsKgAAcFWvvGLCy6BBlxbcPHdOeuYZ8/qvvy618OD6CDcAAHiAAQOkM2ckf38pKEh66qlLx4YNk+rVk+6/X/KhQ8l1cVsKAAAPtHatVKaMmRcnQ9++UqFCUmio9O67ZibkgoLbUgAA5HP165vnXr2kL780rzNWHJekChXMRIDIjMYtAAA82KhRZg6cXr2kgIBL+595RvrsM/OaWY5dEW4AAPBgRYqYFcW/+ML0yblwwfTHSU83Q8Zvu82MqJo0ye5KPQfhBgCAfKRQIXN7qls3s71+vZSWJv3733ZX5jnocwMAQD7073+bUVXbtkm//CItWyYdOGA6IRd0hBsAAPKhsLBLHYybNZOWLpXatpU6dZKioswkgE8+aXeV9iDcAACQzw0fbubA2bzZPDI0bCjVrWtnZfagzw0AAPnc3XebvjcPPOC6/8MPpbFjXQNPQUC4AQDAC1SpIk2bZkZQhYSYfV99ZSb+a9BA+u9/zb6pU6U+faSzZ20tN1fZGm5++eUX3XfffSpTpowcDodmzJiR5fmLFy+Ww+HI9EhISMizmgEA8FQOh1ls88QJ6eWXL+1PTTWtOl9/LT38sDR+vGnR8Va2hpuUlBTVq1dP//rXv3L0c/Hx8Tp48KDzUbp06VyrEQCA/MbhMCuNr18v7d8v1a5t5sd59NFL52zbZmeFucvWDsXt27dX+/btc/xzpUuXVvHixXOlJgAAvEW9euZ50iQzL058/KVja9dK589LhQvbVl6uyZd9bm699VZFRkaqTZs2+u2337I8NzU1VUlJSS4PAAAKknr1pK1bpc6dL+377TepXDkzhNzb5KtwExkZqc8//1zfffedvvvuO0VFRenOO+/U2rVrr/kzsbGxCgkJcT6ioqLytGYAADyBwyF9+60JOYGBZl9CgtS+vbRpk/Tmm9Lf/ibt3Gl3pTfPYVmWZXcRkuRwODR9+nR1vjxWZsMdd9yhcuXK6auvvrrq8dTUVKVetqJYUlKSoqKisrVkOgAA3mjcOOnnn82sxn/95Xrs3nsvjazyJElJSQoJCcnW3+98P4lfo0aN9Ouvv17zuL+/v/z9/fO0JgAAPFmfPuaxerUUE2M6G2eYPdv0zTl3TqpTx84qb1y+ui11NevXr1dkZKTdZQAAkO80bGgm+Pv5Zyk5WWrRwqw2Hh1tZjYePFjyjPs7OWNry01ycrJ27Njh3N65c6fWr1+vsLAwlStXTkOGDNH+/fv15ZdfSpI++eQTVaxYUbVq1dLZs2c1duxYLVq0SPPnz7fxWwAAkH9Vq2YekvTOO9I990hnzpjtTz+VZsyQHn9ceu21/DOyytZws3r1at11113O7eeff16S1Lt3b02cOFEHDx7Unj17nMfPnTunF154Qfv371eRIkVUt25d/fTTTy7vAQAAbkzLltLy5abPzZEjJtzs3i299Za0cqXpkFysmN1VXp/HdCjOKznpkAQAQEF19qzpj7N+/aV9DRpIP/4ohYfnfT05+fud7/vcAAAA9wsIMK04qanmuWRJac0aE3i2bTN9dd58Uxo1yvP65eT70VIAACB3ZAw2btzYTPZ3zz1m6HiNGqbjcYbISNcJAu1Gyw0AALiuqlVNwGnY0DXYSNIXX9hV1dURbgAAQLaEh0u//CJNmybNnCmtW2f2//ijtGKF3dVdwm0pAACQbYGBUteul7bbtpXmz5eaN5dat5bKlzfLOFSvbl+NjJYCAAA3LDlZatTIrFmVITxc2rvXvfPiMFoKAADkiWLFpIvT1DkNHGjvhH/clgIAADelZ0/pq69Mi80339hdDeEGAADcpMBAackSu6u4hNtSAADAqxBuAACAVyHcAAAAr0K4AQAAXoVwAwAAvArhBgAAeBXCDQAA8CqEGwAA4FUINwAAwKsQbgAAgFch3AAAAK9CuAEAAF6FcAMAALwK4QYAAHgVX7sLyGuWZUmSkpKS7C4FAABkU8bf7Yy/41kpcOHm1KlTkqSoqCi7SwEAADl06tQphYSEZHmOw8pOBPIi6enpOnDggIKCguRwONz63klJSYqKitLevXsVHBzs1vfGJVznvMO1zhtc57zBdc47uXGtLcvSqVOnVKZMGfn4ZN2rpsC13Pj4+Khs2bK5+hnBwcH8DycPcJ3zDtc6b3Cd8wbXOe+4+1pfr8UmAx2KAQCAVyHcAAAAr0K4cSN/f38NHTpU/v7+dpfi1bjOeYdrnTe4znmD65x37L7WBa5DMQAA8G603AAAAK9CuAEAAF6FcAMAALwK4QYAAHgVwo2b/Otf/1KFChUUEBCgxo0ba+XKlXaXlO/88ssvuu+++1SmTBk5HA7NmDHD5bhlWXrzzTcVGRmpwMBAtW7dWtu3b3c55/jx4+rZs6eCg4NVvHhx9enTR8nJyXn8TTxXbGysbr/9dgUFBal06dLq3Lmz4uPjXc45e/as+vfvrxIlSqhYsWLq2rWrDh065HLOnj171LFjRxUpUkSlS5fWSy+9pAsXLuTxt/Fso0aNUt26dZ2TmMXExGjOnDnO41zn3DFixAg5HA4NHjzYuY9r7R7Dhg2Tw+FweURHRzuPe9R1tnDTpkyZYvn5+Vnjx4+3Nm/ebPXt29cqXry4dejQIbtLy1dmz55tvfbaa9b3339vSbKmT5/ucnzEiBFWSEiINWPGDOv333+37r//fqtixYrWmTNnnOfcc889Vr169azly5db//vf/6wqVapY3bt3t+HbeKZ27dpZEyZMsDZt2mStX7/e6tChg1WuXDkrOTnZeU6/fv2sqKgoa+HChdbq1autJk2aWE2bNnUev3DhglW7dm2rdevW1rp166zZs2dbJUuWtIYMGWLTt/JMP/zwg/Xjjz9a27Zts+Lj462///3vVuHCha1NmzZZFtc5V6xcudKqUKGCVbduXWvQoEHO/Vxr9xg6dKhVq1Yt6+DBg87HkSNHnMc96ToTbtygUaNGVv/+/Z3baWlpVpkyZazY2Fhb68rPrgw36enpVkREhPXBBx849508edLy9/e3Jk+ebFmWZW3ZssWSZK1atcp5zpw5cyyHw2Ht378/j79B/nD48GFLkrVkyRLLunhNCxcubE2dOtV5ztatWy1J1rJlyyzrYgj18fGxEhISnOeMGjXKCg4OtlJTU234FvlHaGioNXbsWK5zLjh16pRVtWpVa8GCBdYdd9zhDDdca/cZOnSoVa9evase87TrzG2pm3Tu3DmtWbNGrVu3du7z8fFR69attWzZMltr8yY7d+5UQkKCy3UOCQlR48aNndd52bJlKl68uBo2bOg8p3Xr1vLx8dGKFStsqdvTJSYmSpLCwsIkSWvWrNH58+ddrnN0dLTKlSvncp3r1Kmj8PBw5znt2rVTUlKSNm/enOffIT9IS0vTlClTlJKSopiYGK5zLujfv786duzock3Fv2m32759u8qUKaNKlSqpZ8+e2rNnj+SB17nALZzpbkePHlVaWprLL0uSwsPD9ccff9hWl7dJSEiQLl7Xy4WHhzuPJSQkqHTp0i7HfX19FRYW5jwHl6Snp2vw4MFq1qyZateuLV28hn5+fipevLjLuVde56v9HnTZ7wnGxo0bFRMTo7Nnz6pYsWKaPn26atasqfXr13Od3WjKlClau3atVq1alekY/6bdp3Hjxpo4caKqV6+ugwcP6q233lKLFi20adMmj7vOhBuggOrfv782bdqkX3/91e5SvFb16tW1fv16JSYmatq0aerdu7eWLFlid1leZe/evRo0aJAWLFiggIAAu8vxau3bt3e+rlu3rho3bqzy5cvr22+/VWBgoK21XYnbUjepZMmSKlSoUKYe4YcOHVJERIRtdXmbjGuZ1XWOiIjQ4cOHXY5fuHBBx48f53dxhQEDBmjWrFn6+eefVbZsWef+iIgInTt3TidPnnQ5/8rrfLXfgy77PcHw8/NTlSpV1KBBA8XGxqpevXr69NNPuc5utGbNGh0+fFj169eXr6+vfH19tWTJEv3f//2ffH19FR4ezrXOJcWLF1e1atW0Y8cOj/s3Tbi5SX5+fmrQoIEWLlzo3Jeenq6FCxcqJibG1tq8ScWKFRUREeFynZOSkrRixQrndY6JidHJkye1Zs0a5zmLFi1Senq6GjdubEvdnsayLA0YMEDTp0/XokWLVLFiRZfjDRo0UOHChV2uc3x8vPbs2eNynTdu3OgSJBcsWKDg4GDVrFkzD79N/pOenq7U1FSusxu1atVKGzdu1Pr1652Phg0bqmfPns7XXOvckZycrD///FORkZGe92/ard2TC6gpU6ZY/v7+1sSJE60tW7ZYTz/9tFW8eHGXHuG4vlOnTlnr1q2z1q1bZ0myPv74Y2vdunXW7t27LeviUPDixYtbM2fOtDZs2GB16tTpqkPBb7vtNmvFihXWr7/+alWtWpWh4Jd55plnrJCQEGvx4sUuwzlPnz7tPKdfv35WuXLlrEWLFlmrV6+2YmJirJiYGOfxjOGcbdu2tdavX2/NnTvXKlWqFMNmr/Dqq69aS5YssXbu3Glt2LDBevXVVy2Hw2HNnz/fsrjOuery0VIW19ptXnjhBWvx4sXWzp07rd9++81q3bq1VbJkSevw4cOW5WHXmXDjJp999plVrlw5y8/Pz2rUqJG1fPlyu0vKd37++WdLUqZH7969LevicPA33njDCg8Pt/z9/a1WrVpZ8fHxLu9x7Ngxq3v37laxYsWs4OBg64knnrBOnTpl0zfyPFe7vpKsCRMmOM85c+aM9eyzz1qhoaFWkSJFrC5dulgHDx50eZ9du3ZZ7du3twIDA62SJUtaL7zwgnX+/HkbvpHnevLJJ63y5ctbfn5+VqlSpaxWrVo5g43Fdc5VV4YbrrV7dOvWzYqMjLT8/PysW265xerWrZu1Y8cO53FPus4Oy/wfHgAAgFegzw0AAPAqhBsAAOBVCDcAAMCrEG4AAIBXIdwAAACvQrgBAABehXADAAC8CuEGAAB4FcINgALJ4XBoxowZdpcBIBcQbgDkuccff1wOhyPT45577rG7NABewNfuAgAUTPfcc48mTJjgss/f39+2egB4D1puANjC399fERERLo/Q0FDp4i2jUaNGqX379goMDFSlSpU0bdo0l5/fuHGj7r77bgUGBqpEiRJ6+umnlZyc7HLO+PHjVatWLfn7+ysyMlIDBgxwOX706FF16dJFRYoUUdWqVfXDDz84j504cUI9e/ZUqVKlFBgYqKpVq2YKYwA8E+EGgEd644031LVrV/3+++/q2bOnHnnkEW3dulWSlJKSonbt2ik0NFSrVq3S1KlT9dNPP7mEl1GjRql///56+umntXHjRv3www+qUqWKy2e89dZbevjhh7VhwwZ16NBBPXv21PHjx52fv2XLFs2ZM0dbt27VqFGjVLJkyTy+CgBuiNvXGQeA6+jdu7dVqFAhq2jRoi6Pd99917Isy5Jk9evXz+VnGjdubD3zzDOWZVnWmDFjrNDQUCs5Odl5/Mcff7R8fHyshIQEy7Isq0yZMtZrr712zRokWa+//rpzOzk52ZJkzZkzx7Isy7rvvvusJ554ws3fHEBeoM8NAFvcddddGjVqlMu+sLAw5+uYmBiXYzExMVq/fr0kaevWrapXr56KFi3qPN6sWTOlp6crPj5eDodDBw4cUKtWrbKsoW7dus7XRYsWVXBwsA4fPixJeuaZZ9S1a1etXbtWbdu2VefOndW0adOb/NYA8gLhBoAtihYtmuk2kbsEBgZm67zChQu7bDscDqWnp0uS2rdvr927d2v27NlasGCBWrVqpf79++vDDz/MlZoBuA99bgB4pOXLl2farlGjhiSpRo0a+v3335WSkuI8/ttvv8nHx0fVq1dXUFCQKlSooIULF95UDaVKlVLv3r01adIkffLJJxozZsxNvR+AvEHLDQBbpKamKiEhwWWfr6+vs9Pu1KlT1bBhQzVv3lxff/21Vq5cqXHjxkmSevbsqaFDh6p3794aNmyYjhw5ooEDB+qxxx5TeHi4JGnYsGHq16+fSpcurfbt2+vUqVP67bffNHDgwGzV9+abb6pBgwaqVauWUlNTNWvWLGe4AuDZCDcAbDF37lxFRka67Ktevbr++OMP6eJIpilTpujZZ59VZGSkJk+erJo1a0qSihQponnz5mnQoEG6/fbbVaRIEXXt2lUff/yx87169+6ts2fP6p///KdefPFFlSxZUg8++GC26/Pz89OQIUO0a9cuBQYGqkWLFpoyZYrbvj+A3OOwzKgBAPAYDodD06dPV+fOne0uBUA+RJ8bAADgVQg3AADAq9DnBoDH4W45gJtByw0AAPAqhBsAAOBVCDcAAMCrEG4AAIBXIdwAAACvQrgBAABehXADAAC8CuEGAAB4lf8PJ48umnSI2bwAAAAASUVORK5CYII=",
"text/plain": [
"<Figure size 640x480 with 1 Axes>"
]
@@ -1394,14 +1622,13 @@
}
],
"source": [
- "batch_src, batch_labels, batch_padding_mask = mktunebatch(BSZ, test=True)\n",
- "model.eval()\n",
- "with torch.no_grad():\n",
- " output = model(batch_src, batch_padding_mask)\n",
- "print(criterion(output.squeeze(1), batch_labels).item())\n",
- "x = batch_labels.detach().to(torch.float16).cpu().numpy().flatten()\n",
- "y = output.detach().to(torch.float16).cpu().numpy().flatten()\n",
- "plt.hist2d(x, y, bins=50, norm=mpl.colors.LogNorm())"
+ "with open('training-loss') as f:\n",
+ " train_err = list(map(float, f.read().split()))\n",
+ " plt.suptitle('Log MSE vs Epochs')\n",
+ " plt.plot(torch.log(torch.tensor(train_err)[:500]), label='Train', color='blue')\n",
+ " plt.xlabel('Epochs')\n",
+ " plt.ylabel('Log MSE')\n",
+ " plt.show()"
]
}
],