From d5534cfd1a5b2b2ff2e23c15506764dd3a5cc680 Mon Sep 17 00:00:00 2001 From: Ciprian-Florin Ifrim Date: Tue, 24 Mar 2026 18:58:16 +0000 Subject: [PATCH 1/2] =?UTF-8?q?Notable=20Non-Record=20Submission:=201.1239?= =?UTF-8?q?=20BPB=20-=20106.2M=20Binary=20U-Net=20(15L=20768d=208192BPE=20?= =?UTF-8?q?relu=C2=B2=204xMLP=20FP8=20SmearGate,=2050k=20steps)?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- .../README.md | 162 + .../RESULTS.md | 1236 +++ .../binary_log.txt | 1518 +++ .../fineweb_8192_bpe.model | Bin 0 -> 370917 bytes .../fineweb_8192_bpe.vocab | 8192 +++++++++++++++++ .../requirements.txt | 10 + .../run_cuda_binary.sh | 72 + .../setup.sh | 143 + .../submission_binary.json | 11 + .../train_gpt_cuda_binary.py | 1350 +++ 10 files changed, 12694 insertions(+) create mode 100644 records/track_non_record_16mb/2026-03-24_106M_Binary_Asymmetric_UNet_FP8_15L_8192BPE_YaRN_NeoMuon_Smear/README.md create mode 100644 records/track_non_record_16mb/2026-03-24_106M_Binary_Asymmetric_UNet_FP8_15L_8192BPE_YaRN_NeoMuon_Smear/RESULTS.md create mode 100644 records/track_non_record_16mb/2026-03-24_106M_Binary_Asymmetric_UNet_FP8_15L_8192BPE_YaRN_NeoMuon_Smear/binary_log.txt create mode 100644 records/track_non_record_16mb/2026-03-24_106M_Binary_Asymmetric_UNet_FP8_15L_8192BPE_YaRN_NeoMuon_Smear/fineweb_8192_bpe.model create mode 100644 records/track_non_record_16mb/2026-03-24_106M_Binary_Asymmetric_UNet_FP8_15L_8192BPE_YaRN_NeoMuon_Smear/fineweb_8192_bpe.vocab create mode 100644 records/track_non_record_16mb/2026-03-24_106M_Binary_Asymmetric_UNet_FP8_15L_8192BPE_YaRN_NeoMuon_Smear/requirements.txt create mode 100644 records/track_non_record_16mb/2026-03-24_106M_Binary_Asymmetric_UNet_FP8_15L_8192BPE_YaRN_NeoMuon_Smear/run_cuda_binary.sh create mode 100644 records/track_non_record_16mb/2026-03-24_106M_Binary_Asymmetric_UNet_FP8_15L_8192BPE_YaRN_NeoMuon_Smear/setup.sh create mode 100644 records/track_non_record_16mb/2026-03-24_106M_Binary_Asymmetric_UNet_FP8_15L_8192BPE_YaRN_NeoMuon_Smear/submission_binary.json create mode 100644 records/track_non_record_16mb/2026-03-24_106M_Binary_Asymmetric_UNet_FP8_15L_8192BPE_YaRN_NeoMuon_Smear/train_gpt_cuda_binary.py diff --git a/records/track_non_record_16mb/2026-03-24_106M_Binary_Asymmetric_UNet_FP8_15L_8192BPE_YaRN_NeoMuon_Smear/README.md b/records/track_non_record_16mb/2026-03-24_106M_Binary_Asymmetric_UNet_FP8_15L_8192BPE_YaRN_NeoMuon_Smear/README.md new file mode 100644 index 0000000000..2f7f235d78 --- /dev/null +++ b/records/track_non_record_16mb/2026-03-24_106M_Binary_Asymmetric_UNet_FP8_15L_8192BPE_YaRN_NeoMuon_Smear/README.md @@ -0,0 +1,162 @@ +# Notable Non-Record Submission: 1.1239 BPB — 106.2 Asymmetric Binary U-Net Transformer + +**1-bit Quantisation + 15L (7 Encoder - 8 Decoder) + NeoMuon + 4x relu² MLP + SmearGate + Factored Tied Embedding + Poly5 Softcap + YaRN 2048 + 8192 BPE + FP8 QAT + LZMA + Stride-16 Sliding Eval** + +**val_bpb: 1.1239** (sliding, seed=42) | **15.67 MB** artifact | 8×H100 SXM, 50k steps (~2.15h) + +> **This is a **non-record submission** — training exceeds the 10-minute wallclock constraint (50,000 steps / ~2.15 hours). Submitted to demonstrate the compression frontier: 106.2 parameters in 15.67MB via 1-bit quantisation. Over 120M possible with FP4 (implemented) with a worse bpb. Full experiment log: [RESULTS.md](RESULTS.md). Complete training logs: [logs/](https://github.com/CiprianFlorin-Ifrim/openai-parameter-golf-submission/tree/main/logs/cuda).** + +## Results (seed=42, 8×H100 SXM) + +| Metric | Value | +|--------|-------| +| Sliding BPB (s16) | **1.1239** | +| val_bpb | 1.1497 | +| RT bpb | 1.1516 | +| Steps | 50,000 | +| ms/step | 155.3 | +| Training time | 7,763s (~2.15h) | +| optimal_T | 0.90 | +| Artifact | 15,670,651 bytes (15.67MB) | +| Parameters | 106,154,616 | + +### Comparison to Ternary Submission + +Binary reaches better absolute quality but requires circa 13x more training time. Within the 10-minute budget, binary's best fitting run (14L, 4,820 steps) scores 1.1824 sliding — 0.025 bpb worse than ternary (my previous record PR). The zero state is worth more at convergence than the 60% parameter density advantage. + +The results document linked here and in my repo showcases all methods and sweeps applied to both Binary and Ternary Bitnets, which unfortunately are incompatible with many methods, such as Tversky Layers, EMA, Muon WD, LM Logit Head ranking and many more. + +## Architecture + +- 15 transformer layers, dim=768, 8 heads, 4 KV heads (GQA), head_dim=96 +- Binary quantisation: weights {-1, +1}, 1 bit/param, per-group (128) absmean scaling +- 4x MLP expansion (hidden=3072) with **relu²** activation, fused gate+up projection +- U-Net encoder/decoder with learned skip weights (ones-init) and per-block residual mix from input embedding +- **SmearGate:** causal cumulative mean blending with learned tanh gate, zero-init for safe residual start +- Factored tied embedding: 8192×254 bottleneck with learned projections +- Polynomial softcap (degree 5, cap=10) with Z-loss regularisation (1e-4) +- YaRN positional encoding (max_len=2048, ROPE_BASE=5000) +- Fused QKV projection +- FlashAttention-3 (Hopper native kernels) +- 106.2M parameters, 15.67MB artifact (97.3M binary + 2.5M fp8 + 70KB code) + +## Key Techniques + +### Architecture +- **Binary quantisation:** 1 bit/param packs 60% more parameters per MB than ternary (1.6 bits/param), allowing 15 layers vs 10 within similar budget +- **4x relu² MLP:** same as ternary — relu² strictly dominates relu; 4x width outperforms 3x even with fewer layers at matched budget +- **SmearGate:** blends each position with causal cumulative mean; adds 22ms/step overhead but provides -0.007 bpb at scale. Viable here because the run is not wallclock-constrained + +### Training +- **NeoMuon** with 3 Newton-Schulz steps: same optimizer as ternary, effective for binary STE as well +- **50,000 steps unconstrained:** binary converges slower than ternary — at 4,000 steps (the 10-minute equivalent) binary lags by 0.025 bpb. Extended training closes the gap and surpasses ternary +- **524k batch tokens:** same optimal batch size as ternary + +### Evaluation +- **Temperature scaling (T=0.90):** same auto-calibrated grid as ternary +- **Sliding window (stride=16):** same evaluation protocol + +### Compression +- **Bit-packing + LZMA (preset=9):** binary weights pack at exactly 1 bit/param before LZMA entropy coding +- **FP8 QAT (e4m3):** same as ternary for non-binary parameters. Clean roundtrip — binary has no zero state, so `mean(|Q|)=1.0` always; no shrinkage correction needed +- **No EMA:** despite clean binary roundtrip math, EMA still hurts quality by 0.03 bpb in practice + +## Setup and Run + +```bash +# Environment setup (conda + Python 3.13 + PyTorch + FlashAttention-3 + Triton + dataset) +bash setup.sh + +# Activate and run +conda activate golf +SEED=42 bash run_cuda_binary.sh +``` + +
+Full run command + +```bash +RUN_ID=binary_run \ +DATA_PATH=./data/datasets/fineweb10B_sp8192 \ +TOKENIZER_PATH=./data/tokenizers/fineweb_8192_bpe.model \ +ATTN_PROJ_TYPE=standard \ +LOGIT_HEAD_TYPE=standard \ +TVERSKY_MEMBERSHIP=sigmoid \ +TVERSKY_NUM_FEATURES=0 \ +TVERSKY_FEATURE_POOLS=0 \ +VOCAB_SIZE=8192 \ +BITNET_GROUP_SIZE=128 \ +BIGRAM_HASH=0 \ +EMBED_DIM=254 \ +TRAINING_DEPTH_RECURRENCE=0 \ +EVAL_DEPTH_RECURRENCE=0 \ +NUM_LAYERS=15 \ +MODEL_DIM=768 \ +NUM_KV_HEADS=4 \ +NUM_HEADS=8 \ +DIFF_ATTN=0 \ +MLP_MULT=4 \ +MLP_GROUPS=0 \ +MATRIX_OPTIMIZER=muon \ +ADAM_LR=0.05 \ +ADAM_WD=0.05 \ +MUON_BACKEND_STEPS=3 \ +MUON_MOMENTUM=0.95 \ +MUON_MOMENTUM_WARMUP_START=0.85 \ +MUON_MOMENTUM_WARMUP_STEPS=500 \ +MUON_WD=0.0 \ +MATRIX_LR=0.04 \ +SCALAR_LR=0.02 \ +TIED_EMBED_LR=0.02 \ +WARMDOWN_FRACTION=0.2 \ +LOGIT_SOFTCAP=10 \ +QK_GAIN_INIT=2.25 \ +ROPE_TYPE=yarn \ +YARN_MAX_LEN=2048 \ +ROPE_BASE=5000 \ +BATCH_TOKENS_START=0 \ +BATCH_SCHEDULE_FRACTION=0.33 \ +TRAIN_BATCH_TOKENS=524288 \ +SEQ_LEN_START=0 \ +SEQ_SCHEDULE_FRACTION=0.0 \ +TRAIN_SEQ_LEN=1024 \ +SMEAR=1 \ +ITERATIONS=50000 \ +WARMUP_STEPS=5 \ +MAX_WALLCLOCK_SECONDS=0 \ +VAL_LOSS_EVERY=0 \ +TRAIN_LOG_EVERY=500 \ +CHURN_LOG_EVERY=1000 \ +VAL_MAX_TOKENS=0 \ +TIE_EMBEDDINGS=1 \ +UNTIE_AT_FRACTION=0.00 \ +HEAD_LR=0.02 \ +CORR_WEIGHT_LR=0.02 \ +ACTIVATION=relu2 \ +SOFTCAP_TYPE=poly \ +MTP_HEADS=0 \ +REFINER=0 \ +REFINER_KERNEL=3 \ +SLIDING_EVAL=1 \ +SLIDING_EVAL_STRIDE=16 \ +SLIDING_BATCH_SIZE=256 \ +TEMP_SCALING=1 \ +FP_STORAGE=FP8 \ +EMA=0 \ +EMA_DECAY=0.995 \ +EMA_START_FRACTION=0.5 \ +SEED=42 \ +COMPILE_MODE=default \ +OMP_NUM_THREADS=1 torchrun --standalone --nproc_per_node=8 train_gpt_cuda_binary.py +``` + +
+ +## Compliance + +- [x] Artifact <=16,000,000 bytes (15,670,651) +- [x] Sliding window eval stride=16 +- [x] No test-time training on validation data +- [x] No network calls during evaluation +- [x] No external compute +- [ ] Train time <=600s — **non-record submission** (7,763s / 50,000 steps) diff --git a/records/track_non_record_16mb/2026-03-24_106M_Binary_Asymmetric_UNet_FP8_15L_8192BPE_YaRN_NeoMuon_Smear/RESULTS.md b/records/track_non_record_16mb/2026-03-24_106M_Binary_Asymmetric_UNet_FP8_15L_8192BPE_YaRN_NeoMuon_Smear/RESULTS.md new file mode 100644 index 0000000000..82fcd581f0 --- /dev/null +++ b/records/track_non_record_16mb/2026-03-24_106M_Binary_Asymmetric_UNet_FP8_15L_8192BPE_YaRN_NeoMuon_Smear/RESULTS.md @@ -0,0 +1,1236 @@ +# Parameter Golf — Complete Experiment Log + +**Author:** Ciprian-Florin Ifrim +**Date:** March 2026 + +--- + +## Challenge Overview + +Train the best language model that fits in a 16MB artifact and trains in under 10 minutes on 8×H100 SXM GPUs, evaluated by tokenizer-agnostic bits-per-byte (BPB) compression on the FineWeb validation set. + +- **Baseline:** 1.2244 bpb (9L 512d int8+zlib, 1k vocab) +- **Our best (ternary, valid):** 1.1565 bpb sliding (P2, 10L 768d relu² 4×MLP fp8, EMBED_DIM=254, seed=42, 16.00MB) +- **Our best (binary, unconstrained):** 1.1239 bpb sliding (15L 768d binary relu² 4×MLP fp8, 50k steps / ~2h compute, 15.67MB) +- **Our best (quality, over budget):** 1.1771 bpb (F59, 12L 768d swiglu 3×MLP, 21.96MB) +- **Challenge period:** March 18 – April 30, 2026 +- **Compute sponsor:** OpenAI ($1M in compute credits) + +The challenge is framed as L(N) optimisation — minimising loss given fixed parameter count N, unconstrained by data, compute, steps, or architecture. Related challenges include NanoGPT Speedrun (L(T): lowest loss given constrained time) and NanoGPT Slowrun (L(D): lowest loss given constrained dataset). + +--- + +## Run Numbering Convention + +| Prefix | Description | +|--------|-------------| +| Plain (1–100) | Dev runs on RTX 5090, 100 steps | +| R prefix (R1...) | Record runs — 600s on 8×H100, leaderboard-targeted | +| S prefix (S1...) | Scaling runs — 1500 steps or 300s on 8×H100, controlled sweeps | +| SB prefix (SB1...) | Binary scaling runs | +| F prefix (F1...) | Final runs — 600s on 8×H100, official submissions | +| P prefix (P1...) | Pushed/submission runs — final config pushed to GitHub | + +Additionally, 20 early architecture iterations were performed on MLX (Mac Studio M1 Ultra, 32GB unified memory) and 2 on MPS (MacBook Pro M1 Pro, 32GB unified memory) for rapid prototyping before GPU scaling. + +> **Note:** This document covers ~85 named runs (F, S, R series). An additional ~165 dev runs (plain numbered 1–100, repeated sweeps, smoke tests) were conducted but are not individually listed. Key findings from those runs are incorporated into the sweep tables and decision rationale. Separate synthetic-data notebooks were used to isolate the behaviour of specific techniques (Tversky similarity, linear alternatives, grouped projections) before committing H100 compute. + +--- + +## Hardware + +| System | Spec | Notes | +|--------|------|-------| +| Dev | RTX 5090 32GB, single GPU | Triton smem ceiling 101KB/SM; blocks value embeddings and some kernels | +| Mac (MLX) | Mac Studio M1 Ultra 32GB | MLX early iteration, 20 runs | +| Mac (MPS) | MacBook Pro M1 Pro 32GB | MPS early iteration, 2 runs | +| Final | 8×H100 SXM 80GB | Primary training platform | + +**Step times at 768d (12L):** relu² 2x: 89ms | relu² 3x: 99ms | relu² 4x: 91ms | swiglu 3x: 127ms | leaky relu 3x: 103ms + +**Step times at 512d:** 26L baseline: 149ms → 136ms with FA3 → 127ms with FA3 + fusions + EMBED=256 at 25L + +**FlashAttention-3** reduced step time by ~9% (~380 free training steps per 600s run). + +**Kernel fusion optimisations** (fused QKV + fused SwiGLU + dataloader + softcap) saved a further ~7-10ms/step. + +**Width vs depth discovery:** 12L 768d at 106ms/step gets ~5640 steps in 600s vs ~4720 steps for 25L 512d — 920 extra steps from the faster per-step time of wider/shallower models. Final 10L 768d 4×MLP at 91.8ms/step gets ~6530 steps. + +--- + +## Architecture: Ternary U-Net Transformer + +### Quantisation Scheme + +BitNet b1.58 ternary quantisation — weights constrained to {−1, 0, +1} with per-group absmean scaling. Approximately 1.6 bits per parameter. + +**Compression pipeline:** Base-3 packing (5 trits/byte) or bitmask packing → LZMA (preset=9). Best method auto-selected per run. Bitmask wins when zero fraction is high. + +**Quantisation shrinkage fix:** When ternary Q contains zeros, `mean(|Q|) < 1.0`, causing scale mismatch on reload. Fix: inflate by `1/mean(|Q|)` during dequantisation. Eliminates all roundtrip gaps. + +### U-Net Skip Connections + +The model uses a U-Net style encoder/decoder structure with learned skip connections. The first `num_layers // 2` blocks (encoder) store their outputs; the second half (decoder) receives these via `x = x + skip_weight[i] * skips.pop()`. This allows the decoder to simultaneously access high-level semantic representations (from deep processing) and low-level token-level features (from early processing), without requiring the decoder to reconstruct low-level information from the compressed residual stream. + +Additionally, each block receives `x0` (the original input embedding) via a learned residual mix: `x = mix[0] * x + mix[1] * x0`, giving every layer direct access to the raw token representation regardless of accumulated residual drift. + +For odd layer counts, the decoder receives the larger half (e.g. 27L → 13 encoder + 14 decoder), which is the standard U-Net convention — more processing power applied after skip injection. + +### Factored Embedding + +With `EMBED_DIM=254`, token embedding is `[8192, 254]` instead of `[8192, 768]`, with learned projections `embed_proj` (254→768) and `embed_proj_rev` (768→254) for the tied output head. + +**EMBED_DIM history:** Started at 128 (dev runs), upgraded to 256 after an optimizer coverage fix revealed that the projection layers had not been receiving gradients (−0.024 bpb improvement vs 128 once trained), then trimmed to 254 to fit artifact+code under the 16,000,000 byte budget (~0.0004 bpb cost, 0.00018/dim from 128→256 scaling data). + +### Fused Operations + +**Fused QKV:** Single `TernaryLinear(dim, dim + 2*kv_dim)`. **Fused SwiGLU/relu²:** Gate and up projections combined into single wide matrix. Combined saving: ~4-6ms/step. + +### Z-Loss Regularisation + +`1e-4 * logsumexp(logits)²` (from PaLM/Gemma) anchors logits near zero, keeping gradients sharp through the ternary STE. + +--- + +## Compression Scheme + +### Base-3 + LZMA (Primary) + +5 trits per byte (1.585 bits/trit), lossless. LZMA at preset=9 achieves ~39% reduction over int8+zlib. Ternary distribution at convergence: ~20–29% zeros, ~35–40% each ±1. The skewed distribution (more zeros) is exploited by LZMA's entropy coding. + +### Bitmask Compression (Alternative) + +Encodes "is this weight zero?" and "if nonzero, is it +1?" as separate bitmasks. Both methods are tried and the smaller is selected automatically. In practice, bitmask and base-3+LZMA produce nearly identical artifact sizes — bitmask wins marginally in some runs (e.g. S72: 15.84MB vs 15.87MB). Zero fraction would need to drop below ~5% for bitmask to provide a clear advantage; our zero fraction ranges from 17–29% at convergence, making bitmask non-competitive. + +### 3D Tensor Support + +Conv1d weights (`[dim, dim, kernel]`) are reshaped to 2D before ternary quantisation and restored to original shape on load. + +### FP8 QAT + +Non-ternary parameters (embeddings, projections) stored at fp8 (e4m3) with Quantisation-Aware Training via STE. Halves fp_params storage (~5MB → ~2.5MB). Typical roundtrip gap: 0.001–0.002 bpb. + +--- + +## Submission Runs (P prefix) — Ternary + +Configuration: F88 (10L 768d relu² 4×MLP fp8, WD=0, EMBED_DIM=254, 599s wallclock, TEMP=0.90) + +| Seed | Steps | val_bpb | RT bpb | Sliding bpb | Train Time | Eval Time | Artifact | Budget | +|------|-------|---------|--------|-------------|------------|-----------|----------|--------| +| 1337 | 6520 | 1.1825 | 1.1839 | **1.1568** | 599.1s | 428.7s | 15.92MB | 16.00/16.00MB | +| 42 | 6530 | 1.1816 | 1.1837 | **1.1565** | 599.7s | 429.3s | 15.92MB | 15.99/16.00MB | +| 7 | 6530 | 1.1823 | 1.1850 | **1.1578** | 599.6s | 429.0s | 15.92MB | 15.99/16.00MB | +| **Mean** | **6527** | **1.1821** | **1.1842** | **1.1570** | **599.5s** | **429.0s** | **15.92MB** | | +| **Std** | **5** | **0.0005** | **0.0007** | **0.0007** | **0.3s** | **0.3s** | **0.00MB** | | + +All three seeds fit within the 16,000,000 byte budget. The standard deviation of 0.0007 bpb across seeds confirms high reproducibility. All runs achieve p < 0.001 improvement over the 1.2244 bpb baseline. + +### Batch Size Sensitivity (Ternary, 599s wallclock) + +| Batch Tokens | Steps | ms/step | val_bpb | Sliding bpb | Tokens Seen | Fits Budget | +|-------------|-------|---------|---------|-------------|-------------|-------------| +| 262,144 | 10,000 | 49 | 1.2413 | — | 2.6B | No | +| **524,288** | **6,530** | **92** | **1.1850** | **1.1578** | **3.4B** | **Yes** | +| 1,048,576 | 3,480 | 172 | 1.1925 | 1.1659 | 3.5B | No | + +524k batch tokens is the optimal operating point. Halving the batch (262k) doubles the step count but degrades quality by 0.056 bpb due to noisier gradients interacting poorly with the ternary STE. Doubling it (1M) sees similar total tokens but fewer gradient updates, costing 0.008 bpb. + +--- + +## Current Best Configuration + +### Ternary: 10L 768d relu² 4×MLP fp8, WD=0, EMBED_DIM=254 + +```bash +NUM_LAYERS=10 MODEL_DIM=768 NUM_HEADS=8 +NUM_KV_HEADS=4 MLP_MULT=4 VOCAB_SIZE=8192 +ACTIVATION=relu2 LOGIT_SOFTCAP=10 SOFTCAP_TYPE=poly +QK_GAIN_INIT=2.25 ROPE_BASE=5000 ROPE_TYPE=yarn +YARN_MAX_LEN=2048 EMBED_DIM=254 TIE_EMBEDDINGS=1 +BITNET_GROUP_SIZE=128 FP_STORAGE=FP8 MUON_WD=0.0 +MATRIX_LR=0.04 SCALAR_LR=0.02 TIED_EMBED_LR=0.02 +MUON_BACKEND_STEPS=3 MUON_MOMENTUM=0.95 WARMDOWN_FRACTION=0.2 +MAX_WALLCLOCK_SECONDS=599 +SLIDING_EVAL=1 SLIDING_EVAL_STRIDE=16 TEMP_SCALING=1 +TRAIN_BATCH_TOKENS=524288 +``` + +| Metric | Value | +|--------|-------| +| val_bpb (mean) | 1.1821 | +| RT bpb (mean) | 1.1842 | +| Sliding bpb (mean) | 1.1570 | +| Artifact + code | 15,992,753–15,995,705 / 16,000,000 bytes | +| Steps | 6520–6530 | +| ms/step | 91.8 | +| zero_frac | 0.335–0.336 | +| optimal_T | 0.90 | +| Params | 73,685,840 | + +--- + +## Dev Runs (RTX 5090, 100–500 steps) + +### Phase 0 — Ternary vs Binary (500 steps, 16L 512d, 1k vocab) + +| Run | Config | val_bpb | RT bpb | Artifact | ms/step | +|-----|--------|---------|--------|----------|---------| +| 17 | Ternary baseline | 1.7110 | 1.7300 | 23.95MB | 1312 | +| 18 | Binary {−1,+1} | 1.7121 | 1.7316 | 23.93MB | 1309 | + +Ternary wins by 0.0016 bpb. The zero state provides representational benefit. + +--- + +### Phase 1 — Training Techniques (100 steps, 9L 512d, 1k vocab) + +| Run | Config | val_bpb | RT bpb | Artifact | Notes | +|-----|--------|---------|--------|----------|-------| +| 19 | Ternary 16L 512d baseline | 2.3371 | 2.3793 | 7.33MB | | +| 20 | + Untie lm_head at 2/3 | 2.3569 | 2.3983 | 8.13MB | Deferred — needs wallclock fix | +| 21 | + Value embeddings | — | — | — | Blocked: RTX 5090 Triton smem | +| 22 | + Smear module | 2.3593 | 2.3985 | 7.33MB | Deferred — gate needs many steps | +| 23 | Baseline 9L 512d | 2.4483 | 2.4768 | 4.45MB | Switched from 16L | +| 24 | + Polynomial softcap | 2.3981 | 2.4438 | 4.45MB | **−0.033 rt** | +| 25 | + Seq length schedule | 2.4633 | 2.5106 | 4.45MB | Deferred — recompile cost | +| 26 | + NorMuon | 2.4018 | 2.4104 | 4.40MB | **−0.033 rt**, 5× smaller RT gap | +| 27 | + Grad accum delay | 2.6298 | 2.6571 | 4.40MB | Deferred — needs 2000+ steps | + +--- + +### Vocabulary Sweep (100 steps, 9L 512d) + +| Run | Vocab | val_bpb | RT bpb | Artifact | Notes | +|-----|-------|---------|--------|----------|-------| +| 23 | 1024 | 2.4483 | 2.4768 | 4.45MB | Baseline | +| 28 | 4096 | 2.0930 | 2.0974 | 6.68MB | −0.32 vs 1k | +| **29** | **8192** | **1.9946** | **1.9990** | **9.64MB** | **−0.42 vs 1k — largest single win** | + +8192 vocab locked. The tokeniser merges ~1.57× more aggressively than 1k, directly reducing BPB. Val token count drops from 63.8M (sp1024) to 40.5M (sp8192) for the same 50k documents. + +--- + +### Activation Sweep (100 steps, 9L 512d, 8k vocab) + +| Run | Activation | val_bpb | RT bpb | Artifact | ms/step | +|-----|-----------|---------|--------|----------|---------| +| 29 | relu2 | 1.9946 | 1.9990 | 9.64MB | 838 | +| 30 | relu | 1.9846 | 1.9879 | 9.63MB | 830 | +| **31** | **SwiGLU** | **1.9704** | **1.9743** | **10.70MB** | **960** | +| 32 | SwiGLU + MTP(2) | 1.9627 | 1.9672 | 10.69MB | 1111 | + +SwiGLU with MTP auxiliary loss gives −0.032 bpb but +16% slower. SwiGLU alone gives −0.025 bpb. MTP deferred. + +--- + +### Embedding Factorization Sweep (100 steps, 9L 512d, 8k vocab) + +| Run | EMBED_DIM | val_bpb | RT bpb | RT gap | Artifact | +|-----|-----------|---------|--------|--------|----------| +| 33a | 0 (=512) | 1.9931 | 1.9962 | 0.003 | 9.63MB | +| **33d** | **128** | **1.9656** | **1.9656** | **0.000** | **9.12MB** | +| 33c | 256 | 2.0538 | 2.1339 | 0.080 | 6.68MB | +| 33e | 64 | 2.0936 | 2.0968 | 0.003 | 4.49MB | +| 33f | 1024 | 2.0709 | 2.1845 | 0.114 | 15.60MB | + +128 was optimal at dev scale. After an optimizer fix revealed the projection layers had not been training, 256 became optimal at full convergence — see EMBED_DIM Sweep at full convergence. + +--- + +### Tversky Neural Network Investigation + +Based on Doumbouya et al. (2025). Three-term Tversky similarity: `S = theta * f(A intersection B) - alpha * f(A - B) - beta * f(B - A)` with learned membership functions. + +**Feature count sweep (FP16 features, ternary prototypes, 100 steps, 9L 512d):** + +| Run | Features | val_bpb | RT bpb | RT gap | Artifact | +|-----|----------|---------|--------|--------|----------| +| — | No Tversky | 1.9751 | 1.9751 | 0.000 | 5.33MB | +| 38 | 16 | 1.9877 | 2.0186 | 0.031 | 5.46MB | +| 39 | 32 | 1.9843 | 2.0133 | 0.029 | 5.57MB | +| 40 | 64 | 1.9790 | 2.0097 | 0.031 | 5.79MB | +| **41** | **128** | **1.9427** | **1.9865** | **0.044** | **6.20MB** | +| 42 | 256 | 1.9737 | 2.0863 | 0.113 | 5.63MB | +| 43 | 512 | 2.0036 | 2.0965 | 0.093 | 5.90MB | +| 44 | 128 + shrinkage fix | 1.9425 | **1.9425** | **0.000** | 6.20MB | + +Tversky showed genuine quality benefit (~-0.017 bpb) at dev scale with 128 features and fp16 prototype storage. However, subsequent investigation at full convergence (12L 768d) and with corrected prototype storage showed all Tversky variants within noise of the linear baseline. Additional experiments included full ternary prototypes, shared feature pools across layers, no-features mode, logit-head application, and different membership functions (sigmoid, poly, tanh). A synthetic-data notebook confirmed that Tversky's asymmetric similarity only helps on tasks with genuine directional feature relationships (hypernym/hyponym, cause/effect); next-token prediction on FineWeb web text is not such a task. + +At the 768d architecture with relu², Tversky also incurred a 19ms/step overhead because the smaller MLP no longer masked the compute cost. + +**Conclusion:** Tversky is quality-neutral on FineWeb language modelling regardless of configuration. Not a quantisation issue, not an optimizer issue — the task simply does not benefit from asymmetric similarity. + +--- + +### Key Hyperparameter Sweeps (100 steps, 9L 512d, 8k vocab) + +**QK_GAIN_INIT sweep:** + +| Run | QK_GAIN | val_bpb | Delta | +|-----|---------|---------|-------| +| 75 | 1.0 | 2.0007 | +0.0076 | +| 73 | 1.5 | 1.9931 | baseline | +| 81 | 2.15 | 1.9913 | −0.0018 | +| **79** | **2.25** | **1.9898** | **−0.0033** | +| 77 | 2.5 | 1.9915 | −0.0016 | +| 80 | 2.75 | 1.9975 | +0.0044 | +| 78 | 3.0 | 2.0011 | +0.0080 | + +Clear inverted-U response. **QK_GAIN_INIT=2.25 locked.** + +**LOGIT_SOFTCAP sweep:** + +| Run | SOFTCAP | val_bpb | Delta | +|-----|---------|---------|-------| +| 74 | 5 | 1.9942 | −0.0013 | +| **73** | **10** | **1.9931** | **−0.0024** | +| 72 | 20 | 1.9935 | −0.0020 | +| 71 | 50 | 1.9957 | +0.0003 | + +**LOGIT_SOFTCAP=10 locked.** + +**Softcap type (poly vs tanh):** + +| Run | Type | val_bpb | Notes | +|-----|------|---------|-------| +| S23 | poly | 1.3680 | | +| S24 | tanh | 1.3693 | | +| S28/S29 | both at EMBED=1024 | 1.3460–1.3462 | Identical at convergence | + +Zero effect. Polynomial retained as default. + +**ROPE_BASE sweep:** + +| Run | ROPE_BASE | val_bpb | Notes | +|-----|-----------|---------|-------| +| **70** | **5000** | **1.9959** | Best at short training | +| 73 | 10000 | 1.9931 | Close second | +| 69 | 20000 | 2.0008 | | +| 68 | 50000 | 2.0017 | | + +**KV Heads:** + +| Run | KV_HEADS | val_bpb | Artifact | +|-----|----------|---------|----------| +| **58** | **4 (GQA)** | **1.9955** | **7.75MB** | +| 66 | 8 (MHA) | 2.0148 | 8.46MB | + +**MLP_MULT:** + +| Run | MLP_MULT | val_bpb | Artifact | +|-----|----------|---------|----------| +| **58** | **2** | **1.9955** | **7.75MB** | +| 64 | 3 | 2.0004 | 9.09MB | +| 65 | 4 | 1.9992 | 10.39MB | + +**Storage precision:** + +| Run | Storage | val_bpb | RT bpb | RT gap | Artifact | +|-----|---------|---------|--------|--------|----------| +| **90** | **fp16** | **1.9656** | **1.9656** | **0.000** | **9.06MB** | +| 91 | fp8 | 1.9662 | 1.9702 | 0.004 | 7.83MB | +| 92 | fp4 | 1.9661 | 1.9955 | 0.029 | 7.11MB | + +**TTT-LoRA sweep (100 steps, ROPE=5000):** + +| Run | Rank | LR | TTT bpb | Delta | +|-----|------|-----|---------|-------| +| **85** | **8** | **0.01** | **1.9368** | **−0.0315** | +| 86 | 8 | 0.005 | 1.9378 | −0.0312 | +| 87 | 8 | 0.02 | 1.9644 | −0.0038 | +| **88** | **4** | **0.01** | **1.9371** | **−0.0285** | +| 89 | 16 | 0.01 | OOM | — | + +TTT confirmed working at dev scale (−0.0315 bpb). Incompatible at convergence — see TTT investigation. + +**EMBED_DIM sweep at 512d (12L, 100 steps):** + +| Run | EMBED_DIM | Tversky feat | RT bpb | Artifact | bpb/MB efficiency | +|-----|-----------|-------------|--------|----------|-------------------| +| 95 | 64 | 128 | 2.1961 | 8.40MB | worst | +| 98 | 96 | 128 | 2.0356 | 8.74MB | | +| 97 | 128 | 128 | 1.9656 | 9.12MB | best | +| 99 | 192 | 128 | 2.0409 | 10.07MB | | +| 94 | 256 | 128 | 2.0703 | 10.93MB | | +| 100 | 256 | 256 | 2.0340 | 10.09MB | RT gap 0.021 | +| 96 | 512 (off) | 128 | 2.0642 | 13.50MB | | + +128 confirmed optimal at dev scale. + +--- + +### Architecture Sizing Table (Ternary, EMBED_DIM=128, standard proj) + +| Config | Layers | Artifact | Under 16MB? | RT gap | Headroom | +|--------|--------|----------|-------------|--------|----------| +| fp16 | 20 | 14.23MB | Yes | 0.0001 | 1.77MB | +| **fp16** | **22** | **15.48MB** | **Yes** | **0.0001** | **0.52MB** | +| fp16 | 24 | 16.74MB | No | — | −0.74MB | +| fp8 QAT | 24 | 14.63MB | Yes | 0.028 | 1.37MB | +| fp8 QAT | 26 | 15.77MB | Yes | 0.066 | 0.23MB | +| **fp8 QAT** | **27** | **15.42MB** | **Yes** | **0.0025** | **0.58MB** | +| fp8 QAT | 28 | 15.92MB+code | Marginal | 0.0029 | ~0MB | +| fp8 QAT | 30 | 16.92MB | No | 0.0029 | −0.92MB | + +--- + +## H100 Record Runs (R prefix) + +**Hardware:** 8×H100 SXM 80GB | **Time limit:** 600 seconds + +| Run | Config | Steps | val_bpb | RT bpb | Artifact | Notes | +|-----|--------|-------|---------|--------|---------|-------| +| R1 | 22L Tversky fp16 | 4299 | 1.2789 | 1.2792 | 15.80MB | | +| R2 | 26L standard fp16 | 3973 | 1.2649 | 1.2650 | 15.85MB | Pre-LR tuning best | +| R3 | 16L Tversky fp16 | 5949 | 1.2900 | 1.2904 | 11.95MB | Too shallow | +| R4 | 9L Tversky fp16 | 10112 | 1.3374 | 1.3394 | 7.48MB | Way too shallow | +| R5 | 30L fp8 | 2852 | 1.2689 | 1.2815 | 17.22MB | Over budget | +| R6 | 26L fp16, 2× LR | ~4003 | 1.2991 | — | ~15.85MB | LR overshot | +| **R7** | **26L fp16, LR=0.02** | **4008** | **1.2608** | **1.2610** | **15.83MB** | **Best pre-FA3** | +| R8 | 26L fp16, LR=0.01 | 4017 | 1.2853 | 1.2855 | 15.72MB | LR too low | +| R9 | 26L BigramHash | 4010 | 1.2804 | 1.2802 | 15.81MB | BigramHash negative | +| R10 | 26L untie@66% | 3706 | 1.2754 | 1.2753 | 23.15MB | Over budget | +| R11 | 26L tied, updated code | 4009 | 1.2806 | 1.2808 | 15.81MB | Code regression | + +**LR sweep (R-series):** + +| LR | val_bpb | Notes | +|----|---------|-------| +| 0.08 | 1.2991 | Overshoots — ternary STE amplifies gradient noise | +| **0.02** | **1.2608** | **Optimal** | +| 0.01 | 1.2853 | Too slow | + +--- + +## Scaling Runs (S prefix) + +**Hardware:** 8×H100 SXM 80GB | **Steps:** 1500 | **Timer:** disabled (MAX_WALLCLOCK_SECONDS=0) +**Base config:** 26L 512d, EMBED_DIM=128, ROPE=5000, QK_GAIN=2.25, SOFTCAP=10, LR=0.02 all, VOCAB=8192, SwiGLU, SEED=1337 + +--- + +### Warmdown Sweep + +| Run | Fraction | val_bpb | +|-----|----------|---------| +| S3 | 10% | 1.3467 | +| **S1** | **20%** | **1.3438** | +| S2 | 30% | 1.3443 | +| S4 | 30% repeat | 1.3458 | +| S5 | 40% | 1.3501 | + +S2 vs S4 (identical config): 0.0015 bpb spread — confirmed seed variance floor. + +### Muon Backend Steps + +| Run | Steps | ms/step | val_bpb | +|-----|-------|---------|---------| +| S8 | 3 | 144.87 | 1.3491 | +| S9 | 4 | 146.61 | 1.3448 | +| **S1** | **5** | **149.19** | **1.3438** | +| S7 | 8 | 164.31 | 1.3441 | +| S6 | 10 | 157.95 | 1.3456 | + +At full convergence (F6 vs F1): 3 steps matches 5 due to +190 extra training steps. Locked at 3. + +### Muon Momentum + +| Run | Momentum | val_bpb | zero_frac | Artifact | +|-----|----------|---------|-----------|---------| +| S11 | 0.90 | 1.3680 | 0.179 | 15.39MB | +| **S1** | **0.95** | **1.3438** | **0.205** | **15.56MB** | +| S10 | 0.99 | 1.3505 | 0.259 | 15.78MB | + +Higher momentum increases zero_frac, inflating artifact size. + +### Architecture Experiments + +| Run | Config | ms/step | val_bpb | Notes | +|-----|--------|---------|---------|-------| +| S12 | 20L 640d (80M params) | 160.58 | 1.6676 | 17.75MB — over budget | +| **S1** | **26L 512d baseline** | **149.19** | **1.3438** | **Reference** | +| S13 | 26L, TRAINING_DR=2 | 281.63 | 1.3727 | ~795 effective steps, OOM at DR=3 | + +### Eval Depth Recurrence Sweep + +| Run | EVAL_DR | val_bpb | +|-----|---------|---------| +| S15 | 0/1 | 1.3685–1.3690 | +| S16 | 2 | 1.3688 | +| S17 | 3 | 1.3681 | +| S18 | 4 | 1.3690 | +| S19 | 5 | 1.3683 | + +Total range: 0.0009 bpb — pure noise. + +### Weight Decay (1500 steps) + +| Run | MUON_WD | val_bpb | zero_frac | Artifact | +|-----|---------|---------|-----------|---------| +| **S15** | **0.00** | **1.3685** | **0.179** | **15.39MB** | +| S20 | 0.04 | 1.3722 | 0.145 | 15.12MB | + +WD hurts at 1500 steps but saves 0.27MB. Reversed at full convergence — see Final Ternary Record Runs. + +### BigramHash + +| Run | Config | Steps | val_bpb | Artifact | +|-----|--------|-------|---------|---------| +| S21 | 26L + BigramHash | 1500 | 1.3681 | 15.45MB | +| R9 | 26L + BigramHash | 4010 | 1.2804 | 15.81MB | + +At full convergence: 0.020 bpb worse than R7. The 2.1MB fp16 cost of the bigram table displaces ternary layer depth at convergence. **Not viable within budget.** + +### Tied Embedding / Correction Weight / Untie Investigation + +| TIE_EMBEDDINGS | UNTIE_AT_FRACTION | LM_HEAD_RANK | Behaviour | +|---------------|-------------------|--------------|-----------| +| 0 | any | any | Untied from start — unstable, loss = log(8192) = 9.01 | +| 1 | 0.0 | 0 | Always tied — current best | +| 1 | 0.66 | 0 | Tied → full-rank untie at 66% of wallclock | +| 2 | 0.0 | 0 | Tied + correction weight residual on tok_emb | +| 2 | 0.66 | 0 | Tied + correction → full-rank untie at 66% | +| 2 | 0.66 | r | Tied + correction → SVD rank-r untie at 66% | + +**1500-step results:** + +| Run | TIE | UNTIE | RANK | val_bpb | Artifact | +|-----|-----|-------|------|---------|---------| +| S15 | 1 | 0.00 | 0 | 1.3685 | 15.39MB | +| S30 | 2 | 0.00 | 0 | 1.3678 | 15.39MB | +| S36 | 1 | 0.66 | 0 | 1.3648 | 22.83MB | +| **S37** | **2** | **0.66** | **0** | **1.3642** | **22.84MB** | +| S38 | 1 | 0.66 | 0 | 1.3667 | 22.84MB | +| S39 | 0 | 0.66 | 0 | 3.4890 | 10.88MB | + +Untie gives +0.005 bpb gain but adds 7.3MB — over budget. **TIE=1, no untie locked.** + +### LM Head Factorization (SVD-at-Untie) + +| Run | RANK | val_bpb | Artifact | Delta vs baseline | +|-----|------|---------|---------|-------------------| +| S37 | 0 (full) | 1.3642 | 22.84MB | +0.004 — over budget | +| S43 | 32 | 1.4873 | 17.27MB | −0.119 | +| S41 | 64 | 1.4243 | 17.60MB | −0.056 | +| S42 | 128 | 1.3889 | 18.40MB | −0.020 | + +SVD factorization does not recover within the remaining 34% of training. The model requires full-rank lm_head for 8192-class separability in 512-dimensional space. + +### Tied Embed LR Sweep + +| Run | TIED_EMBED_LR | MATRIX_LR | SCALAR_LR | val_bpb | +|-----|--------------|-----------|-----------|---------| +| S33 | 0.01 | 0.02 | 0.02 | 1.3723 | +| **S15** | **0.02** | **0.02** | **0.02** | **1.3685** | +| S34 | 0.03 | 0.02 | 0.02 | 1.3742 | + +Symmetric degradation. **TIED_EMBED_LR=0.02 locked.** + +### TTT-LoRA Investigation + +Test-time training with per-document LoRA adapters. Confirmed working at dev scale (−0.0315 bpb). Incompatible at convergence across 6 diagnostic runs. + +| Run | Config | val_bpb | TTT bpb | Notes | +|-----|--------|---------|---------|-------| +| S22 | TTT_LR=0.01 | 1.3690 | 1.5065 | TTT hurts | +| S23 | No lm_head_lora | 1.3690 | 1.4993 | Still hurts | +| S24 | tanh softcap | 1.3693 | 1.4982 | No improvement | +| S25 | Q/V loras only | 1.3692 | 1.5193 | Worse | +| S26 | EMBED_DIM=1024 | 1.3473 | 1.4746 | Bottleneck not cause | +| S27 | 9L (original depth) | 1.4039 | 1.5189 | Still incompatible at 9L | + +**Root cause:** Every `TernaryLinear` applies RMSNorm to its input before the weight multiply. The LoRA adapter delta is computed on the pre-normalised representation, but injected into a forward pass where base weights operate on a differently-normalised space. At 100 steps the model is poorly calibrated and LoRA signal dominates. At convergence, the base model's representations are precisely calibrated to this normalised space, and any LoRA delta corrupts rather than adapts. This incompatibility is architectural. **TTT permanently disabled.** + +### MTP (Multi-Token Prediction) + +| Run | MTP_HEADS | ms/step | val_bpb | Notes | +|-----|-----------|---------|---------|-------| +| **S47** | **0** | **149** | **1.3693** | **Baseline** | +| S45 | 2 | 157 | 1.3704 | +0.0011 worse | +| S62 | 2 | 144 | 1.3727 | +0.0034 worse | + +Confirmed at both 1500 steps and full convergence (post-fix retest: 0.006 bpb worse at both MTP=1 and MTP=2). A 60M+ parameter, 1.58-bit model does not have the parameter bandwidth for auxiliary future-planning objectives. + +### Smear Module + +| Run | SMEAR | val_bpb | ms/step | +|-----|-------|---------|---------| +| **S48** | **0** | **1.3687** | **149** | +| S49 | 1 | 1.3675 | 182 | + ++22% slower, −0.0012 bpb at 1500 steps. At full 600s wallclock, smear costs ~740 fewer training steps. Not viable within the ternary 10-minute budget but explored further in the binary track. + +### Sequence Length Schedule + +| Run | Config | val_bpb | ms/step avg | +|-----|--------|---------|-------------| +| S48 | baseline | 1.3687 | 149 | +| S51 | smear + seq@33% | 1.3660 | ~240 | +| S52 | smear + seq@33% repeat | 1.3640 | ~221 | +| **S58** | **smear + seq@33% + YaRN** | **1.3628** | **~221** | + +Real gain at 1500 steps but severe step penalty at full 600s. **Disabled for final runs.** + +### Batch Size Schedule + +| Run | Config | val_bpb | +|-----|--------|---------| +| S48 | baseline | 1.3687 | +| S50 | smear + batch | 1.3698 | +| S53 | smear + seq + batch | 1.3667 | + +Noisier gradients interfere with ternary STE convergence. **Not viable.** + +### YaRN Positional Encoding + +| Run | Config | val_bpb | +|-----|--------|---------| +| S48 | RoPE baseline | 1.3687 | +| S54 | YaRN 4096 | 1.3705 | +| S55 | YaRN 2048 | 1.3679 | +| S56 | YaRN 2048 + seq@33% | 1.3672 | +| S57 | YaRN 2048 + seq@50% + smear | 1.3637 | +| **S58** | **YaRN 2048 + seq@33% + smear** | **1.3628** | + +YaRN 4096 hurts (scale=0.25 too aggressive). YaRN 2048 marginally better. **YaRN 2048 retained; seq schedule disabled.** + +ROPE_BASE with YaRN: S63 (10000) = 1.3692, **S61 (5000) = 1.3686**. ROPE_BASE=5000 locked. + +### Sliding Window Evaluation + +| Run | Stride | Sliding bpb | Eval time | +|-----|--------|-------------|-----------| +| S60 | 16 | 1.3452* | >600s | +| S67 | 24 | 1.3146 | 592s | +| **S61/S66** | **32** | **1.3139–1.3452*** | **~350s** | + +*S60/S61 used incorrect momentum=0.90. At full convergence (F1): stride=32 gives 1.2312 sliding bpb in 280s. + +### Temperature Scaling + +Grid search over T in [0.80, 1.20] on 65,536 training tokens. 5-point grid. Optimal T was consistently 1.00 at convergence for the 512d SwiGLU architecture. At the 768d relu² architecture, T=0.90 was consistently optimal (relu² logits slightly underconfident). **TEMP_SCALING=1 in all final runs.** + +### Group Size Sweep (S73–S76, 2000 steps, 27L) + +| Run | Group Size | Layers | val_bpb | Artifact | Total | +|-----|-----------|--------|---------|----------|-------| +| S76 | 32 | 27 | 1.2739 | 17.64MB | 17.73MB | +| S75 | 64 | 27 | 1.2683 | 16.22MB | 16.31MB | +| **S73** | **128** | **27** | **1.2677** | **15.53MB** | **15.62MB** | +| S74 | 256 | 27 | 1.2699 | 15.19MB | 15.28MB | + +128 wins on both quality and compression. + +### Skip Weights Init — Zero vs Ones (S77) + +| Run | Init | val_bpb | artifact | +|-----|------|---------|---------| +| S73 | ones | 1.2677 | 15.62MB | +| S77 | zeros | 1.2781 | 15.62MB | + +Zero-init is **0.0104 bpb worse**. Decoder needs skip signal from step 0. + +### FP8/FP4 Storage with QAT + +**FP8 sweep:** + +| Run | Config | val_bpb | RT bpb | RT gap | Sliding bpb | Artifact | +|-----|--------|---------|--------|--------|-------------|---------| +| S64 | 26L fp16 | 1.3390 | 1.3390 | 0.000 | 1.3150 | 15.58MB | +| S65 | 30L fp8, no QAT | 1.3346 | 1.3394 | 0.0048 | 1.3150 | 16.92MB | +| S66 | 30L fp8, QAT | 1.3351 | 1.3380 | 0.0029 | **1.3139** | 16.92MB | +| S71 | 27L fp8, QAT | 1.3380 | 1.3405 | 0.0025 | 1.3164 | 15.42MB | +| S72 | 28L fp8, QAT | 1.3377 | 1.3406 | 0.0029 | 1.3166 | 15.92MB | + +QAT reduces fp8 RT gap from 0.0048 to 0.0029 (40% improvement). However at full convergence (F3), 28L fp8 QAT (1.2353 sliding) loses to 26L fp16 (1.2312 sliding). + +**FP4 sweep:** + +| Run | Config | val_bpb | RT bpb | RT gap | Sliding bpb | Artifact | +|-----|--------|---------|--------|--------|-------------|---------| +| S68 | 30L fp4 QAT | 1.3377 | 1.3643 | **0.0266** | 1.3404 | 16.49MB | +| S69 | 26L fp4 Tversky QAT | 1.3543 | 1.3835 | **0.0292** | 1.3606 | 15.01MB | +| S70 | 28L fp4 QAT | 1.3405 | 1.3666 | **0.0261** | 1.3424 | 15.43MB | + +FP4 RT gap of ~0.026–0.029 even with QAT is unrecoverable. **FP4 not viable at any layer count.** + +### EMBED_DIM Sweep (Full Convergence, 25L) + +| Config | EMBED_DIM | Steps | val_bpb | sliding_bpb | artifact | Notes | +|--------|-----------|-------|---------|-------------|---------|-------| +| S80 | 0 (=512) | 4500 | 1.1902 | ~1.168 est | 19.78MB | OOM on sliding eval | +| **F22** | **256** | **4720** | **1.2012** | **1.1739 (s16)** | **16.21MB** | **Best 512d result** | +| F16-era | 128 | 4310 | 1.2245 | — | 16.19MB | Pre-fix baseline | + +**EMBED_DIM=256 locked.** Budget impact: fp_params ~4.85MB vs ~2.48MB at 128 (+2.37MB). + +--- + +## Final Ternary Record Runs (F prefix) + +**Hardware:** 8×H100 SXM 80GB | **FlashAttention-3 enabled** | **Time limit:** 600 seconds + +| Run | Config | Steps | val_bpb | RT bpb | Sliding bpb | Eval time | Artifact | +|-----|--------|-------|---------|--------|-------------|-----------|---------| +| **F1** | **26L fp16, no smear, no seq** | **4362** | **1.2560** | **1.2560** | **1.2312** | **280s** | **15.85MB** | +| F2 | 26L fp16, smear + seq@33% | 3044 | 1.2779 | 1.2778 | 1.2535 | 390s | 15.85MB | +| F3 | 28L fp8 QAT, no smear, no seq | 4019 | 1.2571 | 1.2601 | 1.2353 (s24) | 385s | 16.14MB | +| F4 | 26L fp16, EMA=1 | 4145 | 1.2589 | 2.3307 | — | — | 14.52MB | +| F5 | 26L fp16, EMA fix v1 (smoke) | 407 | 1.5483 | 2.3642 | — | — | 14.90MB | +| F6 | 26L fp16, MUON_BACKEND_STEPS=3 | 4552 | 1.2558 | 1.2558 | 1.2311 (s24) | 362s | 15.81MB | +| F7 | 26L fp16, WD=0.04, steps=3 | 4499 | 1.2552 | 1.2551 | 1.2302 (s24) | 362s | 15.60MB | +| F8 | 28L fp16, WD=0.04, steps=2, LR=0.02 | 4219 | 1.2799 | 1.2801 | 1.2558 (s16) | 577s | 15.92MB | +| F9 | 28L fp16, WD=0.04, steps=2, LR=0.03 | 4231 | 1.2673 | 1.2676 | 1.2431 (s16) | 577s | 16.00MB | +| F10 | 28L fp16, WD=0.04, steps=2, LR=0.04 | 4226 | 1.2636 | 1.2636 | 1.2391 (s16) | 578s | 16.01MB | +| F11 | 28L fp16, WD=0.04, steps=3, LR=0.04 | 4137 | 1.2489 | 1.2488 | — | — | 16.69MB | +| F12 | 28L fp16, WD=0.04, steps=4, LR=0.04 | 4047 | 1.2496 | 1.2500 | — | — | 16.71MB | +| F13 | 28L fp16, WD=0.04, steps=3, LR=0.05 | 4048 | 1.2512 | 1.2510 | — | — | 16.73MB | +| F14 | 28L fp16, WD=0.04, steps=3, LR=0.08 | 4036 | 1.2576 | 1.2574 | — | — | 16.75MB | +| F15 | 27L fp16, AdamW matrix, LR=0.01 | 4676 | 1.2943 | 1.2942 | — | — | 15.71MB | +| F16 | 27L fp16, Muon, LR=0.04, WD=0.04 | 4310 | 1.2245 | — | — | — | 16.19MB | +| **F22** | **25L fp16, EMBED=256, steps=3, WD=0.04** | **4720** | **1.2012** | **1.2011** | **1.1739 (s16)** | **493s** | **16.21MB** | + +**Key findings:** F22 with EMBED_DIM=256 and corrected optimizer achieves 0.055 bpb improvement over F1 (the best pre-fix config). 28L extensively attempted (F8–F14) but artifact always over budget at competitive LR. AdamW for matrix params (F15) is clearly worse than Muon. + +--- + +## Phase 2 — Post-Optimizer-Fix Experiments (25L 512d EMBED=256) + +### EMA (Exponential Moving Average) + +| Run | Config | Steps | val_bpb | RT bpb | Artifact | +|-----|--------|-------|---------|--------|----------| +| F4 | EMA=1, decay=0.999 | 4145 | 1.2589 | 2.3307 | 14.52MB | +| — | Full run with EMA | 4144 | 1.2584 | 1.3776 | 14.94MB | + +**EMA is fundamentally incompatible with ternary quantization.** EMA averaging in fp32 produces smoother, more zero-centered weights. More latent weights near zero → more round to 0 in ternary → scale factor mismatch → 0.13 bpb RT gap. **Permanently disabled.** + +### Muon Backend Steps — Full Convergence + +| Run | Steps | step_avg | val_bpb | sliding_bpb | artifact | +|-----|-------|----------|---------|-------------|---------| +| F1 (steps=5) | 4362 | 137ms | 1.2560 | 1.2312 | 15.85MB | +| F6 (steps=3) | 4552 | 131ms | 1.2558 | 1.2311 | 15.81MB | + +6ms/step saving → 190 extra steps → quality equivalent. **MUON_BACKEND_STEPS=3 locked.** + +### Weight Decay — Full Convergence + +| Run | WD | Steps | val_bpb | sliding_bpb | zero_frac | artifact | +|-----|-----|-------|---------|-------------|-----------|---------| +| F6 | 0.00 | 4552 | 1.2558 | 1.2311 | 0.294 | 15.81MB | +| F7 | 0.04 | 4499 | 1.2552 | 1.2302 | 0.221 | 15.60MB | + +WD=0.04 wins at full convergence on the 26L architecture. However at 10L 4×MLP (Phase 4), WD=0.00 was better — wider MLP needs full weight freedom. + +### MTP Retest (Post-Fix) + +| Run | MTP_HEADS | Steps | step_avg | val_bpb | artifact | +|-----|-----------|-------|----------|---------|---------| +| F22 baseline | 0 | 4720 | 127ms | 1.2012 | 16.29MB | +| Run 26 | 1 | 4560 | 131ms | 1.2074 | 16.30MB | +| Run 27 | 2 | 4420 | 135ms | 1.2074 | 16.29MB | + +**MTP confirmed not viable post-fix.** 0.006 bpb worse at both heads. **MTP_HEADS=0 permanently locked.** + +### Tversky Phase 2 (Post-Fix, 12L 768d, fp16 Prototypes) + +Comprehensive retest with corrected optimizer and fp16 prototype storage: + +| Run | Config | Features | Pools | val_bpb | RT gap | +|-----|--------|----------|-------|---------|--------| +| 49 | No Tversky | — | — | **1.1888** | 0.0002 | +| 50 | Attn proj only | 128 | 1 | 1.1893 | 0.0000 | +| 51 | Attn proj only | 256 | 1 | 1.1894 | 0.0001 | +| 52 | Attn proj only | 32 | 1 | 1.1898 | 0.0001 | +| 53 | Attn + head | 128 | 1 | 1.1892 | — | +| 54 | Attn + head | 128 | 0 (local) | 1.1897 | +0.0006 | + +All variants within 0.001–0.002 bpb of baseline — pure noise. Confirmed by synthetic-data analysis that Tversky's asymmetric similarity only helps on tasks with directional feature relationships, which next-token prediction on web text is not. + +--- + +## Phase 3 — Architecture Exploration (Post-Optimizer-Fix) + +### Width vs Depth + +The central Phase 3 finding: wider models with fewer layers beat deeper models. + +#### 768d Scaling Curve + +| Run | Layers | Steps | step_avg | val_bpb | Artifact | +|-----|--------|-------|----------|---------|----------| +| 34 | 8 | 8110 | 74ms | 1.2894 | 12.94MB | +| 30 | 12 | 5640 | 106ms | 1.1893 | 17.50MB | +| 38 | 14 | 4900 | 122ms | 1.1870 | 19.79MB | +| 33/37 | 16 | 4320 | 139ms | 1.1825–37 | 22.08MB | +| 39 | 18 | 3870 | 155ms | 1.1801 | 24.39MB | +| 36 | 20 | 3510 | 171ms | 1.1854 | 26.67MB | + +Peak at 18L, then step penalty dominates. 8L collapses (U-Net encoder too shallow). Seed variance: Run 33 vs 37 = 0.0012 bpb. + +#### Cross-Architecture Comparison + +| Config | Layers | Dim | Steps | val_bpb | +|--------|--------|-----|-------|---------| +| F22 | 25 | 512 | 4720 | 1.2012 | +| Run 30 | 12 | 768 | 5640 | 1.1893 | +| Run 40 | 8 | 1024 | 5870 | 1.1858 | +| Run 41 | 10 | 896 | 5400 | 1.1862 | +| Run 35 | 20 | 640 | 4170 | 1.1927 | +| Run 42 | 6 | 896 | 8510 | 1.2157 | + +Width beats depth: 12L 768d (1.1893) beats 25L 512d (1.2012). Minimum viable depth: 768d ~10–12L, 896d ~10L, 1024d ~8L. + +### FP8 at 768d + +| Run | Layers | Storage | val_bpb | RT bpb | RT gap | +|-----|--------|---------|---------|--------|--------| +| 49 | 12 | fp16 | 1.1888 | 1.1886 | 0.0002 | +| 42 | 13 | fp8 | 1.1879 | 1.1900 | 0.0021 | + +FP8 RT gap acceptable at 768d. Enables extra layers within budget. + +### LM_HEAD_RANK Investigation (Post-Fix, 768d) + +| Run | Config | val_bpb | RT bpb | Total | Notes | +|-----|--------|---------|--------|-------|-------| +| Run 49 | baseline | 1.1888 | 1.1886 | 17.50MB | Reference | +| Run 43 | TIE=2, rank=256, fp8 | 1.2021 | 1.2028 | 20.41MB | Artifact bloated | +| Run 44 | TIE=0, rank=512, untie=0.0 | 1.3196 | 1.3195 | 16.92MB | Random head, no learning | +| Run 45 | TIE=2, rank=512, fp16 | 1.2312 | 1.2317 | 26.87MB | Catastrophic artifact blowup | + +Root cause: the SVD factors U and V require fp16/fp8 precision to maintain approximation quality. At any viable compression level, the two new matrices cost more storage than the original tied embedding saves. **Not viable.** + +--- + +## Phase 4 — Final Architecture Search + +### Activation Sweep (12L 768d 3×MLP, 600s) + +| Run | Activation | MLP | ms/step | Steps | val_bpb | Artifact | +|-----|-----------|-----|---------|-------|---------|----------| +| F55 | relu | 2× | 88.7 | 6760 | 1.2284 | 14.49MB | +| **F56** | **relu²** | **2×** | **89.5** | **6700** | **1.2042** | **14.48MB** | +| F60 | leaky relu | 3× | 102.6 | 5840 | 1.2094 | 17.50MB | +| **F57** | **relu²** | **3×** | **101.5** | **5910** | **1.1878** | **17.51MB** | +| F58 | swiglu | 3× | 127.4 | 4700 | 1.1786 | 22.05MB | +| **F59** | **swiglu** | **3×** | **127.3** | **4710** | **1.1771** | **21.96MB** | + +relu² beats relu by 0.024 bpb at no cost — strictly dominant. relu² locked for budget-constrained path. + +### MLP Width Sweep (600s) + +| Run | Activation | MLP | Layers | ms/step | Steps | val_bpb | Artifact | +|-----|-----------|-----|--------|---------|-------|---------|----------| +| F56 | relu² | 2× | 12 | 89.5 | 6700 | 1.2042 | 14.48MB | +| F64 | relu² | 3× | 12 | 99.4 | 6030 | 1.1873 | 17.50MB | +| F75 | relu² | 4× | 12 | 91.6 | 6550 | 1.1795 | 20.54MB | +| F82 | relu² | 4× | 10 | 91.6 | 6550 | 1.1861 | 16.04MB | + +4× MLP at 10L beats 3× at 12L within similar budget. + +### Layer Count vs MLP Width (fp8, 600s) + +| Run | Config | Layers | ms/step | Steps | val_bpb | RT bpb | Artifact | +|-----|--------|--------|---------|-------|---------|--------|----------| +| F78 | relu² 3× fp8 | 12 | 99.3 | 6040 | 1.1884 | 1.1898 | 15.80MB | +| F77 | relu² 3× fp8 | 13 | 106.6 | 5630 | 1.2065 | 1.2077 | 16.96MB | +| F80 | relu² 2× fp8 | 15 | 106.9 | 5610 | 1.2120 | 1.2136 | 15.45MB | +| F81 | relu² 2× fp8 | 16 | 113.9 | 5270 | 1.1996 | 1.2009 | 16.33MB | +| F79 | relu² 3× fp8 | 11 | 91.5 | 6560 | 1.1920 | 1.1933 | 14.66MB | +| **F82** | **relu² 4× fp8** | **10** | **91.6** | **6550** | **1.1861** | **1.1877** | **16.04MB** | +| F83 | swiglu 3× fp8 | 10 | 105.5 | 5690 | 1.1842 | 1.1853 | 17.29MB | + +### Weight Decay at 10L 4×MLP fp8 + +| Run | WD | val_bpb | RT bpb | Artifact | +|-----|-----|---------|--------|----------| +| F82 | 0.04 | 1.1861 | 1.1877 | 16.04MB | +| F84 | 0.08 | 1.1983 | 1.1998 | 16.04MB | +| **F85** | **0.00** | **1.1828** | **1.1844** | **16.02MB** | +| S87 | 0.00 | 1.1831 | 1.1843 | 16.01MB | +| **F88** | **0.00 (EMBED=254)** | **1.1820** | **1.1839** | **16.00MB — FITS** | + +WD=0 optimal at 10L 4× — opposite to 26L result. Wider MLP needs full weight freedom. + +--- + +## Binary Quantisation Track + +### Motivation + +Binary quantisation constrains weights to {-1, +1} with no zero state. At 1 bit/param vs ternary's 1.6 bits/param, binary packs approximately 60% more parameters per MB. The hypothesis was that additional depth could compensate for the loss of the zero state. + +Starting point: the ternary best config (10L, 768d, 8h, 4kv, 4× relu², FP8, 524k batch, 599s) scoring 1.1578 sliding bpb. + +### Binary Scaling Runs + +| Run | Layers | MLP | FP | Other | Steps | ms/step | Sliding bpb | Artifact | Fits | +|-----|--------|-----|-----|-------|-------|---------|-------------|----------|------| +| F17 | 17 | 4× | FP8 | — | 4010 | 149 | 1.2022 | 17.45MB | No | +| **F1** | **14** | **4×** | **FP8** | **—** | **4820** | **124** | **1.1824** | **14.74MB** | **Yes** | +| F2 | 14 | 4× | FP8 | EMA | 4800 | 125 | 1.2110 | 14.56MB | Yes | +| S3 | 15 | 4× | FP8 | — | 1000 | 133 | 1.3114 | 15.65MB | Yes | +| S4 | 20 | 3× | FP8 | — | 1000 | 160 | 1.3077 | 16.90MB | No | +| S5 | 21 | 3× | FP4 | — | 1000 | 167 | 1.3676 | 16.64MB | No | +| S6 | 19 | 3× | FP8 | — | 1000 | 152 | 1.3130 | 16.16MB | No | +| S7 | 15 | 4× | FP8 | refiner | 1000 | 135 | 1.3123 | 15.89MB | Yes | +| S8 | 15 | 4× | FP8 | smear | 1000 | 155 | 1.3043 | 15.67MB | Yes | +| S9 | 15 | 4× | FP8 | tversky_attn | 1000 | 179 | 1.4016 | 15.74MB | Yes | + +### Key Decisions from Binary Scaling + +**MLP width (4× vs 3×):** 4× won even when 3× received 4–5 extra layers. S3 (15L 4×) outperformed S6 (19L 3×) at matched steps. Width matters more than depth past a minimum viable layer count. + +**FP storage (FP8 vs FP4):** FP4 added a 0.06 bpb roundtrip penalty and was immediately ruled out. FP8 used for all non-binary tensors. + +**Layer count:** 17L was the theoretical maximum at 4× FP8 but landed 1.45MB over budget. 15L at 15.65MB was the maximum that fit. 14L left 1.26MB headroom. + +**EMA:** Mathematically sound for binary (no zero bucket means `mean(|Q|)=1.0` always, clean roundtrip). In practice, 0.03 bpb worse — the smoothed weights apparently hurt binary's learning dynamics despite the clean quantisation math. + +**Smear:** 0.007 bpb gain at 1000 steps but added 22ms/step overhead (133→155ms). Retained for the extended binary run to test whether the gain survives the step penalty at longer training. + +**Refiner (causal conv):** Neutral at 1000 steps, added 2ms/step. Not justified. + +**Tversky attention projection:** 0.09 bpb worse. Completely incompatible with binary weights. + +**Activation:** relu² inherited from ternary sweeps, not retested for binary. SwiGLU would cost ~4MB extra across 15 layers, eliminating the layer budget advantage. + +### Extended Binary Run (Unconstrained Compute) + +To measure the binary architecture's convergence ceiling without the 10-minute wallclock constraint, a single extended run was conducted at 50,000 steps (~2 hours on 8×H100). + +**Configuration:** 15L 768d, 4× relu², FP8, smear, 524k batch tokens, seed=42, MUON_WD=0.0 + +``` +step:50000/50000 val_loss:2.9692 val_bpb:1.1497 train_time:7763s +artifact:15.60MB binary:97320960(13685760B) fp:2542200(2585072B) code:70399 +budget:15670651/16000000 (15.67/16.00MB) FITS +final_binary_roundtrip val_loss:2.9743 val_bpb:1.1516 +temp_scaling optimal_T:0.90 +final_sliding val_loss:2.9027 val_bpb:1.1239 (stride=16, T=0.90) +``` + +| Metric | Value | +|--------|-------| +| val_bpb | 1.1497 | +| RT bpb | 1.1516 | +| Sliding bpb | **1.1239** | +| Artifact | 15.60MB (15.67MB total) | +| Params | 97,320,960 | +| Steps | 50,000 | +| ms/step | 155.3 | +| Training time | ~2.15 hours | + +The 1.1239 sliding bpb demonstrates that with sufficient compute the binary architecture reaches strong quality. This validates the compression approach — nearly 100M parameters in 15.67MB via 1-bit quantisation — though the 50k steps required far exceeds the competition's 10-minute budget. + +### Binary vs Ternary at Equal Architecture (Dev Scale) + +| Metric | Binary | Ternary | Delta | +|--------|--------|---------|-------| +| val_bpb | 1.8609 | 1.8113 | Ternary wins by 0.050 | +| Artifact | 9.14MB | 11.56MB | Binary saves 2.42MB | +| ms/step | 918 | 924 | Identical | +| RT gap | 0.000 | 0.000 | Both clean | + +Ternary is better at equal architecture. Binary's only advantage is fitting more layers in the same budget. + +### Binary Conclusion + +Binary lost the depth-for-sparsity trade. The 5 extra layers (15L binary vs 10L ternary) could not overcome ternary's representational advantage from the zero state. The 0.0016 bpb gap measured at 500 dev steps significantly understated the true difference at convergence. Ternary at 1.1578 sliding bpb (10-minute budget) outperforms binary's best fitting run (F1: 1.1824 at 14L without smear) by 0.025 bpb. Even the over-budget 17L binary run (1.2022) could not match ternary. + +The extended 50k-step binary run reaching 1.1239 sliding bpb shows that binary has a competitive convergence ceiling, but it requires approximately 8× more training steps to approach competitive quality — well beyond the competition constraints. + +--- + +## Grouped MLP Investigation + +Tested GroupedTernaryLinear: splits MLP into independent groups for parameter/speed savings. + +### Real Model Results (relu² 3×, 768d, 600s) + +| Run | Config | Layers | ms/step | Steps | val_bpb | Artifact | +|-----|--------|--------|---------|-------|---------|----------| +| F64 | standard | 12 | 99.4 | 6030 | 1.1873 | 17.50MB | +| F72 | g=2 | 12 | 87.4 | 6870 | 1.2180 | 12.97MB | +| F71 | g=4 | 12 | 83.5 | 7190 | 1.2429 | 10.74MB | +| F73 | g=2 | 16 | 114.2 | 5260 | 1.2037 | 16.04MB | +| F74 | swiglu g=2 | 12 | 113.3 | 5300 | 1.2084 | 15.24MB | + +Cross-group isolation costs 0.031–0.056 bpb. Even with 4 extra layers (F73), only recovers 0.014 of the deficit. **Not viable for language modelling.** + +--- + +## Differential Attention + +Microsoft (2024): computes two attention maps from split Q/K and takes their difference. + +| Run | Config | ms/step | Steps | val_bpb | +|-----|--------|---------|-------|---------| +| F64 | standard | 99.4 | 6030 | 1.1873 | +| F68 | diff_attn | 109.3 | 5480 | 1.2094 | + +Splits 96-dim heads into 48-dim sub-heads — insufficient dimensionality for meaningful attention patterns at this model scale. + +--- + +## Sequence Refiner (CausalConvRefiner) + +| Run | Config | ms/step | Steps | val_bpb | Artifact | +|-----|--------|---------|-------|---------|----------| +| F64 | none | 99.4 | 6030 | 1.1873 | 17.50MB | +| F69 | k=3 | 102.2 | 5860 | 1.1885 | 19.92MB | +| F70 | k=5 | 103.0 | 5820 | 1.2018 | 18.13MB | + +Noise-level quality improvement with storage bloat. 12 attention layers already saturate local pattern capture. + +--- + +## ByteCNN Vocabulary Generator + +Replaces `nn.Embedding(8192, 256)` with a CNN that generates the embedding matrix from byte spellings. + +``` +step:500 loss:9.0471 — step:2000 loss:9.0471 (flat, no learning) +``` + +All 8192 CNN-generated embeddings converge to near-identical vectors at initialisation. The CNN's inductive bias (byte-similar tokens → similar embeddings) destroys the initial diversity needed for gradient signal. + +--- + +## Asymmetric Tokenizer Investigation + +8k BPE input with 256-byte output to eliminate large output projection. + +| Model | BPB | Notes | +|-------|-----|-------| +| Standard (tied, emb=256) | 3.10 | reference | +| Asymmetric parallel (emb=256) | 8.65 | byte independence assumption fails | +| Asymmetric autoregressive (emb=256) | 8.17 | tiny GRU insufficient capacity | + +Multi-byte parallel heads assume conditional independence between bytes within a token — mathematically incorrect. Sequence-length mismatch (7 BPE tokens → 70 bytes) also incompatible with the evaluation framework. + +--- + +## Linear Alternative Exploration + +Systematic notebook testing of linear layer alternatives at real model dimensions (768d). + +### Projection Benchmark (DIM → DIM, H100) + +| Model | Params | ms | vs Linear | +|-------|--------|-----|-----------| +| Linear | 589,824 | 0.07ms | 1.00× | +| LowRank r=64 | 98,304 | 0.03ms | 0.44× | +| BlockDiag b=4 | 147,456 | 0.03ms | 0.40× | +| Grouped g=4 | 147,456 | 0.03ms | 0.40× | +| BD4 + mix32 | 196,608 | 0.07ms | 0.97× | +| Hash 65536 | 65,536 | 0.08ms | 1.13× | + +BlockDiag/Grouped offer speed advantages but cross-group isolation degrades LM quality in practice. + +--- + +## H100 Microbenchmark Results + +Standalone kernel timing vs torch.compile behaviour (critical lesson: standalone microbenchmarks can mislead when torch.compile fuses operations). + +### STE Speed + +| Variant | ms/call | +|---------|---------| +| Current | 0.041 | +| Reciprocal | 0.043 | + +No gain — 48 STE calls/step = ~2ms overhead (unavoidable). + +### Contiguous Checks + +Q and K are contiguous after RoPE. V is non-contiguous (view into fused QKV). V's `.contiguous()` costs 0.065ms/call = 0.78ms/step (necessary for flash_attn). + +### RoPE Variants + +Current (half-split + cat) is fastest at 0.52ms/call. + +### Softcap: Poly5 vs Tanh + +| Variant | ms/call | +|---------|---------| +| Poly5 (current) | 8.43 | +| Poly3 | 5.98 | +| Tanh | 2.12 | +| Hardtanh | 0.71 | + +**Critical finding:** Tanh is 4× faster standalone due to H100 hardware transcendental units. However in the real training loop, torch.compile fuses poly5 with surrounding ops into a single kernel. **Switching to tanh broke fusion — F63 was 16ms/step slower.** Poly5 retained. + +### CE + Z-Loss Fusion + +| Variant | ms/call (fwd+bwd) | +|---------|-------------------| +| Separate (current) | 16.56 | +| Fused (shared LSE) | 12.33 | + +**Same lesson:** 4.2ms saving standalone, but torch.compile already optimises `F.cross_entropy`. Manual gather+logsumexp prevents optimisation. Current approach retained. + +--- + +## Efficiency Analysis + +### BPB Gained Per Component + +| Component | BPB gain | Source | +|-----------|----------|--------| +| relu → relu² | −0.024 | F55 vs F56 | +| MLP 2× → 3× (relu²) | −0.017 | F56 vs F64 | +| MLP 3× → 4× (relu²) | −0.008 | F64 vs F75 | +| relu² → swiglu (at 3×) | −0.010 | F64 vs F59 | +| +1 layer (average) | −0.0012 | scaling data | +| fp16 → fp8 (RT penalty) | +0.002 | run 42 vs 49 | +| Sliding eval stride=16 | −0.025 | F22 data | +| WD=0.04 vs WD=0 (at 26L) | −0.001 | F7 vs F6 | + +### MB Cost Per Component + +| Component | MB/layer | +|-----------|----------| +| relu² 2× layer | 0.767 | +| relu² 3× layer | 1.003 | +| relu² 4× layer | 1.220 | +| swiglu 3× layer | 1.357 | +| fp16 → fp8 (fixed saving) | −2.51 | + +### Efficiency Ratio (BPB Gained Per MB Spent) + +| Change | BPB gain | MB cost | BPB/MB | +|--------|----------|---------|--------| +| relu → relu² | −0.024 | 0.00 | infinite (free) | +| Sliding eval | −0.025 | 0.00 | infinite (free) | +| MLP 2× → 3× | −0.017 | +2.83 (12L) | −0.0060/MB | +| MLP 3× → 4× | −0.008 | +2.83 (12L) | −0.0028/MB | +| relu² → swiglu | −0.010 | +4.25 (12L) | −0.0024/MB | +| +1 layer (relu² 2×) | −0.0012 | +0.767 | −0.0016/MB | +| +1 layer (relu² 3×) | −0.0012 | +1.003 | −0.0012/MB | + +MLP 2×→3× is the most efficient paid upgrade. relu² and sliding eval are free wins. + +### Layer Budget at 768d + +| Config | Max Layers | Est ms/step | +|--------|-----------|-------------| +| relu² 2× fp16 | 14L | ~95ms | +| relu² 2× fp8 | 17L | ~97ms | +| relu² 3× fp16 | 10L | ~99ms | +| relu² 3× fp8 | 13L | ~106ms | +| relu² 4× fp8 | 10L | ~92ms | +| swiglu 3× fp8 | 9L | ~105ms | + +--- + +## Ternary-Incompatible Techniques + +These are not merely unhelpful but structurally incompatible with 1.58-bit quantisation: + +| Technique | Mechanism of failure | +|-----------|---------------------| +| **EMA** | Weight averaging → values cluster near zero → ternary rounds most to 0 → 0.12 bpb RT gap | +| **TTT-LoRA** | LoRA delta computed outside RMSNorm space that TernaryLinear normalises into. Corrupts calibrated representations at convergence | +| **Ternary prototypes + sigmoid** | Sigmoid membership needs continuous values. Ternary {-1,0,+1} collapses membership patterns → 0.077 RT gap | +| **LM head rank factorisation** | SVD factors U,V need fp16 precision. Storage exceeds original tied embedding | + +--- + +## Software Optimisations + +| Optimisation | Saving | Notes | +|---|---|---| +| Fused QKV (c_q+c_k+c_v → single matmul) | ~2ms/step | Safe: in_features divisible by all group sizes | +| Fused SwiGLU/relu² (gate+up → single wide matmul) | ~2-4ms/step | Same params, fewer kernel launches | +| Z-loss regularisation (1e-4 x logsumexp²) | quality | Anchors logits, keeps STE gradients sharp | +| DataLoader int16 transfer (pin then cast on GPU) | ~1ms/step | 4× less PCIe bandwidth | +| FlashAttention-3 | ~13ms/step | ~9% speedup, ~380 free training steps | +| TernaryLinear bf16 weights, cleaner STE | ~1ms/step | Eliminates fp32 roundtrip | +| DDP static_graph + gradient_as_bucket_view | ~1ms/step | Free when find_unused=False | +| Fused optimizer loop (LR set + step in one pass) | ~0.5ms/step | Fewer Python-level iterations | +| Removed CUBLAS determinism tax | ~1ms/step | Not required for competition | +| Temperature grid: 5 points instead of 21 | ~1s total | T=0.90 consistently with relu² | +| Temp scaling moved to eval phase | ~3 steps gained | No longer steals training time | +| `_e()` helper for Hyperparameters | -1.8KB code | Eliminates env var boilerplate | +| 3D tensor ternary quantisation | storage fix | Conv1d weights reshaped to 2D for ternary | + +--- + +## Rejected Techniques (Summary) + +| Technique | Reason | +|-----------|--------| +| Tversky (all variants) | Quality-neutral on FineWeb LM — confirmed via synthetic data analysis; speed penalty with relu² | +| Differential attention | Halved head_dim (96→48) degrades quality at this model scale | +| Grouped MLP (g=2, g=4) | Cross-group isolation costs 0.031–0.056 bpb; not recoverable with extra layers | +| CausalConvRefiner | Noise-level quality; storage bloat from Conv1d weights | +| ByteCNN vocabulary generator | Embedding collapse — CNN inductive bias destroys initial diversity | +| Asymmetric tokenizer | Byte independence assumption incorrect; sequence mismatch with eval framework | +| EMA | Incompatible with ternary — weight averaging causes 0.12 bpb RT gap | +| TTT-LoRA | Architectural incompatibility with RMSNorm space in TernaryLinear | +| LM head factorisation | SVD factors bloat artifact beyond budget; unrecoverable quality loss | +| MTP | 0.006 bpb worse — model capacity too limited for auxiliary objectives | +| BigramHash | 0.020 bpb worse at convergence; fp16 table displaces ternary layers | +| Seq/batch schedule | Recompile and step penalties dominate at 600s wallclock | +| SmearModule | +22% step cost for −0.001 gain within ternary 10-minute budget | +| Depth recurrence | Halves effective steps; OOM at DR=3 | +| AdamW for matrix params | Clearly inferior to Muon for ternary weights | +| FP4 storage | 0.026–0.029 RT gap even with QAT — unrecoverable | +| Tanh softcap | Faster standalone but breaks torch.compile kernel fusion | +| Fused CE+Z-loss | Same — breaks compile optimisation | +| 16 heads at 768d | 48-dim head_dim insufficient for meaningful attention | +| relu (plain) | Strictly dominated by relu² | +| leaky relu | Strictly dominated by relu² | +| Distillation (in-run) | Train-from-scratch teacher always worse than supervised | +| reduce-overhead compile | Rotary + embed_proj_rev incompatible with CUDA graphs | +| max-autotune compile | 30+ minute kernel search prohibitive for 600s runs | +| Skip weights zero-init | 0.010 bpb worse — decoder needs skip signal from step 0 | +| EMBED_DIM=0 (full 512) | 19.78MB artifact — 3.78MB over budget | +| Untie lm_head full-rank | 7.3MB budget overrun not justified by 0.005 bpb gain | + +--- + +## Decision Log + +| Decision | Rationale | +|----------|-----------| +| 8k vocabulary | −0.42 bpb, largest single win | +| relu² activation | −0.024 bpb vs relu, free (no cost) | +| 4×MLP width | Best BPB within budget at 10L; 0.008 better than 3× | +| 10L 768d | Minimum viable depth at 768d with maximum MLP width | +| WD=0.0 at 10L 4× | Opposite to deep models — wider MLP needs full weight freedom | +| fp8 storage | Halves fp_params (5MB→2.5MB), enables wider MLP within budget | +| EMBED_DIM=254 | 256-2 dims to fit artifact+code under 16,000,000 byte budget; ~0.0004 bpb cost | +| BITNET_GROUP_SIZE=128 | Same quality as 64; saves 0.69MB | +| 8 heads, 4 KV, 96-dim head_dim | 16h at 48-dim insufficient; MHA only +0.0012 at +1.5MB | +| Poly softcap | Fuses with torch.compile; tanh breaks fusion | +| ROPE_BASE=5000 + YaRN 2048 | Best frequency calibration | +| Muon optimizer | Newton-Schulz normalisation compensates for ternary STE gradient attenuation | +| MUON_BACKEND_STEPS=3 | Equivalent to 5 at convergence; +190 extra steps | +| MUON_MOMENTUM=0.95 | Both directions degrade; affects artifact via zero_frac | +| WARMDOWN=20% | Asymmetric — too little hurts more than too much | +| MATRIX_LR=0.04 | Higher LR compensates for ternary STE gradient attenuation | +| SCALAR_LR=0.02 | Optimal — scalars do not pass through STE | +| TIED_EMBED_LR=0.02 | Optimal | +| TRAIN_BATCH_TOKENS=524k | Optimal tradeoff between gradient quality and step count | +| Base-3 + LZMA | 39% reduction over int8+zlib | +| Shrinkage fix | Eliminates all RT gaps universally | +| Skip weights ones-init | Decoder needs skip signal from step 0; zeros costs 0.010 bpb | +| Tied embeddings | Untie costs 7.3MB; not justified | +| Standard attn projection | Tversky quality-neutral; grouped destroys quality | +| No EMA | Fundamentally incompatible with ternary | +| No TTT | RMSNorm space incompatibility confirmed across 6 runs | +| No MTP | Confirmed post-fix: 0.006 bpb worse | +| Temperature scaling T=0.90 | relu² logits slightly underconfident; auto-calibrated | +| Fused QKV + relu² | ~130-180 free training steps per run | +| Z-loss regularisation | Anchors logits; keeps STE gradients sharp | +| FlashAttention-3 | Free ~380 extra training steps per 600s run | +| Sliding eval stride=16 | Best quality when eval budget unconstrained | +| Optimizer coverage fix | embed_proj/embed_proj_rev now train; +0.055 bpb improvement | +| MAX_WALLCLOCK_SECONDS=599 | 1s leeway for safety margin | +| Binary 15L 768d 4× fp8 | 97M params in 15.67MB — maximum parameter density; convergence ceiling validated at 50k steps | diff --git a/records/track_non_record_16mb/2026-03-24_106M_Binary_Asymmetric_UNet_FP8_15L_8192BPE_YaRN_NeoMuon_Smear/binary_log.txt b/records/track_non_record_16mb/2026-03-24_106M_Binary_Asymmetric_UNet_FP8_15L_8192BPE_YaRN_NeoMuon_Smear/binary_log.txt new file mode 100644 index 0000000000..f75377dcdf --- /dev/null +++ b/records/track_non_record_16mb/2026-03-24_106M_Binary_Asymmetric_UNet_FP8_15L_8192BPE_YaRN_NeoMuon_Smear/binary_log.txt @@ -0,0 +1,1518 @@ +"""Binary training script for OpenAI's Parameter Golf Challenge. Ciprian-Florin Ifrim - 24 March 2026""" + +import copy +import glob +import io +import math +import os +import random +import sys +import time +import lzma +from pathlib import Path +import numpy as np +import sentencepiece as spm +import torch +import torch.distributed as dist +import torch.nn.functional as F +from torch import Tensor, nn +from torch.nn.parallel import DistributedDataParallel as DDP +from flash_attn_interface import flash_attn_func + +# --------------------------------------------------------------------------- +# Hyperparameters (all configurable via environment variables) +# --------------------------------------------------------------------------- +def _e(k, d, t=str): + v = os.environ.get(k, str(d)) + if t == bool: return bool(int(v)) + return t(v) + +class Hyperparameters: + data_path = _e("DATA_PATH", "./data/datasets/fineweb10B_sp1024") + train_files = os.path.join(data_path, "fineweb_train_*.bin") + val_files = os.path.join(data_path, "fineweb_val_*.bin") + tokenizer_path = _e("TOKENIZER_PATH", "./data/tokenizers/fineweb_1024_bpe.model") + run_id = os.environ.get("RUN_ID", f"run_{int(time.time())}") + seed = _e("SEED", 1337, int) + compile_mode = _e("COMPILE_MODE", "default") + val_batch_size = _e("VAL_BATCH_SIZE", 524288, int) + val_loss_every = _e("VAL_LOSS_EVERY", 500, int) + train_log_every = _e("TRAIN_LOG_EVERY", 10, int) + iterations = _e("ITERATIONS", 2000, int) + warmdown_fraction = _e("WARMDOWN_FRACTION", 0.2, float) + warmup_steps = _e("WARMUP_STEPS", 20, int) + train_batch_tokens = _e("TRAIN_BATCH_TOKENS", 524288, int) + train_seq_len = _e("TRAIN_SEQ_LEN", 1024, int) + max_wallclock_seconds = _e("MAX_WALLCLOCK_SECONDS", 0.0, float) + vocab_size = _e("VOCAB_SIZE", 1024, int) + num_layers = _e("NUM_LAYERS", 16, int) + num_kv_heads = _e("NUM_KV_HEADS", 4, int) + model_dim = _e("MODEL_DIM", 512, int) + num_heads = _e("NUM_HEADS", 8, int) + mlp_mult = _e("MLP_MULT", 2, int) + tie_embeddings = _e("TIE_EMBEDDINGS", 1, int) + rope_base = _e("ROPE_BASE", 10000.0, float) + rope_type = _e("ROPE_TYPE", "rope") + yarn_max_len = _e("YARN_MAX_LEN", 4096, int) + logit_softcap = _e("LOGIT_SOFTCAP", 30.0, float) + softcap_type = _e("SOFTCAP_TYPE", "poly") + tied_embed_init_std = _e("TIED_EMBED_INIT_STD", 0.005, float) + qk_gain_init = _e("QK_GAIN_INIT", 1.5, float) + activation_type = _e("ACTIVATION", "swiglu") + embed_dim = _e("EMBED_DIM", 0, int) + bigram_hash = _e("BIGRAM_HASH", 0, bool) + mtp_heads_count = _e("MTP_HEADS", 0, int) + training_depth_recurrence = _e("TRAINING_DEPTH_RECURRENCE", 1, int) + eval_depth_recurrence = _e("EVAL_DEPTH_RECURRENCE", 1, int) + attn_proj_type = _e("ATTN_PROJ_TYPE", "standard") + logit_head_type = _e("LOGIT_HEAD_TYPE", "standard") + tversky_num_features = _e("TVERSKY_NUM_FEATURES", 16, int) + tversky_feature_pools = _e("TVERSKY_FEATURE_POOLS", 0, int) + tversky_membership = _e("TVERSKY_MEMBERSHIP", "sigmoid") + diff_attn = _e("DIFF_ATTN", 0, bool) + refiner = _e("REFINER", 0, bool) + refiner_kernel = _e("REFINER_KERNEL", 3, int) + mlp_groups = _e("MLP_GROUPS", 0, int) + embed_lr = _e("EMBED_LR", 0.6, float) + head_lr = _e("HEAD_LR", 0.008, float) + adam_lr = _e("ADAM_LR", 1e-3, float) + adam_wd = _e("ADAM_WD", 0.05, float) + untie_at_fraction = _e("UNTIE_AT_FRACTION", 0.0, float) + tied_embed_lr = _e("TIED_EMBED_LR", 0.05, float) + corr_weight_lr = _e("CORR_WEIGHT_LR", 0.05, float) + smear = _e("SMEAR", 0, bool) + seq_len_start = _e("SEQ_LEN_START", 0, int) + seq_schedule_fraction = _e("SEQ_SCHEDULE_FRACTION", 0.33, float) + batch_tokens_start = _e("BATCH_TOKENS_START", 0, int) + batch_schedule_fraction = _e("BATCH_SCHEDULE_FRACTION", 0.33, float) + churn_log_every = _e("CHURN_LOG_EVERY", 500, int) + matrix_lr = _e("MATRIX_LR", 0.04, float) + scalar_lr = _e("SCALAR_LR", 0.04, float) + muon_momentum = _e("MUON_MOMENTUM", 0.95, float) + muon_backend_steps = _e("MUON_BACKEND_STEPS", 5, int) + muon_wd = _e("MUON_WD", 0.0, float) + matrix_optimizer = _e("MATRIX_OPTIMIZER", "muon") + muon_momentum_warmup_start = _e("MUON_MOMENTUM_WARMUP_START", 0.85, float) + muon_momentum_warmup_steps = _e("MUON_MOMENTUM_WARMUP_STEPS", 500, int) + beta1 = _e("BETA1", 0.9, float) + beta2 = _e("BETA2", 0.95, float) + adam_eps = _e("ADAM_EPS", 1e-8, float) + grad_clip_norm = _e("GRAD_CLIP_NORM", 0.0, float) + bitnet_group_size = _e("BITNET_GROUP_SIZE", 64, int) + sliding_eval = _e("SLIDING_EVAL", 0, bool) + sliding_eval_stride = _e("SLIDING_EVAL_STRIDE", 64, int) + sliding_batch_size = _e("SLIDING_BATCH_SIZE", 64, int) + temp_scaling = _e("TEMP_SCALING", 0, bool) + _fp_raw = os.environ.get("FP_STORAGE", "0") + fp_storage = True if _fp_raw == "FP8" else ("fp4" if _fp_raw == "FP4" else False) + ema = _e("EMA", 0, bool) + ema_decay = _e("EMA_DECAY", 0.995, float) + ema_start_fraction = _e("EMA_START_FRACTION", 0.5, float) + +CTP = ("attn_scale","attn_scales","mlp_scale","mlp_scales","resid_mix","resid_mixes","q_gain","diff_lambda","skip_weight","skip_weights","vocab_bias","refiner.gate") + +# --------------------------------------------------------------------------- +# Binary packing — bitpacking (8 weights/byte = 1 bit/param, lossless) +# --------------------------------------------------------------------------- +def pack_binary(q: Tensor) -> tuple[bytes, int]: + bits = ((q.reshape(-1).to(torch.int8) + 1) // 2).numpy().astype(np.uint8) + n = len(bits) + pad = (8 - n % 8) % 8 + if pad: + bits = np.concatenate([bits, np.zeros(pad, dtype=np.uint8)]) + groups = bits.reshape(-1, 8) + packed = np.zeros(len(groups), dtype=np.uint8) + for i in range(8): + packed |= groups[:, i] << i + return packed.tobytes(), n + +def unpack_binary(data: bytes, n: int) -> Tensor: + packed = np.frombuffer(data, dtype=np.uint8) + bits = np.zeros((len(packed), 8), dtype=np.int8) + for i in range(8): + bits[:, i] = (packed >> i) & 1 + flat = bits.reshape(-1)[:n] + return torch.from_numpy(flat.astype(np.int8) * 2 - 1) + +# --------------------------------------------------------------------------- +# FP4 quantization (per-row absmax, 2 values packed per byte) +# --------------------------------------------------------------------------- +def quantize_to_int4(t: Tensor) -> tuple[Tensor, Tensor, list]: + t32 = t.float() + orig_shape = t32.shape + if t32.ndim < 2: + t32 = t32.unsqueeze(0) + absmax = t32.abs().amax(dim=-1, keepdim=True).clamp(min=1e-8) + scale = absmax / 7.0 + q = torch.clamp(torch.round(t32 / scale), -7, 7).to(torch.int8) + flat = q.reshape(-1) + if flat.numel() % 2 != 0: + flat = F.pad(flat, (0, 1)) + low = (flat[0::2] + 8).to(torch.uint8) + high = (flat[1::2] + 8).to(torch.uint8) + return low | (high << 4), scale.half().squeeze(-1), list(orig_shape) + +def dequantize_from_int4(packed: Tensor, scale: Tensor, shape: list) -> Tensor: + low = (packed & 0x0F).to(torch.int8) - 8 + high = ((packed >> 4) & 0x0F).to(torch.int8) - 8 + flat = torch.zeros(packed.numel() * 2, dtype=torch.int8) + flat[0::2] = low + flat[1::2] = high + numel = 1 + for s in shape: + numel *= s + flat = flat[:numel].float() + if len(shape) <= 1: + return (flat * scale.float().squeeze()).reshape(shape) + return (flat.reshape(-1, shape[-1]) * scale.float().unsqueeze(-1)).reshape(shape) + +# --------------------------------------------------------------------------- +# State dict serialization (binary + fp16/fp8/fp4) +# --------------------------------------------------------------------------- +def q_sd(state_dict: dict, group_size: int = 64, fp_storage=False, binary_override_names: set | None = None) -> tuple[dict, dict]: + "Binary for large 2D weight matrices, fp16/fp8/fp4 for everything else." + quantized = {} + stats = {"binary_params": 0, "binary_bytes": 0, "fp_params": 0, "fp_bytes": 0} + for name, tensor in state_dict.items(): + if "mtp_heads" in name: + continue + t = tensor.detach().cpu().float().contiguous() + t_orig_shape = list(t.shape) + if t.ndim == 3: + t = t.reshape(t.shape[0], -1) + is_binary_candidate = ( + t.ndim == 2 and t.numel() > 65_536 + and "tok_emb" not in name and "lm_head" not in name and "embed_proj" not in name and "bigram_emb" not in name and "lm_head_correction" not in name and "lm_head_U" not in name and "lm_head_V" not in name + and "prototypes" not in name and "tversky" not in name + ) or (binary_override_names is not None and name in binary_override_names) + if is_binary_candidate: + pad = (group_size - t.shape[1] % group_size) % group_size + t_padded = F.pad(t, (0, pad)) if pad > 0 else t + t_grouped = t_padded.reshape(-1, group_size) + scale = t_grouped.abs().mean(-1, keepdim=True).clamp(min=1e-8).half().float() + q = torch.where(t_grouped >= 0, + torch.ones_like(t_grouped, dtype=torch.int8), + -torch.ones_like(t_grouped, dtype=torch.int8)) + packed_bytes, n_bits = pack_binary(q) + quantized[name] = { + "type": "binary", "packed": packed_bytes, + "scale": scale.half().squeeze(-1), + "shape": list(t.shape), "padded_cols": t_padded.shape[1], + "group_size": group_size, "n_bits": n_bits, + "orig_shape": t_orig_shape, + } + stats["binary_params"] += t.numel() + stats["binary_bytes"] += len(packed_bytes) + scale.numel() * 2 + elif fp_storage == "fp4" and t.ndim == 2: + packed, scale, orig_shape = quantize_to_int4(t) + quantized[name] = {"type": "fp4", "packed": packed, "scale": scale, "shape": orig_shape} + stats["fp_params"] += t.numel() + stats["fp_bytes"] += packed.numel() + scale.numel() * 2 + elif fp_storage and t.ndim == 2: + quantized[name] = {"type": "fp8", "data": t.to(torch.float8_e4m3fn)} + stats["fp_params"] += t.numel() + stats["fp_bytes"] += t.numel() + else: + quantized[name] = {"type": "fp16", "data": t.half()} + stats["fp_params"] += t.numel() + stats["fp_bytes"] += t.numel() * 2 + return quantized, stats + +def deq_sd(quantized: dict, target_dtype=torch.bfloat16): + "Reconstruct full-precision state dict from quantized representation." + out = {} + for name, entry in quantized.items(): + if entry["type"] == "binary": + q = unpack_binary(entry["packed"], entry["n_bits"]) + q = q.float().reshape(-1, entry["group_size"]) + scale = entry["scale"].float().unsqueeze(-1) + # No shrinkage correction needed: binary has no zeros, q.abs().mean() == 1.0 always + t = (q * scale).reshape(-1, entry["padded_cols"]) + shape = entry["shape"] + result = t[:shape[0], :shape[1]].to(target_dtype) + orig = entry.get("orig_shape") + out[name] = result.reshape(orig).contiguous() if orig and orig != shape else result.contiguous() + elif entry["type"] == "fp8": + out[name] = entry["data"].to(torch.float32).to(target_dtype).contiguous() + elif entry["type"] == "fp4": + out[name] = dequantize_from_int4(entry["packed"], entry["scale"], entry["shape"]).to(target_dtype).contiguous() + else: + out[name] = entry["data"].to(target_dtype).contiguous() + return out + +# --------------------------------------------------------------------------- +# Binary diagnostics (logged during training) +# --------------------------------------------------------------------------- +_prev_committed: dict = {} +def churn_fn(model: nn.Module, group_size: int = 64): + global _prev_committed + total = flipped = 0 + with torch.no_grad(): + for name, p in model.named_parameters(): + if p.ndim == 2 and ("weight" in name or "prototypes" in name) and p.shape[0] > 1: + w = p.detach().float().reshape(-1, group_size) + q = torch.where(w >= 0, torch.ones_like(w), -torch.ones_like(w)).cpu().numpy() + if name in _prev_committed: + flipped += int(np.sum(q != _prev_committed[name])) + total += q.size + _prev_committed[name] = q + return flipped / max(total, 1) + +# --------------------------------------------------------------------------- +# Muon optimizer (Newton-Schulz orthogonalized momentum) +# --------------------------------------------------------------------------- +def ns_orth(G: Tensor, steps: int = 10, eps: float = 1e-7) -> Tensor: + a, b, c = (3.4445, -4.7750, 2.0315) + X = G.bfloat16() + X /= X.norm() + eps + transposed = G.size(0) > G.size(1) + if transposed: + X = X.T + for _ in range(steps): + A = X @ X.T + B = b * A + c * A @ A + X = a * X + B @ X + return X.T if transposed else X + +class Muon(torch.optim.Optimizer): + def __init__(self, params, lr: float, momentum: float, backend_steps: int, nesterov: bool = True, wd: float = 0.0): + super().__init__(params, dict(lr=lr, momentum=momentum, backend_steps=backend_steps, nesterov=nesterov, wd=wd)) + @torch.no_grad() + def step(self, closure=None): + loss = None + if closure is not None: + with torch.enable_grad(): + loss = closure() + distributed = dist.is_available() and dist.is_initialized() + world_size = dist.get_world_size() if distributed else 1 + rank = dist.get_rank() if distributed else 0 + for group in self.param_groups: + params = group["params"] + if not params: + continue + lr, momentum = group["lr"], group["momentum"] + backend_steps, nesterov = group["backend_steps"], group["nesterov"] + total_params = sum(int(p.numel()) for p in params) + updates_flat = torch.zeros(total_params, device=params[0].device, dtype=torch.bfloat16) + curr = 0 + for i, p in enumerate(params): + if i % world_size == rank and p.grad is not None: + g = p.grad + state = self.state[p] + if "momentum_buffer" not in state: + state["momentum_buffer"] = torch.zeros_like(g) + buf = state["momentum_buffer"] + buf.mul_(momentum).add_(g) + if nesterov: + g = g.add(buf, alpha=momentum) + g = F.rms_norm(g.float(), (g.size(-1),)).bfloat16() + g = ns_orth(g, steps=backend_steps) + g *= max(1, g.size(0) / g.size(1)) ** 0.5 + updates_flat[curr:curr + p.numel()] = g.reshape(-1) + curr += p.numel() + if distributed: + dist.all_reduce(updates_flat, op=dist.ReduceOp.SUM) + wd = group.get("wd", 0.0) + curr = 0 + for p in params: + g = updates_flat[curr : curr + p.numel()].view_as(p).to(dtype=p.dtype) + if wd > 0: + p.mul_(1 - lr * wd) + p.add_(g, alpha=-lr) + curr += p.numel() + return loss + +# --------------------------------------------------------------------------- +# Data loading +# --------------------------------------------------------------------------- +def ld_shard(file: Path) -> Tensor: + header_bytes = 256 * np.dtype(" Tensor: + chunks = [] + remaining = n + while remaining > 0: + avail = self.tokens.numel() - self.pos + if avail <= 0: + self._advance_file() + continue + k = min(remaining, avail) + chunks.append(self.tokens[self.pos:self.pos + k]) + self.pos += k + remaining -= k + return chunks[0] if len(chunks) == 1 else torch.cat(chunks) + +class DistributedTokenLoader: + def __init__(self, pattern: str, rank: int, world_size: int, device: torch.device): + self.rank, self.world_size, self.device = rank, world_size, device + self.stream = TokenStream(pattern) + def next_batch(self, global_tokens: int, seq_len: int, grad_accum_steps: int) -> tuple[Tensor, Tensor]: + local_tokens = global_tokens // (self.world_size * grad_accum_steps) + per_rank_span = local_tokens + 1 + chunk = self.stream.take(per_rank_span * self.world_size) + start = self.rank * per_rank_span + local = chunk[start:start + per_rank_span].pin_memory().to(self.device, non_blocking=True).to(torch.int64) + x = local[:-1].reshape(-1, seq_len) + y = local[1:].reshape(-1, seq_len) + return x, y +# --------------------------------------------------------------------------- +# Model +# --------------------------------------------------------------------------- +class RMSNorm(nn.Module): + def __init__(self, eps: float | None = None): + super().__init__() + self.eps = eps + def forward(self, x: Tensor) -> Tensor: + return F.rms_norm(x, (x.size(-1),), eps=self.eps) + +def apply_qat_ste(w: Tensor, fp_storage: str | bool) -> Tensor: + """Applies Straight-Through Estimator (STE) for FP4 or FP8 simulated quantization.""" + if not fp_storage: + return w + if fp_storage == "fp4": + absmax = w.abs().amax(dim=-1, keepdim=True).clamp(min=1e-8) + scale = absmax / 7.0 + q = torch.clamp(torch.round(w / scale), -7.0, 7.0) + w_sim = q * scale + return (w_sim - w).detach() + w + elif fp_storage is True or fp_storage == "fp8": + w_sim = w.to(torch.float8_e4m3fn).to(w.dtype) + return (w_sim - w).detach() + w + return w + +class QATLinear(nn.Linear): + def __init__(self, in_features: int, out_features: int, bias: bool = False, fp_storage: str | bool = False): + super().__init__(in_features, out_features, bias=bias) + self.fp_storage = fp_storage + def forward(self, x: Tensor) -> Tensor: + w_qat = apply_qat_ste(self.weight, self.fp_storage) + return F.linear(x, w_qat.to(x.dtype), self.bias.to(x.dtype) if self.bias is not None else None) + +class QATEmbedding(nn.Embedding): + def __init__(self, num_embeddings: int, embedding_dim: int, fp_storage: str | bool = False): + super().__init__(num_embeddings, embedding_dim) + self.fp_storage = fp_storage + def forward(self, input: Tensor) -> Tensor: + w_qat = apply_qat_ste(self.weight, self.fp_storage) + return F.embedding(input, w_qat, self.padding_idx, self.max_norm, + self.norm_type, self.scale_grad_by_freq, self.sparse) + +class BinaryLinear(nn.Linear): + def __init__(self, in_features, out_features, bias=False, group_size=64): + super().__init__(in_features, out_features, bias=bias) + self.group_size = group_size + def forward(self, x: Tensor) -> Tensor: + w = self.weight.bfloat16() + g = self.group_size + w_g = w.reshape(-1, g) + scale = w_g.abs().mean(-1, keepdim=True).clamp(min=1e-8) + q = torch.where(w_g >= 0, torch.ones_like(w_g), -torch.ones_like(w_g)) + w_binary = w + ((q * scale).reshape(w.shape) - w).detach() + return F.linear(x, w_binary, + self.bias.to(x.dtype) if self.bias is not None else None) + +class NormedBinaryLinear(BinaryLinear): + "Binary linear with RMSNorm on input — for output projections receiving un-normalized activations." + def forward(self, x: Tensor) -> Tensor: + return super().forward(F.rms_norm(x, (x.size(-1),))) + +class GroupedBinaryLinear(nn.Module): + "Grouped linear with binary STE. Weight stored as 2D [groups*group_out, group_in] for binary quantization compatibility." + def __init__(self, in_features, out_features, groups=4, group_size=64, normed=False): + super().__init__() + assert in_features % groups == 0 and out_features % groups == 0 + self.groups = groups + self.group_in = in_features // groups + self.group_out = out_features // groups + self.group_size = group_size + self.normed = normed + self.weight = nn.Parameter(torch.randn(groups * self.group_out, self.group_in) * 0.02) + def forward(self, x: Tensor) -> Tensor: + if self.normed: + x = F.rms_norm(x, (x.size(-1),)) + w = self.weight.bfloat16() + g = self.group_size + w_g = w.reshape(-1, g) + scale = w_g.abs().mean(-1, keepdim=True).clamp(min=1e-8) + q = torch.where(w_g >= 0, torch.ones_like(w_g), -torch.ones_like(w_g)) + w_binary = w + ((q * scale).reshape(w.shape) - w).detach() + w_grouped = w_binary.reshape(self.groups, self.group_out, self.group_in) + bsz = x.shape[:-1] + x_g = x.reshape(*bsz, self.groups, self.group_in) + out = torch.einsum('...gi,goi->...go', x_g, w_grouped) + return out.reshape(*bsz, self.groups * self.group_out) + +class TverskyProjection(nn.Module): + "Tversky similarity: S = θ·f(A∩B) - α·f(A\\B) - β·f(B\\A). Three modes." + def __init__(self, in_features: int, out_features: int, num_features: int = 16, + group_size: int = 64, use_shared_features: bool = False, + membership: str = "sigmoid"): + super().__init__() + self.group_size = group_size + self.num_features = num_features + self.membership_type = membership + self.no_features_mode = (num_features == 0) + if not self.no_features_mode and not use_shared_features: + self.features = nn.Parameter(torch.empty(num_features, in_features).uniform_(-0.02, 0.02)) + else: + self.register_parameter('features', None) + self.prototypes = nn.Parameter(torch.empty(out_features, in_features).uniform_(-0.02, 0.02)) + self.theta = nn.Parameter(torch.tensor(1.0)) + self.alpha = nn.Parameter(torch.tensor(0.5)) + self.beta = nn.Parameter(torch.tensor(0.5)) + + def _binary_ste(self, w: Tensor) -> Tensor: + w_bf16 = w.bfloat16() + g = self.group_size + w_grouped = w_bf16.reshape(-1, g) + scale = w_grouped.abs().mean(-1, keepdim=True).clamp(min=1e-8) + q = torch.where(w_grouped >= 0, torch.ones_like(w_grouped), -torch.ones_like(w_grouped)) + w_binary = w_bf16 + ((q * scale).reshape(w_bf16.shape) - w_bf16).detach() + return w_binary.reshape(w.shape) + + def _membership(self, t: Tensor) -> Tensor: + if self.membership_type == "poly": + return torch.clamp(t * 5.0 / 4.0 + 0.5, 0.0, 1.0) + elif self.membership_type == "tanh": + return (torch.tanh(t * 5.0) + 1.0) * 0.5 + else: + return torch.sigmoid(t * 5.0) + + def forward(self, x: Tensor, shared_features: Tensor | None = None) -> Tensor: + proto = self._binary_ste(self.prototypes) + if self.no_features_mode: + x_f = x @ proto.t() + p_norm = F.normalize(proto, dim=-1) + p_f = p_norm @ p_norm.t() + else: + feat = (shared_features if shared_features is not None else self.features).float() + x_f = x @ feat.t() + p_f = proto @ feat.t() + x_s = self._membership(x_f) + p_s = self._membership(p_f) + x_a = x_f * x_s + p_a = p_f * p_s + t, a, b = self.theta.abs(), self.alpha.abs(), self.beta.abs() + return t * (x_a @ p_a.t()) - a * (x_a @ (1 - p_s).t()) - b * ((1 - x_s) @ p_a.t()) + +def restore_low_dim_params_to_fp32(module: nn.Module) -> None: + with torch.no_grad(): + for name, param in module.named_parameters(): + if (param.ndim < 2 or any(p in name for p in CTP)) and param.dtype != torch.float32: + param.data = param.data.float() + +class Rotary(nn.Module): + def __init__(self, dim: int, base: float = 10000.0, no_cache: bool = False, + rope_type: str = "rope", yarn_max_len: int = 4096, train_seq_len: int = 1024): + super().__init__() + self.no_cache = no_cache + inv_freq = 1.0 / (base ** (torch.arange(0, dim, 2, dtype=torch.float32) / dim)) + if rope_type == "yarn": + scale = train_seq_len / yarn_max_len + freq_idx = torch.arange(0, dim, 2, dtype=torch.float32) + ramp = torch.clamp((freq_idx / dim - 0.25) / 0.75, 0.0, 1.0) + inv_freq = inv_freq / (ramp * (1.0 / scale - 1.0) + 1.0) + self.register_buffer("inv_freq", inv_freq, persistent=False) + self._seq_len_cached = 0 + self._cos_cached: Tensor | None = None + self._sin_cached: Tensor | None = None + def forward(self, seq_len, device, dtype): + if self.no_cache: + t = torch.arange(seq_len, device=device, dtype=self.inv_freq.dtype) + freqs = torch.outer(t, self.inv_freq.to(device)) + return freqs.cos()[None, :, None, :].to(dtype=dtype), freqs.sin()[None, :, None, :].to(dtype=dtype) + if ( + self._cos_cached is None + or self._sin_cached is None + or self._seq_len_cached != seq_len + or self._cos_cached.device != device + ): + t = torch.arange(seq_len, device=device, dtype=self.inv_freq.dtype) + freqs = torch.outer(t, self.inv_freq.to(device)) + self._cos_cached = freqs.cos()[None, :, None, :] + self._sin_cached = freqs.sin()[None, :, None, :] + self._seq_len_cached = seq_len + return self._cos_cached.to(dtype=dtype), self._sin_cached.to(dtype=dtype) + +def apply_rotary_emb(x: Tensor, cos: Tensor, sin: Tensor) -> Tensor: + half = x.size(-1) // 2 + x1, x2 = x[..., :half], x[..., half:] + return torch.cat((x1 * cos + x2 * sin, x1 * (-sin) + x2 * cos), dim=-1) + +class CausalSelfAttention(nn.Module): + def __init__(self, dim, num_heads, num_kv_heads, rope_base, qk_gain_init, + group_size=64, attn_proj_type="standard", tversky_num_features=16, + tversky_feature_pools=0, no_cache=False, rope_type="rope", + yarn_max_len=4096, train_seq_len=1024, tversky_membership="sigmoid", + diff_attn=False): + super().__init__() + self.num_heads, self.num_kv_heads = num_heads, num_kv_heads + self.head_dim = dim // num_heads + self.diff_attn = diff_attn + self.q_size = self.num_heads * self.head_dim + self.kv_size = self.num_kv_heads * self.head_dim + self.c_qkv = BinaryLinear(dim, self.q_size + 2 * self.kv_size, bias=False, group_size=group_size) + self.proj = NormedBinaryLinear(dim, dim, bias=False, group_size=group_size) if attn_proj_type != "tversky" else None + if self.proj is not None: + self.proj._zero_init = True + self.tversky_proj = TverskyProjection( + dim, dim, num_features=tversky_num_features, group_size=group_size, + use_shared_features=(tversky_feature_pools > 0), + membership=tversky_membership, + ) if attn_proj_type == "tversky" else None + self.shared_features = None + self.q_gain = nn.Parameter(torch.full((num_heads,), qk_gain_init, dtype=torch.float32)) + if diff_attn: + self.diff_lambda = nn.Parameter(torch.full((num_heads,), 0.5, dtype=torch.float32)) + self.rotary = Rotary(self.head_dim, base=rope_base, no_cache=no_cache, + rope_type=rope_type, yarn_max_len=yarn_max_len, + train_seq_len=train_seq_len) + def forward(self, x: Tensor) -> Tensor: + bsz, seqlen, dim = x.shape + qkv_out = self.c_qkv(x) + q_out, k_out, v_out = qkv_out.split([self.q_size, self.kv_size, self.kv_size], dim=-1) + q = q_out.reshape(bsz, seqlen, self.num_heads, self.head_dim) + k = k_out.reshape(bsz, seqlen, self.num_kv_heads, self.head_dim) + v = v_out.reshape(bsz, seqlen, self.num_kv_heads, self.head_dim) + q, k = F.rms_norm(q, (q.size(-1),)), F.rms_norm(k, (k.size(-1),)) + cos, sin = self.rotary(seqlen, x.device, q.dtype) + q, k = apply_rotary_emb(q, cos, sin), apply_rotary_emb(k, cos, sin) + q = q * self.q_gain.to(dtype=q.dtype)[None, None, :, None] + if self.diff_attn: + half = self.head_dim // 2 + q1, q2 = q[..., :half], q[..., half:] + k1, k2 = k[..., :half], k[..., half:] + v1, v2 = v[..., :half], v[..., half:] + y1 = flash_attn_func(q1.contiguous(), k1.contiguous(), v1.contiguous(), causal=True) + y2 = flash_attn_func(q2.contiguous(), k2.contiguous(), v2.contiguous(), causal=True) + lam = self.diff_lambda.to(dtype=y1.dtype)[None, None, :, None] + y = torch.cat([y1 - lam * y2, y1 + lam * y2], dim=-1) + else: + y = flash_attn_func( + q.contiguous(), + k.contiguous(), + v.contiguous(), + causal=True + ) + y = y.reshape(bsz, seqlen, dim) + return self.tversky_proj(y, self.shared_features) if self.tversky_proj is not None else self.proj(y) + +class MLP(nn.Module): + def __init__(self, dim, mlp_mult, group_size=64, activation="swiglu", mlp_groups=0): + super().__init__() + hidden = mlp_mult * dim + self.activation = activation + if mlp_groups > 0: + if activation == "swiglu": + self.gate_up = GroupedBinaryLinear(dim, hidden * 2, groups=mlp_groups, group_size=group_size) + else: + self.fc = GroupedBinaryLinear(dim, hidden, groups=mlp_groups, group_size=group_size) + self.proj = GroupedBinaryLinear(hidden, dim, groups=mlp_groups, group_size=group_size, normed=True) + else: + if activation == "swiglu": + self.gate_up = BinaryLinear(dim, hidden * 2, bias=False, group_size=group_size) + else: + self.fc = BinaryLinear(dim, hidden, bias=False, group_size=group_size) + self.proj = NormedBinaryLinear(hidden, dim, bias=False, group_size=group_size) + self.proj._zero_init = True + def forward(self, x: Tensor) -> Tensor: + if self.activation == "swiglu": + gu = self.gate_up(x) + gate, up = gu.chunk(2, dim=-1) + return self.proj(F.silu(gate) * up) + elif self.activation == "relu": + return self.proj(torch.relu(self.fc(x))) + elif self.activation == "leaky_relu": + return self.proj(F.leaky_relu(self.fc(x), negative_slope=0.01)) + else: # relu2 + return self.proj(torch.relu(self.fc(x)).square()) + +class SmearModule(nn.Module): + def __init__(self, dim: int): + super().__init__() + self.gate = nn.Parameter(torch.zeros(dim, dtype=torch.float32)) + def forward(self, x: Tensor) -> Tensor: + cumsum = x.cumsum(dim=1) + counts = torch.arange(1, x.size(1) + 1, device=x.device, dtype=x.dtype).view(1, -1, 1) + smeared = cumsum / counts + gate = torch.tanh(self.gate.to(dtype=x.dtype)) + return x + gate * (smeared - x) + +class CausalConvRefiner(nn.Module): + "Causal Conv1d that refines hidden states using local n-gram context." + def __init__(self, dim: int, kernel_size: int = 3): + super().__init__() + self.kernel_size = kernel_size + self.conv = nn.Conv1d(dim, dim, kernel_size, padding=0, bias=False) + self.gate = nn.Parameter(torch.zeros(1, dtype=torch.float32)) + def forward(self, x: Tensor) -> Tensor: + h = x.permute(0, 2, 1) + h = F.pad(h, (self.kernel_size - 1, 0)) + h = self.conv(h) + h = h.permute(0, 2, 1) + return x + torch.tanh(self.gate.to(dtype=x.dtype)) * F.rms_norm(h, (h.size(-1),)) + +class Block(nn.Module): + def __init__(self, dim: int, num_heads: int, num_kv_heads: int, mlp_mult: int, + rope_base: float, qk_gain_init: float, group_size: int=64, + activation: str="swiglu", attn_proj_type: str="standard", + tversky_num_features: int=16, tversky_feature_pools: int=0, no_cache: bool=False, + smear: bool=False, rope_type: str="rope", yarn_max_len: int=4096, + train_seq_len: int=1024, tversky_membership: str="sigmoid", + diff_attn: bool=False, mlp_groups: int=0): + super().__init__() + self.attn_norm = RMSNorm() + self.mlp_norm = RMSNorm() + self.attn = CausalSelfAttention(dim, num_heads, num_kv_heads, rope_base, qk_gain_init, + group_size, attn_proj_type, tversky_num_features, + tversky_feature_pools, no_cache, rope_type, yarn_max_len, + train_seq_len, tversky_membership, diff_attn) + self.mlp = MLP(dim, mlp_mult, group_size, activation, mlp_groups) + self.attn_scale = nn.Parameter(torch.ones(dim, dtype=torch.float32)) + self.mlp_scale = nn.Parameter(torch.ones(dim, dtype=torch.float32)) + self.resid_mix = nn.Parameter(torch.stack((torch.ones(dim), torch.zeros(dim))).float()) + self.smear = SmearModule(dim) if smear else None + def forward(self, x: Tensor, x0: Tensor) -> Tensor: + mix = self.resid_mix.to(dtype=x.dtype) + x = mix[0] * x + mix[1] * x0 + n = self.attn_norm(x) + x = x + self.attn_scale.to(dtype=x.dtype) * self.attn(n) + x = x + self.mlp_scale.to(dtype=x.dtype) * self.mlp(self.mlp_norm(x)) + if self.smear is not None: + x = self.smear(x) + return x + +class GPT(nn.Module): + def __init__(self, vocab_size, num_layers, model_dim, num_heads, num_kv_heads, mlp_mult, + tie_embeddings, tied_embed_init_std, logit_softcap, rope_base, qk_gain_init, + group_size: int = 64, activation: str = "swiglu", mtp_heads_count: int = 0, + embed_dim: int = 0, attn_proj_type: str = "standard", logit_head_type: str = "standard", + tversky_num_features: int = 16, tversky_feature_pools: int = 0, + training_depth_recurrence: int=1, fp_storage=False, bigram_hash: bool=False, + softcap_type: str="poly", no_cache: bool=False, + smear: bool=False, rope_type: str="rope", yarn_max_len: int=4096, + train_seq_len: int=1024, tversky_membership: str="sigmoid", + diff_attn=False, mlp_groups=0, refiner=False, refiner_kernel=3): + super().__init__() + self.training_depth_recurrence = training_depth_recurrence + self.fp_storage = fp_storage + self.tie_embeddings = tie_embeddings + self.logit_softcap = logit_softcap + self.softcap_type = softcap_type + self.embed_dim = embed_dim if embed_dim > 0 else model_dim + self.tok_emb = QATEmbedding(vocab_size, self.embed_dim, fp_storage=fp_storage) + self.bigram_emb = QATEmbedding(vocab_size, self.embed_dim, fp_storage=fp_storage) if bigram_hash else None + if self.bigram_emb is not None: + nn.init.zeros_(self.bigram_emb.weight) + self.lm_head_correction = nn.Parameter( + torch.zeros(vocab_size, self.embed_dim)) if tie_embeddings == 2 else None + self.embed_proj = QATLinear(self.embed_dim, model_dim, bias=False, fp_storage=fp_storage) if self.embed_dim != model_dim else None + self.embed_proj_rev = QATLinear(model_dim, self.embed_dim, bias=False, fp_storage=fp_storage) if ( + self.embed_dim != model_dim and logit_head_type != "tversky") else None + self.num_encoder_layers = num_layers // 2 + self.num_decoder_layers = num_layers - self.num_encoder_layers + self.num_skip_weights = min(self.num_encoder_layers, self.num_decoder_layers) + self.skip_weights = nn.Parameter(torch.ones(self.num_skip_weights, model_dim, dtype=torch.float32)) + # Shared Tversky feature pools (if enabled and num_features > 0) + if attn_proj_type == "tversky" and tversky_feature_pools > 0 and tversky_num_features > 0: + self.tversky_feature_pools_list = nn.ParameterList([ + nn.Parameter(torch.empty(tversky_num_features, model_dim).uniform_(-0.02, 0.02)) + for _ in range(tversky_feature_pools) + ]) + else: + self.tversky_feature_pools_list = None + self.blocks = nn.ModuleList([ + Block(model_dim, num_heads, num_kv_heads, mlp_mult, rope_base, qk_gain_init, + group_size, activation, attn_proj_type, tversky_num_features, tversky_feature_pools, + no_cache, smear, rope_type, yarn_max_len, train_seq_len, tversky_membership, + diff_attn, mlp_groups) + for _ in range(num_layers) + ]) + # Inject shared feature pool references into attention layers + if self.tversky_feature_pools_list is not None: + for i, block in enumerate(self.blocks): + pool_idx = (i * tversky_feature_pools) // num_layers + block.attn.shared_features = self.tversky_feature_pools_list[pool_idx] + self.final_norm = RMSNorm() + self.refiner = CausalConvRefiner(model_dim, kernel_size=refiner_kernel) if refiner else None + self.mtp_heads = nn.ModuleList([ + nn.Linear(model_dim, vocab_size, bias=False) for _ in range(mtp_heads_count) + ]) + for h in self.mtp_heads: + nn.init.zeros_(h.weight) + self.logit_head_type = logit_head_type + if logit_head_type == "tversky" and tversky_num_features == 0 and vocab_size > 1024: + raise ValueError( + f"Tversky logit head with no-features mode creates O(V^2) = {vocab_size}x{vocab_size} " + f"matrix per forward pass. Use tversky_num_features > 0 or a smaller vocab." + ) + self.tversky_head = TverskyProjection( + model_dim, vocab_size, num_features=tversky_num_features, + membership=tversky_membership, + ) if logit_head_type == "tversky" else None + self.lm_head = QATLinear(model_dim, vocab_size, bias=False, fp_storage=fp_storage) + self.lm_head._zero_init = True + if self.lm_head is not None and (tie_embeddings or logit_head_type == "tversky"): + self.lm_head.weight.requires_grad_(False) + self.vocab_bias = nn.Parameter(torch.zeros(vocab_size, dtype=torch.float32)) + self._init_weights(tied_embed_init_std) + def _init_weights(self, tied_embed_init_std: float) -> None: + if self.tie_embeddings: + nn.init.normal_(self.tok_emb.weight, mean=0.0, std=tied_embed_init_std) + for module in self.modules(): + if isinstance(module, BinaryLinear) and not getattr(module, "_zero_init", False): + nn.init.normal_(module.weight, mean=0.0, std=0.02) + elif isinstance(module, nn.Linear) and getattr(module, "_zero_init", False): + nn.init.zeros_(module.weight) + def _compute_logits(self, x: Tensor) -> Tensor: + if self.tversky_head is not None: + logits_raw = self.tversky_head(x) + elif self.tie_embeddings: + if self.embed_proj_rev is not None: + proj = self.embed_proj_rev(x) + else: + proj = x + weight = self.tok_emb.weight + if self.lm_head_correction is not None: + weight = weight + self.lm_head_correction + logits_raw = F.linear(proj, weight.to(x.dtype)) + else: + logits_raw = self.lm_head(x) + return logits_raw + self.vocab_bias.to(x.dtype) + def _softcap(self, logits: Tensor) -> Tensor: + s = self.logit_softcap + if self.softcap_type == "tanh": + return s * torch.tanh(logits / s) + x_sc = torch.clamp(logits / s, -2.0, 2.0) + x2 = x_sc * x_sc + return s * torch.clamp(x_sc * (1.0 - x2 / 3.0 + x2 * x2 / 15.0), -1.0, 1.0) + def forward(self, input_ids: Tensor, target_ids: Tensor, reduction: str = "mean", temperature: float = 1.0) -> Tensor: + x = self.tok_emb(input_ids).float() + if self.bigram_emb is not None: + prev = F.pad(input_ids[:, :-1], (1, 0), value=0) + x = x + self.bigram_emb(prev).float() + if self.embed_proj is not None: + x = self.embed_proj(x) + x = F.rms_norm(x, (x.size(-1),)) + x0 = x + # U-Net style encoder/decoder with skip connections + skips = [] + for i in range(self.num_encoder_layers): + for _ in range(max(1, self.training_depth_recurrence)): + x = self.blocks[i](x, x0) + skips.append(x) + for i in range(self.num_decoder_layers): + bi = self.num_encoder_layers + i + if skips: + x = x + self.skip_weights[i].to(dtype=x.dtype) * skips.pop() + for _ in range(max(1, self.training_depth_recurrence)): + x = self.blocks[bi](x, x0) + x_normed = self.final_norm(x) + if self.refiner is not None: + x_normed = self.refiner(x_normed) + # Standard training/eval path + x_flat = x_normed.reshape(-1, x_normed.size(-1)) + targets = target_ids.reshape(-1) + logits = self._softcap(self._compute_logits(x_flat)) + if reduction == "none": + return F.cross_entropy(logits.float(), targets, reduction="none").reshape(input_ids.shape) + # Fused CE + Z-loss: single logsumexp computation + logits_f = logits.float() + lse = torch.logsumexp(logits_f, dim=-1) + target_logits = logits_f.gather(1, targets.unsqueeze(1)).squeeze(1) + main_loss = (lse - target_logits).mean() + 1e-4 * (lse ** 2).mean() + # Multi-token prediction auxiliary loss (training only) + if self.training and len(self.mtp_heads) > 0: + mtp_loss = torch.zeros((), device=main_loss.device) + for k, head in enumerate(self.mtp_heads): + shift = k + 2 + if target_ids.shape[1] > shift: + mtp_tgt = target_ids[:, shift:].reshape(-1) + mtp_in = x_normed[:, :target_ids.shape[1] - shift, :].reshape(-1, x_normed.shape[-1]) + mtp_loss = mtp_loss + F.cross_entropy(head(mtp_in).float(), mtp_tgt, reduction="mean") + main_loss = main_loss + 0.1 * mtp_loss / len(self.mtp_heads) + return main_loss + +# --------------------------------------------------------------------------- +# Validation +# --------------------------------------------------------------------------- +def build_luts(sp, vocab_size: int, device: torch.device): + sp_vocab_size = int(sp.vocab_size()) + table_size = max(sp_vocab_size, vocab_size) + base_bytes_np = np.zeros((table_size,), dtype=np.int16) + has_leading_space_np = np.zeros((table_size,), dtype=np.bool_) + is_boundary_token_np = np.ones((table_size,), dtype=np.bool_) + for token_id in range(sp_vocab_size): + if sp.is_control(token_id) or sp.is_unknown(token_id) or sp.is_unused(token_id): + continue + is_boundary_token_np[token_id] = False + if sp.is_byte(token_id): + base_bytes_np[token_id] = 1 + continue + piece = sp.id_to_piece(token_id) + if piece.startswith("\u2581"): + has_leading_space_np[token_id] = True + piece = piece[1:] + base_bytes_np[token_id] = len(piece.encode("utf-8")) + return ( + torch.tensor(base_bytes_np, dtype=torch.int16, device=device), + torch.tensor(has_leading_space_np, dtype=torch.bool, device=device), + torch.tensor(is_boundary_token_np, dtype=torch.bool, device=device), + ) + +def ld_val(pattern, seq_len, max_tok=int(os.environ.get("VAL_MAX_TOKENS", 500000))): + files = sorted(glob.glob(pattern)) + assert files, f"No files: {pattern}" + tok = torch.cat([ld_shard(Path(p)) for p in files]).contiguous() + if max_tok > 0: tok = tok[:max_tok + 1] + u = ((tok.numel() - 1) // seq_len) * seq_len + return tok[:u + 1] + +def eval_val(args, model, rank, world_size, device, grad_accum_steps, val_tokens, + base_bytes_lut, has_leading_space_lut, is_boundary_token_lut, temperature: float = 1.0): + local_batch_tokens = args.val_batch_size // (world_size * grad_accum_steps) + local_batch_seqs = max(1, local_batch_tokens // args.train_seq_len) + total_seqs = (val_tokens.numel() - 1) // args.train_seq_len + seq_start = (total_seqs * rank) // world_size + seq_end = (total_seqs * (rank + 1)) // world_size + loss_sum = torch.zeros((), device=device, dtype=torch.float64) + token_count = torch.zeros((), device=device, dtype=torch.float64) + byte_count = torch.zeros((), device=device, dtype=torch.float64) + model.eval() + with torch.inference_mode(): + for batch_start in range(seq_start, seq_end, local_batch_seqs): + batch_end = min(batch_start + local_batch_seqs, seq_end) + raw_start = batch_start * args.train_seq_len + raw_end = batch_end * args.train_seq_len + 1 + local = val_tokens[raw_start:raw_end].to(device=device, dtype=torch.int64) + x, y = local[:-1].reshape(-1, args.train_seq_len), local[1:].reshape(-1, args.train_seq_len) + with torch.autocast(device_type="cuda", dtype=torch.bfloat16): + batch_loss = model(x, y, temperature=temperature).detach() + n = float(y.numel()) + loss_sum += batch_loss.to(torch.float64) * n + token_count += n + prev_ids, tgt_ids = x.reshape(-1), y.reshape(-1) + tok_bytes = base_bytes_lut[tgt_ids].to(torch.int16) + tok_bytes += (has_leading_space_lut[tgt_ids] & ~is_boundary_token_lut[prev_ids]).to(torch.int16) + byte_count += tok_bytes.to(torch.float64).sum() + if dist.is_available() and dist.is_initialized(): + for t in (loss_sum, token_count, byte_count): + dist.all_reduce(t, op=dist.ReduceOp.SUM) + val_loss = loss_sum / token_count + bpb = (val_loss.item() / math.log(2.0)) * (token_count.item() / byte_count.item()) + model.train() + return float(val_loss.item()), float(bpb) + +def eval_val_sliding(args, model, rank, world_size, device, grad_accum_steps, val_tokens, + base_bytes_lut, has_leading_space_lut, is_boundary_token_lut, + stride: int = 64, temperature: float = 1.0): + seq_len = args.train_seq_len + batch_size = args.sliding_batch_size + total_tokens = val_tokens.numel() - 1 + loss_sum = torch.zeros((), device=device, dtype=torch.float64) + token_count = torch.zeros((), device=device, dtype=torch.float64) + byte_count = torch.zeros((), device=device, dtype=torch.float64) + all_starts = list(range(0, total_tokens - seq_len, stride)) + my_starts = all_starts[rank::world_size] + model.eval() + with torch.inference_mode(): + for i in range(0, len(my_starts), batch_size): + batch_starts = my_starts[i:i + batch_size] + starts_t = torch.tensor(batch_starts, dtype=torch.int64) + offsets = torch.arange(seq_len + 1, dtype=torch.int64) + indices = starts_t.unsqueeze(1) + offsets.unsqueeze(0) + local_batch = val_tokens[indices].to(device=device, dtype=torch.int64, non_blocking=True) + x = local_batch[:, :-1] + y = local_batch[:, 1:] + with torch.autocast(device_type="cuda", dtype=torch.bfloat16): + per_token_loss = model(x, y, reduction="none", temperature=temperature).detach() + for b, start in enumerate(batch_starts): + score_from = 0 if start == 0 else seq_len - stride + scored = per_token_loss[b, score_from:] + sx, sy = x[b, score_from:], y[b, score_from:] + loss_sum += scored.to(torch.float64).sum() + token_count += scored.numel() + tok_bytes = base_bytes_lut[sy].to(torch.int16) + tok_bytes += (has_leading_space_lut[sy] & ~is_boundary_token_lut[sx]).to(torch.int16) + byte_count += tok_bytes.to(torch.float64).sum() + if dist.is_available() and dist.is_initialized(): + for t in (loss_sum, token_count, byte_count): + dist.all_reduce(t, op=dist.ReduceOp.SUM) + val_loss = loss_sum / token_count + bpb = (val_loss.item() / math.log(2.0)) * (token_count.item() / byte_count.item()) + model.train() + return float(val_loss.item()), float(bpb) + +# --------------------------------------------------------------------------- +# Temperature scaling +# --------------------------------------------------------------------------- +def find_temp(args, base_model, rank, world_size, device, grad_accum_steps, + calibration_tokens, base_bytes_lut, has_leading_space_lut, + is_boundary_token_lut): + best_t, best_loss = 1.0, float("inf") + for t in [0.90, 0.95, 1.00, 1.05, 1.10]: + loss, _ = eval_val(args, base_model, rank, world_size, device, grad_accum_steps, + calibration_tokens, base_bytes_lut, has_leading_space_lut, + is_boundary_token_lut, temperature=t) + if loss < best_loss: + best_loss = loss + best_t = t + return best_t + +# --------------------------------------------------------------------------- +# Training +# --------------------------------------------------------------------------- +def main() -> None: + args = Hyperparameters() + code = Path(__file__).read_text(encoding="utf-8") + if args.matrix_optimizer != "adamw": + global ns_orth + ns_orth = torch.compile(ns_orth) + distributed = "RANK" in os.environ and "WORLD_SIZE" in os.environ + rank = int(os.environ.get("RANK", "0")) + world_size = int(os.environ.get("WORLD_SIZE", "1")) + local_rank = int(os.environ.get("LOCAL_RANK", "0")) + grad_accum_steps = max(1, 8 // world_size) + grad_scale = 1.0 / grad_accum_steps + if not torch.cuda.is_available(): + raise RuntimeError("CUDA is required") + device = torch.device("cuda", local_rank) + torch.cuda.set_device(device) + if distributed: + dist.init_process_group(backend="nccl", device_id=device) + dist.barrier() + master_process = rank == 0 + torch.backends.cuda.matmul.allow_tf32 = True + torch.backends.cudnn.allow_tf32 = True + os.makedirs("logs/cuda/", exist_ok=True) + logfile = f"logs/cuda/{args.run_id}.txt" if master_process else None + if master_process: + print(logfile) + def log0(msg: str, console: bool = True) -> None: + if not master_process: + return + if console: + print(msg) + if logfile: + with open(logfile, "a", encoding="utf-8") as f: + print(msg, file=f) + log0(code, console=False) + log0("=" * 100, console=False) + log0(f"Python {sys.version}", console=False) + log0(f"PyTorch {torch.__version__}", console=False) + random.seed(args.seed) + np.random.seed(args.seed) + torch.manual_seed(args.seed) + torch.cuda.manual_seed_all(args.seed) + sp = spm.SentencePieceProcessor(model_file=args.tokenizer_path) + val_tokens = ld_val(args.val_files, args.train_seq_len) + base_bytes_lut, has_leading_space_lut, is_boundary_token_lut = build_luts( + sp, args.vocab_size, device) + + # --- Model --- + base_model = GPT( + vocab_size=args.vocab_size, num_layers=args.num_layers, model_dim=args.model_dim, + num_heads=args.num_heads, num_kv_heads=args.num_kv_heads, mlp_mult=args.mlp_mult, + tie_embeddings=args.tie_embeddings, tied_embed_init_std=args.tied_embed_init_std, + logit_softcap=args.logit_softcap, rope_base=args.rope_base, qk_gain_init=args.qk_gain_init, + group_size=args.bitnet_group_size, activation=args.activation_type, mtp_heads_count=args.mtp_heads_count, + embed_dim=args.embed_dim, attn_proj_type=args.attn_proj_type, logit_head_type=args.logit_head_type, + tversky_num_features=args.tversky_num_features, tversky_feature_pools=args.tversky_feature_pools, + training_depth_recurrence=args.training_depth_recurrence, fp_storage=args.fp_storage, + bigram_hash=args.bigram_hash, softcap_type=args.softcap_type, no_cache=(args.compile_mode == "reduce-overhead"), + smear=args.smear, rope_type=args.rope_type, yarn_max_len=args.yarn_max_len, train_seq_len=args.train_seq_len, + tversky_membership=args.tversky_membership, diff_attn=args.diff_attn, + refiner=args.refiner, refiner_kernel=args.refiner_kernel, mlp_groups=args.mlp_groups, + ).to(device).bfloat16() + for module in base_model.modules(): + if isinstance(module, nn.Linear): + module.float() + restore_low_dim_params_to_fp32(base_model) + if base_model.lm_head is not None and (args.tie_embeddings or args.logit_head_type == "tversky"): + base_model.lm_head.weight.requires_grad_(False) + torch._dynamo.config.optimize_ddp = False + compiled_model = torch.compile(base_model, mode=args.compile_mode if args.compile_mode != "default" else None) + use_find_unused = args.untie_at_fraction > 0 or args.mtp_heads_count > 0 or not args.tie_embeddings + model = DDP(compiled_model, device_ids=[local_rank], broadcast_buffers=False, + find_unused_parameters=use_find_unused, + static_graph=not use_find_unused, + gradient_as_bucket_view=True) if distributed else compiled_model + + # --- Optimizers --- + _excl = {"tok_emb.weight", "lm_head.weight", "lm_head_correction"} + all_other_params = [(n, p) for n, p in base_model.named_parameters() + if not any(eh in n for eh in _excl)] + matrix_params = [p for n, p in all_other_params + if p.ndim == 2 and not any(pat in n for pat in CTP)] + scalar_params = [p for n, p in all_other_params + if p.ndim < 2 or any(pat in n for pat in CTP)] + token_lr = args.tied_embed_lr if args.tie_embeddings else args.embed_lr + opt_tok = torch.optim.Adam( + [{"params": [base_model.tok_emb.weight], "lr": token_lr, "base_lr": token_lr}], + betas=(args.beta1, args.beta2), eps=args.adam_eps, fused=True) + if args.matrix_optimizer == "adamw": + opt_muon = torch.optim.AdamW( + [{"params": matrix_params, "lr": args.adam_lr, "base_lr": args.adam_lr}], + betas=(args.beta1, args.beta2), eps=args.adam_eps, weight_decay=args.adam_wd, fused=True) + else: + opt_muon = Muon(matrix_params, lr=args.matrix_lr, momentum=args.muon_momentum, + backend_steps=args.muon_backend_steps, wd=args.muon_wd) + for g in opt_muon.param_groups: + g["base_lr"] = args.matrix_lr + opt_scalar = torch.optim.Adam( + [{"params": scalar_params, "lr": args.scalar_lr, "base_lr": args.scalar_lr}], + betas=(args.beta1, args.beta2), eps=args.adam_eps, fused=True) + opt_head = torch.optim.Adam( + [{"params": [base_model.lm_head.weight], "lr": 0.0, "base_lr": 0.0}], + betas=(args.beta1, args.beta2), eps=args.adam_eps, fused=True) + optimizers = [opt for opt in [opt_tok, opt_muon, opt_scalar, opt_head] if opt is not None] + if base_model.lm_head_correction is not None: + opt_corr = torch.optim.Adam( + [{"params": [base_model.lm_head_correction], + "lr": args.corr_weight_lr, "base_lr": args.corr_weight_lr}], + betas=(args.beta1, args.beta2), eps=args.adam_eps, fused=True) + optimizers.append(opt_corr) + + # --- Log all hyperparameters --- + log0("--- Hyperparameters ---", console=False) + log0(" ".join(f"{a}={getattr(args,a)}" for a in sorted(dir(args)) if not a.startswith("_") and a not in ("train_files","val_files") and not callable(getattr(args,a))), console=False) + n_params = sum(p.numel() for p in base_model.parameters()) + log0(f"params:{n_params} L:{args.num_layers} d:{args.model_dim} h:{args.num_heads} kv:{args.num_kv_heads} ws:{world_size} ga:{grad_accum_steps} s:{args.seed}") + # --- Data loader & helpers --- + train_loader = DistributedTokenLoader(args.train_files, rank, world_size, device) + def zero_grad_all(): + for opt in optimizers: + opt.zero_grad(set_to_none=True) + max_wallclock_ms = 1000.0 * args.max_wallclock_seconds if args.max_wallclock_seconds > 0 else None + def lr_mul(step: int, elapsed_ms: float): + if args.warmdown_fraction <= 0: + return 1.0 + if max_wallclock_ms is None: + warmdown_start = int(args.iterations * (1.0 - args.warmdown_fraction)) + return max((args.iterations - step) / max(args.iterations * args.warmdown_fraction, 1), 0.0) if step >= warmdown_start else 1.0 + warmdown_ms = max_wallclock_ms * args.warmdown_fraction + remaining_ms = max(max_wallclock_ms - elapsed_ms, 0.0) + return remaining_ms / max(warmdown_ms, 1e-9) if remaining_ms <= warmdown_ms else 1.0 + _seq_switched = False + _batch_switched = False + active_seq_len = args.seq_len_start if args.seq_len_start > 0 else args.train_seq_len + active_batch_tokens = args.batch_tokens_start if args.batch_tokens_start > 0 else args.train_batch_tokens + # --- Compiler warmup --- + if args.warmup_steps > 0: + _ms = {n: t.detach().cpu().clone() for n, t in base_model.state_dict().items()} + _os = [copy.deepcopy(o.state_dict()) for o in optimizers] + model.train() + for ws in range(args.warmup_steps): + zero_grad_all() + for mi in range(grad_accum_steps): + if distributed: model.require_backward_grad_sync = mi == grad_accum_steps - 1 + x, y = train_loader.next_batch(active_batch_tokens, active_seq_len, grad_accum_steps) + torch.compiler.cudagraph_mark_step_begin() + with torch.autocast(device_type="cuda", dtype=torch.bfloat16): loss = model(x, y) + (loss * grad_scale).backward() + for o in optimizers: o.step() + zero_grad_all() + log0(f"warmup:{ws+1}/{args.warmup_steps}") + base_model.load_state_dict(_ms, strict=True) + for o, s in zip(optimizers, _os): o.load_state_dict(s) + zero_grad_all() + train_loader = DistributedTokenLoader(args.train_files, rank, world_size, device) + + # --- EMA model --- + ema_model = None + _ema_started = False + _ema_steps = 0 + if args.ema: + ema_model = copy.deepcopy(base_model) + for p in ema_model.parameters(): + p.requires_grad_(False) + + # --- Main training loop --- + training_time_ms = 0.0 + stop_after_step: int | None = None + _untied = False + train_loss = torch.zeros((), device=device) + torch.cuda.synchronize() + t0 = time.perf_counter() + step = 0 + while True: + last_step = step == args.iterations or (stop_after_step is not None and step >= stop_after_step) + if last_step or (args.val_loss_every > 0 and step % args.val_loss_every == 0): + torch.cuda.synchronize() + training_time_ms += 1000.0 * (time.perf_counter() - t0) + val_loss, val_bpb = eval_val(args, model, rank, world_size, device, grad_accum_steps, + val_tokens, base_bytes_lut, has_leading_space_lut, is_boundary_token_lut) + log0(f"step:{step}/{args.iterations} val_loss:{val_loss:.4f} val_bpb:{val_bpb:.4f} " + f"train_time:{training_time_ms:.0f}ms") + torch.cuda.synchronize() + t0 = time.perf_counter() + if last_step: + if stop_after_step is not None and step < args.iterations: + log0(f"stopping_early: wallclock_cap train_time:{training_time_ms:.0f}ms step:{step}/{args.iterations}") + break + elapsed_ms = training_time_ms + 1000.0 * (time.perf_counter() - t0) + scale = lr_mul(step, elapsed_ms) + # Sequence length schedule + if args.seq_len_start > 0 and not _seq_switched: + if max_wallclock_ms is not None: + should_switch_seq = elapsed_ms >= args.seq_schedule_fraction * max_wallclock_ms + else: + should_switch_seq = step >= int(args.iterations * args.seq_schedule_fraction) + if should_switch_seq: + active_seq_len = args.train_seq_len + _seq_switched = True + torch._dynamo.reset() + train_loader = DistributedTokenLoader(args.train_files, rank, world_size, device) + log0(f"step:{step} seq_len_switch:{args.seq_len_start}->{active_seq_len}") + + # Batch size schedule + if args.batch_tokens_start > 0 and not _batch_switched: + if max_wallclock_ms is not None: + should_switch_batch = elapsed_ms >= args.batch_schedule_fraction * max_wallclock_ms + else: + should_switch_batch = step >= int(args.iterations * args.batch_schedule_fraction) + if should_switch_batch: + active_batch_tokens = args.train_batch_tokens + _batch_switched = True + log0(f"step:{step} batch_switch:{args.batch_tokens_start}->{active_batch_tokens}") + zero_grad_all() + train_loss.zero_() + for micro in range(grad_accum_steps): + if distributed: + model.require_backward_grad_sync = micro == grad_accum_steps - 1 + x, y = train_loader.next_batch(active_batch_tokens, active_seq_len, grad_accum_steps) + torch.compiler.cudagraph_mark_step_begin() + with torch.autocast(device_type="cuda", dtype=torch.bfloat16): + loss = model(x, y) + train_loss.add_(loss.detach()) + (loss * grad_scale).backward() + train_loss /= grad_accum_steps + + # Untie lm_head at configured fraction of training + if args.untie_at_fraction > 0: + if max_wallclock_ms is not None: + should_untie = not _untied and elapsed_ms >= args.untie_at_fraction * max_wallclock_ms + else: + should_untie = not _untied and step >= int(args.iterations * args.untie_at_fraction) + if should_untie and base_model.tie_embeddings: + with torch.no_grad(): + base_weight = base_model.tok_emb.weight.float() + if base_model.lm_head_correction is not None: + base_weight = base_weight + base_model.lm_head_correction.float() + if base_model.embed_proj_rev is not None: + full_weight = base_weight @ base_model.embed_proj_rev.weight.float() + else: + full_weight = base_weight + base_model.lm_head.weight.copy_(full_weight) + base_model.tie_embeddings = False + base_model.lm_head.weight.requires_grad_(True) + for g in opt_head.param_groups: + g["lr"] = g["base_lr"] = args.head_lr + _untied = True + torch._dynamo.reset() + log0(f"step:{step} untied lm_head (head_lr={args.head_lr})") + + # Muon momentum warmup + if args.matrix_optimizer != "adam": + frac = min(step / args.muon_momentum_warmup_steps, 1.0) if args.muon_momentum_warmup_steps > 0 else 1.0 + for g in opt_muon.param_groups: + g["momentum"] = (1 - frac) * args.muon_momentum_warmup_start + frac * args.muon_momentum + + # LR scheduling + for opt in optimizers: + for g in opt.param_groups: + g["lr"] = g["base_lr"] * scale + opt.step() + zero_grad_all() + # EMA update + if ema_model is not None: + if not _ema_started: + if max_wallclock_ms is not None: + should_start_ema = elapsed_ms >= args.ema_start_fraction * max_wallclock_ms + else: + should_start_ema = step >= int(args.iterations * args.ema_start_fraction) + if should_start_ema: + _ema_started = True + _ema_steps = 0 + with torch.no_grad(): + for ep, bp in zip(ema_model.parameters(), base_model.parameters()): + ep.data.copy_(bp.data) + log0(f"step:{step} ema_started") + if _ema_started: + _ema_steps += 1 + decay = min(args.ema_decay, (1.0 + _ema_steps) / (10.0 + _ema_steps)) + with torch.no_grad(): + for ep, bp in zip(ema_model.parameters(), base_model.parameters()): + ep.data.mul_(decay).add_(bp.data, alpha=1.0 - decay) + step += 1 + approx_ms = training_time_ms + 1000.0 * (time.perf_counter() - t0) + + if args.train_log_every > 0 and step % args.train_log_every == 0: + log0(f"step:{step}/{args.iterations} loss:{train_loss.item():.4f} t:{approx_ms:.0f}ms avg:{approx_ms/step:.1f}ms") + if args.churn_log_every > 0 and step % args.churn_log_every == 0: + log0(f"step:{step} churn:{churn_fn(base_model, args.bitnet_group_size):.4f}") + # Wallclock cap sync + if stop_after_step is None and max_wallclock_ms is not None and step % 10 == 0: + reached_cap = approx_ms >= max_wallclock_ms + if distributed: + cap_t = torch.tensor(int(reached_cap), device=device) + dist.all_reduce(cap_t, op=dist.ReduceOp.MAX) + reached_cap = bool(cap_t.item()) + if reached_cap: + stop_after_step = step + + # --- Serialization --- + if master_process: + sd = (ema_model if ema_model is not None and _ema_started else base_model).state_dict() + if base_model.tie_embeddings or args.logit_head_type == "tversky": + sd.pop("lm_head.weight", None) + + # Compute binary overrides for no-features Tversky prototypes + binary_overrides = set() + for n, m in base_model.named_modules(): + if isinstance(m, TverskyProjection) and m.no_features_mode: + binary_overrides.add(n + ".prototypes") + binary_overrides = binary_overrides or None + q_obj, q_stats = q_sd(sd, group_size=args.bitnet_group_size, fp_storage=args.fp_storage, binary_override_names=binary_overrides) + buf = io.BytesIO() + torch.save(q_obj, buf) + final_blob = lzma.compress(buf.getvalue(), preset=9) + with open("final_model.binary.ptz", "wb") as f: + f.write(final_blob) + artifact_bytes = len(final_blob) + code_bytes = len(code.encode("utf-8")) + total = artifact_bytes + code_bytes + log0(f"artifact:{artifact_bytes/1e6:.2f}MB binary:{q_stats['binary_params']}({q_stats['binary_bytes']}B) fp:{q_stats['fp_params']}({q_stats['fp_bytes']}B) code:{code_bytes}") + log0(f"budget:{total}/{16000000} ({total/1e6:.2f}/{16.00:.2f}MB) {'FITS' if total <= 16000000 else 'OVER'}") + if args.eval_depth_recurrence > 0: + base_model.training_depth_recurrence = args.eval_depth_recurrence + log0(f"eval_depth_recurrence:{args.eval_depth_recurrence}") + + # --- All ranks load roundtrip weights and evaluate --- + if distributed: + dist.barrier() + with open("final_model.binary.ptz", "rb") as f: + loaded = torch.load(io.BytesIO(lzma.decompress(f.read())), map_location="cpu", weights_only=False) + base_model.load_state_dict(deq_sd(loaded), strict=False) + if ema_model is not None: + ema_model.load_state_dict(deq_sd(loaded), strict=False) + torch._dynamo.reset() + q_val_loss, q_val_bpb = eval_val(args, model, rank, world_size, device, grad_accum_steps, + val_tokens, base_bytes_lut, has_leading_space_lut, is_boundary_token_lut) + log0(f"final_binary_roundtrip val_loss:{q_val_loss:.4f} val_bpb:{q_val_bpb:.4f}") + + opt_temp = 1.0 + if args.temp_scaling: + torch.cuda.synchronize() + t_temp = time.perf_counter() + calibration_tokens = train_loader.stream.take(65536).to(device) + opt_temp = find_temp(args, base_model, rank, world_size, device, grad_accum_steps, + calibration_tokens, base_bytes_lut, has_leading_space_lut, + is_boundary_token_lut) + torch.cuda.synchronize() + temp_time_ms = 1000.0 * (time.perf_counter() - t_temp) + log0(f"temp_scaling optimal_T:{opt_temp:.2f} eval_time:{temp_time_ms:.0f}ms") + + if args.sliding_eval: + torch.cuda.synchronize() + t_sliding = time.perf_counter() + sw_loss, sw_bpb = eval_val_sliding(args, base_model, rank, world_size, device, grad_accum_steps, + val_tokens, base_bytes_lut, has_leading_space_lut, + is_boundary_token_lut, stride=args.sliding_eval_stride, + temperature=opt_temp) + torch.cuda.synchronize() + sliding_time_ms = 1000.0 * (time.perf_counter() - t_sliding) + log0(f"final_sliding val_loss:{sw_loss:.4f} val_bpb:{sw_bpb:.4f} " + f"(stride={args.sliding_eval_stride}, T={opt_temp:.2f}) eval_time:{sliding_time_ms:.0f}ms") + + if distributed: + dist.destroy_process_group() + + +if __name__ == "__main__": + main() + +==================================================================================================== +Python 3.13.12 | packaged by Anaconda, Inc. | (main, Feb 24 2026, 16:13:31) [GCC 14.3.0] +PyTorch 2.10.0+cu128 +--- Hyperparameters --- +activation_type=relu2 adam_eps=1e-08 adam_lr=0.05 adam_wd=0.05 attn_proj_type=standard batch_schedule_fraction=0.33 batch_tokens_start=0 beta1=0.9 beta2=0.95 bigram_hash=False bitnet_group_size=128 churn_log_every=1000 compile_mode=default corr_weight_lr=0.02 data_path=./data/datasets/fineweb10B_sp8192 diff_attn=False ema=False ema_decay=0.995 ema_start_fraction=0.5 embed_dim=254 embed_lr=0.6 eval_depth_recurrence=0 fp_storage=True grad_clip_norm=0.0 head_lr=0.02 iterations=50000 logit_head_type=standard logit_softcap=10.0 matrix_lr=0.04 matrix_optimizer=muon max_wallclock_seconds=0.0 mlp_groups=0 mlp_mult=4 model_dim=768 mtp_heads_count=0 muon_backend_steps=3 muon_momentum=0.95 muon_momentum_warmup_start=0.85 muon_momentum_warmup_steps=500 muon_wd=0.0 num_heads=8 num_kv_heads=4 num_layers=15 qk_gain_init=2.25 refiner=False refiner_kernel=3 rope_base=5000.0 rope_type=yarn run_id=pushing_run_binary_1 scalar_lr=0.02 seed=42 seq_len_start=0 seq_schedule_fraction=0.0 sliding_batch_size=256 sliding_eval=True sliding_eval_stride=16 smear=True softcap_type=poly temp_scaling=True tie_embeddings=1 tied_embed_init_std=0.005 tied_embed_lr=0.02 tokenizer_path=./data/tokenizers/fineweb_8192_bpe.model train_batch_tokens=524288 train_log_every=500 train_seq_len=1024 training_depth_recurrence=0 tversky_feature_pools=0 tversky_membership=sigmoid tversky_num_features=0 untie_at_fraction=0.0 val_batch_size=524288 val_loss_every=0 vocab_size=8192 warmdown_fraction=0.2 warmup_steps=5 yarn_max_len=2048 +params:106154616 L:15 d:768 h:8 kv:4 ws:8 ga:1 s:42 +warmup:1/5 +warmup:2/5 +warmup:3/5 +warmup:4/5 +warmup:5/5 +step:500/50000 loss:3.6805 t:77540ms avg:155.1ms +step:1000/50000 loss:3.3485 t:155075ms avg:155.1ms +step:1000 churn:0.0000 +step:1500/50000 loss:3.3714 t:232880ms avg:155.3ms +step:2000/50000 loss:3.3187 t:310516ms avg:155.3ms +step:2000 churn:0.1984 +step:2500/50000 loss:3.2573 t:388417ms avg:155.4ms +step:3000/50000 loss:3.1844 t:465980ms avg:155.3ms +step:3000 churn:0.1457 +step:3500/50000 loss:3.3885 t:543772ms avg:155.4ms +step:4000/50000 loss:3.3496 t:621381ms avg:155.3ms +step:4000 churn:0.1252 +step:4500/50000 loss:3.3527 t:699211ms avg:155.4ms +step:5000/50000 loss:3.2171 t:776797ms avg:155.4ms +step:5000 churn:0.1151 +step:5500/50000 loss:3.0536 t:854512ms avg:155.4ms +step:6000/50000 loss:3.1355 t:932007ms avg:155.3ms +step:6000 churn:0.1087 +step:6500/50000 loss:3.1928 t:1009731ms avg:155.3ms +step:7000/50000 loss:3.2378 t:1087253ms avg:155.3ms +step:7000 churn:0.1041 +step:7500/50000 loss:3.1585 t:1164994ms avg:155.3ms +step:8000/50000 loss:3.1436 t:1242513ms avg:155.3ms +step:8000 churn:0.1009 +step:8500/50000 loss:3.0573 t:1320248ms avg:155.3ms +step:9000/50000 loss:3.0523 t:1397837ms avg:155.3ms +step:9000 churn:0.0982 +step:9500/50000 loss:3.3082 t:1475596ms avg:155.3ms +step:10000/50000 loss:3.3521 t:1553112ms avg:155.3ms +step:10000 churn:0.0964 +step:10500/50000 loss:3.1877 t:1630835ms avg:155.3ms +step:11000/50000 loss:2.7388 t:1708388ms avg:155.3ms +step:11000 churn:0.0948 +step:11500/50000 loss:3.2052 t:1786100ms avg:155.3ms +step:12000/50000 loss:3.2859 t:1863613ms avg:155.3ms +step:12000 churn:0.0935 +step:12500/50000 loss:3.0326 t:1941282ms avg:155.3ms +step:13000/50000 loss:3.2551 t:2018764ms avg:155.3ms +step:13000 churn:0.0924 +step:13500/50000 loss:3.1339 t:2096463ms avg:155.3ms +step:14000/50000 loss:3.0606 t:2173965ms avg:155.3ms +step:14000 churn:0.0915 +step:14500/50000 loss:3.1752 t:2251634ms avg:155.3ms +step:15000/50000 loss:3.0206 t:2329140ms avg:155.3ms +step:15000 churn:0.0907 +step:15500/50000 loss:3.2017 t:2406858ms avg:155.3ms +step:16000/50000 loss:3.1705 t:2484387ms avg:155.3ms +step:16000 churn:0.0900 +step:16500/50000 loss:3.0774 t:2562139ms avg:155.3ms +step:17000/50000 loss:3.2494 t:2639671ms avg:155.3ms +step:17000 churn:0.0894 +step:17500/50000 loss:3.2024 t:2717393ms avg:155.3ms +step:18000/50000 loss:3.1627 t:2794977ms avg:155.3ms +step:18000 churn:0.0888 +step:18500/50000 loss:3.1733 t:2872744ms avg:155.3ms +step:19000/50000 loss:3.2055 t:2950389ms avg:155.3ms +step:19000 churn:0.0885 +step:19500/50000 loss:3.2026 t:3028137ms avg:155.3ms +step:20000/50000 loss:2.9144 t:3105704ms avg:155.3ms +step:20000 churn:0.0880 +step:20500/50000 loss:3.2154 t:3183466ms avg:155.3ms +step:21000/50000 loss:3.1016 t:3261044ms avg:155.3ms +step:21000 churn:0.0878 +step:21500/50000 loss:3.2065 t:3338791ms avg:155.3ms +step:22000/50000 loss:3.1611 t:3416326ms avg:155.3ms +step:22000 churn:0.0875 +step:22500/50000 loss:3.2578 t:3494047ms avg:155.3ms +step:23000/50000 loss:3.0689 t:3571604ms avg:155.3ms +step:23000 churn:0.0871 +step:23500/50000 loss:3.2047 t:3649319ms avg:155.3ms +step:24000/50000 loss:3.0689 t:3726856ms avg:155.3ms +step:24000 churn:0.0868 +step:24500/50000 loss:3.2355 t:3804562ms avg:155.3ms +step:25000/50000 loss:3.2085 t:3882065ms avg:155.3ms +step:25000 churn:0.0865 +step:25500/50000 loss:3.2235 t:3959778ms avg:155.3ms +step:26000/50000 loss:3.2484 t:4037303ms avg:155.3ms +step:26000 churn:0.0863 +step:26500/50000 loss:3.2419 t:4114994ms avg:155.3ms +step:27000/50000 loss:3.1215 t:4192502ms avg:155.3ms +step:27000 churn:0.0861 +step:27500/50000 loss:3.1305 t:4270187ms avg:155.3ms +step:28000/50000 loss:3.2679 t:4347697ms avg:155.3ms +step:28000 churn:0.0858 +step:28500/50000 loss:3.1768 t:4425383ms avg:155.3ms +step:29000/50000 loss:3.1519 t:4502876ms avg:155.3ms +step:29000 churn:0.0857 +step:29500/50000 loss:3.1614 t:4580510ms avg:155.3ms +step:30000/50000 loss:3.2341 t:4658001ms avg:155.3ms +step:30000 churn:0.0855 +step:30500/50000 loss:3.1673 t:4735648ms avg:155.3ms +step:31000/50000 loss:3.0884 t:4813158ms avg:155.3ms +step:31000 churn:0.0854 +step:31500/50000 loss:3.0147 t:4890803ms avg:155.3ms +step:32000/50000 loss:3.1793 t:4968281ms avg:155.3ms +step:32000 churn:0.0853 +step:32500/50000 loss:3.1626 t:5045990ms avg:155.3ms +step:33000/50000 loss:3.3086 t:5123506ms avg:155.3ms +step:33000 churn:0.0851 +step:33500/50000 loss:2.9607 t:5201190ms avg:155.3ms +step:34000/50000 loss:3.1584 t:5278703ms avg:155.3ms +step:34000 churn:0.0850 +step:34500/50000 loss:3.2311 t:5356349ms avg:155.3ms +step:35000/50000 loss:3.0574 t:5433881ms avg:155.3ms +step:35000 churn:0.0848 +step:35500/50000 loss:3.1880 t:5511613ms avg:155.3ms +step:36000/50000 loss:3.0474 t:5589157ms avg:155.3ms +step:36000 churn:0.0848 +step:36500/50000 loss:3.1925 t:5666894ms avg:155.3ms +step:37000/50000 loss:3.0935 t:5744417ms avg:155.3ms +step:37000 churn:0.0847 +step:37500/50000 loss:3.1454 t:5822114ms avg:155.3ms +step:38000/50000 loss:2.9914 t:5899675ms avg:155.3ms +step:38000 churn:0.0846 +step:38500/50000 loss:3.1192 t:5977449ms avg:155.3ms +step:39000/50000 loss:3.1994 t:6055002ms avg:155.3ms +step:39000 churn:0.0845 +step:39500/50000 loss:3.1586 t:6132704ms avg:155.3ms +step:40000/50000 loss:3.1402 t:6210265ms avg:155.3ms +step:40000 churn:0.0845 +step:40500/50000 loss:3.2176 t:6287989ms avg:155.3ms +step:41000/50000 loss:3.1743 t:6365543ms avg:155.3ms +step:41000 churn:0.0831 +step:41500/50000 loss:3.1811 t:6443269ms avg:155.3ms +step:42000/50000 loss:3.0934 t:6520796ms avg:155.3ms +step:42000 churn:0.0810 +step:42500/50000 loss:3.0804 t:6598538ms avg:155.3ms +step:43000/50000 loss:3.1341 t:6676105ms avg:155.3ms +step:43000 churn:0.0788 +step:43500/50000 loss:3.0942 t:6753855ms avg:155.3ms +step:44000/50000 loss:3.0144 t:6831414ms avg:155.3ms +step:44000 churn:0.0769 +step:44500/50000 loss:2.8582 t:6909098ms avg:155.3ms +step:45000/50000 loss:3.3925 t:6986654ms avg:155.3ms +step:45000 churn:0.0745 +step:45500/50000 loss:3.0488 t:7064379ms avg:155.3ms +step:46000/50000 loss:2.9942 t:7141950ms avg:155.3ms +step:46000 churn:0.0721 +step:46500/50000 loss:3.0737 t:7219653ms avg:155.3ms +step:47000/50000 loss:3.1052 t:7297260ms avg:155.3ms +step:47000 churn:0.0688 +step:47500/50000 loss:3.1031 t:7375013ms avg:155.3ms +step:48000/50000 loss:3.0978 t:7452604ms avg:155.3ms +step:48000 churn:0.0648 +step:48500/50000 loss:3.0704 t:7530338ms avg:155.3ms +step:49000/50000 loss:3.0631 t:7607877ms avg:155.3ms +step:49000 churn:0.0586 +step:49500/50000 loss:2.9547 t:7685573ms avg:155.3ms +step:50000/50000 loss:3.0994 t:7763153ms avg:155.3ms +step:50000 churn:0.0453 +step:50000/50000 val_loss:2.9692 val_bpb:1.1497 train_time:7763355ms +artifact:15.60MB binary:97320960(13685760B) fp:2542200(2585072B) code:70399 +budget:15670651/16000000 (15.67/16.00MB) FITS +final_binary_roundtrip val_loss:2.9743 val_bpb:1.1516 +temp_scaling optimal_T:0.90 eval_time:245ms +final_sliding val_loss:2.9027 val_bpb:1.1239 (stride=16, T=0.90) eval_time:768782ms diff --git a/records/track_non_record_16mb/2026-03-24_106M_Binary_Asymmetric_UNet_FP8_15L_8192BPE_YaRN_NeoMuon_Smear/fineweb_8192_bpe.model b/records/track_non_record_16mb/2026-03-24_106M_Binary_Asymmetric_UNet_FP8_15L_8192BPE_YaRN_NeoMuon_Smear/fineweb_8192_bpe.model new file mode 100644 index 0000000000000000000000000000000000000000..6574784f5f13df431a81b76359dfc0c643b4c679 GIT binary patch literal 370917 zcmZ6Udz@rRR@X1=@@jUM;<3DuWO**^&eptVmIZEgb@z17PERLYJv%JV%&N+&%9*Ol z+heezfX9ePk69Tc&JZ!xKDVbPk6LXc&tx&yiZu^6E61&t9`b*kNf-bxW6xt`}^{^zb}vb`|`NIFOU2C^0>b* zkNf-bxW6xt`}^{Epf8UH`to?7FOLWM@_3*xj|cklc%Uzj2m11Opf8UH`to?7FOLWM z@_3*xj|cklc%Uzj2m11OurH4X`|^0OFOLWN@_4W>j|cnmc(5;z2mA7PurH4X`|^0O zFOLWN@_4W>j|cnmc(5;z2mA7Ps4tI)`to?FFOP@%@_48(kB9p5c&IOrhx+n(s4tI) z`to?FFOP@%@_48(kB9p5c&IOrhx+n(xG#@~`|^0WFOP@&@_4u}kB9s6c(^Z*hx_t) zxG#@~`|^0WFOP@&@_4u}kB9s6c(^Z*hx_t)q%V(0`to?BFONt1@_3{#k4O6Qc%(0n zNBZ)3q%V(0`to?BFONt1@_3{#k4O6Qc%(0nNBZ)3v@efG`|^0SFONt2@_4i_k4O9R zc(gB%NBi=4v@efG`|^0SFONt2@_4i_k4O9Rc(gB%NBi=4tS^tp`to?JFOSFi@_4K- zkH`A*c&smv$NKVktS^tp`to?JFOSFi@_4K-kH`A*c&smv$NKVkyf2T(`|^0aFOSFj z@_4*2kH`D+c)Tx<$NTblyf2T(`|^0aFOSFj@_4*2kH`D+c)Tx<$NTbF>C0oKFOQYJ zJXZShSn11Sr7w?_zC2d?@>uE1W2G;TmA*Vy`tn%m%VVW4kCnbWR{HX|+?U7YzC14X z<#D+$kIQ{|T<*)`a$g>o`|`Nlm&fJ4JTCX;ak(##%YAuV?#ttHUmlnH@>uQ5W3?}j z)xJDd`|?=r%VV`KkJY|BR{Qc;?aO1eFOSu}JXZViSnbPWwJ(p=zC2d@^0?BM$CbW3 zuJq+`r7w>weR*8z%i~I49#{JExYC!$mA*W#^yP7-FOMsId0gqs<4Rv1SNig}+Ly=G zzC5n><#DwykE?xoTA>zzJaKoyv5kG=m6^#lms~_csaFj&MwHm=kUQMnRt{_B?_U<=x~c zTY#^26wy!LdFj%nmreJ#34gPr2gf=3I`r?(U=BFTUIX) zcqqrY0eCFv2a>>>B6z6Sb&hfma3upJzMk0#<#{T9l4yq+tO4$1pcear$a$iizc_*$ z3XXD=6~JZ&tAL$%B|+btNpE%|&8~QvM{um*Q3R(F@C!NqRw&{-GLIp+CZnlD{S}dF zrt-fse^^(=zRDk@(_ihNMEErk9IG+D)yz+rHK9e598Bc4sIC+st6J$#>+8hJ5U=>yQ@ zlWFAf-OfWg`#m|;Wx)65ct<GQ6#e#xve@s$ z#~*eo3G+u>paQyq%H5tS&bFxJ-PTRfFE}((SM@EQktL4v91GtDuZi) zpN>Li#}eClBtOvT_cIZEUM&4=W_BLR{&P9w6V>wPGm7r`3pwgIl>8SXlS5_lOHs^G zG}wAZ8}KWUex@$=s}Y=Pg!r`xZY%AtC+5`-zY*lQhOmpAWwh0A zMx^nP68u(#Cx;U0Z)avVV#xTNoG5z2@4om^^83Ba)>DArk4$!=F4em=(9EK`?-Suk z=-m6hI}JKp;_YSc)}XfM0p;ZpZMrS-yJ#~$3Q)BnGyb^$og1py(ULH+KD0UlQZ5r z2o?C0h&q>CKQ&ivQ|Qy;!%QRCr{@ocYQN8j4>5GS*1W}(_L(kI0{bkp5_9zHa(O4y z=-{ut@b3*KG&(EX+AGf z-*+G6+T{g#CFeBPV0AgZOvX}+)%+zo!WCz=@jPFy<}ZiRlWVT6wB(7*{7LlP>+vP} z*NyyT9QA)Payy8*@y!vQM*BV$+|Q1qAKlDVd5)}~HfNFgmQ%#AJB%`7O!@o>j>X|^ z$B3S}7UY@4cE>R^>pc@OJUqT7Gq)4U{j3vjs5!nM^Y9Gdt-0QJq;g-F(Wb`5FLHh{ z8h&w3Bv!s8KFnqce`%DzrHsEUryh&3QAVKyTgAtSUigJ8M00#WG0{&1=cBrZE4;Q3{{Ua_qx_{%a z8mS@7B(b5P=*x404XV_DG!w-_|0~|jA?^oifxa`D!e9fzJDWF??vDmwLPYirK*>T+ zzcRw(D97R1fyT{0mJ3=#@(svqCNfu*zA9IDCb2ibv^kKz{qe|hT~cZs9*2T|bAoO*)UK=wFoDCSsjAO<9r_CbG)DU$2rV`C*PQ=h74#56;EKT?` zSu`g*)Sv;9e20)H--v@O;^B#2?|n^V0RY#PFD>w6xY|9mcHf5Ljd0j9QA zFZdQGyeEkhCco&1e<4aem}oK90G#56I`Ov#+VEn$0f}j?f4(h(Q-TdaWN^Hz_}`w3 zA8Vc00QodI&>Zj=9Zj9MA?!k)l>pkvYpetW{b5xj>0dIxS`0RX{Uz2Q-w{bBN+5uRzK(h#P3*ojH8{%X!* z8p>GUc5DW}JGaYtSAC-?_-GnK`CrTZ=wxTAA*6vo-m~La5`0e-dRK#0L)b068=bNG z>t^Ug-w1mcnWb^=dm|T?v8o2ZDG$Ab_!~|cEs(;oe)zszlkvW!280s~668_+&BTcY zFv2wdSdDp>vtFMBlVGLUh^hB)<&Q{B!%D&PaV!kKKO)myViI5`@3^1*?aa)kR)7tV zGM^VKKagASy4nIn8_iZcwg!Mic4yk>|5)bkx#Rc-=G$J(g`If| zttISIV(0iy=S%}P4#ukPU5>BibVJy}J@GdC-A*C3A(#Z9q2@jD;eheKA!v~whG_m_ zhI5Ul4Y16FF~I(K6gP=!fiP8-s#pIgCmU;OZGcp0Gt4DFkxSO@vjOqtASU{MoY}=H zphg0_u#>cu{K*_)e-e`|*p;{qZ`D7^W$tJ()&OZNbXwZ@Q~BF8`Wi3^W)ijdrxx12 zrX0eg+w$J}rvvPVy{`edxG=-~v-rfcFI@ysjr(F@nNjhb3>S7SXIZ8`(NZjk2FJo%s9iCCh?yQk?(4dB&c#?S@bU>d>jf$ zSV}vO@$u&zX{G_f;c={R{#6$BR#Hz0R)d|;(4Wt6eWLaRnvI=_%C7!(WV@OM|7di4qNCmib>4RA7P4`pV{QFFqPb|d- zinCw=ugT@TS$brtMN|X947@e=M*kr*@U+?&M3s(X@creS)ARe1a|7nv*0E~!A9E&$ zTE;g(K22k5`YVy43?R}l5G{-*O4I+7qs4lHur(>x;=h`?i188R&NWF(>HnNrSs%yZ zui)8!Y!-eke?N>BLIb8$)L;GmzeK6Kq40$HFrCGw_}6ox`1wFJ%W}$RQ281} z%44a2Lzvo2`#OTI)~8Udmq(n52DzFramMQ91YHno6*W%>!37!NlBQL8&eH6jRC{G3IsUW4=j}MCs?`ia|3A?NVom4eI)tPQ1l>?FG)V8G$&n?ERrM>}n%4;%JHb7DyAB5_5 zAoWTFSA9fIcNpuLy5OBSIB3ACo(AO3;pqn9>eV^_L#=yiAkmIvU@o6#TJd)Nw7;oT z9~o@io$SiSUxSh)_qA3Bx>4AQs7rSctcAwaN9FjNQlYv)79=DGI`*c^1yjk_-bb+? zofC%o*FcGJ)6**m)){XokuV~i?S_r-V*)TQ@udcs*IilX8-E)gsHr=E89zuy^^c7# zqhV^oW{V*Wt{X6gZ|uypsOrks9q(;FF1jMa0nHIs;u;F^p{z&)KxX4FVQy-SMu5pc=r3JFz|@>@?$OMSq&cGV~L2=Iba)4I+^@0)(YpVY}-9t^=KkURnKt zh&YatQaG%Cgh_O~t6gXZa3cDR5`AK3<46Z>HPH1MOP(M#f^5Rwi(j+~9pq$(n%Zk1 ztvcl#s3~7IqPzmPVi)pBk>kFG{+jTsFTDyGPWN9j$~BNJm_6xJ=M>wqj(knjWS6Ot zn3&@jWE+4hxG57tSK?0h#dq~-IqhBySv8on zC?X#lfYWYBi8>(6zC%lrPY=mXWId`0o6XBaYCs=$>4T9o(^FbibdJGr76*2p@t(An zWxM2sRSn7eiuFmF%At^HvITV@RAa0yX!Y96)Mku2Kt3JE#9BUG)>%PU;8=Ug>N9gf zst(&gvwcm2BZzw6_ul5)NCX) zxhHaIPZDmqQ(Vc0ejJkInw;-GWCNT53RQg;?YOiFY7-Wk;CY!U}g^9)LJ+ zl$>e^)aW_SHUNopH%uxWh-70K!QYURjkm|4u7!Y+Icz=*dV{M#k z!jO-(qabLhu1MQJX5H4Z`!n2=BGf>Od2<%c2d1R)uBK7KWck|HQPKlB)m@piXNpZmz41Y3s+Ot=dl$9I3>ouJ1Ie1lN-sK0=LA?UN&HaG>|i2W zZ4DNOv8*KwAK0X{{Ov&MHz%>od^nIMwoq4ynOx5-`%|Br>BwHvGn zQX`aF>-z=-b#l^7)tlT*`>{r>K@x;o zsUe(X9FtS$nD7$YjL*qM+>t;*jyoBLi5u)9$85(nA9f|iy3avBH%f|aM@=}DWaE3m zTX8>$$cDq;}SMTLUENc)B}| zHAz=KW3iyfm0TemyGEym;OrYk_4ls8d7Rx`Ns>Z1HDRgmY(I8%Ajxf8a_jsF zt5O(iu4dNPwFm*ZfDH}9P5CJ4cOY`nJVqt28SDwv04i~$RYQ|L+>NpT&Tl17xSq&x zGuAFZD-Nv`+%!-q0yWaQ2SBbiM_!NcObe&l;n;XLgk1o;6O#DTB&<6(a?NgO=mEJO zZtU2655a{Wrn$H)QCKEQqpBx!4r`JP$VKhTr=!~A*hOrm z$)G>HMIzY--0)G;@kx6l6BA|g@X`vCS2XhACGpT zO@Y^C?dgC{%#t>25x3uS>BBF*f`M&(xAgxSqGQc>S|x+hog9^kD{S^P6bl8F)3&|ftw|INCOPX3b-8D9WQGTA z+QDXWBP3ovuZPv81H&{my`vTNTXN(r%^-DwhZ;H>AmwaKz}j zCQr_r!ZHCe1BXXyVW2Cs6Q{)07i5gBMvJh5y`*mnp2b41E0{wa`g!%%DEMx42*R$+ zHZpHOE_fcR#R3n*68VLp;l~WNHNiZ>BztV3>@em?BRVK%dpjI3sxQi!k2l$G)j(P^ zR?SWMbf6>Z4(QzX(h98l;*jI?NTY5Iu&TvMBpI(ElHE+sUR{c9Uz{(Aq?{&dnF}ya zTL*G0Yy1&!4h?eU9p|!N`g>3bQtd{;oxTN6=#14PUb_%QC&n8ny=c z<63NzfvHQ_Ce@Yit6Gm#BWq?H6W0JoKAOZruYr(8zVTkzXFDivz=<}o<4BEe;tp8@ z_#`8Px&X+0U~q(IAo7}qcfk6KpQug^bRmbydeA^2r>Rfjy7HabGX|wb57WDBBjaO7L#ptZnc4p`#HZwqr zYbEzzDAL8yjS_h){Iqs*-kO>z80^HG$!69-G5dGa_d%8i2M%<16+ev#+JQ;ZAUL1C zU7~CN)|V5Xigpm-WII-0&t4}EghOkOLjPn5ix*4ASv&=oqq&UQY;KJ7B(%%3{ z$Dx||GhhQtj@3NAFhOhd1|$~4m#e^eSO|cwsob!_Nc^^7?Ktdxgxw6cXzm7N>9KX8 z5*_#>F1?aQh*N@OkgX1)a4grVC%_t_Q$bF*$OV%5nZ>*VbQ`T7#}chNjPf^P8BLff z9)@YJ0i@R4Ln4$9!8vV)gY{8n4_h-LslkrV#H3?{27&~=FKz7nHMV-OF{+M3^75^& z2|DHMAPiv*gfudTakm3Y^r7U{-5i_YMK!D;ICfG)uw^{n^`zdF7~U0EC%Gm^I-jqB z&S1WyF}ndf1NxcQRZ!e88#GMyLXM9eMGO@DtgWqJVjtTTrUO}Q^KjU|mow)e*Oopg zDT3tM04Oc#iYj!lVv(A_299UPu+5nuoJ675_SF9m1plOmH5p-_v zZN8D>f!5(eI&uZXJ7cV~PW0Lc65ZK_GonXMqj5s2aSIYnv+ zn`tHi=|<~Rdn$<_{n*Z=3j@Y2B2|AhzTg$h{4)T!T4Vh6y}^f+)JRS_BcMgOZHp*a z!#LX4mYCyxhU2clRc)LX1)jti@>k}P7_l(qko5o(bvcf+Z^ABhTjRp0z>PTG0wRMA zuVY%Lo9atfBChTL7ydNUlvw?-$bP111H#V!c5Lv0NC!V-H6Fk<+?#~i5$rCgqj#oT z#MzcOD3F$M&VZRsg#s)@tPEZjS6>wc;2oxV0^pu`T9apOgi4=~b_8--T^zHkvLj3n zV|-{q=8hqUKG=a>lUrK2EmCB&S^e=S2_7_B4?r}PJojsYh{)n)12-mCz=mO9W z7I&?q?n*SB!|Z1$i7+ywqau&$t8<6Kco0t611QdhO*@R$V3)G4fqPUwZ`k4lhPN?S zBU*y+HVL&QOd*&n(9}gqW3yKECnC$KOz|)`Fhr1tI$IXWxmhDLUcV9vKXvxrX5kTLWSJU@Rw*5rEvrv95J^ zsu8X&X~wc{cEHs795BZ?wt(bLz3(x;`jfdL&#X}>$ZW6Lm<*;=P4<*pqCTxffl(eC zKnVOqMg)S+<76VghXvTep=(h<;HW5?Gadjr8U zp%=E{bJRkVRARI!DRzCRx`04U=JOb&{!}h*D~`biU{daenT@dJei#i6bkVyq{8Mxx z%nrgt+m-}mM)iT@wH+!(IEm8^ZC{s#eN!3+@>e{}s8kJhQG4cj1i`FKUeKloNX`dh zt1W36G3Wpt*iDU)H;*;BEg&V%XXrG&>Q86FzVjbIaLXKG50q-wP}JUG2OR;d<5BOD zn#G#>TxmM&TL9rB0syEmo zxf$EW@)@sMM6@WFC4@}>eV7Th5 zrUSak<(=u?4axj(*`wzhA~)M-2LL|G$16Hj4R-Z4H)sblf>M-dD@LXUl52QPOvOTQ+;Eu++4=gfguh!J}(4XoQ!bVMvov+c^qfRy#Y}242uq^ z^{1`y@#z5Q)1{Xyn6i7SkihTd!yKU);ispfY;t8b(7< z-B5&^R1R%9gP6 z=g5TO{AnUvz=AO8F_7}1`g6J9xep8n5ZnlNr(vor7|V;=6zq&6v!|xNQj{c5# zV%6CJDeNc?g%$wk%@oZ9PSxzycCFaN}v}`7L>N!RWAe?DM|?M9Phm&zdlCRTiI_$XGl0B6pboaQNTYIj;ojqKR$ zIVE<0+UG#ZjmcA{7mlGa2qK~{e;bGfRSYyec%Cfd5y0E!;#Kx^bM z<~xcdjNX0HCs8Oti<8qQ`=0uv{DBt)%>jq_X^Gk9;x=Uk9{|ZIPB96(Iq_MHAp?R_ z+x{x?LbA54l6?z;SB@d5eFw0#w{$0J0ZhsvzEGvV6eZoG9}EaO>YCcR2D%EIZ^mFW zf+F3Kn`lGW88O7lzR;G`tFqU2Ky!&lh)w+qNak`AVO8IeR3}!l0}xqzl|$HktRExo z5zzY0DhS+w%@>EM;ltbba^{6C6sc5N=zw72SkuCY zuuI*KbBG4)`p)f})fR#-%W2qv0NZh{23(ZyV>@qE-Txj9Z~v zz|`{N1Gz6Zg~0irc4%!OtWUz`+CiW`obRZ<3lNGh_Kljqk_EG(wgyo>8LVy208@30 z094)I=uK?p>9CDcGI~m3T9CP7T%Mv*9Rx~Y52BG0fa2c1N&|mamiC0jV%RmHxNUYd zJAf#1tUIM6hmYfkn6L|AN36wj>tqaOvV?aaO1Q(UzaW@8nJhmMwIZAg7vf5Soo2VDYjXt*%oyqTI;e=V21$Kelo z4j`EKP5HRgKuX?=Wn=ktvZDflZiK6}SL=xPbmrPQJd<178l)Z&woteT zscM0#mKOITM~lI~FpY5)fR` z!uSx-RiK>kcI?1blnt!rnokFt3jnCH2)9bpJ-q+?MIVOK`ZefhjE88b?5D z-Yj3lfkNikX|?evpV|2NJjkEzue1QTCM>2<^XmI@P1Yx<(xAY&##aN=u!NyDy3hy` z&2Sv*O4!XH8&eB%)^lcQ>Rd3UM~Yg2VH4$k1LgkB9QhDGL@;Tu+y_TM z3Y>-gq!$^*`_2~r%Wo?76MMGTAp+eg5?-)7a&;HE>TPG>IX9qPsU(A z0MlLvaV?{KPq(KQB?0$T_+I_J zEUZMMRzH&C?rTFi03vejrsH;T4JK;1ZH)+1K7Mep-vG+o<}>WpX%Wb3JWA9RJi$y$ zv0x;r`)XuR{exV+EiWPlg}@>~Y}OF0$K19{YwHm-cvz3a{RUtWa&8ez?-q*K4F_7n zmfyseJr@uxmSrX$BZ4hI3kYJ|9$-x|!~+@=G# zytf@sWg}WZaCuLMSJjW@yEVzLKJCOITfd@ckTY4g9bg2(b!5Zuo1#$QfJqrvTYzO` z-<{YlLL*1kl(BJfba`GN^e$(=X%Eo>1oL=VwX1wW#!wJRuc=L`|gh4EqJF}Mz)dxVSn3;BjnoO;s({Id|RJsi_+wDnrIUw8- zw%DFfX&{t8o-2J zXx3VkwClce_a19O+;bQJQ2Xf-O(C8!k{D^SUPsZ{FWIp%8 zN(yxDTWQYjpqO(`0pNB~V#r{qk-7SbJbAGUF5~k6s%jV~T7;<)R!vnM;0#Z~9@2o# z!A#Gov<2Q%<{i*g;eE-l#6yr`#9I{AKh9quA} zj_{CHp0R>thK_TB*Mfg)~cV18s5`rNibC@Zo<}JxaOLl z>I-m7+?0KyDKMVXX$#zq84>83Y}nCh0fDSG!uDPL(_D@(7Y@MAdaU=zYM{oIH3}_X zVte2Ci);yP`SN^tJpd7Q92+eh9`7-TRzDr(()IDA8vx7|4nlEZ9fAe4#TMMJ0gRxz zJ6%QQ4TyZFdV#DZY&PTGWam!@Iv`pQcFuF$($%d0EXw7WgDx-t_^b8-e69V}=j0;* z1ZC_G7iDis{T$DR%rCjNv9#mg@NBdsm`-hXLcn4yK*|l^+o;~nwr6Dd4So% zO)V^y5WLdz=kcbVCyZ$a(FA3(lxu z_Wv56!IrV^VufF83kiL{b(<;^l)A}PYeCpd>n?+qsnyTsw7UKnZw3w^k@g<(Yl4=+ z5UxfgE|*Y}aRL%`W~r%R*$MR$Uq{d^YaF3qa2godDaRo7FN3XEuM7w~Hmg2tpEcOR z-q?*L!U%%f`*^M%RO0)tU|ZtjWTb~vI|$}m9eGirbCjw&|6I-cHNj*?#1;#saFDxk_VJ|H{oFzn?)4Yvn^2 zslVY3$eExMtZ5Q(q?sP*XbC4Rv>$ipRI+L<2%1yI3tdL|`7Cly(7oP(V!Kcr5_sWQ zL!|Ch4in%AfUfxB4PncCsNUED-GVmaP+KU;JDT0*y=3`bKsEE9k7`!`Ix}_RO*xp- zIESVgYk>I)zgsL82zUh1Rpgvju^I^GgQqIsqrg>daJvGJbY*z~ls-Nmy8R0|f8Sw> znQ{OzX(H3HRtq&0DjkowjLO%e{Qe6RcP{(_uq;q!>jX?Rl<)8x)ou%i$Fho4|0WpO z4!h6*kVF~2|3GJ{3u$qvJ%Z%M!`PxqgDr(MduXCrZm9P)IDm#lSD- z9G+&gCG#!>7x84A>egTj0!J*?O^6w!fwj|yQui_4D{&3KZ1vdoe zhQTiUud07rYMC8L29Q!`;${(&FHvY~G*Hc7NSuJ#{Q!i=U~2gQ>T zqD=Qc7tqiHZj2D2`lZY)4Jfn00D{}&C^l9=v&Hg-&!d7_F~L&0=sG{jmeWl;7uEl*rsXk+p>lP_E%UogxFBnVwmrscQ(9SsdDi`C(KdaY0d!Mm8nV z(_u?ce`}#xbccGJqXSB1v|3%1PYji?@b7blxTk|k55P_toB0|@%9smA{-ouS%4mR8 z&7cLCbskmMA!G*ubx0103n$e|CSKKhZ;pM4N~=@|sr$;OBFDF= zVM?1D2(D0El8Jm;Xz&{k$ajP(es>b97(nW;(SX&jL_`d6)S9pdizg6r4f2aHll_jW?lHw84hO4oIU5Z5sUyiY3Lf-$=ed8#4Y=j?c0@99#xaTobL3?uGDbXwKud z*0LiY#UJo$9mNA&8DIQrOM0oREp;6n5gQ4IS)js zMg{X0AfrG7HeXk?>}tzrF3ZuOJ1`9pF3O7+qsEZ>pL5AP!Jz>L?3`vY+SWi1alUlo zr-DXMKl0Ltpwm=MZi~$L(ceqrvEI0AfixjzEp4g0@^M9vk1T)|*VV9SR=*aqL>wAx z0hYas)iXmiG%B&HYfmFbMyM3i0LV#RWA@2hZyH7AtVfdrC>c3<~ zCr1rU0|;Kw#H?s5UJX&L9h>%1NusAq$fJQk(_Xzoy$QN%y!-%zU5WcFcl*`rON3|H z&QSiZ=kj?DToDEkQhSDF14xD%R>O^IRN`Yoj$42bLJ_r~XaQzkMw_(!?I3E*(BqX0 z!l`RfR`p*qx7RsRp=>aDC1XFp-RRoE)MXb+OgI{Iph&6jP_)ysBhkbLa3wB>DHiC_ z1V`eS+I}Mwc&}7lasa`F(c|KpMh(S!z!lcCE*+IvEMWYU5EP02X&4|#nOM;6Txp1+ z?rD9nD3HSrYX0AH&9y|Ke6Shb)2vwo&9E*~(8Qyc#Fx3MaeUf8l%_4oV=vN`*lV$) zUVvRYHA~!Fzj%oz&STL$fJ8IRwDPP8JHcF6oW%eXGq@6u5d%FaB5&Qm@&oQIWUJqP zY-hWI=do*AfN9|5K=t247iRPh=^e-g#4u9>Edh2S>K7vjNwfVvZHd34SCU#FIb#4u zk2(NV;W`O|C+v)L=?ERIelu5HPA75xIe?T}KiPh3D5SE_gocn21oGdA_mmog8HX^G zY=FPReUuDv4^NgG7=VO|$bwY;5BG#!?0^HpWXNSJ$s6FxVSgac5!lM-N4{dLftGY) zJ=P$tb6_7t4qf@G{izfVLdp`i0m}ZZjE~|<^Z;ym$8xp?S_Hhk67PzPpdiGF*8NTS z%4NTB)oG!q=dDt-gZgudh0`31-|Bys0%XJ)0O|SD4Ig-bT%7b<1I-AE8ZeA{4M4d= zZwaP)a6_8Xz!pv3@KG!@*JU%F3jS>ur9(3c26;@9cTe)}uOT{DrXC3#tT8H{z;Sr4 zW^V~Q=QVjHbwDbC`8iC)Ah*amZ)31Js{U8bbtSe(12A>+wZ9tRtafy(d<2wuwc>0D zr|!lXm#kPV6!V93Ke0Qo^$L$7J6JCuSrBVwbgq6U3Pe4~bpU`_U3ij;fs%~mLpuVx zfMJ*%8*tJt?EMTuErg^H>GZIJLZhrag}@1_G3^;x{qLODn%Bz(^AI*=)nH=NcXWg{ zDqrV2a*%Go2#J9T&^juEnz~kJgw;(Nv?vLgDrH0c-K3z%Ye3lCYB3nTWKb+IJDigF z2$DtMXNyY=E-ZL;(L$n8Vj}DaTOy49;THmdYA~Y5X~h4rbQwX&dSIlbHiJ0^8Iw)$P!d6~`cR*Su-2Nnl1=#JJJaj3p`n`ywOElr5F#xNnljo_w z!pxK5W;Fs^I9lSxOA=6AH>^6kEZ0JyZKH#gZ+e@AZwm-&Aiq`mf1@M}3|hhsK-Qw< za|*VEuW1i30;X0;4>mMV)XU?3Lt9dNQ0TyUd9*PJo52E-Gvo;@>izp!7CH&RHw4-M znxm}86G%X_^kg`Ij=*_6g>KkD)Ic<2S^?^eqlE%vEVtF$03psWNiIMXp!Hdp(EneS z9e##!76ieWv0{wrxP}4`=#-mh1U8FriM48h*Y#3e3v^>}mpV=BV*b1*r@ zQCLIR8mPEP2)0h4xk@`4y<11$*In&1!f6rcJFKw;hiD;<_w2fKUMR>c#)(DBM7EF% z!shQ;Z&ThU$~#UHT>?5gRy5u*LQ&P_KDq43WWqZ1l=ln7+!PGd5L^;E=qJR6Y{lszbpWHvz=|1AUOw4lwuhN!HjWU9DQsVN*R`9 z<)GlY#G>QGf4>Z&xZT%K-Vw;P-I9OZynGzeryG!Kb$G1TZcf22QV#ygzJ=l<;{-1l zJA>vdkCbHxr01*=JSQlPl#VP2J1?z~G0W9ClZ6b+8cR z2QH8F&ejldjL` z1p}a&MxV9fT82Pg4BoG!?f@ED>rpzw5R}?+5+2}Sv&!TZOZ+($*O+%ylA~^kbUnR( zc3L9w29MVbEhNfGkB*;}2t0L}9@cUM-hY>2<{V_E50Bzf!~#M(?ux^2ENs#QsMac0 zHS#K!L_RRn@JHb`ip~_4pjjj;*8%7Rr!QurjeA=lihy1K^Kc=r$e8@0c}0K|nr3@VbLSAz*IUn^ z7X&S*u&W>u2yQV31ok7y8_#m!!o)wI@1>ZYxI}djVZMX%vN(hIc@CnK{bFzMirdF@aEb+W2EG`wyu@-m$&h4Hq_AHku+cOPDHP|e1jVX>AhfwHo z@|YMAL`lZ7Ud$by>G|k}utjk#PSj4p&O`_B8pT^EuK1eXl{qUBkY%vmL9tfw0Erap z90HP<#5!pKNDYF~C+x5nkSu^LWSwly)vMfH5I@V7B@l(m!7T;@po@_GOA|#Qu&0*C zMh(IB*oqy=uteYjw5l;G1jMJ~&pCt_2Q5z6)fJ$*_881#&vOdJ`EqbUsuoP$eEfI@ zuxOqL|4Sg^)*mpQ6GYrQdbxg4VsW8_J^)Ob!&W=y$q$Lh$1)Ht5q6f>WbqvMQykDO z6Sfd8+f-Mlbkeb5Nw=ZLl}A9>#P^amo&)HmckLC~fSk=9PYs|qrx4siENwONQA;Oh z)@EsS1~5zEVci*WbP!9xbD#yp!-zD*!b!;rZ~<^#^+KAascPvZQr9Ec2x|!%{Lzb_ zf%O5%#c`NJO9OMm#*rFWIt^pP67w+JHh>y%eFtd{ay4ZFqD~Ej)V3Rb3HKCA$~s&i zTd?}brYcPsE?BliExQs`_m5~_F!Gg40#ycBld^veHHT&$*Xyv!VqAS_@P6wKbtXvu z<0BnBP-&psKvx01FovY+u-!s#wd2zuVZa#zT|2#{f(%EFjSPo3X>pKic0?Oe_Y$4g z<~5w0La;FO_oN!UTPVm?>(n!XQeG)XM;I<pD9*%buU0}y0IQo zsy-|i683lsS%QMtHll(91K%cJ29T8g9``2el9Y7{lR1qI#X`nll`KaiXjYs%Z=up4 zWxgA)Ujod^ME>-r1@6gRqXoJiIKF7?JcHoelYtVEchH>soGJSpMDA0YaSd#>xd6B* ze9EcyhiBPyCz?rf3GC0$#!v<%A7)z(V+lO3VW0*S9(1IjT0_eM51Z8pLK;rv83n{V zhm@+SzS|h`4+J6Q(~`sbERIuK2-ZxTCO6WYdj^pvpT&QxIRo013l3Bq=n`>q@FkfA zq}-Kk9lR(M>2nNfACZ&k895K?kWexG`Qo5_Wu;&j%4JB_S^n#XYSkqQo?l4^AW^RT zZ4=)kDDchp8aWVF3&m{Q04z98P;{Pl3c;bU1I1Zz~KKJ zk{gHzFSDyIG;{6!&4qI*!`-WMIpOxM>0k*`N)a!%55O>ng-~K!E?%l+(f%4Q$DxmtvG_bNVS|Us<9mh&x05D&d)oGW3&WEjc*uWsT zEbZ!}5r!qnhTbF?706LN)dr$sYmBfBL391gn`wR!#h%17Q*B9fC?BRXpk>UlVYnf7 zP%IJ7|LLda1?wqrDhy71;bR&?yMSa<2=+`Ijgo>^AC>dg$g3^X5)}78`Jkn9p#enK zeDXq^O)kHfj_dV$VyP|^O|&}X8y1S*(~$OzpqOt~r4mPpAzOx;qJe@`xG$%%;uK`f z@hv$Gg>9i>DNmmBXV7RT3`pUa))_KF#AW020#|gvu>e|}`>YUUL%4vVUaE1t`siHK zBR^xf1R?haEjI>W4MkxOC7iTa9t9xUkjFD`w&?Xn&u+Y!)G)e%E_<87W2#NFx! z;H@i7%u$ZLT~47EmD7@?g<$UGyCM_&SxJaM^i*9K6*{P}882R*6NVRjxMQ)fC<$;U zVj*Pr=L@LRCEZ`HKE_=_4?-+C92VdKVQZGG2C=?ghGZ5Y>7ciUU=|ObXHAEIhfplT zqk}jZExE+YwQAORA?aM#{)L6a?ExF$P9agNsa9SsVV9fMZfAwC%Z`s?I|x=E?XGQf zIEUs6+{sJCMTvH2CoiZjpwUrpTtOo~Hp<-4I{{0CT@hZ9(E1nb+;9l+J*H(yq%9o(FkWH z#!y2>S73P7o&(MHPP~+|DEM0ZdB#P-Plww>^>LY1H7U2l>AOqN%+g`JhEt-%#o<^D zESD&;XsLlzZbz20Az^c{68~I3a#XrsXzScjIhbb%@F(hiQ-kg)(9O(=bo79i5Vo`l zIV*%iY>{er2=11*m|V_5G>D#^T9lajj|XC2IOfzwkm}=KQf<9c#!S2fjY@=pY(Ur| zh{q0=f!1?&_c3tyzN6&{)D13!Q?}(s9z>6=~$>-qQm2_r7O`1YNVQoD*g_MSc z^d4Faf>14#YyKbXGYDx|NH^y@C{jNE_rN*Gg>QuS%c8`GQQlcY6j00#f2o+}#ZSm2 znQ-GJt))T|hxXKmWWClF-A|Y-LvwMsa+7&oFgrLVs3ABtDif5($3oyig(-wNgxmup z8DMm!c);z{2x~~VQGi_7R=7Exm84tp4(<%6a}QVNPK$c!ah*kpyRNn61<-s=aT0@# z>JMbO;^GjO{g$9uPTVrH-V7iy@Q>rK0+#`aaZSy2iLL{WIAWz4f`@*;!j#PtO*e%&5mw?WkEvpvngYtbv3%zBancP$RoIYCU|7B> zca$CMym$jl=At3uifiq)u_oE|h038^0(go3Tf)*Y7f1mv>O6;5AvVH-!7`Rd$f~j5;?Yb=m zTyl6O^UZ)Tx_01o$}-5>pqp`M1K9aoN&hH1gyR0{C$2S8jY=wdwW<%c{%ZbcKxkC_ zd-Ew_=ci?uhVQme^ir?p$ulSt|DG$7R!rPfh*WD|5$#=h`z>3~_H+x}0eb4cb@>pVZEwSZ>k zSjfpp26hQgAOHx}r{+X@@4(L+EkW~cQh%lu1`o+@@RF#yCZR!p85(tY>MGhm&_!L3 zP0+MgDn}#4OgwEP#Cr2%E@U-vmJk3+JZfDV(xnB$az+h^7?ly zf;uQPJNIxH3`#cKfMk$dl-Rtn9S*Y>P%O`QdPPRb>eDi5Ri`;&ATOlj!*t1Q0EuG8 za+O{tY|U5?Cv>1Cpj|mR4hvxm%PFcbg5W0e1Bnvp9GZHzT`Wt4xFs8hlc&&J%2S&8 z+5$H<+nfO{RThWgRNFxz!{@?)b6#Sv>kY_-AsdgF?Jr7V2BI+7RiB;@oF~IJk(;-3tnZ2hq*VRA^7VkgW9aSC{5(^Ek-w|u%EI9SiS@j&`0#?Jbz5ype z;ut_5JB1*j%ShZ3b^&_sfuliW0Lfy#!c3E%T!&`4ut12_?l}~f;%gxbuo*v;TrYqw zG~J44np37Q*UE>r`M3&#?!9sxF|EJ<}XSwrv$t<%Sl z0}4@6@o#H1?+9!`#({M73P>n111P05fK-1@UQwS@2(H)`Czd*rYfBW$2rs)5Wqa56 z4mv1iMRv7hCO$9ecz;LMjRgePPgkriN-p7BN72=124m~7%v=Jb>c>mk1F#uoZ74lo zb_5<#OaJ2sCU<%8Q`WE~rzOFof*A~v%N#7VzpJT(u(Q^`H*rRA3PIz>+hrVSOUlDI zZ#)Ah?WYfhGX)gaOY`7)`EFMo>J25g!J!{-y@2G}T@I(U>a()?Kc$|&1oUVAU=mdx zfMJN2)M0%YKylOLHBh4ZI?=G;&G_5F5!jXE!hy_sa|rHfE7}w`KsTMQO~)zKDMS^) zNv|XTGBiy^Tb^Ma*tqk#mfldh2v8} ztvjpN=hROk)FqGx2MKuK@PU?&%Y@xrcjBMUfaZE9&Y6ecGzjPh@nP&AM@~PM)pJf5 zu5>&|odFDBE6gpY1dBXXTd>udJ7Tf8I)h??#-nyqiJ*gKRpz-@zu0>Y2_a1Oy+s0% zt!~vv3tSlc(Jie6?#tETPo1eX5bpfvR0Dun8iqA)8SECpKB4VyT_W6Kp-qz)tjB8T zqFjrS9th*|vO+Ce5b{6PJb32t-FP*}iP$(tea?X{A`BmD zX-8lU7(8A8OJ_fdYxUI|BI|gBXNho{=fv><=qg{1n`Fxc%MeSAopdZm)*)f%TwZ)1 zL2~QI?$^(p&QqGW%|qBCT=icioI-G;a#DjURBVl*AqYLXPYa!aT*If*8lr;&le*y% z4q)eCGonN8aC2OwEI0s_o`YOSJfIXEx?0XOwnTWH>K4lqG;^o>a3mZ+aM=)Ia9M_8 z0pZ3$jJhOY8L55#un;sZpyMM5sc_skZEKoCOD!*xL<4lD_)4KyrzMXmYh}7}3qd-4 z5Ua$q7o*2gvje*X{*N#D!kj}xeZv2KLD;43X&ku#x|SIARa|wyRZXUsCBlw-sO8oG zXwFW?S_CYET~u1bNCeOC+`KY-!}(VHIF8<7Ge6d&6hU=A<`L)+dP2rhL?z3>$1 zQrS*WYFjY<3MS7=^nGnfyYhJ}oaxRB#t24@XCIP&v{Lulsl*|_gi;HG-@9B8TSYiL9a z5MT?IX`s^*`KAseTcC?WU$9u6mBgH*=sd53fQ0Ou5dSBC4n@~D!mP$Sp4agO}9-G-yFtvzl$t__7#MB`L0Jxcuh}@l!0VK6KCy>bZ9AtiZ&n8}o zg5a#>O@|N)KIV=ygTTX1ei-+G=@eRm2;HQ}Xdrlht+&>f2yAI~7AAytq5`Dne+ z|1F)R0vc~+;tF~ z;i3Lz(*F4g?yBerBBSC0mtGrLJGfn0O>@@e896iRGoDQy{wOXJju z);Gkg8G11UN*QFs91(U`$*L2R093KV>W<0-tOeXT_T#mukV0~Chr0!tBN>5YnmB_( z35;sg++l`Cs!iB(#2`sc7X{;XqJ19Na%5)Gsblq6PVi(HRhGcHkzdgMasVN1Nwm7~ z!fUw@(aL;ARhOi?(+`tF2y_qqo64wsOD}KC3*p2MWi=42KbzrDtZsK&Xe?83ql|Mi zXnFk7eRTifCoF@l5wV%7o&4$@#6Xa?IzC`>9}!lU3CEH-AXS;Gcj=q} znyZJN@8zp+rzu&u1`B~>s?Si@RLHAoY56u}G|1QdB z_WOvYdSm9|*_TW!OHf_#Q*v4!6nHz_?toHR*P{lxZhFYr?qfq}9z}h>AbInQAe&bk z8|Gkl2VH04XonsG$#ulz%aaz`QPOS4TBHSAbn9s~tp-0Ux!vI(?qY@3L81{b_Ddm3 zE+=Enz6-EgE5=B|REcAlCv?ByvjefWk{&LB6i*BL0b$qRYFNXT!D;MF517|bY~OwH zPFD=~Lq1rhcXa?a0$N|6@T)s>2;%iXV_Z`r^Hw7Q0HVuRSD7aWruGhZgfp-?2oEXx zQdiOrwbwZ>FrKbl80tl~&^IT&M+TTWqZ z;A+C=MGk!0st=(!d&bjrL23l8^y_lUJcp_(-lFmjUPm2H2_h6^3uplpJJUs|Gs2eU znzrs8(9%4V&k(hS;9=q={$wz;fbOd3ztHFpV6%)@k9SO;lUXJus~hcHmI`fa%u!sl z=b+GXZ$uc-%+-~U2hfFW#nWa(u(@3IUuKUWxC3B}W5|KQIV5V#YoCCoWUw7Zu+x&k zHT#3Nj?HLI!=D-8WI)ocB*hgB37vyolx~>CY0;uYd28(BFACm{zZj}MHzzqfUiTlH zEkSYxZ^n)RNTaP>q3Q%(w-sJ?M8RtaNip^CEG!M7xZ%^^l!Zh52r3n}KiM3R^d)>T zV@QDJA;JIq6qhMaA;JjnB~R5~TA)kkWpJ&f&mg!%$MfYNHw3FF3BzBfNB9n*6CjHQ zhkl(Uq85;(YR9NZ^?5n*bG(FuGJs(i137gX09;E9HE~66xkP_+3?FsDT%{KBNxzc^=s6VE zmR;uVQFxg^vh=pCmlqJ+47lGgcxxD}>de)e?BYv6w+gQTaG*jP9CJHsxoc{3)K3RA5Z2Z#{kec&%ecF(o8j)7#kgX9LNi8SNKpHl} zzp?|EsmrmfJ_lPQoYDC*HWb&z|HKui5Eq5$|E54O5W-5Sfw##^5Y%0n3sHs_zrqOG zu?)dQ;l2|)@EVHS_F0`141wnDcI=}@<#YHf%z>$4xT8dd4Fory-joa%pi@Y2;Gw%M z>EELB@iSo3`Y?p3*6F`ff{H|@H=L77=kSs9$pA}^8kZ8(6$&B!Rm(X9cnS-KI{gCdRSTOfuRUUlfb6>JvUFTYt5!29 zEkcn9BzGCFYU4>UXo!2bhBydXI@dJ_)Ie)oJhkMN)X>pasmrK**VBpV4*};8)8NRn zIn(f9YM?panf7w0K)MKb<+S%`K}m+baD4_r+8W;B9e^>8zt~0_oQ}?Pi_#aMl&Svx z@&f2=#+xy5R97;ARaRU&wB-_AivZOgf@|+vaxo7KAtJ125Y~tw zZ2CW5=KwPkZ$36aXMU4QX4JTRzOCn(+5+Ppl`~+{Q&xoG`P>x}@4?eek<&S(Bv_k% zkjMfW(mB$?la=GE2A)0)070bR-!y4kK)Yy(KHqEGW7v3AfC%C0UM{Iu`mE20iY$qdwdMKtuco3G#G-_=wYVq0A>}- zjZJswkfRL|9(&-@9*GrJdnyo9XIslkuUO~{JdbuRv z!iTq$H6(cD{)hkHYgp1SZ|l5i1R-@J&&4$H9Gdyy5p=2#OoQsgf0J}!*jDkQj#*2uu%5^0NhIY2$XaE!x7?J7^ffpbP>G|xpbx}y_tFd&hp2$s{{>w(H zv;@(-a@37Xfa)0g;Z?R=K0c!@d!0U}J8AxSk1Mn~bQuBF2XNX!0wv*daszFEslH)s zJq5dht1RLfp4&pWVN>-P1gj$+4A_!l1=~S{R~j#DALqc-3uJ7I)##$6^JRpn#sx$R zD_l^o=fd@0xlA5QFB0(ilVU)CUHyNoy8+nZ4UhadUxOqa-Oy#cq2tpST!yE)M*fI_ zlFa7i<5s+J)PSi3JvqAaDbRVdW$;AMLbATB-J`0s%2}bCH2F!ANSq8YR)|2O9Asp_;(eV(1s~}go z7#c=}q`!3bJ^>Q^VMPjGyawn>4%3syr-j0(8|&{DiWapS>Jx;`hW_M>-rYe+J>U-g z0p#jli^nk+U`mTUE-iH7$RyNtiR#HLs+-!`E)|HCp4JDtB%KyY0uaa>_klDffEn53 zTQMYv*l;Z%+Y!KWIAINH$1^8IGoFeaOH=Tz_#1*#u%&?aPju3@WTLz3;cjyVNo^Ak ztas%*Z&3RG49_9C<^4)AP+b_i6uNi;vS4JWqV8|bk=L~RUIHTG_^Yo0VORW)4_cQC zfsH`y4+}w$B}wxkAd1_nefbs+UMPMc$ac+8*+9tMD0z3CLh`Oxn>_46Y1nO{lkbhE zXn2Bd=XC~_4rCt3yJ5sUF9a>oX{`kWON04@`U72mjJtfUp2``>u}7_62na~y!vKOa zh@I6k&>0M4hS~kMxzxmOe2-&U68;|RnVdhY1O2!pfY078*Fgk7_ zpB4&_zTzLlzz79Fyq15fJ9F?VGU*5+6I~*pm~)UhyyC?%ILmpTI!OOZl-uz?8P(0a zvt{^5PZA>vsBq_tcLRXM5S%XqQ~#TXuSpHTrC`rxfM#MJLV_*1uBi4U#pH!}F=Gw^ zT|1ByWaFrGj5T`iDTso{Ven}I7VuVh7D-fR&=7;XtviC|Ygx7SQ$uC~14# z5xoFYgikq{Z}i5B?z8lXnDrj80!ZRNx`5+$0sbd_O)TF0d7orSW8icP|S(P zJYXt2{PM4c59FR1v^eg`SF2;Nla z$~gpch~+AbMT?S3p6hW!bpZ)UzUDpzax1Dj-o%2qUXXuQ-)m1|nf>lJRd{ zAlH)lib98Ab8*@CjYkj?7n^ru^*o1y82L|2#MqSd>+#aqX~BM-r3I%B&S;F3&mhns z$1?tQCGr!|na{yimD{|lVaKLLp`j8S>uYUtQPRn{M$Lz=xi)r7gk2%^Icg6!|9THt zEH9_na`~zWTQ-+9APx(>BTvu~(9CPag5J&{ICHEWGw z>cY?f;3_;hj^|C6Avk;HagHR3Q$tFNGyU-uR!H>*h)FVPwo!qbGmJ6wk|h4e!yAYO zB29Zke_4G>*pj*$lTiycA1mwOH+}}ee8{URdR*u9Z+yo)kV5&KypR|SAel|v9r8;G z7lp{x>HIWfb=!l8hkzxJ)3I>T;#r&O0YvjBhaQvyOm!I>f|VMG3|Xp56acFb3vJs9 z{COir&N;wUyR{zwtkwAIfz1Q}7liqZk+OAc{auE4&}SvWYIKke?9#&4!P4j)iYpM$ zKRDR}BEpWB$u0T2y0*zE1%)Gn2P-~w(_9ldn93`Ko%o#-`Tw>y7( zOWg~c%X@AFM8C;Mr&6O+k){EsMy6kq*{P#zT0rJ4$XVZ%$j|&yM@dBOK+fwL zV;hwQdn`!zeKj3Gv(C67=z$lITp3-A3p?SRQcCz#7Kkq#l?S>Qjt(hixqP1Nv;VBi z=hg7O9hT46H9m~WXCLXz!LD#}JZoMgyb!O)oE8E;HX7Ai2xb@`kLaRjP>}vCu4WRZ zC3y>px}BG(JPZ(RyMST=@Vo$vSqZ=?-rIberhsR1=J9@%S|17xrWR1`K~66xm^k-e zhD7OOot@T%Jwo}?jr-3KD$NG_`|^w%l{|Dl;q4(Lx0P)D_J(PorFvrbka_F0BxG@T z#KA4XC?uYi^zFYhNJs$x3gq1Z+*;4?OHt3kmiL&&Jk7UoZ1hp008m$sQDly*-V$u% zyc};3Edk9B?^|;>1YtM=qS?_{z6^9(n7P4qNx&*BR$W6Vu4cGm$LexaGFg#jX%2)1 z6j3#9N){ZignRR;zialOaas_Qx1@ST5Fv|q72W}9m8mXAof9@=e5H(qzlH8xI2m=o zv4}c9o7)!=vtuUq5+tWSnaUtPFvdrY{Rv5y!Ilfh0b%~Hp}3pgjybVpaGM z0FE0BgM_Vkd*gc=CP#&!2XN5DIfOL3Pqy^vR~!U1P+^S6;3(s?q|rq~wwo=4G&9Le z;y!WaB>cfH73)&^@O?fn>3K#=yS)X3!nEBrE+8RT4zFS?e5(;}H&Ol)C@IphntlLw zrEqQ1n7a%i$sI$pP6KNwXy}b0A|Qxbuv(7w!Kfr&^9x<`LfAg0)1d-|u!9X*^@rf{ zua9+@(Uu5$CYcCl5RevUK#0F9v7Zam@;TU*jX_2-UpNYO8D8k^-i#VT{?j}!shqw2(97NpR@UbbMMq-qycvv_Q9k>?o1}1arcK!D>XKpfk{IYBD%48E_dD833(A zt6X(VFSuSnboE#vYApPs+(X$lk_`p5X14OO9IP<@zJ>o+f)#2TY32fE*5r=Z=9`_TujWON_1jrprYk$-ghArs_+AKUPJEfUt_#@b#_% z(DhrzHCeJ;hLQ@!?&39YiNs@9F+L2P3vApup&lb3jTPoSL;!MYvp7eOG@vvk!K%iq zQ=p6W|DGU<65+BQBt8R%M*071Ko%TbMf2%-Nk887WxR!>q-X9C_63B*1%s~l71fvK zx@imPDFKqBg@Kyaf*B5>nY&nVqDRYxa!tN#P#*s#etT)|aN^oTE zZX$^iIUMItfvy_=1g)97eK9rrFZJ0or@)91OTVr};F4afjbE@jgO3zwDA3)a8?PL7|#-QK>U%LTY7VF1!HmX+_imo-vVmpLlm645hDi`2U z$!5HvS5W39(J(!q)D#ML%3_2*g0$NBX`6k#15=?NoMb+j2m%BTYJ*@fvT)jzbW7*QJ$06}x81&!= zH01qZ!1@8>fkVL_TuMr1G3!d1e3Gh4cSdEVlvI^c;!UM0ndN7j?&Y5D<(}^4?wRi1 zqkFoSd%Bl9fI|&A)X?9GwNI$**;$K+XUC2myN?|k?3@rHI@+bNU;h*fCKG_azNMt6 z2`NEJJYihFz5r4hZ^_=0N92up(u!wy8iy+8x0Qgp(j)xn8%gP^9X2D#FO(Yqdzh$H zTFFI!6^%k0&fTl`ilQ!dA)O}<8OAuN2ZLm9c2NM2RsbIBNIFWa*%Dk7#1-1KrxM^nD5vD zvI&LQX9~^Bqb=f{ea_kt&g9z@@5gpwqVHDo)O&v46VU+#lGEstXPzxBjTz|1)ZSO0dVk_Q zga(Xl_6ncH=-ir6e4W!oWSwV!?X8?o+CV3^6^_c?#0Nw>B90Q{T5Z%AMj5~4-=(*9{ z%79}pD)y5~_LJB}Y=A8*L4|b(OB2GXV5s8-lUBZ?s=!#F4dJ@sB|?3;3r8ODJ=DCS z2eOK=X^L|IcIhz|53?TmeTMa+1rXZiA`EoEv$g%*!6pS?UbHN8{}^#1(Ch+e@na57 z!9!SqB@CLJfmZ%gb)j?Zv`U~G3qT3E_5cMWWG$Yu2a+#lTN)F$LDmisZWl^s zt<&GV{28YTyKW4iED+vzm6yQ?(gyL0kB&jAHeo9h0J&!l6^fi1($wxt);Z96lpnk2(4ID)NAVXUt2 zv9viz>WVd^1;7Sy>bkasus Igf#Gt9%FJRDG=VEBlRTjG^cTTve&#T3N6^Obkgm zytE+v;c~hUJwW}@pZbd_9NK>QY1qA}>rb&p;FYCb{_{%}&Ts&gLo=Pxk6JRy@0EJR z777_2`+G`;E|Ut97%v?f(<#`ZvZ1QlG>52k%6c20Sisonn^6ICQXRF7Y%lqN2L1&V zZN+1$vk_vqc2<4mo3fnSfbkoL%-E?aV5>ye?D#|)6t#h76!q}zW`anI7aT~@yco`v z`}Zn|9*~kPhgAlkQkV>&5qkvTj9isCS7R6_`D3--IDt|Hyuya3Jp1#6r9crte~=|( zCBK012MMARaW0n0ppPhp6^xVUpXt~GL%Z6Kk;GdyrwyKq9S|8Jg3>pkoT{fvK&uIz zXn>Ue@Y;y#3zP*TT(dfbUcQq@qz6#gIRVh62?O71JqDn)vJ^Nmiv4sn074twI7{|< zb^&bSM=NhCGyq3|@9fb|@My8aH(>=t?AUmSTJlq+s|X}cX*FPw2!Y5#6Hr-Q^gykA zpAkl6str>timfYhoo~=IL{HL#sdP=~p75W6A_K=-kQ(`!u!PhCo{^SJ$0qm$=#r#| zVRA#xli~d^?)fuA&Cqo@3mB&-rxMoyD3r-pw4SvB6%&usbBx5~`G0+>5ZvuMqRQ zY{k?*I=|ne4IYb_;b07On%d0r?79U6HqTg)qZM@lXr2H!Q3F8E(!O2X2M|`( zHep4z$TouEOZc0j0iOAiuIM8701M5YE_j;Qk@V+aYlkff-4;vMp`08W1&a2ugm7}4 zk`%YeG5m@6%N9Pi1>KKnbT={MV5I7S z{Z+gli%StyNP&!_d>xlz%z2#n^jw3t37FIvHij?I6zDv%h@kbAIfV0g_0b3Z7Wsus zN((K3JzT*jnL2tE9VnzT@oh8=m(^zpFd8m3NwCKjZ9@{co5X@4gcHWS@BBRpwUaqN zW*L*x?ZQ}nx;eghnZlLBN}o7BTMpAvPY3)!m5W#gJYq(4kKJul0>UorxitY;@HkjH z1)|DR9j==5Y~3&@&=_aoKe8_)W5_>VzUAdrzWf7&R}@(rRR4bG>&+VwC4_9D{P%Z}}RnRLCiK!by2Y;yE=7f_YaW<0D zbxslHy$MxJu7m<<hfr*8DMJ-5p1cs{96sw0D?k{$mXN`am%4R z#{My8<`bSv)swhWAUQen{{1}v+4Z5_MG~yhYpB13s063-kn#$~>QSUA&!RttLU0k# z02)yKL2shn$EJVAG+FCv0h|-MbS-|hlei*966ivZcO6TPIZ_W?eb5K;2QaRlbXN=; z!=q>fNvUw~iy|5Woa>I3wkJR*%7jvzE>j3+;);&c=MMaTJMS#=|ED_9y9By8eh9zO z6}VJV9X^Y_ui8~56dNz-Hx7a{AT7s%ri@LXzoQ)pl#%#-IL@@e zG>k>UJG7mza7v%QbitPDCHpk7*Z^axbkU}LdIoSPlO}UxY>``OjJy33&fc3!w@G8*+J41#MMf<^$cQUw8R`>!-8T+1ta09zRK)v;gS zgR%lk4<26}z((sAcdw2>7Kve+&R30}iR%c>I)SiE@f8eV3S(uIQ${hzSjLwSi~Q(r)`pQL>Eyc!a^8Ms+!>BmEhvAV@v793 zuW+x!PqBF1{T21qD`h<>Ol==i8{`L0VncWYb~d?8LF0J}&g&OBi7Js3+F$GO#C4WLMZ)qgpdj6l}yK{$?&!4&P; zsrP>;KqSyE4NAy&&yN)bWezS9VTah|A_+VP6bO5n1e1t}egt97GYxX=pRbjH(GXA% zC=g5nlCs;yI+{Ed<2u%^dMmN~3D5%ALf~qt#ndi@XI2qOFnT-Q>p|OQchy=A5;F5f z4fz#D;L6(=LaIO3EyfdorRJbcOjoDL47AqI_vbKfNn%(P8_o+jY(6UA($AMV8nxn? zk6%X}t^HC7cA~jz1LzXdm!msqLKXXGfe;lWJ`Ml6HrOhVIU{^Gi7SFap_85oL_hiD zHWEVZ@Bvxpu{2|`8Tp-W`eKKJQM@k1+ z!-_{6cXyvg;IFFwyhnIAG7cu42;lB%=EP`-6W31mV6eqktY8dybRlo4y&nOQPPdH3 zf$DMoyccV)2|p>C*3@sS|2o$(u{qc&Q4g*o08}zD-bPzX7{p|NMMdD*iNDb9X6>)3 z%$u{b+I4Ltp)jo^(1dWl(PdfkYr)wt98x$G)P_U>ObW|*x}n5nA|09as3 zFZ_>2Fjn?EE&HlP8z<$H<_I={lCHHKF!l4Zjw~&KJ!>L zSUpJO|D-|4fM=w@Cj&47RCS*B#tFZdXZ-}I`t8N@Dae__{Uxg79LCvaACneh0k+V( z42sV$A&MyzS1U?t1?3`Tf?I;t)<#uE6mfy&RSmZ4kpB2Sfe}y;xENcw`TtRE4W@iu z_d%(bY(kasv^_qfWm|_O`0t8GTB46QNH#iwu5iL&MKM@x8!j$ zger`99=9;gDcvQve0cscf&Kpv0+8bl+QN6kuPDFv{*qhpXcA}15NsjVeCGKVw)Xz0 zcK-|`tWm-*Y7!rSs4C){QtBY*n%HY7@)2yakmEMutRP%)r`oAd!}$c|>_jM)GI*~2 zb(JD6%Ve(()>rI)v)7&I7l2F74_yGxJy< z*htPhsItX29CgNV;5z`6SzR#R1(E;~s}?$xg@${Au@N0G9;37P>nltoK}@a=w#=uoHlUWf0f%tvYSwt9#MpvF z3r&D^ysvetbs)l?;YorpEs%A#W8>L?sJ?vpx!lkV$CtG?(ec04)nK>Xd^z^JqiP<( zk+wZKdH{cZ%W*OK`40?0!$s6HfU|(_v`8@oA|Nr~<4y;VHNkoUt?UihW%?5bltYSf zQfy7SoxRO}onPyC>Lf8^KA8d7WkguvSWuegN0{sREXE9qBK{idr#vGRTd6|#MzL-} z!=Hb}dN|q1_aC6VmQ8x)3}!5qEvxVFDy9H(=Cm+Q(*Q0OtcD0XfUulehf-5i`x`>2 zh>&0B*}u{%#$mXIf2Hrk2zXXO#CgUhz{QF{TFHKj6oJI55$?s?~M+b%ix^U<&&TAj>?DCDawFg9%ha93NhkYaZk67vhIXRX) z!x$i(94(&1K8=#%5^LYIb(7RP#SD^a$53ukar~kByRW&0MXVD!&P;elKbo;%5CA94 zDpuSS2vLeAc5h~23ld4F?#%rGwnzc@0GEYsv5|Tg$uK~Pg9{HZ9 z-Q)bD{1;24G=javFJg(j54MWqfDolNNQP+bC#u*(7z@3ryIy5eBPi#5z%(Aal7u&q z6uvJQ9|Nq0E&bs(VN8!EK%0in(2>R+gtb5)a&hGiX*eem#}{ZPcyR_<*zAV8M{ zk-VwjwZEy#3`;SZrq+|txepqc286Xt8y~0f){2y0`9$nxRE+l8_=6tMotNI||)Zx3W`FbCK92sJ0k zg;!R5b^xbJFCW<<&ld4}O@>E6K2KMqj_3yDtaDZ)Tw}+{$csUH(YG+p)G24+Xhpy- zxZK zHLJy1^5``21*9DTT2SJ3Qt~UX&6uj9Da8|nQ#h|~8ms-yRSMd*qB$akIvh3S;xZg! z0O(8YK0mF2Nb%&tL1UUgr*QC&QyFwHNw8g|m4p_Alhl+c4u5SxQ8>7fU=Tc$6vwOa z!|4D@b`(^g3v_lEWzf4FCH|SvB^U$Pi4(nD6`>E2X?c}-kd*dV!#0LVc?Ag*89~|P zJ|glB2$|G{(5V<)p&Tp$P_nD`9aDg?`N&UR_4O_pIarFB!dNOCkZ9GQW>BQZ4vrRi z<{;XYky1iJ%Vw?1hNiekzU7)rUAAHJHzO;*C5;+pi@>6bgqssHK9rIlO|p3{>HTq zFld_4GNzT=jm?dX#DnlFY=cv|s<<5pD^B;gMkVV)*+RBuAxA(mrG*R<=z%OWu?MiN zKG+}J;(kXyI7s3f6a+DZup-&=4u8v0G6S`HY9U};1yL>aNR8nb4hzxE<`@OQ$r4eA zk50hID4%?l?j4Zo`V(XHX}+nJ1v3CUgZ7{!wRuw8q!jo3-Y3N{_2)3*MN&Ep64V2f zHSiJuuwXmtoW>b7#HF;eIfv#KD0FTZz^;esl<0ME>pTC8`1QaPHRuv~;3&6n`JbkeO7}R*-G5a)s-{g{1 zwX(Bh&MtnumNm~|EK{uOswcmPQe$~?LRY#lpmtBrs-t6oF8HgI)Zl6T;7W{097H-4<8;I6|b{fkOhgT zlC%X$Xr7Ac+$BN8ljt3iP&5v{ID@c)=^zR6f+fHd>#U9g+(Tgr$8pW%B0quumd?>W z&W1Di6VJ|r;-#rU_}eN~g@0?bHudBfB%vkk zxi{b_ty8RKjYq_L7j8~X1D%SaMF3ctAGMm(0y?qpxJZbS+kn7yXjw5oa2xEr+y8;w zb)Y=%X1V%Ii$>_R3+eC1(dHu{h4cYuKr-CRkHpqW3>uRPM`Klq0Ti;(pq3_JnD4ON zqTPN53Q)Fe);Cey4cLXHW0zqNFjdFs^CuzSZN7NMy#rb(pTM9ax;2?a{)}6eofBXv zlFwRXo$*X=XacuiU=BucPMJ0VR7_4TbhHaeFx*%De1Ncw>OUmo62{8LOt+xo&@>HrI{>04165Kd1K1ynd|Fj&R~wWrCW6?hdr@cJ|E zE8a*8jA)-uxUCjRH-P0^WpSvT3}?AwcdG;As*F<_{j-XyH3ZB25+f?HREotCp`1vJ!uYE_aZaD|=)TU>DCj z-KRA2p9j{_O}!!2aXxb z3cyYlLp#Bq*$o{N6F7y+o%tIMbo4-n2N*HK{x zK$jV9fdf@OK2zLe<5-kXmW`kpxDCiYpWsL{+{0>rSCz^sp1}NAPlBn28jc2p^Q)sY z+BaEChS5(HZ4<__oLu~P8CSfnCnf*3DBc#7Wr^s7aTsy~F7mH&1kcVOi$;tgfL1)4 zWE5N%Y(Y*Hrb&wp$tVpKZZAK<1>6`15RKwtqTXQu@&_5_G4mOMEp<4PgzHgKWY&J@ zH&B-PEi*uGu#Dl{5Niy|_Hcav7M8sDS{9z2)`uAF-ho}3G2OvHlK7cRm$8@`gtNyg zTWr(MVJx4#24Vx_KB>#M+})zmTtL~LwI^0W+&(}$`K!1UYzcPqjM+4kd(01=a@&lL z7(ik5ayY<70G5VvRV*9S{_ZLr2QT*;1SDBLlOPSAU3`aXW!8ZHLacRDzD<}aKiaFO zyw;&C=T8x@wB>Kat|y<|09tH%s0f7!Q8L?4AMH+lgz-M5-pzkS^lt0~f}ByFe6wf3 z2@^mcfPpAI40`}7A`==`huMaHnR7^LIRMV!EB*H-;kTOZjDgMy7tZjp+r*ylOpJZ| zCV#&JJF$=K3x#_;j4e$ge(#LT;GAJZ#O%Eo1!+NWiKF`fT1)J4=%~j6!k^Z1h#IX2 z7^|KO)ZExEp}ZF+yTK*Aat?ju2in{>v_UH#QJmT;OakCk;}BkLT%~&0Khb(s9ipUq zOx;F~28>f>{}2Csa=ixSnxhRNtubpT*n}m?-~JYh;1P?uQ~;+I&m?~tQ$h^dKypZo!Da=Y1$yvyQV5^8GjpgDE6| zACNqSMi5BHB@BpqWAG`W2af?3hG2kL(Ji>DqMTRMWQ=|!Nk)6DNkmMsJ4mZ1ZdKDk zKul>0OG?C#`^#x30O;Y2&~O?J&4D%=23O%Ogh9YLy@+Px16m76bo|GB;=zv}5N*jL zS^mv$enXqUv&G&E=lvD9*p)-QJwaGyT4agyDz(3-qVg`RIuz0}kHzpB0DqHc?xeS7 zeDrm>2j4P#Cwbca7G52xG;U39DKNIDfdK z6Bjo?XP&qlC}RwEP82t_obtX+@;%p=eWHVraX?Zm`#U={6!{hAvwUR=CK0?}W3^%i zv>Z(3%Rc)&xt|@n7DKs*bK<9GoIdt~-U5}-hg$54Mg1PE zh24@jZa-nfrWpXOP$o6i1-wH@#N|*b*1)rK{xeD3fUKuM>|BqHnKF3G;}-0!Z%03l z5x{8hyNwO_aqocQ=aHh}s67TZ16aacg?ybSJkXHh9_S>AgO4s3V4I3UpQ|oW>TrDS zw=^!a2KfgjWt80`*fJ50C+=%lB}0!K(Y~Kxtj!o7X`og6`zi|8FgXAZn#!J-!KL zy~c?#yAQ8JTCeOYWDKqrEJjF0mFWO1&Tg!DwZYExuDp)0oM+}n;F4+6LywCtnEm)5qFntab(uD9#s%~Be0ITf>E%yw8PLu`3SjdEM8mEM=R`J{< z6D|&Xd<;{Rz#Fmuew!4N6N(Y*m_U^_co$nq^oB1?jlOU;o;PP8YvM54{yDfR{^%R; zAuRNJNwmn{_jLLW^MJ6>9C?raY6;^)!Lc(gwnTrAkS-L~e_|#hV_(6ds&|*P(oZ}) zp%*$MQu_xg!QM~{Fc7c>BaFW;L2V>K>@W4=YyO8~ZpP4}3FhP6K1qw>-xv>cBhW@G z@!N>cy^;9pixyzoV1&_Tu_{Fe=)^zh%o661M2UoO1YtE`^3jp?U|doRp1hyZhjME| z@TV_cJsZF|+dt_<%@AlkRZX_iAHh%U0=!m$ z;3dxpk8_*K@eyR*#t=O}}gZU}etVy~Ywi{=%2=)M|i>R&_qp zZ3P@%=}WA?nSXwxy$+BwfSZ)Ns+Qj!uoAw(Bh?l6EhKFNoDOHc_-hC3mRYVKy5%^w zV!HVnZ68@b0$XR?>ZR#vFA1jB3hdX1une1o&5Mw011RSqhV(J(8^WpEj1wLsfJ<># zi+ndg%XWILG{%XEFpR9X;8I)Ir6*?-2-^tb_E`A2gRvC5mEXw}s#MZT?h7wN_8Ba4 zXjYF!7+f=K#To#Xf|CYR=LOiQ((-Cp`vBo=aac+n8TPV-b50J9G2X|712vwj{Djsj zN$?56Vn&0bcWVDoMX6^9_0*HPV%`}}T8*T>R6#@5pxi8~e48NhML;@4Uk6w!_6`_% z0-ZDb9of3yNKDKQq-uk$&Z^b#z?5phEH`|yXeJ=7P)*jj79<*`BUq>Pt?EZF;b*NY z^ns=5?Cmm0g0*zze+Xff=!KY^jG&ye*E?8;QB1e7jr{+6&%SR>l&%qjoIp@pUTxAm z@JPiyR%orMpP9eXFz}4Bh+)k8bAUxWV5t&y+$TY>1~nWDzs|Ie96bOKj>1yYumoAP zd$CgXn3!7(`1lI!kL$ucO!o=G*=4cX*8tUSL*O@jyw0OPw;Qe!jl_&Js0nL{afZYE znqYrkgLt(F5H8BNq?Ek2lEFDGJOwsjT)5TwLcY|7bbAoH^i{E zV?c|=UytBiT)LN`m;b<_25t5e<3)|e2H>I*Iv=ca2tj!rP+lV*E&Ug~=GDO7z*t>O zZXF$L3}q1xxMhv^F!E^~)RfOGHz`Hd~-mH+K6$Z+R) zf^pUI5hMH2*ZyG}1}j8$9$gOKgfB}2Y>77CN9(l)VTlMo5qlU-7$?lYoohG{Y(2kn z5Mggx`6mLg#BD7b$#A?mocY`N8Smwj*AA3RfH9hOL{U)}&YHbaAekeehmL*Mw!==W1yHpj$bo_bmrI}z&d~~49<0s>V0DV!#5USXI@@d z5DySe$gygo=YT`hU$kZR$g_>F!{=8Zr?w}}J|!fR^cVj~RnXeyS1(piMmJO_y#|cU zi7k#xc^k+FV+|H--z34NA79}Nyw0PAW$9nNZwrFlsaRw5u>p2L#qN+qYC}0cY+Z2R zQ*-1FoE4`#hD#XeE~KXw;qWqK9o4RSNBL1EqW(Y+3MG6{L0}#r+w)IlE;oQFiE7U^ zmLZ35E{PLvpT$PN&dm47Pjh<%Vc|I+$b_{ApH4IoK}&CUC0I@+r9E z*~uV3zfJS~$X0|bn!#bbAH2p5u=KhTMWcy(7z={zo^vgrtTC$Rl{UC!DoL`y576gjY1*O@x`Wnzyp9>LaQgz}D2*9wjd#})KXJo|H-+CQoNV-;j6h}Hp! zW|_b%=?2JYZHKqiT4Ihv(7H4ef6IuK;x;BKgr}h`u(QhwdPLdSfN^&Fc=MDfi)~0Z zKsePDXAcS=7Bl+(1%=O}W!}+R&QZcwTI%ZoEoJva7qj*g{}98X0k~L%jxbaH5Hdv? z6B+@4zx^?m9TKv)j|l+%?R}g(srz8(=#XQm5h)$U`lbzGW&e~yy~~e$qc&uku&WCp zXZek7{-Al&9470w?n#j$-$RxUb6~_1t_7Sms{E8;lZ$znAMu9EP)S_rv;^q4oM9?ZR> zQyD{}{C6@A9-Z3N*SK4G0^w%v5U;EVhJR+pPp}Lg(fZEoLu`U) z=W9=${yk6?(?OOOAeRD*cG${8;sMPdW&kdda4>*Ec!Y4iUtPS%Ym;%quNYq?`;<(C z^hcSsf1=V2hb7JlbB`ePZC`Xzqof$g;(v^_-$3!XI66G$na}B(UDcFZpj95n z8mRaazsyN_nh2grOsAPhdTK=Dr#;Xa$obcSXFfHD@TUkrMG0dL$*>d|>+A~{^v#h- z*#p2D#kYvUUV{C_OI8R``D6ZHa}1uKkl0>Bc&aT-PC; z-pkm7Y=DdX*~MxN!a3M{$L=rbHDR2dE^7<@u0uIJHB!Y>uLW6+PHCsg!O;y^wMpz= zF~9_%OLk%9#M1#;mhbVKqw@eeX?A3&^q2!oRpmANQ7!BxN7>Q>sc-JXSspk^EjNC{L6QsG00X&%vQ|AKe7%S5g&}ghA{7OI#kY2N>Hi+B^Fw&z&Ju)i&Xa9^QSUvAv4d15_rLkfOFp6UaWg(dUmaF~d za5iWLn(Q}*C(!*f$yq8!s+#>Rl2rLnm@2vlNaU5GmONVCLyejqflllc_deYSg0+}q z@07K@GLd_NTl-Ug36%#W=zPH$_ltxX*Z!&MgPYoTuLGT9&R58462BqbFvF0<92@5M zrZHdjI^#OnIo*zt1LaLWlcYB5v{uSLHz3K>3nsU19-XN@WvT<`%uA$T+t!QhI$TD<_U#M(GTaD^W>lS%%4dIeANkS8FqZb3QgHv9os^ z_Qh@>ZQdmxWwU5=8-PW@ z>QrUNN$A)hlH4ZovuCRl2;0noPM6*#JUyq%P4o3w38CjnLP1vf;2fe<5FL=@KldRafdpABUseJ8d%ir9?&1hYf8n|PvYv>*IOp> znG?(y-4FuNkNMb$N0jmdr&X{31J+NeFJpi+br?H*w_vAp`YG-*pFr5ywiR+buI;|d zmsmYit(!tvag0{%oifY+sbBL|U~@Rcz&!);?g1`$b|vUpfKHehbnKTxSbN+-bS^tu zCKKKPvIQ6m7n`vju)xemNw#_we{&DkUGy!w*UFF<^&B0C%YZ- zAF&D;3mUQVnn2Y0W*dv(Q8iSfr54ZR?H&GSm;%6BRc}K5Pt5r`Y`XI+tj4ViwiJq; zz#onvtWYqKTli)#DId5));^TY2?Gqv;adYpSBK6Ei~xjK$2i7e1^|oob9lzzfSqiY zK7<#>5a?4^LZ>^|Eu8!2V{H;pud03N3BOn|U1j<%zs4L$hkK_Gr3AzGm8JoPY@Kl) z&hs!D2Po&E6qg#Qdq|}I5(jA)04MXUR;C^j#=-q%{$H_)5nL1XN58|IHwu3RbUr!z ztLgp|geAK=8HQ^@?Vqa@&b1Am*43dX=z{}{f=4Tk^21oy!2Tc(1S-%bMClXaUMO;> z;nF%Z26Fs~)~4lW4r^i`JgZb9>QdW)c_X>&0Q}`W-{9JXuq7x0Q;Zppp!8KOQ}z5D zH$L@wcFAwi2Mj>Y;{g*Xjqisrr7A1r7RRJ82{_l5ys1b{1S}~;D7rDvd{xtHYWgk6 z;(UXnW?a}c$xpBv!{CkX8r!@}_Dn5x@p7KRT2Fd6f>KLQGiX$;3lQc!Iur6BzXw_I zT{ezrCBRk>K_Ik!^8n#o$|FI{62=m03LNu|M<~8I@MS$KkiW?;0*OE6C*+N-OxFJS z>Jz$lH5N+hP?Q*5CcifTmV)(5xu35kA>T=I--NJOaU~-KyAEUJ(`hqGRJCuxp=J`r zBzP`XuFj;LkR=7o;u-4jbJvh#HeugKz+#SgYLi$GqLjUKucjgyym?|jdH~~!SFS$R zPADOJnA6L@sVq?2#m z0i5q$Efr4<)qzUWGoU}Kb8}wendd(aBLW`KndjzL?IA6az$rzpcMlNOM))$5%_WRK zOMO;MAvE*|XMNz&cmDnh9WkR$NJ@yw0(Qc)CH+R*xwU^Gp_=D9&{n1UqOcLLlX%Ho z)EED*C3m2>a1KQQhw1<6KZ}X*h+e1_G+`}(Q`=IDvyt%hjNZQube3?Fi<2207}RiP z_iW0o3*r3u^qMUN($$??=srhWKpKV~ED3*!Ffx6BGxBZp!vipad?{DQ4}n&K4LctqxPjt($6w^rH#Wer043i7ik3cVvvvY@K2FP`{~e4!_nABOVijo$WmUY^ zU}Kh$j)(>W=uhs2{P$oBsH?qrVF6Ke#i&q~>mixz3m>Rl=0~_rjls!dzOg5N68^J- zk$!!)l&m~KP;jqV6|dbnzp-dv2UywznMosI#7bUEh=U~U1W=7D?FiafH@fx(UM-L{ zp`GR!+iXB#jvZ~uw0V}bgo--;Cv{Ax)4CwHvRHNrvx3A&(4LB^@%@$KK+m6|0p+be z!1`2ANIwFu^ci!aA%x|kS}Q~rHUMQ~XXCP5jdbJJSl&sK97EY8*@LTYcE5!r9}IJ; zH4^~(P-8*uy8}6CCaX$knuO9t*cn8zAG&_E7;_j)ryb-Nb>An&wZvYESwLA5mQqjN zdLQ@!6762g(Pzmsf|LWUk02*Q`;NaO<^WTa02RxEPf2~$g*2+QwSTeFOZZAh6|BQK z>8yfS2_(j}&!BS6kB1$sgGU5o?R@5Up0trC1w^utG5Bh?AV=-+uGVdOY9(X zbRjS9m2C)sA2(5yktaZ2;JuUufg2pYZ)xYQ-QS5gqA13@Mwh zG2R5a{1o4Uf8Rh@YdTUFi$`N9i$y`i9FJBaIUHjdaOtlTxN79U`cUOpa0iXz4pH2c zN5od>Xbb~jb-jw1{c~{D2wW2Gl%WMok;TWaX(^ywG!BUn<@JzXVyuTCORzKbohSZ} z5dQQ|S*TyZ_|w|iWHq;!XTe7N>6Rd2=ah`wEia(OUyRwalvum&th0zfT~Xr_DG`N2>wj#1u4W3c@Pv13Lpa<-K=U%sgQ%$lWK5{mliC?_6^% z0CRZ&S-CIcNYXMf<2inR1lx8ZxS#JiT|v4$tCMLeh9}q}g{oBTU#`Bq&6*iT0CtYl zOh+G*3}?pTAmSQ~%OEbhQN!1SLf;Vwa6SKkcU!nQwek;KEF=FJtRX2jhd%dJ^V?81 zhi}LLt= zYQrS7HqAdq5LSlvC^SyHNrshT&rqJ3n6}b@z*z67FyqZnU@hJ6ScAn-60_CBD1Hid z?qi=ef;P?IXkYr2%bZ6S+m^zS+$Y@DFk}J5TJYA;?#%|(b~UKy{B`SZt&}HY9zFe+Thueaz2hW1Bht+Tj@4I z*1{l;nXD&fAk5IRWgJ_jdVd3K$>}De6x+!}2TOvTWEeBV@@f~xB|!DidG9c}BUsx_ z?v+O-d+^b8tAP6eX_A~F`~Xx6Ep9#;f~_{4$BMWfqon8wIsQlUe*;yLj|&d1qcJSW zFfU~0dke6V`ntn?0(3gtacc7}v4S^X9AKB|=P$8VGJ~*QHV1>ak7S+Mr#oMEG-QO#IcuT2FDHnxtS$w z&A>RHb{Ixq8&J;YM~3A{Psz67&_4_HvKc^8c&xs6!4~CPxg^q9G)umE@lf@a{r=!~AnxheP`~g0Vt5MuI1QjM;7=`Qo4am2@Qzp`_3Y9T-s^B z{OlLfA3?dQJO}ZM3Ui;Y`dg?2tOK5BkhZX+zk7+S@bfO1L02W5Luj!1zgArz>1&1XFf#if7-3hSd z8*$Xxu3)NsGrHHTKeI0BeygG<=Bop73c_gDyr^76c|YUS!-cGIb`=Oq%(V^<*`XZ&Rk(3 z`3YtOd}0cAZrQDtOV$j+c_B=l!yC+@+*%T8QTN%$k*9lD@5`{@`2X3lS?~)oGg-rE z9{?7|K0H8l&rw%p;1T2uy&(W7CIYrJIv_%hpOW}qhrKbBf4zD`KIyzshp=f7hn0;` z8jNIO3`_M~gDK^sp;C-OlZp$@BaAXBXFh0Biu(TI(LZS_}v|4pS zxK0ubIpt!}g|I=!5}LN1yutx*rMywA)=T(Ni*kLS(_vQ;@dse1!z5Jw>JY+tR;(}$ zH%7@Yy1!NiZ(y9~cYR$1hpNU&ak_@xc&OWFD#9>BO`u$9Uo%C$16k6J8vQBIqA}rO zac%}Kq7kK?WakiN*vJH`gu}hhw=gm-W-gLAW!^@j4-kA;eL^&COR%M9kKZ-w5vm$L zXagRMB-aB|KtwgKaW7zh-fFQ`%_ao0d1 zl|d8ITKR!~aUFzNQ0ip41+Wa?>Uy9Jzo{)}GS&uJVW0b&#&`U3HF4orln>v+(a|wd#5j1i8Wa!s4&BuWgn;dkW&gHg|oE*+ry9%Nw&#)oW+ z!~`}%rw?EY&y~h%#+FI2C?9vlK)B$x8JEQkJ}W3IsRR>KYw!f=#>=(ZWnB9=qX2Qx z;oEfsI%O=n0dSsvCl##$$uQSEXj<1aDt@=?AZLQ3dSs%N*e*jG;3^Z5P#eM}pi^in zla3j?o5d1)7si>`c|JOG1gU1|LR+Ri1NuCa+y_|P7wSg`33rsUArR?fX$=DaS)Lt* zh@<==N_Eg75B@NQartuyOhX}+Kb$3EO{TgoWCB_B3G{I98QhbdJ6JldO*)z>kG7P} zLu?_@E@d(Yk)3nA(HUs~oP#YbA}$iL1BM|0tz#mpYnTDH$h6zxm+%NNF0ZsLPjbUOkUlfUNP&6tfO zMOx~bu?Q%qPlx`VPMj|f;5_DIjfGC6v?@brm(c;%FfvkoMdTZh^Zky{YgooJjy~u% zF*lN9c@wbpfLEGk3wIEfwj1uz^m9`vi>9GI5v?hz8JtzafP}RgA2FZ9QdC>7un#}( zZDJcdTHaSO4G>nueh(vwr2!5`r1J=HiSB5XZT``GSg6;WeC?U=L4YQN^Ge76F@k%qL#fIRl*tyr0uc=ia{!VL&QeD}J&-m1;qsgX zu#R8$1{GrnpcO0MCKP*=#E;i;Ua$vIQfDtHX7hb0YnqijqHjouL2_4kR(VBg2#1tR ziuv#efLLdY-DLnEi}m(W3yfp1<>qW#IoQdtlBrhpKEI)RwWFC?KuKyZjOPbF#U-=L`~ZhrG=_Ujg7_Yqs;nSN zgI0MIVHHWf)h!5KFsl9eXeQ3J)l~QMrLOjC@GJqsy>Ja+iLV`IqzU0%>w4;NnORRt z*Zjm7s0HOBde4b{`weZtS+bbJyH9RIlGtw|JX8lj#TZgwyF9A6drq(ejk4>T_&{BO=G*?OW2<6m1f5nl(gYy+6-_-6u zX5nuzq8um^MC~tB6`9#IHUTVkf}W8wX(Yj+53N0|L0CEXU`BV^OeS_IkzyUjg1sWm zSaohe6%$np&Uz&8Z9sZ9;XpLiIzSuVt;^49&^kb?0h#z)H}QbeQnVjn3;o*r)4e2c z!&61>Ls+1$9hR{XD6Mwrkj{{26y$-~2tavYanWmloHiXX`Wl0s)(6Lsx`nVl2+k0d zd6Ep_K4X}42V+}dR3c+Uq$wO}veZBhW&o$cnkY)1C)N%DRslvKER?vp09uG+;(}sN z4@q$HTVgLEEESvCF;{v_hCz5VpDX|AIhOttgvF#&Es->8v#6dwW2ki=T`Di%>|+@T ze-{CK6HhoejDJ_ew3YdfRqYL zgPP=bLFkK#gJ%;*;9`^tU4{mGFwRTFhKO3+PwwJGu_CqXJAkuYGU+F@LbM-4NZKTY zoEY(leC+?yQ1Fa=3W9+l09<6;m=O~IEiPMsE;P;*$({o#}A*#ti1k;92&QX7Tqh#m4Z9j_}6| zC`4nNPT@TOD9Y=y!VPkEX(qxeEC0^13ONyD5l}82cBe|O@&soU5G%2=<=VenRcO{G zi8)i&1nbab;05)$!K3xUwM+CIKxDxj<@C@5m5g&V5UT)NHX>um)LIbE8iz+5KxqSt z>a(ez+dPv^ZtO%c9e`6}NJaAiBqe%3nf(#SDaqR!8vr{ca&v(a`w&)i++@M`2FcLf z$I9do#wuW>e}di0F>(ayHjNEJnyqre^9C9*kJV_7{Txdrw>(?QJ@xJrAX0MNNMn{e zkfq#@xTMp>zJhF)n9~SY#~jSZ&cA*MM}d2w^T4U3>yNy!fN?=;P3?o%F5ytxbvUmt z0Zx&yv1IxYY`t)+fhNBz2)-FTh*ue&VEj$3rm*nH-)jF}rJ2p2!#7QTtwVYQ7EVY+ zE|hIGV6lzEO^ks@bhNX-gDLQAy|7}8B><6&>W51;|7wA&tg$jK*Q{g)Cp2v|2W=P| z%g@XkWAC8@WfAagQoMjzChfu&OVPyV2xuLAV8saA09yy!TpmmA`VdZBJ#nl^_!S+N z@zUN9!b)Ky@zdAXC>X)nT*6<1W*nv5K$87$$nuy+vi~+tkooyiCuk--tMbO!>keR9 zJ8>o?;Tw&5W^yLxF#-WCL?PT#LKmXh(QZn*(lGVD3 z)-2S4bzv!#?jFX#BZY$V7>2;J3gSZDQ6Erh2IEEkG)O`>6C>IX!qUq9HI7w{pllXQ zkqN?~m6;njC5A@EemtPLy5*5{>9;Weo{N1-FLyxeaVrLj)5I(iQd%>=9&YxCF$Y>u zq>!)heG)7?YnZZtD3uAHtk?uihLG#w-noRa@oG_zVhBla5tEL5-a@#Dm|1cgAw~h?%n~k_ zwF;WRJE+pYJbV65lPr^RRp=R{EII1z96*kA>YtzPjbif-(*RlL-$r zU*3eoAb=$!IwyoJByoB+)aRd)AYzc}QSCphY=~-_hsf)&REkZkv%#ZPw=dPLB|Okp zLlcPb*Pp_xVjW~P5#!N8-C;_T!M1k{c{iY}Cfv?(&HtFM=<>Qomsghqit7mE;@f9lN%<$nt&HO6gRMwT1w`XH zfU%aC$urZJIKyWSL!!fm8zom9PO+{VxH6vfy0J%HW9aIb0fQS{fXc|@mLJH`mh6H@ zY=mBcwt;7x<2Uqb(?q``Lo*}Ia}i<=a`w1!0&Beoy9}=ABl+wC!nujy*CF`>9LmF& z4TUTLHh>t0JOceeR$ka_SS5ZJ6OO0EnjK;k^Xh%>SEI+MgDn6D8nkuXfUtu5igrQE zw+3UgAj+SEc{HJH1MKj`5jB*#4ryiXJjc$)G@#24_H{%^;Z8Emq&DDDnue47xeX#) zIAled9iXMqX_Xi@z_=m!Out%%7&?Nb6kezldOY(%CP&dkfUFn>Gt5f|i3!$3xeUS1 zIG*x4Q8`M2o?CumHxSM@kzM4=O4pIpsdl?G%)3oW+Y!;qPoPQ}vP)a`wUSeCTaAw~ z6gZ2>VGC`AgDnzKuCzlrhp-k7SnE>h-{)u4O4v7L0f%`|=jjt305*?T7ZmR@;ZK?a zJ_4QM9uqvQB{369RQLpTmiHM#g|}esKQ3*aPRiE}s1AIr0kCwJZ+(sES~6D}qw-D@ z#^M}k2DT1V&8Gt-do7TqISg4!W!|pP7*rtM;UdKV8E|ZJc4iz zIIzKDGVk=Du#q!`H|_ImdGNi(D1eq=lQt2f02c{x*g(Wl66{H740QwHOcS0YZhapo z6Z19_yoIq0#8LC0cLHUTh((&v-yK{wq;@vKCLqK=d= z8F4Cm3nN`W!*35D=jiyowtAL{@h-$%9}{21s=x~D{ODAN2K`SkuEu3I)NNwzKM4u? zvHDdvKxZ^rjRMZz@$WT&HTZ(LDb>801OpI_pw}UgkV@?}=@z&c`8DykX@WMA(aNpz z*oLu2<$xMXavdmZH13>^R->DoeDOV}K7w<*QN;j_wgML2h9g*?XVl4{Qfl z9upaYouMD>aWqQeqeopgNwCPHG8iZE6(fbXg>V*ice~P`z*u@M#CyPh2S*tnv%-QM z7+lAq;S6A_-Bi;xPxwL}f%ib#FbN1NSFPMT zO?6jLSm7&~1CQ3u-T~dme_Bbhb*Ve66YeXxQ3L2)X>rCIn`>~D{s|eRrh@`)!uqQ( zwlRtP%$4@ZT0m#vAbedn5;L5{q}qO6JGEM_=m4!W*FT~Qsfu`DT8rR@|cXbZlSD( z>MmW~RW^Y|9%?oJx&t`<_n)riY(9l>#x|)e8a&UEVTmrjJ5T0=wq&uKdl*aoK8BLA ziKMi`daOXB4^S>Dt{c+=7ajc)uJr09o-sl^LYDG>gJC2(PrkY(zqqILYEMk)NOYeXRESc?ML05-4W*xUnKKmvF~Ov8RMUs(;%^7a76BGX61 zprIj@8%);yVlyD-C?jZ(dvx%dh3hhRxq+7r#vo`6u(Z3oRD|1vF`1eG$(b55^l;}N zan_3OPk}B$9B9iFVwM~+EX$FFIh@s3Z8Gt#dq~TSIm%_u(RmlpR^`rj*oaZA{a_s+ zYfFbSFpR`Zs?=tH$%?K?SG)fNv^DQ}WxV#ESK7TgBL)+eT8FVNX-U1&(txtkSVhxa zISjouWmrp|i8k`acpTj~;VtlA@h4acNG<4FO$1v2%gga|u59xb=p6qLUMCa(%)EigH1-2@iQplp z>HRbb#yT2r&XQoUS$~;B@I|~>9Lww;Y^7Z=g?vlFERx}Fi7?X#m{O_bkE~(k63QyN zB4U~w#7DR?`eCMdCYJ$h2vW8B5c~0Ccb|6Uzo-gXcT3bksz2-s(Bt^0&BIrUxxWUo z0=j)UpfthGv9`sLb{)dK%?2^b{SCze|J%io6RB zQ~D5{`4P~`Y@JXyFbRla*s+j7I{l>hi&y;yP@ZLKw?Lagl$xX*@&k#plAvy31aji9 zW0r6Oc4`Ds|G;oy4B<@SCKbhYn+(35m>L~TG9MTfsMoxMv5e)QF7}WV0X951n?bqc zbO??R>Qtn8a-5Un6qb6td$_7JiErr{-U1TcUZKGUKWepY$)jz!ul3PKptExU>NSxGo|%8NniUftG9HcNK~0R6vcP>mh3io#zy4%i&YrVk|2B&;MBJ@+g%5u zYG#zG$}Ny(AqSY+2H2`TQ5C=}lAy`Z$gTrnX$ci!ceHL&g#Xng<_toTqLN=k%<^7R zc+z^6tPf>f)0MMQ8jdx|{Y((wvi$*PT`PKJo)&|Fa0wr9o;b!R>rmMBce_{&j~E&&k0@aS;KV%? zZ-dbx65CdQPV3Z%=(k29W&U04x)gPl_EwILU3Qgm$)Wli9l@rX~5Dz!Xc_!8Aa& zlzmys)IT!lQ;nJ#F*T3w#E6kogwO+7U2l&y++BeA5DgEGQ#2X=S+{e?hQShC8V2k; z9Lye}O25L@e(uFraIPJ?)3t>x39j-0K7Ee=b7ifoQCBvpdrU=mOye5ipbkLaCgCL@&&z$)CNgX{VyBGbtT0M_TV zR&aVi%Z0lEa`nM3F{0pW2Xg@75_@rWbs^87VKOw@F-0FeGb|ctj(U@fcbq8sWMZPu zJGU@Ji`!Ir`kxr1*Y_3EfBZ? zw8T_CE%>&diO=}wVhIq=0?S9(E7kyGweK;OjuoFHDCbJkQ2UMa;H>w!^u<%SKBV=2 zg)cYL0uNgU&^EU>2iOOQtg+{fgd>B?i?4K4HvpuUUxnJAF|bHa6T%ev7Q({5=5)BK z^duQ}9Eq-oJtV_mHip{MWY|;)w}=@`u`JHs`RF;6v&h2Q8J!5S-NThE#xd4#yPu!#4O53loRnFcb}7SQQZ(vT31t}% zP$NAN|9nhNmrY~sq?JJpo`*!Rvk-8OdJpDSqto<;mBp2NQbcE8OcS6|MssKU8qY-k_<7y9QOvs z8q=metBmw6Nxfrq9Ye<+lm^aMjH~-RtJRCpR09KrOu2CX6C+w@70n&Uil7u2T1*ooE_eoZzFGE(ebad|)VjEf;yxKRdZO1@ zz@Wa<7A0VidT}*80$1^I+BadX9&1TA;XEM(HXlfw<&K@2EG!Xw?CFJ)$Tb`I) zpp6FDr6o^Vq+Uzn57%)>s|n$35LhcN7+Z(2P3~!H2}`Ba;gE8dFKqa+iVhmKC#osggvS>w&Gr)h9Q&p~OYp5$8_~%i8G682j~OPTpe*ty8Nt|qC?3xU zZ=ftVo?(5@T*h!Ha`SM9(&iZ%OLV3Q$mRPs=3;kXt4inBm~>AeoWhIh9>7^rg#Gkc z?KxDb!8*_wg?ta^+8WD(be&YX1*B{1shk?ijOhW^dedblOAtkU_KVhsAN>=N7_@@9 z0$E%fOYwqvO3Y>eB~<(GEAhM9Z(%+GVYM=Qy;dXIfN_RsXt<6PEv_X+#5`Y!-b{+Y zr;j(TLs9yj0}O#j8DoTP*)TXc*E(1m;4=R#=cNwN^6nff*l0KL%eWxuC^73rf50ff zRzJ#DZ`Fse{JPIv%Vqsh;4=^h*J1C!fEVN z`yVRin)Bb9fYxCwHXDvym80c=2AqwQ?Ln4jJT_i~ElZ!Al8oB}V&u_;xAA^U3hN-} zVP76zt!Jn$>kXjua6qOpm&7L@G*%tB*3kVmPkpZXu_~iy*$#<{m6~+kX1S_!Zgz!09{d+Q@t8qPWVUe_y_Z2}f!!s5I)M5GjCjV&g=pydW zB&~s@+*q(^0<5)7U01xGkjQluf5I!l4WLEYas_E8v48nh`RpVi$0U;OLXaDcfnv@0 z2wYi!)}_?Q_25cN8|P!AkL*LD5|+WS0G^Tf>|9?Q8W7!>Uq%3nzoXx768=u+IR;vl zbgWp?E!gt!Xkm-C0>Y(Ehs5X-JAiSi)BSkFJ^d9`!e!5%DJE6Xxy_-RiFcQSuf=Qj z9+EQ9Ju(X(EooPU_>l0e+(MT?N(j#pSJV97)R`|C1XXX18@7)y3+<#s<&W z#dR#UuK}FomaMS}v?f>pBE@z8%&I^*LbZUFNRt9}5=l^O%0e5$#l}T};Z@dwan?6K zqb)7Rb)lS(_f)8JEl2X#BiItcMFRu%WJP;DenI+^W4iM`&q)761j8QyEIl4-SV+Pv zO%F!T@F%SN2I#yU;zW;0B&JsJ?_0mFb7{)j1n9i(`OrS?Bpn8iL`u7jOJ=DVsS4ZpzJ1G?vhHL!E+)g$a9 z8M?c$fOQy`Wgm-SIv#ys3)Z#JvDV7{MH|qxkq2r$cvK$L?R9vTMShf2U4YAUFZMft z&cw@Di0CE$nOo_x2e31tQp7+85ElCM{P<_-aF~qZZ0fwy2*w46CsT2~Nb0oOuo^?T z;1v9qPL6-x!nu&b$4Q6aF_k1KB-~Oys>!>9EF-fx6~Pc5cA6}Unb(zp&tOpi(-E{W zHy{SK3-$%XHud`KoD`lG=mjC04h>nxVyAEBHTWyac3LMUS-DEK2u*)MDw$WPOiv86?(~c$fj$@<)_R3<1L8QqyTDumuZl-`@d|?Pc3%P}9&^{z)(;Ery z!6_4CT`T~`rt>W;_js;i0ZDvDmxg6+-c%5gL(-s|n14N!Vkj+nP0B3Vs_fj;Up-gPue`JuO30eze9WX88 z*NwyqgiIR&c0s-3L=k`JKv)Gt>Y+nJ;aw>Ukzw$1`!Y`ELz7#Q&*nvmvdDYk`=1b36AC-wp$<|v z6TVaQ`E{Uk^d+{fT3`#z^8F)_ zzmL-m)^bK?!cVL$#?SfWirb(RIO=Jd93bTyL=gF|bAD!PljKzzq zJsI_o$pxef==J6SMv?HX`tfDL%eNTtV?vHd&<_AD<|h7Zn1O%RS!4}eYY&x7+eG5i zQe8cntI~}(U|e24Z;hJQplo2fe8#Rf&E#T^6EO|e;iwu6h^Rv?1G*}b*)Y&0nKS|b zXK?p3s_Y~T_l+(Pi@?c9L){~gGjxC-H5LK3-nV?>sSjbE;*5uO;0$U6pD#C-*dxCJ5;8H;r)kY(Uj8)^A237s>Cm?nXn zm!h!GlHjthIOYLi>(D?(0^TQs&@m%hz?AliC@|54JtXDgLYqq{D1>14$~1N{um?ZHL%4dibc3bHIq4s$a6Fj zFsO%%A4>PrGU0oBr1lMnVqz)>t!lTE`myQ?J4w-p)5TFNT__jg70VR_W>Qmk1nJ=s z&8+5Qn4CcmC>DrSaVe0K!0F1XW?GQ0u1*jc@e zW$z^9wMvS&{zpU*kpvU4wMATvIAnSUQ;L+Tzz`2xNh&-kVk80OY|_fc$|UwNhot-t z(BD0el;7EV7UCB?Tgon@NScepT6n-bz*w@jUR0ouKxbmtVXjw6uzIAzc}jv0=!jD7 z|9qC2@t9WR>PfvnIX#YJMUAAGlzP*BEvf5)j=-SPCX|g84`U{wihi^X=iJ7oj*oP< zAl*h659Tocjbs&*n-!YqHQKPMxpCcM2S6FVldnDo1G07$cOP3xe08aTWDo2T!nx!^ zZAbrC%tcuE0LEoTbb8d@I%~M<=xB4e zq1+~`_Vnz6Wgi-zNwO@PT~|Y)J6Ov9Tyg29Jd3!tsx;By=ud`N=hziOwJ&xAwfJto1rr&PE~;@o4fCkSMSH9NW3*Z!|c$KTL~ zXpm4(Mi=U62dV*M9W#LOGU6IkF$?DD_?v6JvI%M3$zPHM;@FK?heaPak8qKI4 zlX=f)Wh|>?aJph)kj4bY6=j=doia+970q&_I-x+8!yzpvK+#&4o8X=@B} z=Goq(9Bvacf#U56xKv?QC=v563C5-R!ZZo`QLQh|k|4OE!p+YitPqac#SIhp$>2#4 z?pq5OD}EB*n>x!1X6UxB@Z!_qmci-GX$eEnW^e3j6voSOVNw?BliuRm>|R*)-v*(Y?F@z0CG~fR@=j4SZ z60J0f2%}tFr7_xwQ3{0;t~*qhKL7Li{65diUVH7ep7pHvwf9j6F#;nGk`-x$ zB&#Kvh+vY;r)g*+5GgbU^lHc$B$HzP#R=2TH-Y&_YV$!&L{k#cyva)VI9oB(EU_op z3SM^AVexuf!Ala_*Sb;g2%RTc`w>~l#tMA6+^W0V(HJYt;dMLf^yD5}#d}>$Yw?nU zwrYibBvETofvsZZYMR(GRK|R)8`B$W`&Db}*eUhD#dQ*$acMlrS~oS#2x_gRW~Jv6 zlCjQ*)|X(eAscfrmrJ5SYqk~oXHE=F?%E0lvzo+*C`@iSxQ`#uexNbA{Z)H)mw#ObMFydaJ|2DWDjX$@{6wC=|x2;=km83QVS?K(O4_;gI@9b~4m0E}K zWBr8i9+Wz?oQmTZR$u%sfc58}Y?Rs3yYJy8NaUaC7Pp`D3yCWUcMiedm}whu$=t^N zy%z8_B$NAhtfsr!mgiV!ZSLB#w!=6CGLczdkReCU5REN(-^BX=7`fH1(ZaO;8XJxA zn0c{$x9~w~6ZwNOX*u6Qw(>g>C9NS2gsfbIwW9FWoIMg08&Nh8Q8< zYGD!(jQPx_!+K=nEbsHrNwi3ATfu6ao)65j6}sbyuWq*$tIdhlJt&M#b@kfnQP3C* zvg4w!5a0@^jJZ~`DPaa(W$27)mg=!+-NVH%P_{R|y838dF+7PDGsNn2=Nvp)U2Yu= zyo6w!6kxU`eZXz$SWpg|}4%uX! zZyV{la)PaxL($NQthW`k7-~qjt@tqB`c7z;t>7IL?Ju_53g%^+iajVyL2~q&&xJ8d z!RQCk8Y8s-0jbqX71(#r){oZESfDUY^`ZaM{k3TPPDPEeoa!VhQykV_hjo_k96Doq znqupFdOY{W*U;ird{PrnfyeYZf5cG`FfM=EqHgmL$(VDBwYmDtmMLzm zCoS#fzXLH5y=g7x&BI@i$e;Pdb_{}Xt=YQdJ>M4jxFzm_X!1)F ziyUXm96RRFRRXecpUEt#VxaXX{E1I8A9{47F$vXgQt_uN+Q?_wO2#|wCx^D9#QUeH z!x&Z%E+2#?wM&=lXtuxz@gM4&qh$#GTA{=IT3h^ub}A=rads^I=s85K{<^A%6Cjy< z#aeIjUo(OE=B%}d&}_uz{>`boh$ff(uaM5?9{!-ve*mDluocVADJc23_FEWNSf9y3 zVGP3`)mQ}|hQ^dCui5)Qs@fBs4tT9L0^V{%Ypoo?8zi9N zIf?e%d_4+N)a?7M^lyu~e^Anft5fp8ER^j-Cz{11Qwg`D^$(p^SgY{<{WH`UC0>_z z4#FEdu!WC5*kaOeb@>7jv(QXwKT~TY2CPfzDo!GqBA@h;jwR07@`@N;gTI7K%rq1< z7W5v7#)$kqy82Rz}M$B>8!M2=ey$3Vm2RU9xUXjQqoqPe5 zzl6a-(3o_ZH@it^6e?46W}V)u1~Di}(@#khK7v3gtW$?t0#AZiCW@h^#hGAOiJFjL z1aqKly%AU~qHS~|m`o+;8f_M$$;LZL`c0DUw#-Rmp0URS*N-}A!*bAwW{Hb}M>NI5 zT9fW8me~sb$L4BL5WgSTa5Hm(!$4~#&AjR8&~nC$#q7edY{ff=cPq30(~iI?`MPrYYTiZf}IiaIRe?F>-|`EET~wKw#NJg7tZj5 zhLOK^5K*>9SEtloF=$L~Vq!n$y?PQc-&W~N*Zc|L52_8U+GsZxhsxyH`ok$JFA27e z-jOw0?UL7{^Vd(Df|YRV?7G|5vS7?MHhILQvurIBR0d||z2xn-78hrg)okDGL5n}X z%cSWbtooe)U-t?SO&R{&9CR#0VQlqt|51@zG$!%p?`(Ai`lPL+GviOp)SJ)QI^2~m zP7`^_)~#dV$TULN(3!eut+krd{LLs$&Z#NZ_X)H;ylZPYV$N|^^Zky8XiXQHXeNf# z7K-8-YJY=Fi!cA)?%S*NP(=qsF8aSA9EM~Z z?bQfd!AEk{p(AYtM>JJ25rs+TuUD+HX5~?6Od(I!tq9FsjICl{g++fFr}?(Z?2VyX zg38p>_gSJfbG0~hrnu(l89hYM zUABs?W%CCl=TMm}|3VjItanl`p)>Ap4JEi;hh*2#(gLp3sV%I@*3aXuM#h$78H?gN zw!GT4fR*Vx->iATvQxrVaiO(v~UdJHd*UVRS zxHTWiSipRUa0#;Z39A!%i;mvPh`P0*UY~&I_iEZ|T@A4noVnBUHEtBf09Z6uKR}yh zYxKVvt;Pk7F$r^Z^V;7YTcyhkY~JH8<~~N~?L)BwH2$*)ENK2P&niQ2TKg%S@~<`0 zPxVKuVkG^76=^w#WODnC`L-Xj$q&^_|MBD+3X?Sb=W5I3&9-WJd(&|D2Nk)t8sCSg zOu=Zy@_&%y8A{VT#cDU${?P`xXS!g9lpSCxrC5E}Fbtkj(pNZWUl2_FOVpRU`XQ1< zzQL{+2OEi1ZykP*Kr)$nmoJOcu-LLDn05u3sM?1GQHUnb?->_}u@#)9)(GdD2$(H# z%#n(_1lg1+pD?i!5QoN?X`QuePlK0$%5MPczvWCTg3^@C^vU=J5*eg<>jUiOaFK9wPu0+_8!!?lRUs*J@bkdwB| z1_#aEIa_9R9%l&psGoi3X>o|z5i{+IK^F8=5bS6i0v@Pw4h!4e>)4lzS^r1=YU=vi= z>~#b;0?}lKT}B-`McN7uk7{-&+6u;)1n0GClixbjFdQ@EQMK!?gONx1u04JAq{FwVsu7!HDH^0D-{8`kN%z(iZa`6%lEM3k-ht^WB2JI^s_+MR@qSv?9W z;}OwIm-F_Awe^C|l=K`uYN(S(#v3TLg6k4Vg5mk18siLjQ)-jnqa#5w$($0)1GCJ< zrM&M(tFzrm{`~JjGG+CSW%Yx$tPK(lfNaX{UG1Q80$ahqYwJ9q)>h0jA0#-5;)l}J z8OAwVqZ_4KQ7)k|Hu$x!=d0=(Dr18f^XIG7fM#^YUHPX?ngVkU^)6adl=N8oB`hRn$q1Gs<5taxL1~IDilog)B59n2 zAnQvIOc_qnky)HAMrocBY%y9tYOx-Xc&ItL8skPXshx~zDSum@YHhk~M>eUSWkVv=r->evOiSeku^)nyI6yS%nQ}_ZhxzaTu zZLZ@GO)m7!G873YOzBLemsM|DZ)^BMc&z%9+tyfL14WZ%YiJFvD!3hu$^2w%h{2=x z*s9;et}~wmIf$yga6a@8WeU)>dpBo>*`y;`Whg1JSe-f68lEq_(_6s%8><*R0tq{C zc!mPIgkZ|=L+w|t+2V3-U7O8~yxgPh{9PntCH{Mwq&~D|zM)F~p4l=-`?*`%+RYZn z*DkoB0}5ldcUei%D8p=x`5Rn11zW?pDl1GsTcbPP`jLylwnop>YJ)ez)+Dm_sZ}8o zjWI1FdjFdUCXn?hCkmbvMxim5Rp+Br(8ZuLmc`GQ9$N)9AEj|MCb^6Qc{P`yOKh!v zz?Wvknw`Yi+TX=;C{fR@k)8y!xaA~{6Rn3e#SzaJXxvEZmV5vY7eO$^5v6{)-4<8r zyZw92jrzZ9Ts-KPSbZcd1xTiF%-5sJ%&pe!m4dBBG%l1s@_&e?4a+u$mhU za0%HI7N_d0Z^K>tK|{NtT}U$;V`Q^e&uw>Ym3bLK{lr$8*H?MiGgQX#2`s9!&V!C` z1N#jhW4;xxspx>xbUiu@=Oemi`ymW9*0h$|VX+^lu<@sU2*!Tzsr?2cVrdRi>8isB zBPHR?NRdd!VAHG~Wuh%lw@&Ux*>W^*6yO)OJcq?%+6-hKr4xBAjU|Z2z`x|5P*KF8 zFj>?WWcAb$PJqt%mYLI8MqH25?_1X5vW~xT3$!F%n>zIiB$HTHoRGI8o2O~f@U%Tf z!9wUN)1MwhVTxv^{vEAWwgOwFtJv1Dy)s*+CorrovKEy|qlRLvtYesyC`}sq+qF1r zqkxB=Lu(v)&ignF5(UBkp_9{V2&N#`>Ni81O-NqQ*AwA)kxXtFdo$#Fh-`8bWBv^8 znXUNnV{7wK9<@8;43-Mjyd7-KM~usLZV-mXKtoiPj^%YfEuK_;R!hOtJuzgxi;9_ur&(E*t~V81G%H70$BdDC@5u zx7+eZN!l>%F}JfGVY&T5L}Tkoyd%QX3T(w}X1-Xc%vNyJ!1~u36vnXn-+z|Rp0riE zUd^%&&Vb5qGX61rt+i5e$=2asR0BQmcFoq|S^mQ^&9+WAV$5XhuC4P=G%(;pbfySq z`~N8neVY=yH&aNhX`p?`JD!&`Dy$ z!R8LD(^6L$fo$4Ft9_+{M4~Y^n#wnItu4}two3mb#@gtJLgjDvtOFR@aKxZ9d3u-B z`2TWZzOB^Hi~HZOSz;^WKjfd;&DRU#P~vK<7|tfZ;%ZbtoCe;w*VOs=3=;0eV3Y@D zA(-q(TWgTpZF!~jitQd`V-B4_bHhOsK)Io{ulR_j+H1$!6X zWO_2w%ZIi!JsK}T!UVc%gmXN$0_s1QkiQN{Cc{%!u(64Q*ou$*e{o_fm`>@|YCjal z1bS{w{cSKBf97-*Q{R5YPf+^TTlq|Tg8GGqjYQr4iocF0v@=yVn~0vW9QadO5X1h4 zT*bhW!#L}WCV1n?{COIk4k8)GFLmD~&PbG+j?NO0OfgTP81V>O<_j0x>$YX{ZPF}c zlcir+3tekCkriogyk{|?EM%WT!JIY4K* zP_3=vRXsDrI*H173~&1R{{i(JI#cAAov6QtDiEAJk+f1_l>Ef9!zY}kbT)%I>Zq`9M*fP5r`i9iOL(^lV11cF{W%NSWH)ZbN7g%E+tHY;%)uQwR7yW&tMseV?6=}P zs7&s7fyn>%LBS6?T^+YpP|9o_2dH#2h9am%XVN@b7e!7Y`qOOf|DCgC_Jy$aB@^8G zXeHUdW<>3Obo$+lXmZHe73;K!t>B;HbUyyjR`7Lg{V4Y{6eiEQvZWoVJZaA}&7%2V zgATUp{r0uUFjOYb`gL!!D$x_2DN(If%a@xzwjWwrgE=uYE{0u;lZ=3+HJJ2^DL9Pb zS);~N;5+=tK~kd(N2X&8M;kKVaI|ep3`chnXE+*;1jAvM^@hVFZo@H?%rYG1yxnkE ze~;l9JRLL~2lxsM$9S&Ha9F94uv`OKrGe7}0R_3?mFT0t*^$B=$2LJw=q^Fnf&Q z-obu`@919wbkK?)FEGvA04cqzsSkZ(ShZWnkIoQ!?{|9z7+#A@j{rXNk(P(F3N5e5J zI%l|n*wJvm!j4bR{CMGy|8}_bil2TSXE-w@mMdv0hAod@_3Pnm11%lCmP2a#@7^CL z4*xalZ0lk_TpT%a>|ZbbFd??5v!8q7+;!ef6V5ptSx(vGbewd$&N&^IoX%@bd42cg zF7%hH*Dn6;uU9VnapvVNQtuQ=6aBdz!1ouY)CP)p2Ecy{KVBqB`R|NAUHP~8W^uj9 zA3KWs5cm?LhI4Z-?gL}`ihDSi1||UaNc13|aM{rD^Ih^*luM@nNZfD7xx|r95*!Y3 zZ5455iv;cx3D_YLyk8_}uSmy2kq(DMI+ck0C{HBhxJc&)kx#RJQ9g`Dbn}3$kND;vY2!(o7hD@ zeAq!^2Zzc>?k=+YTB!W`VW_M~=qzy~I?Kw3o#i*zI?L*)5LwkHMB>AKEWce9A~VTH zLg#LhSkO&A{;r!O4GxpF?}f>_OI_u6uZ78mOFy9-3zK;~N9qcvq)c^+XS7q?Aak`#GUmF(GnR6Rb;=OZ_?aU>vah&UU3ST~jv^7u&dFi7SLS$BU$fyTljflqT|(GKBDJU6Shw zl19S1%Si(`3-94t9%XtF-gnm}O-r1zkhlscx24eKU^Td^VOJ2(p93VNt3z%fC!)Ut zeF|QIHc*X?dlzCl~ev>%16#^=z{wY@3ZAjPuE~cdE}6Xowqu(=VI5}4hiL2W{ZOaB8QOY4I-OJ zlLzYFpUwDa-2_0S0P8wfKTx{2$5)Utja zB;io^l_2q)3bO3c6c8vWv&dsou((Um9}kj@(ZP~ABv{fWiR>nheCpj8;>o%fEZI%L zvLD^HQ^B%@XXV2G1z*Zv6eMN%WFc}X$ek1<6S!w4*v2!bLFa*zH9@i%`VXYh_YA*y z(bO52KRE?0A0H%{#GA28q#&F$W%K7w=u-+Qn=2wEj$pZryaV4% z+3S!C#L?6_5FZH?54y~B2Y>PIka(Wih#gLn)-CwPY|^v;jEAa;kp&$WDLL6yh>y^rUXLVbTB z{7Hw@z&ENs-6aEfw&%7>p6WC1x#XYKmU%8Y%XJrdEXJLy z(=*H=SMd1yPTN!K>Yc*_HkFX69} zhs~rbYjvn(b8k=X`A5BPpvYn^2kaFK4UM3k0TG}F7zAE%uX_l20$m6b94_v(4fKUj zPc*)B$RYiSEB7Voi=I zN3PvO_b22v@b2TG)ay{W$g@+ohDrx?5&B&GY!Gx1;f6y;K|TA4vp@EzQv2}yyF9b9 zGwp6?Iow2^4uRY{?E4t+!*jZw$Jr(DL)#OT06YCxA@KX&y9{XHMXHZXjuS zPMH+AWHI_#=(bF!oF7rYkh2Ll1D*Q8I^=lZE5^^xi8Pa^%oQT7P)`GO5WD@m>W({P z9{23XGyb%jdln&Qk&XlK`*w&dL%v5?t^41?UjzRe{3}SEN?gR@0q#79jG>$|)`+a& zS{iBF++E}|@{x_Lvk1G5^v83}2e$BxeA?{8(;c#ow3m^l+y(Rvh4>`!D9DQ&l&o<` zGGU4|-mTb?=S%>(*n1vyCRhwSs|gRv10C`u&m4|!6ZuHuz5<@N4L%c4r^Q!Bx~a=< zklN&ueDtND2Kee}C+MfMX%BpT)R!3_oS{sqH@>$(9Q+igq(+N05FO7I&9FjZLC23>n!wGjs!~U6ai*m|x(*K>7 z8F6ibug3_Rexcu=iMEI59G3nR)A3)uU2pfyvI7q`%czn-2U1TWX-o#c7gwe6d+wfDsZ1fvJ z9pj?;JToO+$0+pQy@F{AY0vPd%ptUgJA&m2>HBO2zH~iEvY^>dg5`bg*+$>Je?}XAVpu@V`TUANm5gXrmuPgVE)dFh78985krXN~4LFbfv7u zCX5489E?w}(>`>W&sLV8ltxfkB8eGBbHH*{9}-$I?>zAO-?Gz5E6&iVNKX5!w7 z?&sJucc4RJpk)gv7xY?>>Yydv9dZc$P~yu#_xFVF4L<DIQoso0DUw4d%qJU;=wz!vQ0g?nm&#-9z?YkRQVMem{feLcgash?AH z-t;}Xrds;`;vm__J^x3;3=t`U&s97i{6xYt7OFt51D=lfKh)A+a!LVh!2i&6@CcQsj>mmcWXE4ydCv_@5 z3)Ma-W06DdW2-c1E3}C@rSBnauA`4+Z*a&9blc|A$1HWoV|dTazT)W~Avrw5vnWEg zcp@ZoQUv*@+~VjPq9eqU7a>D=R%#*F3A^R%zVcJ#%qkk)U0mM_uv2OSayoo=gxn`S z9cPu0=WMP&M%F$xlsvtFcZX13FGXCm!@1WZBzHq!$))VNbG>AAgyb*nEBQRHg8S15 zdmNgJek9K=A^x;Aly(0IDIecg%6Vo!*Kd%Y-rRp4_$n#yvOe-5`f~CU2hBs@jU;vg z13+)!K2H8YN`LIgm@N&v3`L)jOX%jas3YJExHWi-}0O~{|5O#jvoFU?(==bIsx^53t^Y(yeUXU%R~RV zo@aVq5+8QYLBAOk0e63{0Q|`@VZhaa31<*rXEK7Zf zdB@E_t6ud{J2yM!2G_H&i6@%0fM0QME4n4n9bA|uW z>LP4zL$Gv$_9hwsLGn65zsw`;0X#pDFvtZje0Nj=BDew4}V`{$3)Z zxt@vN{I!Pr8TDdBfIR2=0roy3(QToBX=44wNBu9K&YX>8=z4^%10CGkU4F+h-Mi@0 zo{;as-2Y~P{3E(j+V6s7ha3wL_mLG2nM)oj881`~!9O}W1VIg-xEizgY_?P0=&gDCIjlcRP1-jv22%A6GrC)tI)@T?mXjL;=2PPxt5(y zUHdjz8V@nIp|u&)&M=!Z^BOm?}0sU z(pFF&hnb%>ZNc{of+T$fYq#h|0$8{2p-Y0Hqro$tSH4SR1NT-ykAZslMsODV11JSa zggs1p2SWFe{vLltn7JyW)r zUqAX?o&QgSegMU2O{~TG>VqX+ALL+88WJkYi0=gVqz8(ugr5z(>G;MF)&d5G%3|b{ z{-KgSM5L1Eq|;U`M_0i-_*i>v<>=q(EHFzuIGf%*TO#rX;(T+`n@i)lfuco*+ojx?X9F8f;NI8 zFox$=Ko6{SNF8)XPaeFLxss;~g!p-;gMg|*o6YwGy}#!Q62MDSbC z^aY`knk{nZXCdO5O68s2MOxA4G51MdiY*?#CO!Iz`}kL^*-@#Rm@``QsZ*@)qi@_E zXxX{^I~@55@uS~|t*Y>~L0qq)Ox@(cLwaI9*%PHKqaZIr!MlVXP}~O~VZJ$CHmd zN_l)Eg|O+Ph39c!3u|^<+YQEHQ+?(_Xew#4`lTWG=R?Z1jPVIJo5!`;;D2JOdBnFE zxM{mRJZm|8yvjUZ*UeJ!k%h=j^u5cVD?uX2VvK))IIQ_uJ>y^I1^+_a8H@vVbKOlH z_PoT-8ZYVeOc&;S)Zag21Ly(5>;t)=iT-yd?Mf!`HIYtVXKZzxdwE7G?MF@?dqkscZ|tZFX(w@ z6Z=X{tSL0jA^td?y^8YV`bX5YoqK6pPL2~d_1$wFUm&eb9}#Xn;XXlEf-i&gSjuoe z{48`}XAA8TV|{HKa?UW1=l)y7nbqI;;~jlIV=&#LX{sdO%;7zZ8FH{qdPmm9p>2eD z3cU(_1r3IV0vC9JoR7^q!FL0`TIvg95A9HO%|-pc2YQVw9Y1&hKZ3Gx)AxJm;56Sl zFLn_}`U2Kkx&9bn6f2*ZTHh&~w0o>k!Drm2ztZcJrTXt~y&f(y8QlTpvF{AMK9}pO z&GU1nQtpJ+x;2Vxn+a#tKjO{t-#-EUKL7pG^!|I;?-9OBe7UOE^n;mP8>81(a~+$i zucYQVBv{i=e)ajC;9n7@H}IXq9)EVqDDr$(pF5Uxfs620!3A(x_2fbGa|1p?<3EFZ zg)-538si{0ya!!6_okkw9pf2)qjprirj6$gLid#WH|qW9)Tdt|=jJayV6p*#`Et$zR?c=O=tUu4XGC{VHH}?bV@QC%8 zJdp{=X@%sKv9#uG8gkb2wi12PkKJp{bsJp0$2ov-bDY zmpCYUu{u|8T92MJ|0dS{eRrYf82?63kkpq>1}Mzs?m1-!anANm#zs~*N4Eb!M@aGZ4dPR zOu}lLuD+EDzZtCLxz--dx8unN?af{8=}i3eQM$g7g0H0Bo**v5b^_hNQ0_TG8snj3 zRPIl|1Fca1BcGm06J#CoGOllgE{3`Xq8~Cra&};cWaj&X-AVoTU85{n_gPMuoVg;F zpOQYBJ@-0v8YZ7Otakqp*SV&CoJRSTqStb)fo75Sl&`sVoiI&cAUrnE`r63#B2WS9 zKu$USPaLP=t@yf&2|tYR<3$oRz3L~#r~O~52maD{8BRWZ{j~nlRx+;NOt@;&nFGyc z9ptLUM;zMUS@~Gao))?~?%TmLx8rL$H^<5E9Rae7bno0XPVR9159CYRk{t4Sny_u? zwEkXz-}!_wH`H3Ux=DI&(6WjW>d(5B|M@pXKjdh?XJ z*B#wCz*d#pO+K5x;rh34OCqwb9=`T%#%+uT7$cINlyprYFFa=g6n16S4sQx z74#(3ZSr-Hcy({Ww}*Bj27N#D`i#!nZh-Y5fP5_wVOUOju-jJ-`I_g{A)nIwU*rA` z)X{UKpYl4Z;Z#4^DWy-s@V&*DYcz2&HqKytm!RQj6MhHHpAslz zpqn2M7HB_`HIO)Bxi5iyt>t+mpv+5067CbA{m{>#KGxcoE2TZuwkU)9be^`2^k$@u zkQq=n>jD`sIUk_m=!>$bGdtj?1Lb$CK8pCMBmYTww)@<>PX@{#=u>v}WMWMD=genL z%rmn{)Ana?Ne1@$V?m&#O%-`p@BhXrpH~ugk;tCnK*|3uK(e$ft`RT&XKsCfHN2fU!9Lx6k*E&4uy{Tk91$a*ny&0XfKeFCIvFnZ>370}Hacz!Z_nQH?i17Gug z$$TRTUAV|4?m6)u<<4B8_AmHI0dZl+W9W5nDpSXRkxrS2>U zUPiu7p3j9bRsdgvoujemNX`nxIpv>`uM_u9^8F3`uKu($)Q5MwvS-pm++%xESD_lm zF!-u#E~#YhtZ`hQ@01vHdhhi)#KH3iAvdpOU4yxIeRpgFZFV~)8NKHqd)41L<;!!d zTYq59v1OZvPjl+}hvdM2`OL}TRn~2MF4=X7HAQq4%`Q32J^7^J_lz@th0c45Gds}T z&^Ty3_#M#thNcy)(Ik1a8 ze*9`bVZV67noGGuj;?pgRi0Z3_Ks)&2D%U5`Me|T9{hRG9OcwAbW;3*Q%bPoLv-JP z@4>%<%q1?l4K1a-yws1~*!>xDbFowY0?l~Fyy`obG*>z$pcmnY@91{Q9V(wI{mv>wVzE zz-vICnX!fEQ-}0fyQuenpw9K#;gEkB(F{Q{fq2b1A{4s;KFK6DxQ4OjyXOcz-X-2^@W z8DJ~82+n|J@GoE|NM$VKdG3%u!54xuPymYLgKY5b<81NJ3dchGT0O14tg_u*Rv zU&jYpcZP}1KXzv`hIoy3WIb)H1jsU`a?8O=5DyZ;er(mYB~HFL#J)fBM)0S-jQOCc zAQNl``PsD3&>T=W-66Z7#f7Y4Ko5XiP%+RU`OqRz3Tl||RgY!;Xe7RlW%WUm_I$ZCGmvAO4`j?b=jb8KzSb8LC|E629K+;{x_{B4f1D?Oy# z5husCu4D~%rBqH|DOJ1TWhiksCpcs{bQBl^#sN3=NY|6JeojE11`g#p(Flmn?Nc!yZ{@Z&x9{>khfv%LBMYVr4QKK041Op6!46~ zVGcP$Jt`XSkYZ>F@=<6h^*V=pc7uK30LTR?4U_?JgWCP9rPkcq76jab2R6whs)u0AG1K-MITzP`D)s1F7 zpE}$CyvdBOlUR>@fSz!T;50Z34x`Hh7vPIFkheALX~ACxrDMq(Z9~ai@(Qk^y8&*2 zR&WQj6w|h;yd+54ppU^*&^9MXUO-;~Z{_Ra8c#g|oj^uhAIYRm4~6drdVt=beH}#o zH3E4M7z&1iQJ^54etMEij({Ri3`(X`-`7yzL0Z3X8AG^nU;>y7rhyq?CaAqf+xCPr z$uH?=A7TF;*dKUlI~ze+3;ToM7^nc1pz480^>z9h(AqgzTKa(zwBa3sWj1ln0}H`o zuna5*D?vPH+Y>B_&~?gZ;nUep*$BT0c-^l{DzqA17086Ii>2(qX83KOkuioBxq-5) z2Q}k^q#9I#%Kn_g9Zg*U<@jw`NA}C`W$&yYX@J&KH|thYCpHAh@ijqG3$^CYCA0^5 z^gYDC4;%owARm+x?kIJrg!_s?k%#)Q7oXn|Bn8j`gvkL#AoonTltS~Mhd~8=hEV?* zUmY3JSBgOK*uGK%j>4A?>?;-UHH51Jjo>t>e%V*5z**H_?<*%bFK_|=B4}I@A>O4C z(g3YL)t8rm`${1=0*XNKXyRv2|0uL{SYNr!Jy*dEa0|48JK#QO1GTp!m7WI{5(2AE4Qc1Wf!dF9UZtA%M&L$AgQ}6=30*4zSBoDZl zP8RK@?If;@Ec*2KXcNfy(f*`SZQ)@OJ)O1yq%6R{_V$%v?y1V|E7h|&0{|Zix`7kJ zsHX!dbLw#e)C(HBv!54D{z=;r(pUs7MlRV;9c3I*>Y+cTF81L5-XH=D0E56#FdWq4 zOUG9mAFHEHs7G!R9#{w#gJobjSPA}n{f|dp2R4FC z#_ky(1W4L@0kR1>6{H0ON^3vbJNT9Y>I}4P=j*Z!nge!&ec%Ad1^J-j5bXxERC!PX zs*@RifI9fP1&lvtxug;PG-#xq@PY*D{kX|lzSDl8q{2&{^tcs4RO|iMsOA!J`^Z_ea2 zpSm%YIx>>F@^zqGBuvF4r(A|!1=Z}c-+BH+bxTF>N4)BhoE{b@7+h+Ou_CFRdK+i;3?15jB`+t9!|Ko#@PLe3TtM<);px`7^`H;4dj1+R;jgJUf_ z9Wns9wGaJm2>!bU{{@542{wbn=3hsdWpbb0*wWR+z=|54@g?S40#kLJlFVOLM*%_YS#PgvQUge{WJhuh9NB z(Dy&2ZV^xGYn+FHw$Cf~!Kak4wlDPS-v!D6Djjkb+IFyuT!5aY%}jenn+<;%Tm=>MIjPVa;1*~F)#$3|d+)&49ik7Tf4vXi z1{#?&cuVNJKz$YG>hZs}5%lYGX$SB>?*}ekPx8qd@ia zU>O4)2dX%aaRStV3CNRycQoUIv8*3VWt=dc`9E})h<0f-ej1CP&Sm_u)L;LhMW7gztfv0&r;hAm4$D28 z!8VWsc7uK30LTTkggH*w6Njj$pneVYA9z6{D0@Kte?t8S6=$gbH<=q(IiTLj8Y1hi^N=Tlg&?CAe`9@qG5j*H9IOQKAQ80n`$X13 zE3%ViBXko;1(~3FYclhPWZ4W~H}z9F0k*;CfJS&PXqfvcW06mJS0!0?qu&P(fLw4m zI$82SK73L7r*Z@o!IvuFkAjk2pGq;%_NQXxr*`{O{b-YiMTqO_&;xOM0#n z&kb-3w1UG`pGscrCbGpK-`5sA~%O z{~EtMk6(gdRrmr|23$!OdjypG^LfS$|2E;&A-mf&Y)i|3N7CbOSv=Z(yzC zL9KaQSLVN6ssFvG|NW`|p#BQ|@=e+?Xe0HgY`UItrv5LWZiC8b>iwEA2U57l+lL}(lHrghMbU=v6MnP4+$N!TRIwEhK9 zN04``j`_wuXzP3j8@jY-AY~wR_}fqA0Qy|e*8HjDLn|DNTc8-$Np?{{Z?Kb-Y_Xzeie2S(EmkE94$;7>wlK#UH})tWpFr&{%>lKT!k+xXa0RCShk&CEjN%$A2I*l!TftK z^KbOKZYIgjQ)^{+Q-b_qSfcD$6eru~#!1fgwetJu6|(>Kdif%JqZ}Y@e>}ck_J(hg zJ#!Oe-=Rb9p!LCM=8wXyE5$66oDfzyGzOO zu2NjyRZ7{{IeI8e%3t=8vfHfH-s>gD96hDV@rG0;_mVXBAicnPe_|K@LpvEfmHfr< zj`l~KU4#z>C$3N~Hz}u6loRld43I`pkDVL9u@cI)igJdQ=TWZvDc2{IGkn!9%4S6{ z>t>9jKp{8^N-I%os2ZOaPO?G%y3q1htE3e^$``oT7e%`sdVdV9gV%7=Lu+9aC_uh4u$jCei*R z)BZqfR--EqkTmviy#XOIoA~B|>aPQ&Dlb45!q+tg(1!%bV)$jC@i=jp^Ni6v3oJ*M z`y@bCLgT^VT>-KVx)J2H1kip3(04Qa3#}DX%N+2NOps$ON0gHjo2$gSIbd560MxsH_WwbM6uu0RBTty8uV!AD#Jn(>`7g8-IgxuSK#j%$ z8o_CB7F+5hFE6Hh~)0cL_U&L?R5(E3`a%tLNz4xt{fCSQiHzQ#H$y2W4_SPoW#c+hq- zR1%@sdg34@LJM+OSC-&us+WF!m$Pac&R310oOEq3v>c) zX{_}W2T02$d<41mHQpzO_5i&>1gOBz20#aapy^^&&Sy%N8*I1;>c10#uG={yUKQFR0i zX2Tb8{Ro%`zYvtRkhYhMK|nEBjIM(8ErTux)el(DhsJ{{(tBbn{o4-qY?0RiZxZ8< zWX2&L`Zs6+=_n*!M@Ux@D26WqM>mlFq#)VIJ)1x($ON0gHjo2$gW4ne+R#Fy1{=1TYP=UVk3Hc{nHM9mfx12Q;=4T%2sGhN%jlB;SlK(^O84!OF z@gF%KBnRNVam@co=%zHNZQb zcBwyQKNcSt%{6q5AdNQC>+U0`;agbeJ_~IX^3dU@asmD#(EjE!^eVUkZh=;C2jtSH z-iPL0cS;-dF*y7+-`#+|07Yl0!=U6ib)$+p!gyNO7zz9(5b4V*qViw2di`W)1=>`ZEU^ zM&Cb>F~C^*{;rGx;0vg)g{1QcD7wwFUy_FItp9~$>kz)@#WQ+<-XH=D0E56#FdWn} z&OY7|I}^T+`|3wxS76QmpV0o(C)o3U%BFHEbu*gzKeUFj$T%3z`40AAiK~q>lVhOc zzyy`Q;$7NReWZo=6DA|Kma!fM)%n2;m8alSy_l!K&j#~A`gd>AxAF|e0}GMs&XBg3 z4p|Jp3^bl6ZKp^hw4QLe*Tbb7`sH9HhzEze_m#XS;gSen6v*B*`|<1GH-gfR?0-Ah zqlOj}PS5;$fX)jyaXl4eD%h*b_=NrdejCUEyTLwi0OW#vP>>fPg(d8NAXyXiK+jo7X2Yk!JaA|F({~FO(TGAZS8(jn# z00w~y>d;Qs#D>BT2cv-2EA_uI@O9L$6Zq&j_z9p9*?XNe0_s~mbTYbWUcqq;g*9m31Do)m>fRiWPoJGFdv6c z1na;?unDAsf)eU~1NFa3zks@q^m+a1zg`APC*)Aj4fFtq z7tw!#-ta}U=)X3wmJdGwSpC;@`mag!U(f=s7cL+S@fU$&Py&vEQZR^nhJxW>6fo=W zyuVb=_y)N)ntAJ7`uRol^U!+ah9u^($@KGUn75u`-rB@G7FuzWc`LLExf)ut0o}b| z8Am)5K-x*l?HXm%LKyUu(OLO)u>PJ!y<eqnc#39XUL%Q zKpy%dQ^{8}Yy8NI!BJ%0*Dsz+KA{EVy)c&iuO=My|DS(7|FewymV=cb9wdTwU?ZsA zf&bALo?u*82kO_*wgWF{1Z9u#KXB{`{)b;yp27dh$s?!%6}JOr6Y->iOt2YLw*->^ zK-mUgM?F3Pa^QD^#w6;mgZkT%`rF?r`_SbcVt-00IE>%qLyJIO3G@0j)GJVm>>bRw zeJcD&>TxXfIGXxY#f5rxB`c(P^l4{veL=KW)uf;sMyN8 z6VR(5jSq9&fL1p!2k>w%8NQBr!3od`e+M+edqD%J2L+R;|D@vx>C&?@#nVYwGGh<& zS$bT@A0cv|``W-`khZF`JcYghuRtyJ`8f6Z1kkg7_0;!7wc9*(=tbMPx zfw}3N*@xz3J7p;P;oxu;^OA|2JNT-z6p=s8V+kk*r9jJ~U@PQvwMj7=mFLf+3K=ijqYFdus?JkU*OK z+@!&Fbl&;n{o@y-&z#=R`%Xr6N89lR>Z(;TG zVL5#za{Hvc*!cH+PO4x zVB_Ci@_*iOh>unOPgVb;fu6(%?dMAW--+GGeoL7kpU%J8>iMH{yXU{l^H;VaKYQb1 zt#WOuZHklp;$e*U8VzW4`tglyCYjN0EXnP>B5?=*@q2tzOo zy`TP2C?-c?48~ysCSeMq_IDatiW%s!E@>9o{RMqsGPSU8m`5(aA}m462<0Ek=$Zb? z|5SSx(O2T>{J;D9N&}Vut;)YLQ+G)DZ&EfcDkE2w4KjIF{dAUppuhTwKYW${ufYan zt$Eu-wt2Sg=y=sVMyGq)g)BX%K4{dpXxgRxZ&&_Td;WfVI{(ix6`yi{o>d1+Ya1%D z6T1dgb^db#r*H=6Z~>Qa1#Rru_WSw*yV$?mwTb@MMP`vp zvwz=Z|2pQ_L$(UpESx^i{zWTNx7ok?&0EvzFR7B!kl$+cTiRJe!IF}zee~08dl5KeEB33E406p<$tQG*$?SLDb4yX0Tmt*$lIU*AYN zqc8^J(9%c##ZRDT2FpJt(WfBi8D-bXe?R%hG;y)H=8%znU2FABwjzBNs>IjHYt#pu zBRmgzzfOsKzAT^E%0`aX_>_O!8b&sAMdM4M(t4n*bwG<8zXZ#$94oO3Yp@O*(DQG6 zBxLvJ*{b2CP)6T|O6?QZ3w~^19Jd7BQB90`oKYb~r$TtrubBKI%HHFIO z3h(e{<#WCAxmfu`10P@#jggIyV{HBun(1k@Eam@GH?*zh|Kp47K!$&?Q+(vR&(A{~ z>lo+DAcs@v{(9fg`y2g*FIf{Md=5P?MR8L?_wVe}A$$o}a19mTC=Hc=ZLR-{BhCNz zKXFgy7v7-XLN7Y>GcxEz7qZCVj^FO#0Un|6|Lk4E0QCNr{qff?h3=Qkkr9sO9}Xgi zU>J%K8*Kk(aty{{0;;F>4OJ73f70XXusUSFF+EHZj_R}qL~|VKQ8QwCs2x2$)Qz7W z;$(fv^pGGMgp=e6*YxIHYy9mMumzK(QHuP$q_^E)$F)zK9y*Xg=lSVuyXhf|+}Y`& zaq0BXG;(@4Hg&qS0@FkD*y$mSmZ8%_>tJzJ_Jok$EX=_?w1`Wmr-udf%ynrWm>w3< zmmv49w9$o5EE5;&o*tHyD^X2fMXo{BZTY$?U-phzC%gd-wep2}B(S^R=#Y}nO@1pw z%h>5*8(E3yIy*MY*Uaf*r|@oMU01Hlx#g>iJiPe};U8B_4To-jA-p~R#c*)(9l){PSVo;_>0Fx#Pp$?$3trT^<|uEFK%ad;2q?xoT)gU;RYb zEA8@U+3R}-ho(M*^J96buAz_n9|>i%%9?)1#PsogJfb8Vv9R<yM&gR92Vtocfu92god8#A2xkx;YZ*-+c(xlni6cu)Vr@b?ox z7j_<*6b?%JF#e%^QrJE4zlQIO_%C7C-2YTLdB5@p>pu&Liy0Y z2;2AkMfl+>)5Af}q2g!Sv7g&RSQ_15n;zcYp)7>y;d`(C*RXfvln@u+`-$n{`*Wv- z{q3d3Y^Q~NOaG@(d#EJT>?jHG_L5MSDDg~6Lj92vKJ*gfnNtp?PgdXccZDVIgDg_jrfA zhiDz-1M(3vC)GjdYF7uP`6oToXdJQcJmnn&(EBxYanu&_B}MBkd%rXNar^(f>7mH) zJ;SGmA!OwL8y1B>rQd(IZz!f$x=$6a6@*dr>cz_c{JvofJu zLw$+47p`rP_(_<8X(+`E%tDQ}rZ#Fvc4uU=g0oKU_l3u&1N>hs)^8kz-S5 z#dW2ng_YuBub2l%u0b_@9k~HjuDx!l>+Uv}U$_hn^d#H9{R(-%XPwLZsgFR(W#mZofinMm3?Y#cqW&OWy{XcZ>(f`wKW%ud- z?a=<~^EIv4|KF_7PBsgt_03z*ij?+xn{;+#FZQElv-V&7L3*Z1+O^iB(_@I{?q!9$ zjyM;2G>Y%z+Q}rU^<`3IE2_lj{fc*8n{Wa)JRU!XOMm_FeDMT<^kp z@5FWQ3Lj7Axc6?de%VFuq5IKVsNdy1OVf`{bbm_pOXquUQKN5Mi@L4)7pO-xujuLg z=hq6uu;@HrFAT-xD2%~4Ou$q7f2n)C&@-9q*?7+Rbsl8n5&i#!{(qJG)6V{{W&a;y z|L@a3XA5*&Q`!A5d;sPF_UIc;lI9dlLn+Gbt2l$4g*ljq1?aVRVec=^pPBukut<0b zmSH8T#Z_SyJw8_6Mtg7n(sQs6e{!h2jh9E%^TXulzgMw;6YO8IzKdNq{pgLT+|O=!Ew{=drpCo?zM|M%Jd$kKCYyvqK6*&c^8+4@V1LNl43%l;=@ zrJEYfHkZyeRAMJu#HA-&e?ZS{m-bNh4}Cv!JEXnUImpgEMd6^h*eh)2NTQm4l#HWl zt$Zz&uj54_DV##XN%~p#zw1qqHRsv?mmN2i{~h&Xl?yZ|!_oTRR{zN$hZD&A1K26i z8sJmHZH{So?CEHX`&Pd}NQtjTEpLjQv=?lV_ z5Ul}zvi>*Y+B!yhw}yJhTwiXZ;|FTv)gf0Ldq)|$M)sikCi`Dqbb}sO)}y^bZ_#^^ zoG5RL<#E2ezG4q(arf{5j}ZCY`p%aQTmBg`=Nn!F$X@=AB61LhU>J%q3O(QQ{R{R( z_lxEu2#>=AOu|$D$v|UU!qd?EPEjZ&XQ5{3`|E#a8sFD9uh&*3w51K}_3sba|LC2; zA^SSJ9@N7$RdZv>(>9iYyB^oV)HlC)90=K zy~y8q)*gt`Scc_D9k*V9%=_nVD?ju#ScgAYzv($-M|%dIMRlz;5`X!ju)#6We8o*< z?7)XY8MzG&o4rRbyVon+bI+`9z4uI9rQdd9H}+yb4x;y$_E02aIEtQ+cz?-m`w=F| zE#u$!w+xoYMspse*DAj9W7cSC|1)t`uZL{-o=NWekMGiKSEOb-Jsu)IHV0n z-~UCsfB`7NAVhn8>Gp*!7$Q6j#VG%gcIO@T5KjPQr|DpUxWdu=K z8ROV-n1D%`f@vtl474fF?aFw^b@l%)Wpm#L+JE6j&3T|lYf3t>guV>9souSb&avIOP&MB8jrf&_%Ka*G4PyQDgUE#I zt4H4NrykfKZbo!p&w3ZJNw^H z*E;&6!f_;#LMt-J;RH^h`zQA0Bzr#lOgKkgz$IM4HT2*HZlM=7{P?w~Lmc%;paDrV zsyCX{8OO+!`lFftWd2dBx}}Yc-#%I2yCm`@>3{Um|4>&%Yk+g=f|U00j`Q5Z13W^@ zNcFJz@~?~t8Ra`#+wjZ+?-+8*c2-;$I-BexHrIQG-nZFkORd-WH}~W3M(F#D2t|$` zgdrG)VnkzeqsSijatt{R6EF!=Fb$=cfnNRm@?RVO|E70DxaTW;p5#0%z#>Frrc20W zSdNv5)=pH3UqyeietQi)TQfZT`}Ny36Rkf-9pb28YFv7zG5)#6smZ?n_vTr9`rj=M zTOJmNb;28vpZ{YIqPCMSgm&ZE9c1RJF>$g>I7{XhTYq0-{e3^{?+dNJC!0qYTQ`58 zh2Hv5T>A@Qll0244V7r|-?Xvuo%D?H@{ap2gx&PL$oVaM**VTT7r9?tOgab2!>HbB zJe53(sy*`6C12YioJ7MR`EpD>nb>FieXRVWZlPmVi(fB(v;31ufV~@uB>ao$tCf=glIA#u4&=UH-+V$Y#f-$rj;OGIiA+2hzEOE4YRy`zMtD zl{#glwB$ukBA>i}v3=$uc>k)>FN7Y)-9j~eulKo^9+#Io*OS;N zk82<2t%1z@aroy`Ta^RHrOA8b1GLb;ZM^9bJ>#9~c-MP>-FttZ@9MmAK=1oi^)Je$ zk+uIvx3+MAa8EFY;CIELh~E9u3!(St<`8_nI1CaVf?+5|Z9jcb&^3Kd)I4=6WZj4TKxkwYU`WWV8`xj-|uSQ4`|=f zvP1j6l^sU5qhq!H*$RD*&H5f>RyapSV=G$@_|3VZIjTAG1Ww@$Y6|;>+Rg6YTyw;Z zXrHUJ&-y6&@d<5p<92r39(G+VyG|db+3#t;w~(zH?U6E@9f*#_{PXCfcj3wYL{a;D z&Uq?6W3Ag~^}XpWhopf^^vn+bmF^Y#HRK-pFY@~z>|(Qtiw)I>A#b7jvhfG<4yx#N z_w_5z>dOc}Ktnrw7xky*<0Si2AF>v8yX+@^$bN8SqRTqnw0%&oDtrFZccC(d0Vu*C z48bt88UJlZ2U|RYPI?!zSCs+pXJf7YFOEqwCCz5{I*pda`oGA2*7%!rMqv!bVFIRL z8uIf0`%MW~e3u>hEAw{fvk=X>okPyU0xUv){|)g==$Y}#C6>{bBR4|1L^KDt zQ`;Ort8K?hzpX+vH@*Rh{+YWyl`iBFK^D=FguXd;i(@5Xmz*hYPrbXiV=4 zc?~_dffi*v>VMy&XG)a6$$dgE{SI;ul|Mvt1Uhj~T=(znt3>wrw!kB@@Biv~V*rYf zHRoUmISj=Zg)!*;nDLvX$`4}K`KXksXy2!Cew%>m1MG~N=RyS^LR>vi*QM?`qz*!I zy*darZ0lOovALsh^7`@Y|M~2HBoT8xgQPVHQ!ovsn1NYnd)NH`yYJ8c-(^gGpRxHv z<}H&s;l_#V|Eb3R`xyU6^L@5GTKEN8FB*T}==`YLRivF}mmgx6v+o<|$@Zc!$2sO< z0Ty8imSH(oqKyxr{Qw`q5&j2s&gUE8XUNXxe?TJ}zlp7XjE^EkHVdbl_#e=UX#M>v z>8!yzY(R_aOb>oOY@%ld%3FUv3VQjE*i`G~O}NW3o#dO!T7Lb#vJ`!Pr+TK}wcFbq zU*-PQ&2?WFT7PeBGMYcsAe?kh>dAy-D;>8JyRjGhaS)T-_rqk3bJU`4sB}CaVdgz(?`VF%`6mHOOA>VHhj>f<`$=Jlg&?`Rj zAKW7!p!(&)P&KkJJfg?<=ojqJ&sd~PAgRr65KicuL~8)IsN14_c;1|AO_SfdzwuFH z#rhi04hYfuyCQMD_ATuF#K*%Rdc{}l3-_14T|ys*{P@4L{=oky+^}7}FrrT=cFZV@ z!8lC7BuqhhTffleUbk;n#>q^7Wm;XH-=B@1>*M*T2b$Co$H2Z&kzaY=v*926JsS>PEevm$JQog5d@lU( zZecj^?nlB8N-Vwn*7d<*Z+d9>-o#IYJvN;EZsG8-`<3D0j^~tO zKd*|;t!-RRu0e}-c^$a{nSI((bS>l~Kn_`K@>?1HXnbL-b~0^DxyC--ey>Dqbic5Z ztRkz&o1aGCi~Tr=!-(N1;z%NeR%DPvthQhH@%e&qLbP z?-QZ&>c@<44Gr7z_l1MQ&ZdurGk!aVe>gcX?7sM1_|Dmngk2*BhVQ*n81}9&v=3zn zm;Cb(f{<`; z8txZ_}UN)KkGmGg$&tg%(08i3g_OnH~E2r(1c^N z`Kr)tu74UW^j5@16ol+Q{Vt4>))pW%1Se9Q=-n zefl8!7|C}1`(4I=l$SS^mj%*Xge6#p*w0my2k#ljcnM+-ulcl zVVmE3eyN<3-Jh`+RTQ>XWH-4N`_cPX_U9oFBNl~!X6*=BGxs6?%EM51{$Yrt{`N!T zVh=+Dl9=I`Xl>Niq3WCSk3w8HiSlpBBih!#r~Z3S{r8^d|DNanp6CA_`~SU=yZfH; z?1%d255qB}(7fGkh3MiHnVR7|xLAP(Ar!xIkV)Rf&8JlrLNnzJ`X2?}emmtnVYAs5$#6 z)XG;~l(&)Mkw62I=<%N$xCQrc%N^Xq13W_8F8Oyo9sA@zA^&OD*6tc>0F;u=~Vms0^*UQqCcC`0MmfqA~8C1?w%Hgp-uIsWgc;0&^9F5OM z{G?P7SIBMnp#4xOyc4^z9|v(5y`R;0B4an%66BNhe{p)=J~^O_(xWwtb?So@ zy%l-?n|$T-xyN;j%i#o0;SA2<0($i$%YRg0jPOyoB-~T3tst*OI&L7P&A3Hw|EHm$ zm%M{$Y&m_OZ9&ge>6q65pB7|2O7ZTgXlw_;}p4eq!YRW?zrh{*Jbn zo&UUf+WNoa&x9?;%lrOs-c#iLf4_6@-(_Fl)wXN*Gi0Z5*ZjwAdo(WFxRiZ9l6^jv zeJL%+4+N~d;M?U#z|DR&|C?x$a+It`&KH7_5 zjJTBR8b?k*3!lRzatb0pM1K9>3UmC0OOam}Al$jxIK_J7|KsI35cL34;quU%)iSh#FB|H~;v>&~{urJ+5v?Jnszstv=|Gb;t z{V#k<`!Xb+A5)c?nR&H4Wc zoWdEL!v$Qz|ET{jzy9Zi^SQJ&!G($JQ#(FQLD09>61M_FbZkp;z0U{jB{5t^XS! z+~eE9MP%gv8x)0~^G>NhhtMm(WPgD#o2x)?Iqe)**)H^qvfpvum;-$ba$TN%lXsxq zyKvkd6wWtJ`~*zG6ih=YW}v2jkv?b<`;&c+dU^s4wd`{=US?mv%f7zPmcGlr-p9UX zySJb2b_3f{C|Byi^x{yTD!R(e*gT=UC+N!DQ+irV=wmOAPysjqlhDk z6!PQY7d`K*{BOcFFZ2J+=bJ%%1>ekOKAJt+X(W+xOb+?>TN_@Xy{arX_dwhkv^l07 z9nrDqY;u2*MYPvT<8A)G`}}{$)emSEmqrU(5v@r%C!Op!m49vc1$xUsebl@5`ln|~ zw5yZtn?=8d+p`A*!ScuR4+6ZNZvtJlXA6Izj`CvUibkTCTbT^KVM%K zHB*i6;Xm|$`cL1lDGwNcA`HS148wou|McGH@SpiVi={CNV=xZAUkYIYS@8pY`(N@S zSep~=J<&?V(nW~?KXwCLC;i%3nC1+q3<{1?-42; zvlG?og5Bg^L~BCcd}z#_+(K@>tF08j9|v(5F&srjoAUHs&!>?8e<=U|YW{!iX1#VZ zK{g=Ces3Jl|9_JIU;p5b*4H^lo8Q{?A383ZAAl$O_Gb_A0qpZ1zsH>~i4Y`lLNxg0AIwg2VD|DyJPmGBxw?f*J*1OB7-zeZh9 ztB$Fgte%*yURbPNKm$Fg|DOLoVt?grsP|twWvGmJjW2)raQ#pB|F-*o*Ztq+{_k`D zx4Qqxb-DljBO5^9-*KsK^&iq`xvu`Z>^sHM+Ks(Psjv2v>8t!_`U3~)hY>@EyhLl6 zJGI|kQ{6ZBHHYe5*5CL&j;ihYHPP{Gq7>PRhPB=!)UWnly%LpA{S0LMmct2jm)mcF z?D@305mDGZKNp4B7};Nq2p8y=(94Hh{yTFE%rUqUh5z!Ia4k9xH*gDE-+zyK>Cu|# ziZ7av@YCVpp6~-aLf`*geTo67F-Bb5&-#1o=;LI4AM5Y?TYo>*`g`1Qe39b@VF-qy z7^5%-ZKtijzi9padF$`(z2AA?{`>FRfB%5}_t7}o`up+L-%BfHy?rx1?OZL+)oMOL zYI|`QC!GnHgehp*;TTM#M`?C+7l%^%4CGErTU?j8PUo2=E_S>)%pvEY`fPDnKrTWR zy{<~W62)PO@G>;ele^@zRzCe!b5s6Nci;Fw>gfqIlz48Q^#|wwmVf<2VYxI`qLP25 z;%)oio!9@ntc-Q*U*nJF|IStZ7b^ctm47n7{`j!_nv%B#}ZZGRWZsPT>sB;S#PO>N9lz(ipWd|DInJhiKh>!IJkE%Ziw%DYA0!vj2u;-u4enRgQdP=sjy{~&S*hM^ewIW%OCambW=yI)Ta}U@(X|(t)n*TdbIt#D}ORx;f zu@b%i{6c>HL-*IbBf>rAp06R-VFNZHn*UoyZbKz@;>rBq-Sqf)dE;BzOW%*=V0IT0 zBiUhN`Ciz*wQS$Ih3wxI?BC67OESMs?iFo<{~W|&#E|d5>NiIF-yRjt+rMn!j{EQ1 z|LoqbLm#kzyV<`uhSWp$FPk^b_Ko(*Z@nz;s`E-Gi4?=0 zivL_g4{qQVdT|H$(6)>HFVE4~MyAR&b-A{+o{mLE(D&=!eY6a9%s@6DJ-`2e>noxULatE$(S=S75f_{J;c!9u8Agxx zUMMEZ`FpDt%GXBun(II6t5M<`X3H0%y%!SLVw`>JX>-@E8f!mm?@h-%M z9RdA?>??eJXnU7Go?Y=|zp?0BsOgLBLiuYY8`;Zwf4p!?xJR41TsqNrI98J7U$v(` zxd!WS#}D`VaEdwUl3z%n9#z%t0JR3`Y^i?yI*#wEs``7e%4l_v(}4dbW68`Z-(mUEV}W z+=11_;Rh#+!@<+T!VfooG8|g}$?*2cVd3z7`(1jE_PqM>@ZF|cVejf8;d^I?gnj1+ zg|}Y)s68q_8oqyaP}sKcm%iKn%dl5kt*G2JB2?`8MA$y{6ZXIy8g>pG?i+o>L&k48 z?A|^&{KElbUsFF4zO!#c*gOAY;d?Vb7Ebu>6!yPj56tz?hJASIT9 zj^4b?Cn{Vskbi?;qt2Xx_(J{-zJkPB{*Cqg8>@@LBgZ`Ris$%k_0LD_IYlqRAPhm9 zas2lA<_I8z&K2ectTt{==0=$RFoJ)?`h#Ogq4^~L$XWgovUQ_55f_WXFzFOy6vm+C zwqyK0j-Khqhr!1(fj$X2zL#ttzKg+p7?>h1wx=jeBTG@8E($ZqS*Y47Ux(!DW%DS6 z=b>Sye92?|eECGpR{KAq4sp~YfrcjEsyJd#O#fMgC0K^#c(dI-)VJP(w#oM2EaBr< zUu5=qzjt}R)823GU5;#At-N{9kImHnX@{B}n;xP48_YLYXpCBVtB~qb7}k(!asyee z&S|keG5gtqut_+heU9egcHMUl{pb8Vg4^0YzimS$VlxZFPO^%uo?B>a+H?CwL8yGk z7~*yH$w_VARrQT)%YWa;_13ti+R5r0*Y*4FkN9019r-BtN@G6`;xPUx@L7>Z5l5SA zX-~MeD*4yP?_96_*SE@&IpIdv*yNgz9hZMJH_5;Ax41X0way`p9{c>I$nM{mcR^;5 z!wHmss$V!op20a>K<}@8mx8>4)LnZFlKJt6?-ba7zaT_oplSKNAwIL&wXL@|8@(5~ zsh*Aeb}f8d7sV#`3+u%9;12GgdXF`wqx*#i^mv!KcWHY8(fj@%+UfhoU+)@!-Dmt2 zThudK@g_SvwaZ@Sj)~^ZMc)t_Krg}|bj)SX&t$VNWxK!3hDQ!V{5A}IrC&^r!WfLh z1mycC+Q_oK59UaIZntRaY z8lt%e(fCB{I3I}fOu;miqMHAq>b(7b=<)u}?fkRobCA^LHXwm|ZS)8CKbz<`%N zSU_KdieNqcPtD(_FT--I#44;ojrO{BwR6bNpYlJfbIb;8LK(K95<9UQZET8mwncvb znbYiFHtiqmf7ZnQXlH-yV1J-lza)(o?`Ny`Gi8i^uXOg~AP%EN`7`1o2XoQ=W@d3ztC?`9-@8q>xU|b$bS2!aLR8_ zzULXONloalH;mW6AoK0rL-qA>f|9VDp7LB*lIfwIZ+~k+=xeYJ9evm*=tLK?h~_8N zY@Z%#t!b}A{N!}?xc{-K)5CsgO+xj{){~QmQ8nLn&30Xg z2_HoRJ&AfGkRSh=@BKh(PicrdCT)Fr%goY{q*r`VyIr6^D4bd9`sTX664y7@^^GhI zHFM>Eq4OZVLb{u!yF>olod+ptv?7BXPT&;I;2hdsQU1_@3_8(;EOKZ(EB_beAJJY1 z&GsTl*UG!bvtR(8`m6heP$ZmfQ$Jw{eHi{={7adupRD{TgAHU-xY#kHFb3l=0h2HV z)6jNPe;XY;^l#Cr@7skeJ%`4H{BKM7-^kQWWAFF%$I(J>?G~pFDV0veXSCD*WIuj- ze*dnnzF`(UBdv}-<`~oGA^LV$7G3DvYTy6xu@F;_Eb!YRR7-maxeQhGx{LgZSMC2J zyb{rznI!5@$|u?L)#;)8=cV@8b-!P-cZ6f2^0kIshYg6z=O!{5|Lpzx^iW2xMDLeM zAMZcN-}~lfzPVtY*T2cf+vY#nJ9wvKc4II0;~)+rhNFn1&7K79=r~{xgFW^xXt##} zvghq#fX05_pF-~snHu5!A=APwWUJ$94takQ-jOQrZ&n~zt}`n4 zvFqy1m-R_e?Hmsr_Xt&YwX2uq3;FL~y=GiZTbo=epPS{AUL%eC{)4Z6pntJZ{)Ln1 zyFyvQ02E;mhF}Tg*8}*4fuod&&KUMq--jeS>)3HdHLV6n*X6n`AMk1^+`8L zGu@wpY(+G-P%7NE*!Vv3VA5(w_tj4QCLRLtaoi#PG5-}`!|a&$8@e>|E`sP z)FF=gJ?0jm0ZFX#pEX#A4cLS-Y(pj5+^6{B`3FOv4+rVdnv6OerpJ)%=h~2POtgn#{(T$ld5qwX zjgF6wLHS$yAgK7F@521jSeLjAayWrgID?uU?B6}g|33ZncKz~;`sM881|;vA8{n7= zxP&W+*5Y3ydvF77%2B&=)Pel>H^{Ej%I9t66OFsrzcaP}bG7wk^HS|U*&^IZM)Oy0 zNv9Wga1Sjr^`G?%9?&yer9Im>Zs~njx_;s80q5BBxUcbyxY%NA4#^@^_c2~d4ndW= zp{_8p`}>Ar!o`Tb`Ii)~ClmSxTlj&tzH3}udFgGle}QAV*=yxLFsAV@_B9sH{<{PO$H*^!e$tFb5qk>mQ<1+3cF@*^s&W z{QrJifJIn>Wmt}t=>3K3c G|BLrfxM#S1F3EM+fK7-^A}{^1Y4zk6HzhgLLpv)^&7Kl>fc zL)q_Wp|>Kxzd#}TU0ep8;=1r;f8WEdCFR|Skw+0nGKxbhYHpjqz@D#T=f|%a_dorC z^@~TGbFAxjTn;C23N1^uzaxz!&@_K?_kf;W>K>p~xCQg1u>gy(1k11-E3pb6 zw4c@pZ@?y0{?**^2Qf}2kwR+} z_cQq*b2t%&rFn{s)(4&;qy7KRMPc>&g(&=u4?o_2HENH?%Hu=(ev5xHzvK%28p>ZA z;r&qm9B^-TxJUEd8{`n1_(E70rG*=~h3duj|DSKoHa+?#ecf#9#Gh5(U-N7ay%3@` z^a<3@d?DO%OsaozxJN!f3;hw<_f^+H@91YfL+gPQZc zFM#|!fUbY{eSt&8AvX7gFwSukFbPvoEv^dF=y7?kd-;V>N}qwG_y*yG_~9~pGw5BC4}yqj=rEMFg*OY{v#|9<{yW;Op5wn?iJPxt4) z%0Go@-TY4RyRjGL-o=)U%KutpSHgALnrM%ygY?75e>Zfye#8O&2*kw29DkIIqgr}N zGKH$i_7Gb=CT!dF7onm?JL>(aIPkepd3Hi5ANaYjo&NVDKN@z9{b*=)dwTw4E$)=HSpu%duu-#_Fn&F$oZ}Jt-lT5zirOL^%ufhvp*U34SgZ}FuFE%*eSoC z!8vq)>Z9T9+uHX}eLNfbRhNZ7w)Q22iB zps){bt^ZuuJ9UEfZxh0vrU~J@lgEYKi9ZiLe^(s-AL9buU$ZB_GIqf^w%pdQK(^+Y za7p+It|5B{T<&|6rO@iq5AG~;ix?+Yew2{^5nC8=-&$Q{@>!W`c_DcSRWcTev=>Oo1x~q{>N?o55&jo ze@xc@DAE5Qlftv4KL_)$0E@5$%di}65A{EGX`{Dm#}nG*c5O4FZxrXM^e-0r{@YUD zelfVBFOaAF{)5-m@25RKME2HU|F8I_@nrSLQFqdC4cl(8F!aZm>@3~&}oQ*pqqG#W33{F2F zep+9hOsEs;@nrvx8~$?(y|{yWcz{RfyXyV<|7`ro_n6=RI8=pyxLQ zq5Iir-;aXO`vvvIrwZ&_^<0?d|E1_I?-OQ_JwG=tMb1I_PYOfoG<$-c)_z2502a^} zVF^04C7Fcwzg_!}EIn7nK4t$m&19dx%sxdkJw2Cwx|V&4wyo@2bRdIHbRmlz21$Dv zmSZJip35q74c4K?KC`u`Tg|>+!DeO8C&&gQ7qWl1TSGtIJs2GF^ItdkPqnzJedgcO zirG-jysqNDX+-{`zu>E(L|8K7UUg7=7PUM%s zlllKO?CAXd2eVz@OldBa_RG$J{XOmeUUq*ky1z~CFIsBdU;m38cVDHG!wH;1wYVzBo}tJ6zs~>9(Jvsnrs&&@ zi9W8?Z|FpJN6&3)Sb1C6Mz_HPE;L`R2&?-#_nQZ-GAATjn~8UxJ>0 z^<5z{n*YC&T!l4QhYhIsx_-;Ez6C&!_5vv*)0-cUuWqAPVkbKEwKM3v@BKLMSs;hq zev2uGd&&K%?(!W7@-V8>(R|^@^Z8@KN6|1;zYz5k^%KdsxMtF#4w075X*w;0D@O8{hB8KhTGNK-=7@U5?u3>{#RbXxz%b zfMeR&)C&Fu?Qi-O{slzy15%UO=F;iK9o$3S{vWN+MSo)dKce?tt(;cLyL;5t^w|D? zMqF&IxejCzBKv<3IRsU&{NDZ_CR~i@+lA3L4Db%XWoN3gYpZ=m6#zWr7GjDGt15rwud8yMU9O8v2uEv^qjokLJD=gWS(t-)Sb#-Xg0@2Q zKL+wW;_3S8i~e_2`%X5h2byqfqkFy9{a@_L$MpWXEcAAIf9c&HQ7`-p{`C#^L|Lee;=cSw*h%($`i;<+ zT!%PYu@3opZDdlr7=0_ILY-Op8TIsb`|UWcirh=?M>ReAe$PSrVZ?A0Prlz1r)RqK z9gw7_5RD0Bw`xDh&VAayvHahto2q|+r{CXQ>Obt$)+mkWe>j0tID>PzfVP|3e{|f} z{-cxLh3r!8zxMrU|E3RHcZB|pyrtR3(YFp;FPs0@q<$R#e*bTTJ$$5DU2Fc|k!Qjc zdik3!a8be@0m4CSN8?>Ep=t3N^ZBHD*Azb-`mU%|Iq`JDHH z9<3XT*4}mDY5o_>KifS2{lC@jeOmdiHU47^=#I4R;Q=0@?>F_wF#!MR{G-d>za!p1 zq|l6LkMWj0-ao|H<3-XLgdrG)YPNZmF`#04e1)>1{EniJK~i~X5SLi5tYDnD37CW_ zn1)i!z$`rVKk-q|5uS%d$i8H(maGWu>W|qUkRHv)Urwe=*awsCjYD6B+;{#=`C==e zQ@`cu_(np1!|y%c>J!$Hk^gK1xd~<1hDz+jZtTT=M0=SZBoCuzuJW&bt3$k2p4#ON z4fG_+Un>kT#~no+NuZ61{TKCL#yN60g)=yZ|4(~xf$SxFzoC47-TnLJb0MZq zz9hbCSX^wb?-Y__P`&o~Fpivns+V2& zMAwb-ck~s}Fwk|Qex!V$rb_u8=nO}e3Y&|V~ z8rn{K{)qNi$((=RzH^VW2R#3Mp8t0D$FqOB|C{HY7S2C^$EC(UA4;V&1G6v(EicQz zJkFzMl+_O51@uM831^kzF0!-9ykc>&SFIl>m!rDh^I;{q3RUu1CmffDHNxxAaNV;P zuD`1vN&bHQJv*8|tNxI_HQ9hG$`jlF2oul(=r@6dRb{a?lYXV0h5 zOiw42Z+-k$BGKw zRodcOk?Zd{i+ehLxk9}W#p47{p+#JJy}88nbGU#@i0x2z$f^U@Khdwz%Z-cnkagSD z5AUiU?nh-v{XkD%H?P=lw~+l(zfk^VYx3!La1Rgg2z~#?`;7tk;QW~);qFgZ-}ANS zeg6~cu_5Ayp%|l(-=Fy#{lXafI84AK^uGRl=&kYotoJSoPeUnYpk<i6=z2VcUr%nCukBv!{qY{AwD;RNK+>C?f4%t-{!V=sO@QkyxsPxukv_K|@F;pzHM zed?#*e^C!S@$WQNxj*^=$H@Ht#Pt072=_Ev{~7Hay!+6`uwU8-aTw7ze5xi|b4HKW zJVa|4k48F@i(Q+z=o^jE8peOz@oYGJykB_xbpLP&KeT4&VApfTuI%C1UKqZAWNp|# zb8UEQ=K8R&X>Iu4k@fa)TpzxBePh^jc4PSarGFE4&itE@ly-K@@bC}n+TAOj58rA3 zo3M-i=G6Zdwjt&BR#dG0+faGnZ$o+i{}#5d{@d_F{+EN=m5kqVIDzi(+TU=oa`Wsb z!}t30y{`DPa7ujd7d~!n*oVV@ytVny!oJA^LhY`Phnnpl5Ai-9QCELFB-Va3)Tcih zlI@=h4ShZvj%@ZFKL3p!c_I9`*0^lbC&N2;i^I|D!$LK^{GZt9UmhOLIpzW`;R>#y z2TgrGqYeI?wJCoQj@|r=kiPrpp}EheL-{vHgd2X(ukRh~{n7V}@_$QQFDl!N@4qx6 z+@n9hBlKOT9Af~AFbG3148<6Qo-NiRlifcu=Z@?(cW?qZ2~#i)rI>+Pn1ks1e)Gs4 z{{ID0n2&Z*6s|Y^qaP6Y#-r=&9d7Jc{4!+QihToE`M=5kA{?#pukfvaRrD73KCM5q zh90ds?7%wu2IShko0lK^AUp4R|Hh68cjRrk-!`EP)dxm|sy+6_q{poXtV1PzCz2bD ze;Zd#416KfPqx-vTxxZ3*h}t5%L;1|$-_vC?=ZfaK}`54vZIaL8`JNcXdE9kM~u&- z4sp~Yfd(WI_n#zEXhjA&oWLoxCCm> zYkq;T|JHr3Ypi|q^-ZHWjb4^;Mw-#O`Dh*eIeL7>3!x7A^?$$9cCMEHSLA=a{NsXS zQU_iLm&hwp+`F3|GGnSHBMDOB)_) zUh(`;hdAmldwys*?fD%q4g*|c5e8uhhM^dvFb3l=0g;U#n?E8NA|G+y1VF7&+mS7o{ zqxU<;9%t+Sqi(78T|HL6QTk};l73p><466%O2@9k8e}^Q!a8yTHlc02`|mk+c&-^T z^67P5bpNlq|7YF*v-5Pb;# zX#F1_dwza}`eT^iixI8=i~N71=oLR<_v?R*p^w7^Ou`gQLw^0=F8=>R{JZ<~%hk^b zG$^ylyLLaBQ~nkEi}8 zdTSIn*!N?kvkc3z5-n4`KNIcaL(k0AeqarK9dd{G|Imd_Y=7MvCT-0IzimSGX8wF~ z8>-eTSACQ#R0{7z!)4_P^++Ikp|>KM->}>7(H?~R$;ul2`%e2&(+?wtqlhDk6k3r% z4kvI5JzrOclTrJ7j*NVP7syMvf@|o(4ctP_tI9v>Ho8vK(-U}7{&c?9v~| zlF|I*z0Oy)*?MVvSod|@6ih=YV%Fx@@3S8KkoDkXa;*9J;%A{wIKIoAgX{L`6P||! zScDdF>ATN|CG^Y@_Pep~W%T99Rk7dM*>r2?*WP@;$%-e_W z_kULU|0=9Oxv`jaWSeVhN5@qD1$5H8kbP*J|E~21OWEJjIVQd6TZ7HgPmeVIPH!D* z{Jp9uY>-aHXY8f(W&3B+tHo7WM_)#dcUwb$peSskS0Z`cI)wYy&fm6%p4=%ebyJ_2 z+>4eyzR^w|MEbhxlFy8M9u|(_$-2PK1FjP_uA|nq<-fl$RsLOjVxj!I?r46*QU8e} zi4yS4vlL96yDmM9FTq;m%6Z~@g9 z?Z3~CxI~XXGVe-{RBj6yJsHWn=xQ*>3-@ea83@=NqUu-WScwZ#d7^L*D*<(Xl_86ndp|2lwy* zkI=T(`~!3#gU*ZUKXqW1%&8L_)BF$8JhqjO0L}AVAEGhJR`jZ8`fgD7V*tvHe-@F0 zFa+8EpRoInjWRnEe1B>%Aw4v4dTEBR1PCHr8qtWbge5GYofgv~h-op1AiBjMf(Rmg z2_h|m2$v9`F?Mp&3FP8{6e_*_5mNqwK06g4h;83}O&L3?i1pB`iT6mar^K zXzu4!H*6xTjzbx`@ZKq=Q%%L92iEDV=xYVUsAs(dtWd&fUJ?HrjXMx z1G7-4OpRq<1wGDvj&q>r(B~q3(ep#fIZf#E-Sq}-03YZ7W80$-?TLE-eCIE~5-i1X z^#AS?VI{db>Ra`wN!YY6*0ko_IS{*nb0r)z&_ zV_R{YPo8mZ8-Fot{96v~sF|$)Ks-lKgJY-@7WcJF(&L^(jY!ioNbl$WH}LFE7x(cH zk1^oe!u2x^A_pUL+uA(jPz=XN#C;F4z1HEO=XdJ^z&QE@6knD%*^4*^yPK>Zul^&g zjpWn*kA-~vd7i(z(Kg@kllsLZ|DS@&7sX)=;0$`lBl%yPPv@7=OY2XuEv5B`p3(k* z zZgMd-C@q`I~Ha+@XvqjZboJ zUinhmf0!&F?mry&&)(rW)!2nSDCOVl#vIU}Nz2>?9tvSB*lQ@ma%Gu!zc@7uQuWz96jW32? zcK(v%zOPx2n7kHs+(72MF~ej(?&3aT-%ecr`yoBw#>duE3Q z^NAZrycFVG^QNgUh19~ALNn5@s@wVBN5}UJ631YaV<JA<}6<}V{>etY|E@$a>#xp=CSr`5kO=8|4_{w3>wzZAyMlaJgR z#?dDrJ>9(_u2Im0Ny0LRE5a0V8gjK2VFo!1*;Ga7dZQxbtus*Jcn%8gt!OQP?zQf7 zRYjO9Y(5rX5&FJrodvQt%nnP*{y)wR%gL2kjWt+@_1K6S&+TdZ!JFoZ3#%LJ{wJCz zPESf}qjXk&)^kVtg!}irQoZi~=1ZYoS{u-a{#qq}ZIRE(g5x6DlI8!3^Z0iD zFWTOeKaoQ_;{3n7c#1fQxDQZnG5=$_@qO|8RVVLy)A$3wor&jV#f>NH*PIg;=l@+G zFQHD}yh2_>t^YNOCpnv)b$kQOca$4wa$XAc4bqRsPUoSCoxpL~dpo628wsdMFbD$@_kTid;B6tn7I^Y~wj`CrSF0py>r|JeV$J>JAqn*Z=G z1H*v-%QJY5eIfgQ`BE56_LAjf-{-7RpbQxr!_*swlesN_5k`_@klmo4mdvc;2auCc zYKK_izOK5ji|+HX`&8~v@!K@i7~g%@^=Hs$p#pWrb=R^5bLep|_QoN`8Pn$@?#a{4 zp2WS{o5%&i7GVjNV%vvpX|X(9?zr#`eY#|+|66~~YWf!z#!c)r)@ zu}!2=A3-1e2IAU}|?_Yi`t&f$`{`LE3!yx|~jB*UcaE!zlbm)8QM3;WIJf7C?>9H@a z<)Zw5RsJV4Z?MmsKJ*Xh?JJC%7smulLd`vOX|i^z{x?jc&%i9iF$l?F?oXW~)%v0R zC!dVFir-s>^Q!)p-~3f!E(a~<#SZOVwR6^B9oAzbHe)Mx zpksslzr}MV^U8{Dc`VLBEW9WGpOF8ve^&l2W}Cf_C*!Z;J`0s!w(r8T`kKU7`QvB9 zE^-g{;Q$Wd2x<`9Hn*YwGv+Ll$Al%3M&TFFhMjHp3;U$~&eVPT!e516?NIH0>wC>N z@dC_wk^nN%q*|&GB6{^J&w3+a~9wP)~1I!T&`PO?bM$9sR!l zJj7!R_zph}gHev5=&+Vu=R)KBkw^EN#uuP~B3e%T+4_5qv&f+xnVaQdxOfWN5@95n zah+VRbA*p^oFB-C#{~K~6zK)@Xt(G_Y5(1qy^pWk$3S0X)xR2F|Ihke%h%mr$HUw@0GtO3%R4(3;xx3 z{NI)-*RA0pyys12lCq}APbtuo!Y(KaV_QL&vgoa=<{u_znH#}p{e|%~&sBu#5f$bi z&I&u1{g?3f>P@@GPYY@P%ite6r-wbmW`yrgpB8p+pQb-`M%Z`xnNaxUh|uo$AGHp# zwq|*FXVc)Yf5qVNlY!ErKA!hm5htN}BD|}fbei7#N);$<${F(9j%e3<^=HIR5-)+*ak7jy$g|UAA zdkg)zi;}V4t9+5e56pJ?O%o@S>N-Qp&|8`VS;N;!W2xy49r3W=Ag8{ zf_hKmWc8nU>OZfl|9C&mt?Cw->zw&mfJIn>rC5%Y=vZuihA|RdWd2Qc5_N~3JM0gN zXi4$^ck}<%b25jGf23!fms59WSC6QvE&G%4kE_L5cfBmso+=A#=*fA~^oDjN`bMO) z(kv{6CjCZLUp2ns({8;<|x(vX4zDd|f{%{UrK-Z9YDE2Ip`AHE%0JCVx<;+cwJhQDJqv&0oM3dUBw& z;2Qk~()4CLY5%MGjOX{uQQ?-~`f(Tc@h~2n_xqUa{f6}kzw3U!H7E=s2csNAkx{k` zCr4ro#v#|Lj?t;li5|zK#6IXr^eHIbQ}0C2D{KI27V3X=jcI z|LD;8_UJH&J{Rff`gW1pIXX09+u9H7v(n!G#}Q$^a~5C`mZ0hv@~wBcl)fA*(fi`3 z!)mhcSNe}r`oGV*#%2BA=f!zvw6^upp?UV`ke+UB`QL}I#<}aT9viV4Td@Nj=k@R|B*vG>h~?)=Nyaq|$du#ePS?+l z4%M!)3wy8+2XF{SP=gNt=}hV8uN@uo@tLaI#b;Yu-$k9U<-F_TI5KDxXSQc_$dT>- zml-lT91~9xX=IQSmPI=~KTO=n(~A-BrSRrxZLFh1H{SC6Yvzp(+ZKBd+TiQxg?AI- zv~z3ejSHmnJjtq4?98dZcnqHPp)L)kV zvHD+;L=#eIMj9jiXAH(6)_*3DlQ0D{(6L+nUmom2tY>xic>f#Zu`T>8vgJDe|2F^s z0ssFF|NpYQiJbh|p4ImMDeFv#r|$)OT#(yqgBuTF4*e~%)c^4Z^WYuN#{w)u|5rzc zCFD{p$IgHKLt-Vl8hu~WCqTw>Y#kZb608TfokkRw4qty zd+7NFw&f@vnSKC8Hl%=_TDFF5>1{LqOL*V6%fbs4|6fzi(vwId zgZ^jBLOWR_{ybU4Nt{Mp!@pMe8G2lku+jaTqhCOJ6#o~g@!B6I^PPpoafesPYshUe zmYTeQtp0{B{(l_zcgt}<;@&|;wzf15P#Z)2Z0+ycU4wm0?iXjLxbbBC-(CN?kB4}S z0o%NnPH9yC*+zDF2Aw^gQC2$s-}3)e_SdsI?!1h2wIR+6$Z1n(cfFb+)^ro+V3Z@S zJN0DzU(CnJ6(9KjRUcQ5YA=Xu62?6YQtWqJ3wgNzWPC>>$uWq2d&WpfFXvV7Bjcq3;y`rvF=6klHK%BaLbPGXt|wfjO9q`B;EIUH@8LdxN~GZX5Rq zD{2G$_tw8&B#tFmike5}!jQE$J%ds9TA{DT8Z^#SJ}DQTtQ*|C$^FUe>-@GJ8&PTO zK&k!d8Qo~5_*ZxTU zH@`n4u0FQDoy?<%lQ@madTZyC=Wqdqmhy0kjCF%6o`Tmwfds{%S_jA`l0>H>GQAr7j2Ht%^CyHbzZvw zO6BOk#By1?z$)zoeE*(J*8kI=-*8(WzrOq=n)uQwG$W1G;#q@rSdWd^jIG##4t{87 zuV;ijy03dSH$5YL`$hhF%R=K1a2#=*K-(iet8;RDLZWBP_1_a4epS+yiAq3`6#W@3KMuS3a~m&ZD&M!+!Zc zT%%l6F z@g-M{FCmNaRm(*7-|xr8nIYSppB=_O7-f7Kc}X0V?`pFrYqqN=leKTl|E=2W=~bU8 z)1G2{K(G6IS9pSRr~8Rrqke!hbK%bnkUl!*O)H~c3<-Xi-^P!^T?AN57Vaw*P5 zj&qXl`t3d*;xQ`AjH4T`9`TmzlXdQM5PdMp5%(^A+CNh3`bh7$2Abc7Vsmi7-o81q$!5`Q(%F|MTe! zun7HsH1^;4|0VQ{GJGjHRNh@qu0(E#GUgfU70~m&+Qvty575`4cwQd3ss4}HR(?U+ z=Llbqjo6H>*nw))vsVr5RU@02vOTxipzHk8Q~Z0juk{t}>-^RXTiZrPyBqCr z`zYi8+1^g}wu{WqW_Pu{_YC8AKgSNE<~jS!y3QW#!vWO2%KpzYE`Xjq$~S)fW8nzB z2I;*Y_Ww8VmGv8ZwEzEDJU^a;xPL?e8G1WP{SV*QM?f#)Bu=CL@Q3MRqm$CoB(0_S z4Hu>FtU8Eu&fx+s;R>#y4>!=EOz({O40*&oReR*er~Y?*KD$5kzgH>$w|F;X9HY@r zW+rRH7f(O#;y!W<)e**9e}JC%9J<)~$MgX^-1|{+ce+MvwBhOw!fMnB29xEe8)p3h zayUxs56qUXA;yV09)o5!Cw*N#*LkUX?)#$ma@Bh};XR=+wJa3TeaCyk@s#(PHI9R9 zt@U0Hd$0GrXL@$Ock_mJ3p`oJsF|Kd6H?gLX{`dj@LSsBVqZ?x-~KTX+Y1Vxmq*-B z{VMi%1N*y0{vw;!%HLV-`#t*U$Vtwbf@zq6S*XAq%tgn-5BneWCFIfloP0h?{KMq? zXVgFVARpcDr?el?oBzB2alSYfAeI@6$n5O_VF|eux!LL?7X9d-A#^NsgB=*jb*tvXGSeiE_H z)a*EQ;)A-*K4GVE2ItWK^GAsbm^g(Kz}2M3QWp z{eB`vHakv}g%`C6R(cNKQ~%W#Fo-@FN?T-_k=sNf~kw^F2;?BB8>*GW*^*GV8 z?EOUR^~Z_hw;v}my^j-Z^ejDh`EjBhHMNfu!^JZaV=xYN!fM4afu0=e-o!hJJ_YHC z?oC)~x_h1Zeqx%i%+2=`Gss!U?SDT}LC!%|cvtKDiG1h#iMfvFqqHxKGN60!`-vXZ zkCJ|AZIsrev^SBdg|2}#7WmI1EWuJN$4acm8gyJ%{-Eo&@~2n4=s^KRv`m$L_jx?= zejFgd?ayL0f&jxEc`0 z$d&5%`;GrQEdP^f<;F4RC6Pu3mBz}plX(=;q5Z!TaSmV}-P-hfP@oslB44*E1CF0! z3sU+YJGBQQM{lnk5KfBYG|u20F5nXSf2aM9?A3pEjqLk|x&^sSy?MJn+&CuYmgCF= zV+hHruj}{uJALW&+%k0>+^6T~DL42k59yClRHhWr<2iOORNvU?8ejCCTkEs$U#%J9 zcreN_6vHtRV=!(9$O)K){@?1yC#RuCnKpy0y=cERvI2GVIpka<<@d%r-dn4De_7on z>pi0PU*v1|)%Uk$q5pej_FFNYKl)FzjX(6iIBsPDxd=p6C_Py&FbNuM*P-IsN=t*cl5dL(2;eK|C4Q^sDQ>WA~kiKW% zd;i&p&De?^sKzerLB|IE_ZHV6^JmpBF0xzfS%EA%Zee>{#eMuRKZ`7_f5&b{8{3YW zQG@j7{64WyTnBInb*~uzKYdU*LQgJaGu*p?7it)|0D5(#lKJN zU3N3k+Bzz<)Q$?7w?~BIFPmGF_-tsqHzFkCwUEJE4f5F00j05KKhy6}Gw?4$+>`ET zk2#2^hCtQ91L!C3f!qZ;8L(^gj~2dhaCS`g5oK{tpix zCH5#&zCZiFCw8y+?};DIdq1)7@cV4~v!Uu6&xRkh{;$M=2Lr=9um7*ae*8rH#i0}G zQT}@l7jOyhsv}(?>r-(JKz$5n^*>$Jr$DA2>0`L3UuBea1D(@{8@Ppj+{Jx7M8^jG zG+XpPJ)_?S-Tu>)(Eo(uZ2dHE>ZieRWYD%yKMrzee^uBkW#O@S22^Q3#9-vMl!dJD za(X^nrfsJz45bf8aetZi$1=W?Iub?-i+k6NA;+O^rZz%y5^9Av-YE-7OmQ6dGHgCy z7Sd?yEek31n)6rrtp5KOe6!jwW;m~(KT{h2|KgA^%WB)j9E;|Eh0@ zoR0;Fdk-!mv*Z$TDRNokEXkF~zpRX$r)(sYSxyABjSl`;~gWDsMGgeD?A;KcWWC! zss8Z|b!U1rQfNZPIhp9olX=AP|3&g7vc1{`veMo2iE!HS85G$5I4|$%`2XiVoc}OZ z`lYvNvh**O{$=wSCd`SV>Ut?!s+|2MGz@j7J7bm`Yt zaQqqRM;qH8_lC%!{UZC{FeKa(Pe1PBK1$>NYlnn~^ki%sFh1ZheL%JMKbGBKyPl5! zAL4yz>y7qsklzQR9C7dXs_)1vzb^~J9goBqjKc&>!W5L|Ww^)EI+WwR7wK<5Y@b9j z>$t1Od-a?2B^M2t&)IWx?jYz)AHoh)xXr`xM(f?sx;zi~gsf=ty4gDC>+m+1+mC<{Z)nrmw>_bRLi5$J1%p>+8 zbg}7iUEZSOlPEYYI_}P@1CaIC`QNv-FObPc>_7RmePNmYR{uGJbGU#@xPoiw!_)PD z9`R%EvH!Q(fA+iZD*I2ijA#ERv;SmfHv3Oz9p}i>`af&`bp4+j;;a)^i(B;MP4>Tc zQ0S)@w5g^YH#<&|P5k4#!sEQh(){;>Lkw4LqkiXG|JpO$Dwn5??jB*S`jeI(s9EmZgf7SEV ze$puI$?2Z2{GYn6e+%iD|1S;<-1!`TdCI<$+*YCX211D|C;RkJKy;?^{2C+I~-SI7xrKu4&V@upa%Wl z*8e9DG{_5$(iGF$bY9xfj5z-JnDdfIBZGG2QN&4fyy5+z>lyC{Pv`&9ivzvi?1$t3 z$hcNu+a~W9IePmF?f<{z`~O_sze78~cRuz(d)D+1>ffiue+K7J_|2$riM)dTuX>ih zGd`3a`#$@~8@LsJLmUGg_ddBxe=>ie>dT&s{`R!@AKQSqccE#l^zYRFy;namlKP{Y z^h>8M>i<5kpZJvd?c%tPhj@$uZ>uL^Fv`)fLjOCukViLqP(Tqa`np?P4L z&OqIB`cugY#4!ww;z859$Nir7PR2b3;@sjm$9RX| zs<8`uun(1=vbH~Y2uDzZV@M*6cH~j`<_A7WY5vP5b6rB&2m9ZBOaAnG|FdJlNwT+T zbU01+{p;v(hCGK0xP&XXhCbXt|5wZxj1qUz_j7(U8Q1@LNIu4ZzxNGdFyfv<<>XKd z$4KOaWfxl)ke(m=LA|%|Q}uGk#fiQHVLj8;gJ+ z>7KE+nB@Oe#&lPg4GdH0)6n7lb}Hk$$ow+zWZsANUmh=_H2=Sr{ZGmNN7;XRX@0Kj zv~MvkR2;KVfjO9q`B;Gd@0tIt{?z-G(P5F}zOUFzf?SH_Sc#ffl%M1ptiyWL39D7E zZKNmV@y2=fRikf3dZM%mOHG$n=`QWB%QiIfg_CR92W`!z{dMW-Rpzbv&kj^$7e3nm zv&V7A^$*tve!$M7`zE`NI7Xnz-)w`2f2M0pbuF^SIW?|zj7%bq9Y~WIl=i>#TO32*?l_O; zEq-4ion-7oufJ{nfqF`#_>ySaD;&*8qv$^;aT;fE4i|6r2mApA%}KkI+f$%>B9}&Lhd=|jIi!IJkJ&nIjLycio7>udIK+%k$cA z+zgq$Z2l`b6gl>BoJ_CRzkbI-!s-cuGJcUgNMT1K$PI8M)CsW|$6Ypx0z*AHAt#;31gvwVEM@-{wmpW|DoGPkk1B|dNc z-{^Y8&%MXL=L`2N=BF*>r@hMm=F_&~ssA~GUx(Zfe%@vM*L<*k@!iFJJVf1j{oj6j zOpk2>jTrEKWj@kZ-5XMfYxNBlmT_O@;JDZmf!I>JnjGA!hhVL z|DT`PFx@q#N#A;NQX$yMWnjiEh>))-T7hjke){~7p z<;UB;*$2K|JX!y)?=^Exe!+KTt2R1sGqz#}s<8`uunz~&|APAJ3&u0hd;i+A&&u;} zm?M&q=bsTb8OQjfr@96`t_yU;xivV3B+|&B9i7*GgUF)`J?KUe1+=ntEy|b7czGV# ziEOa2IQAfi()b_cT;o0I8xs3P)sY6O8zIjB%ezJqr*Q`7Z~>Qa1*P?WW_zZ$oqOFg ze&kt`1;<6QWwQF;bpD-lGGv?MkFNi-O?Vvh7xzEyx7R>bi}?$`*6u0per*G#{r|KL z^f`|6A#RY--?aUTznJLMWz`JD88t9xtkzIV$%D6RkFxaX*QB~LnM zu<*uL*gy86iA=r9R*-3Squg&pF&s7Lt%X3wwdv~E%rW$F_~`uL363XW3S$5JG;#*2 zzO0;7ZqA}tU=HS@zJdKkqrUtk;+*~zniKkwwZYGK&H^mL5-i1Xti)<`Opg85#yA+y zkpFN@Lyz$l1r!e(f2sYxb(wx_=Viw0x1O!vn#`enm@sKsBc63wkBx|J7;*o<&Gh_1 z_IJPa2kARdWLFCh*xx(sFIg?DW`i*d;KU*zQ1L@Kk|e5r`n0`n2V0g2H(B-(#W74xvYG$ zO1lv~zxRVSq$2$!iqqvQ_wbMR;!Wy-AC2`Gv`)~ud(2u(?PCs+{8ba$)&mtww%cLDy&!-Ld#qki2G2kDJzn#wR57CxMe~VsdnjHqy z%dzw8v%_$5B&xpf#ZdhV<%;_L7{~wQ9>(w3?dTI( zmeW^aHTvH%cAkvo#X2&M4O~yYb+f{nOBG?;or+NN%u8WU;@Qv7v(*EID{kk+4(OTr!$@8_N%B&Ar7LGmm zc&O7C){uBM)UVb5byhzHQs+PULI2mq=R))4Pg=KOeAqF0d^jeKB&trl7^>f2W{BXteuZ8q9xG;tLj$ez;J1@FjWc*xS#^%QfJ?Z7YpC%Z_K`Pm3-x<{t@-`lTgO=I1E6cB^#Ra5(fRef7O9v%`IQ-^vCG`Lq!;{haExHkIPE?EoG#w z^5bP;u&{CrMedo|#-GV6^!z=0+hZhs42oCnZI7Po_PWP7VG~euR`@T=LhXs!VUpu1 zsH0CKXCOIywzizvAvJk+XhM3}?9hzd8~VHbUV$uq4w;!}T|ja^;$A{Ube~W+K-U9x z1uXE}BGgZ|{*dtorS*qW%B8IF2R+(}$n<6H8_I$w{ToZ1zZA<+ZLOH9cgsSDvab_e z)0LOXv_BdD_@26lduqk;mz5Q0o2hI??m2Z3tQJ?_=U)nI$lj)x!aB15JK8)A^hKg zeBImp-+OFp?}z^XRbdVM)X5(_<87ntnJ&(p@Hj^P3_bszxY^lr^h)CkO8!5a-7}N@ zWn1Ik;WgsD>-)UmoVtVN1dvxydsDi4q-(eKgDe(Bu8{<`l5zc>0{ zY5YGu+U3&xyABn*DjA?CiKUvQu zhxWt5o-?*jJOh3pUtutE)1BjamDBU1q#r}+!%kto$B;y$yqd(HuaD68OKX&W()xYY>WTC8w!JAm zh~;V8|EhkWUEnK2LOVT=e)rrT^i9!E;xx|S94_DzuHYK_e;(I=91?oB=^t?1_etx> zlDFb-xQom^{Y~UUJjQ^%(yz`J>mP&Y`Gvks`e1rFp8A6?`&NZN^?zr7=>Kl#|L*4h z?&SYw`4{3G>VLyA5+7~<8{@d+F#kWr|E=Z!^Y^>?{yoe1|MU3&FZ2J;@_#S#fBCbe z{qMw??cx8E?fzFo#ymVuToW(}bwl{S&wL_Gp(mf?|I$nIUyQ*@Z*p&My2k|nSNJsN z%)l&UvgM(I%({o1dzjS8cIHw?mW~4FSe-~g8 zmS8ECVPFqVRzobKMRmA(V%f$j|{=QNG~#GlxwYQOKo9#pFH6;SmhwcUegx-a(2hKc;d$*EzFnL~#}@hjggk~kN_O@=c})H* zqGc!h_lo@gs=P|JEt5CNoa1&fuK#mJJm+u$m+)l#+!cB(Q@YsAYxF*p_Vqj{|L%1y z@`kXu{?9G4A9dOe?vnQr`@l-=h^GdGhmIekSy_;F+(gE81_%96yuadGmFbJu{~IXM)=(f>(ny^>>4^&9mddlrnNPe7kJFq6pM-;{?bWdHNV zD3CKS3l*4yxtNayScD~5ise{|)rjqXJInrqF?zlc$8n8-()uy%WGr9S3Cm2?&QDhU z%A7uOD{>RnG5B6P=y6SwxCTTueHTjWe@SD{3HhI_ziIpx8XxG(M-x4TX0{=X0sgZd zrSUI8neT{qlw9I4waC{kC$4<4q%C_BM|Im(_ zH;pq8PYsSCiMlt8KUiV>S3Hik85w#zO8XLtH?^G&BlE(F$mo+lNoL8@rx-g zf4oIc-g9p^^##!HB7N1pA%!>>zJU~)eM7OYXsG`T$4HF9_oJUnPQWB|+*bdU$Gemx`R(ez z^d5Sl$2b#ZLyNrGisR19sDHL8Q{vu$IkYQl;@ZyH{ zTs$3rqw{*NsQ|{-?uA`f9AfI;_V= z#J;)G_=EHM${cS+&i(BmtC2q;jrXMOIr-+Yw2_4oiLlFWd$11&a0o|GgMNMb$H-oO zLXzyWj$fM0pdERX+IO_^#Qa=zTzKu%q4G6r`)j-HHzv4O8&B^a+5azE&&U5yJLe3l zeyl(L51$S-Z+O=FSwr1v51TZnrH#y#fa8ff)65>zZ=Nim zcyDxQ=^PbW?H_P_r!lUt>i=J?Z+n^kf3khv=+JT2_<@VY517NBC%etb?|EeWz)j-^ z&{Avt`Bv-CBeUE3{rk<|zi0lwbK1#z{oM^{oUH#|UwYGc{rB`{GA+)>?s337@&g8= z9DVZ1P;xjj*T;mB%PB*=HEAr2{W8G3$^sdRqo4wO7pmW@7YUtT-Ts!h5Pgz>ZiJYG`{RSl;-7oF3tXv zMum9hU@qok0Ty8imZD>#{%3T}l+UO8HYdy1=hdYV=XyO||KI!0piP;dMQ()gC&+lm z%f+)2tFZ<-VOgxB=ZE<2v7WvWMS1}}=*DJYHQw)5atG?-UJ>Lj)Y4<$a}s+T??W>^ zjVJwnCI9+4-#-rcz5iGG#KqEtUvO#G2t$G9}n>u1NO_`vHc*XZP!6ouZFOT zz6bko0Ech{Pxt@nbR9G`NI#lAgEVTKa|}tOkwH81D5AqN??h>yJ=uNvgSu5g{i?_} zd%8c;+w4Eu7RrCfp&haR;iPy@;|$K>N&mwIdVUKVzCoJ?{R)cv*&XzFPTjaBEcQS2 zkv9!Nk5TyA zvthuGl%*Jqa&)NcccSaE`u}-#@(1eXC>&J(-_QQ5+MTz3 zu{MBtep|(s(__EGQ1K2&j$ZQrfBQk(Lf6~k<`2Yrf5{JU9DVkY&K-ksn1DEUw-%G= zN%z*sFPTE0hP2~m$0@SO{r3Hujcww;zv~%dhI1;d4KRzWz#PoQd@Ml!r}eXvOQOy` z$FUYq)^m*O4=felr*CFC8Q0%f9nbrwzMB849pL9f!WzfxupS%HrS6qSNA1ASY3@To z-K&UhdXKu819qH@#nMc8#sr zfokl+9u!)h4g1LYvHV{&B02j5pE5O>kIA=;bM03;=Kv1j2x@Q)Nu<%CpTCo@-Non5 zujOBF;9u)oFOWsPb_@Tu^(g-r8MGnW!2jLL|3!wL5zk%US38+UF6A5)6G4ov;VGZ+!fD#Jj7${)mnzp!uSHMfcbwJhq|K%li$J<&p7hs{GzWrW`kuY58oD z|4hL&%)l&Epztki>}1Do`JXS|g?z2N*&~l`U|+VdFJ#M2`5(vS^~@dl|FS%eT&Mh> zmG@KLTb6z7(HD8tTH@l&l5s7+1@yRoz#_71toYH5o{7qr=ad1cxoz(MFUrEswsm2N z^OmAcSS^;*lWV07RqShZY(G*4H6yi3T7|E4PG2wv<+X2E%ceZ6cDx4bupUPn285qJ zGEd~hfbg>+>LJ>o`}y{L%A}2c-;AxOEGr8I?4ZXzzpBYy*n>Xxm3?IISJZLHLpXvO ze6)YznByeU*t2(J=zrJzkKZVp{#jlT*7vX4n8+fkKBZmf2W8>l_2J>iICRrmgEI$* z!;h@PcxP}pB0v74e7SexnDB$MBg4M&W5N&Bwf9>a@STZ6!+{aQ?Zq}Ue82aTVfXk? zhJTp+$*|{vu~y>F;O}QX&t^U!DsO%&>>T*1P<8!Nq597A;iTVA&hP--5bIU$2E!Xgr6PM zf7kh)@bkUj3C9k8JJj`V2@Tsfh5Do44#~H_6&e%Y^38rTG`;?EH z4P-_Q3KPUt^@4hexjvKVaqq;ey3rJRJ}Yiaqt8I`j=0q$O6xDxek{xqR`dFxP(jW? zojTWCaz1LGlP>2cvB2>nG@oT3#8+C!p_kog;QzBFjXnH-eqz&j?SBik{~?Vf{<9Rz zu@aSEw1zXe2J6sK%l}XD5y*TGpMdOfTp&yS|5W}zj`K$|XyZda*`KEUFuy_^8?hOg znFGUCGOJFWQ!m*;uf{HPJrF;-`Qbf?b7+gJ_!WMu+s@uUGG>`xvx*IU?K`3B*WU@% z?|w%cmb9#pCfAB%d*d1&aj)+~&OL&EZ~foxJGBX-#<_3Z=7W$)q>;g&l>ge<{@wL| z)%iYJ|7#aV9z~o)@8{G*8;ym&r5$?o8=b0A8}9e0(zX&jhe$-!am`5aUTy+cX3Oo&29;g>B$FM zd_!BpfP?Z4(!;(Rnvrr&6F$2Cc&7EcVX$+utG4QY-Wtm3RbNsTmaz{vw}w2rA8b`W z+^P?FYbc)G8iqP&IL5iwNODZnF#(fMzxKPKVcB;>hOY6hKil=`J)>NInCtU_Y94J3Q(R{nW?&ZT#(vlO z{of50^kl7TV-9^T(%Yqfuk>fXt1sxgVZN{hScD~5ise{|)hMmEq+Kf3Q`R_Mhe~}# zg%)!;>Gdh&f4q-IJRN^FLOuN%b{eI2gEC|5oVyuYu>;lEg$_Pk=T2i_b{qdgb|2>7 zkp;)*^M;m9#{b~B^D@_sdExtI$sF1*3iJK!5zjsxz#-%g`hMh7dF_GUkat`}^ADGzz~Q z7|xLAZ~>R_(dT!?@ikN$k9>o?g*b<&pRD?FS!tbtyY%~bh{qW46W=Wcqa1za7YrqP zzhvD^a%2qq?A9-)lYwLg}Hw=T=)_5|2g zt6oH39e?|UZ-?ytXXALt_2{qsZrDgRDsz&)oz3*ED6Q{uSp81@Gp_HlLs-pXYdVv= zQ1`0$P3}YOE815)qeb)sID{jp!7(IJe~@2+#@p=3BX;9Dzk*&`|K}?E#s;;f*f%nc zWpATL+f~}%=T+;jd(*uR(c z-+!Ea)c9O%jsIo;{2tf8d25+GoZyGO;(5{Qp3z3?e`iojZye~phZsNL_yU^g>3i(_ zFzN7nX@1;d=OOvF^tB4_k-i4!;iUgu!WCRYA8z0l`q43u|BpDPHjk(CKh&{fS=-_n zJ$*jAJs;09iyXaur)TE(yW+Wzhj@%wH^|P^j(0iMsXSYkDk}1|4m~A zgw=>|C^;N;n~Vt{$DnqLbU8OU**p%%6VNQKG@7PM=P2pF_+kIQ?=^|0>GI`lb!0L< zS^fMhAN|AfYm)y@L7X2{_?EE&^rz$P-&Q~0sZ2QhfsbC;u70k5-?CW!f0_C}nc)Yt zJy8Egj^5s@Uj7+=@h6Q76;}o3U@qpP-+aCWWUu_Ni0reMU*UfmBS2q@<%nZ>y7(0Z z73r0Zi>_5b54y#ldBzwU;cKuC>yf+a99;Q#tIa{#< z)!2nS*oOn?7{PC!Z2f`pANcXzmp|~^3upQ9r}*t_2Zq)S{CH$u;kPg2x6iizz;u56 z#DSsLoXFmv}V8&diD3LuV9^kWAr5Ye`QXpxq~$m zy(h;Rv?GtYM<3LOiuAaqVdMF~2q)>Mk=~*{gcO=iX#eM1H}Fv#ck&bYsBulXQvZJ^ z|2M0D+JDaA94_DzuHYK_(9xs++xYq}zJH$V=KJ@M1;<6Q0|J8{YT|(UY_q(9``(A-w(?F`{n=L(*H91iwxS3MVu$xj*R;lC!PtI zgek}g%ewDr^t`%f^#5njXQAl0fF5yoqe57X=P-wyi#qyzasg_e`_TVCq0P$i5;W7} z+`p#t>;p>U-)8fX<@cmK+;mX>M>9Pw4=?qfviAiKAy ze~<;oMY82R`5(s{Wt(rR5r0!WcBZq z70;H1ee?r3gd>RScGZx_kVG1B-f)~poT2AmRX(6S>L`v^KFs{E{vzE)VJC4KXVCvz zc{oR2z$Ntk;1l5r*=ya+Yw;L2a0~s|rj28}G{?O-DnB*Q9vj+Bg%_T;u07fJPXoeZ zvR6CwfWy9@e^O6*&D!?#!6?U2497@}!8lAnf0Ob5zb~_&$0x!h$8rAO6mlA7U>4$< zsTJfL%*A}vy)OSRRCdzing)$nL|=mRc=>;({Ld$7nl1ky{LsFq*ohwHA3L6+HuPBeKV!~u}=&NOU1Dqaqj0zvV$G(L}{H}cE6j=?|GArX77vI{9BZ#tvIf1&9JRc z?cXNt|A>9mtHrYhaok(g=h*qb^X_JUEUXh&8t1k_UEmdV4?En?df|Ctv0c1o)yKj{ z$D6SgI}qEzYf(*)wyzPp=z9=-`j4(3T)$ZQUw4l4V>@co8`6(v&n1n0{&N6_a0E3t zh9uJHc)}UTweG}>frSU&x`fQy4r=R(2*20Q$esw^&M)u(bZlPm?{J%vWBlGfk z_hoqt1$yxW``^R=7yt2{`~vx~ZLz$$O#Ua^9mnxM{o<+oxHUP+`*JgyD`yqWPk7?W%HaSWKt+ zio(WW0{WXi7ABEXFby+M_}!<&EV2S~F(19#289J=-&Z|%b&*%?*ZB*!bz&@}crT(GI%J5e90sTJ>VX5<$VHl&Z z=l}H?$GVZ;`#E)fvPPU+$(_b!?I5dBchkG?H8z2sbWe>N?DIn3hxBZ|B~pvo0My^s z|F6Hiai;!fG)>emjpoDppZ8k#-+vC^5RRY*Z#|G^{ioX=S^v60|3A9s>Ay$!>-z6s z(SQG{{(G|Jj`{x&%>O@Ou0Ps3_5UN6(*Iv;e86-1_i-EmfquF{{xEyw+6KIOT-=Q;O!&d#m(4jQ(b=K|~ZYTq#T{W}E;7!A=_Y#|7s#UTyXC04FDY|iYJH!b{C{=Jtml{W{Nh}g zq2l5esdzpCyLIu6F|14WQhn^Fb#aw!RqOyU_ zoljqY;t2V7viyr~Z3v5m)$BH2hg^!dcgAvZCE^|$jq32p+u9r*uR-(85A*Lm^%Q!c zazI!od_6W|Gqz#}s`1hGgI$jI-~cK=Z=5`tnQVU{vf8+Vs=u+$4Lv6;i(~YB^zAnB zvFT|Pd*og8bjr&M`S0KM4GHi2C*Ny4AHBw17Gv1gjX@w!;|%)0Hz1rNFGRg*Ot?f| z!8P>Z25zArccc94(cwPX`!A!zL$dGFqr+oz&`-U8lq0Ua5c|GruB)SZ|Fz^$;g!#q zg*tj1yExo&a=UUsd6RlmzCWlAxIsC9`myXEJJL9wotVuokfrf&Y(yOYzt8ms_>SfasJ8pe>O0S+->&ny5G;5pLfyxyz|-uZW|xK zH_txOesEL$03F)nJIAYEOxAutme%bj3yzEAB=;~S-jDm7M$W)2RA3J3he;os*(mO$ zc$>(SIGbHF-RYVGjd5_^d@R5sl>BeuOXzV=oh~+WDSbJLud}}kKlHy>nI9pn=4Evl zat)&Yy^gHpBb3J9d$w`iow&c)M&ZrwC)Pnq;|$5C>n}Jbj?-`2;Q7~jH!1Ik+;&r0 zi?<%h&!5#E_&Mu^|JGOsacxD%Df$1b{7>eu%K!AI<2w$ke_z%9H=h6Dd(MpIe;|t- z+EE(UrarhsT-7L^;$x$S{g30^GKU9-UBdTZAL97G+(7MHR|kdzj!WzIvh9cHN6@@o zy#b|lH1=vY6jsA0K1L={_p;x~3~CP!3~%lIKf?~RJI;9rHm zKl&fSE)@NC691sw-ShhB@cn~B!|vxshabNAAH%-u|1q5QTmSDq8-8@>W8r{z_l~gr z-uq7`x;N=LR zYqq_BT4zG`1Q$9Hib4^jDd<>4{euf40k^270eE$sep zjQ{(ZHld^PKL(>5LopnMz*qQ^zF+znjKc&>!W2xy4AiLq{0v8xy+4(Q*1J|8X8ElG zbMV$dY1kXg4P!s`-!&Yz2Hbvgw8!(0XY-HwmCZ<_Wefk7uYP(+b;rH{&Z*fngdI><+GbudpMANom8d<>|8;IMWgUOVYtSsNG@7!~;XaNJoUUK$ z^Py$!=cHqLSm&JeXiL1Hon~U#NZ*XD*nw*7!XE6y&t8dZXxS%%e)QEusMGFqEb)c# z^F&#w-J)MD@x_pQWp=2aH#0O`pQ+598q!;)hNflavL!yJ{-yoFJ;yl?Ni>acUuYiU zJ|~+eFWw`l!7(I}Mh5NZ7^VHibMA6(p6s6O*~s&;kEeK6`@>@IXPI=8rS~)4vvuB+ z_mkN^B;>_Y#A%#C?x6i2?9EvCN?FL$yXaNlD+|@#>h!(Btq0ii!2XZ^`_??R=hP7E z#%d>|*I9e?oH#C^_M-0wNnw{9UqN$rh;nU6X!4&_kNqFj{~KQ6e=Ji^U##5Nq+X6v z`@`G(4gcxG4ctQ9yQZJKi~HzsPUk9hjJ5m^^55(~%m-=3ab(bjEOKbaLvcLDfS<`% z7>qb>r=0Awu0i4JWnn0NIO5o!k>nVR!vtjZ8zVtZ!3@ko9NUm>RWG9FrL7C4_4j_R zO^{x|ll_b1evrK{TL1Io=J@O9FO7#dul&Fq|DTKbSb#-Xf~DyHo-qqAYM&dXzT|l4 zFDHeS_Nw??DJywc_I7!2K&sX?^(q@uVtUfmiO3y9KXuXy~zJP%l>a*-?y;u zWDMIUo&z|9Bgj2)j&i()o_Bv;D7F8;DF4Uy|7Y0W1pk+edv4b3<%hkj%tBp{G1vZ= zL}~q>dF(3Ejx%U}m3~h>fK0uj{?Y3GaQv+MJK_G2eNX+w_2c}6j)B_uho}RR`BCb8 zWRK$lSyTsX7iS(toWyCIiMns<99civ^PTSbBKE^Kop)U{BaPVpf5CZ|a0StlJ`+3{2}=mrS|`Yu8RRRzHc;-b=`^X zW48NnoqyB*zfipz%}8Ug|CFQfdu#oWBQXZ!FaiH=`~Ot=567p=f9mUh-u}N$zTM6@ zd5ez{#{fd-(vwqRJ*ULuYEBbS1+02yjke`cv+|* zdw>72Fo*09>UiXQEWjcx`F|+8`#3ABa`FFbfI%?_!y0R}!9$6OhJ`g2=0H?bR8&+` zXDTWxN-C_WsHPfVaAr7YXlSFMqK?yy^R(w_&))O6_YR8+3p>=8ii&EQLq$as78Vxx zd#{af)H>hu{r&NIUDsOoeXsj*JzQ&DYu$J!-h=n!L-+_L?qN@m<88)g$;m&N8(r$# zLY$xa8F8P-25d(5_2uCPvbo7Mz3l1`|CxG|@GXda5p^Hsmq3rc710kraa6w3Cg~yL zKZY6{vw6TAar@kjX51I|$K)%$D{}n*_)f@)v*j6b6ghkyasJ=8$nW5L_yOX-h2k2o zKceT?dzr_7js6RqUVos(w>&rY|C(Q3w#c@$p!WsBMWi`oB+Lk8L9i$iY7;?SEgKRsoBdY}2}Tg^}3Zhkt_Tg*>Kvu$zy z%b%t7SNt8Vg^ToEEeem)JF8sVnnfq`KYQx*`U91&b@ie!^{bx2w#DHL@+n9=_c`R# zkQ!dBU&?jWEDp~Qz5rR*6t63PofWRH(e>?ceeJH#IR~0tAJQ1a&_VZi*!_{C$IYLo zXAYP@zt8-6&+tNNJ_pamWq3ZG=DJ@NNxd%neEd=K*T|DMY4G|vAE z{7}Vx1RujE@fmy`|1|$kn?Hg4IvwhX4feSXn{fkT`~N2L7TkuRQuVKT^naZHw?+RS z`sd027*IE*kfsmn7l{3TcSz@MG~+&`E7=$N_B!dK%hjLgp(l|$E}tQTF$~6iVm!W% z@v|3&Z;=yii^F%wTJ7+S3$&%*v^bxSzb7vHw`rk4U;XCSTXXbmcEESF)#I_||NWHw z1t#zt{2q1B)8-)mjK3nz|NA@nDB}8GQ(Kg+cnZ$JsrA2}MnARw*E8rBphp?d9m@g4 z{`U*TJqORlWq3Ybh!65j?ne)XcTn4EY?LD)D?SyZgN^xj`BF3$gpJ`~Y+;B~@p z#HsmzZ=s)>|MzzKJMsT_{@)MW<0~AOZ-4XUc@O>l_z*sVe>wken>q?P{J+fq`$$QJ$g7*m&K5Pj_O^A~4nACzhzsDqRC8&Kz_kfsl=QhzQ}zgIl2 z{S@03>fr7wWjEgM`8~e=xX(3oq9V9dZ%mYu+P`& z-$KuEX(NHITK-_-;{3nwkv~A}yZ90LQ#^kD-!FtG@K5vqGXMAaf4_0e?=gu#Bj*3V zl7GjeI6eQ5&Hse?e|`M(@c%OZZ)&UjgQwsev{s*-|MxWdsrfh0pkII|&;P4kD}Uoc z+cp*I_k7pc;koq7Fd5g@*FG43mN|yP6aP3PypVh`8h*_Fwzl5?=KuFn`!2>S@M^pc zZ^T>hc09rVZ};Q+_vyKv?5~vaFQfePe3E$b{Ev4^BlaD&pJDNh@7;(-sC-x z+5fNPAMkPQ!LOJbW825@Nqh#;f9vz)23&{DIOYF$1O1f$-%a$}@X!8#Md3N(Z^0e7 z8~OF$*=f!6Q|rIy`~Qsp^c1sAkzm(#VWTznYpt`lsmWUSEi=P?j!8AKFU_^@r1v0+ zk)`|t_#=;z8S4aO$sE7xHg;({`?O;f`?rSuyIR_*b3%8<`upUdV@Hw0*YPcU2j9aF zFvQP!cpd)(eg~uE7{7zeEa@YM-aY5&Up$BX;rhq{;gtQ-RnkWDA^%g9&QI|ROrVvG z+;Z4jYxGm=5B#1!i5`6u-R$&J>ks@{+~f22UxojUM-k`W$9XqX@AF=Msqaoba|Zn> zI0s4jr){nK-=c3;pKRwgeY3uqgm0(&nEpT1-J_4*zR$n~xDe05b8#7l{K3nqW- zKZ#_1{NYXd^yzoU_>KA=$#L)CK62vUJ>MvUew7%1O;JdaZB_ao(1Ff&*EOteMmN0& zbzS^)>^F)WzK(CZ$(l6u%aH$n@HTa9 zi@LU5{pnoobJTG-J^!Abw^PaIJLZLWFfBjl+5{V_5loF#Mm zCi3h5m#M$ii*c^qfcwhNzjtqg1#$kp_5Zb}YE#-)c%w91o2>sY{w?%Q?TNVlf89NN z;e~tR*nu`hLi%0CC*CP8{#Pi!{{J(**OdPuiF-d%8TY!}333`SCHweG;F+=dl6TVKau>eSe5PrlT0EmM*gN9P;yj z_sRcAw)_618UGERrVlo0f9%!2FP)ok3vNT}5%uq}!f*#Yu6ffTAKy)H#uNO<3apPG z_lfaHBLCg8M+WKfJ02x-$X;X)F8LjN4-G%&SD^j<1Nx8fQ^fiA6aVIaTlDd7TN{s@ zd;vdp^7m24pYd1x9giaRzfEoTec~xN2dDbqo<<*C!8Tc@?}B~-qR)5y?@eY2`+;nm z|3CCUsp$fF)A$>`W2yUE{@DEcuef*N9^tME_etI-%}&%^WW1eB zVh~^VZ8ei^DbJ@~I$Nbfc6l}l&!zjgbXhh&TU%eeN(z2E%zvBLL(xE9ls z%irW&_j9JQ%eiv#yhzWHMwvD9>7!Nde+^#*`u7lh3Qzz4V3q$fh-)T)ME(@5t7nH_ zkP~QeY<^t?Y5hj{_sIYMV6S$D^Tz)ljKB5#Fd6rM*ID7u8J}<7rFI z3)`2S7rs*TjBvMWx@*=2VTWpq^W9#m-wCBz;CUa(p>!RO##J|%H^8IoBn+h%q*Wo$z=b~Z3^TXyX&kuE5 zE(`TbFAvwdmm3P7rvE9g5);r1HeuKM^gX+Iye%8HE@1>w$0{ffSa?6=t@;$MuHVliHUS7Y-3%nq+3 z$1n1a2y)^Y{zc^5@lL!4qieL!D&=qbhmb2(W+79euCG$&9JZ#j_>bX}_zcpnCnf&# z^yVV}{~WcNRJH8k6Wlf3>j&4A37xKcU&Va?<%azJ>3gwTAs& z%%-MyCR`hSK>rcqKLNUhqmN`4ek$%4n80svx_(gi{a$zye?|WP!^V~VPJa|rJLN63 zZRP*7UHgB(_G!$QN3{PD*O-XwPsjHEQ|xyRo`z@O0$hmaU}%r=|9!^)F?!JW|8e90 zjq11^#!ATEW!nGfuT=kIfF9R|OWSYosN)K!h0z()!jShd9N!B)hcRT(ztwZo*6B@o zj^0mPQ)y89YT$_Hhz@P6&I<2wnR%4xKH$CWisLKZ<34j6UE^hVK3<3y7i2uKPBbh$p`#i>upr2ZQqieoBxH{f`mEzV?5b|F>8BAN`2; zIxx%gM;e0|H@3f7Iuo(4;$`QA8|ZQT@unE}HDhDsZMXxI#sU7I|K;wee}L~lc^^8_ zgXFMtkg0a`bE+&lYV(j-}moY`q=yZ zZ2jbN_J0NYf2HfA4=!Q<&ohTWI#avkM?3|sbJ+jl&!Km+_dE1wJdOSgMBjn9HeiB` z|B$;t+=X}!o{Pzgio#{&^YKDVu+v^ljz81C-N}u`+9_<_*jKe!xNWcezhC~JC;ua^ zDU(1qy$4xk{;TcRpgp{~%UW0TH{vaLJO1VTzgqbpIrLV_|L8xgd{#aT$oDBSjr{yS z_3t~S^B%k(A41x9l&Y=S<@-XB?KT{rsgk&Ar;$Y=wM zVf^Fggh}xeL*}S_%{ari`Fslh6@QQMSNlH6aqZcu3D@(fGr}3i?aO^}CDD z_rv!&ro}vW=XoJsjF%#vkOmghM-Mv|ub{sgvCWW0=Ag91zfN4Udh3nkThMywjPQ2y zooJzVw4a>U{~qD@qkFsSM*cqoO=pA;iTemXhEL)%_&he?I-D9`+f0w&$PMI8$o^MB zxP@%^CBGzg|84X;a5tK9A0~UJg-$Y#-}aCb@0u2pi;6|qfq`Y^}fjb z|0l}S|Nr414fgvMzJu@K2lx?wilLqI4@NMGF=UWM4*B)}=gR+cylV{5qi^e}`sWwY zm_TwXJDZI2et%C+A}#LE9I_B)NyBYJI{;!cZl5lOyl^^`$ss3{xm!T7vMrX2hYW2m;wEtW1cDxhs!TT|E%=!n&_wy{1r&r3`OXP9nYS{n9 z>i<&pKbhRE{@<@`MVdZ{iGO4NYXijpn@v1*`V;&IKP3H+;A8j1`*%Ad<5nok5Huhx82l6r2cOk8wrP#<%F-iQ8Ro7Ma2m+NdYO_rx_< zoCrT4e}vZMC&Ew3U!Y~F`&#C{PflXCx~|{YC*?ZR z`%Z-4(;MEa-o=RgIVxX{$)6cAD;)ptIBB0hw_Aj)$o|zSS@OBb-4#1?OOL z;2+^>o?I^ITkp=i`NVFRgjY!8)p#A6hmVIhk}Z3Wht_?^L*0K)4eu7t z^1+FHfSrZD9p7?->=y1Jqt9RTy=tDpo?XfRyku&4pJP6V)>-oO*QWXY)XhceXe0{M z(WUH2vfDAeyxhNi3i%f}ZiF0l+!&b=&XT!I2p@KyYw-zu z8dE+0&ynk~32jHT4~{u^yK6w#uxmg!y{E-Fm&habyB;^a{qYd_Fdo4PqUkQ-MwIM=E55A**2E7-W`+{32b$F@Dd#wB}pPY(}BZ>HnU#d$a%v+-

ETl8T#hSnCE^-@EvUQGH^&z5K#+$9xE?oR@-NfE&1Bm#{U0-&W0vPXUz!W0J=ZztJ>;$SxgB@mUYr^y zi}pjS@X$f_=V9-d96c`olNsSGnX8olJ;y%dDE&xc0P(-f@gIYOXxpp)Qy;~=*(p!O zyqO@o(NpHz{eZmgJpD*x1XD>CGmyj}l? zv%*_^qs@D!h9BC$<-pYN6Y}S1756Lh*XT?r19rN%rjunr4;jmV-`Vy@JcNfa`R%FU z5%L74^m(rXQ^RyJw!hCJqpwx%I^SDO2s4Gx#d(Nhd)aB~c5=4xvk}Mm8eUcuHd{|{ z^r&~_{bX9aqhaqxzRl7vw%?_=9Ci9Xt{|_(Rhay4L0CkNKW}3uBD!;)Vg7#&`(GbG`vLa<4E8^|O4$E}?0`4-Lrj&Agtx5t~2jn=lSKRP6kv z&Oe9!k1>5(r{}-05qsJAeXg&c&7WlZ571L(?EeMqf7~psNoyhAO5TpUa4%ZXfo}98 zjS*xq(N+}hC&#~N40-3w@J;#yco5%5dY60LJ2U){KDwK4zVJ`zKSxe~LRMVHabskh zyZZWjORu&|b30>%<$A9&7$e)+neg&??Rak_V<5hSq-hembZFmRXjnn%KfCe46|mHf#GdOtYp~cxG5Hyb0r9lCQ~$jUil5-iVtK z%Ya+S{P_Obf^a+iF5HWJ`Fu>?mnSpGuMe3qU&OW!bR+wXlF(14F@i~bK3OuZ|9?L@ z@dDo(`2Ze7-3zAWZLGXMpnjz|2k`si;~cGA?J7)kKhERBz=oG3p0^l ze?Z-Sdi{Ya^}G5%tNzcSS6Y4g68g0jlG+Rd&XGd8TK@#jmBx8EAG0xO9B2G%Gs3g! z6JIL{v0lHJKCXXu@@JRi$Euo@pZAm$hNi97bg9&SvHfyffh&<-t&UwcJzPZ}Ro9Mq zzKiHDN3PhjK}Ovk=kLe%$4%S+5ng568}Mf2+aI3I+vuH#yqj_U1@Zp?ce1DVoot&W z4r(iSws)_PzdnsI2*f zZJ$PM>C~_hcUI~rI4VCEJXL-^FSKtxFSMbj@p9vN&oc&eZs^*6ZfI_q9qvVwzKBPZ zXZMwy&-di~(Bgk_U)^zG_?$G?V-p&1J#NI!xE0BRmxg}yC7vI84_+Q_7k3v@1sCay zzcjQdzuL=`C*`h%?3%BvNhxzO+CSvI_H9K6x^cSv7}h5#JhYtszmomGR{#18HbKeB zv9~NeSHS;qkFsx{GVp-1kQ@+>YxCv%=cANH-M33amW=E8-%oxMldti;{kye4qkjOu zzX^U!6Tx@zRDJW6XhX?HUAkwR)c~pCe9{;iQL-Hs1Ir9B9mCEB4%42d=xjRDs z%C=wQwHU$g@JBp^hw%tbpeU4ZZAoL{ z1C7S6cNpI%2Ze`H`~xz^u*p&JWA(=FE36M-et(WkmX?Mo1M)P^!c3&?pK|=U^wHgp zJv2LzA5#fGO+d7>@eGY&qj;w9SP(1uH$0i`2Vo( zGS^-1KBS)@+l)E4(>q*SXPrW611QZXeOcL$7=3lXl)eKpA}6|4aQaDBLmMb1f?kFPGM<&@J54_$=R{|1iug z)xTqm#y=WfYug*pI;%9inS2{sOiD>s%nt9MzZ>tv2azuE{P1D=C^@olcDR=Q3FO9I zn|dhY+QtfIhfj<99M+@Re;_uIE&EE@|D~aUem!nP$5GcQu4|9$DrXavX)DZUzpg&{ z+@lXgMac)j0+2x+%TA|&I?k}Z`@=Z-i$@lwDjuU)C`+K_ZOq`3McKIKt`~NCETXky| z`ToB>j~oB9-+(q&3TgUalYb6f!q?$N{GL7~e?QZAF3t1NFm$pXpzbASgxSK+#>L2* zYkVm=`CYyd-{wDYbV|5f_zGN!tI(=EYB@M1ETYFbmC^s_<@8tK3G)w@>i=8C{=}(u z#JlJToLWcxEXTYSZ@`=JHoODx#``d&P9MeyMv-6trcWJ*9D3DLed?)xGI?D6Po{*^ z`_=#IvgkkfLFs%L*Wwd6{2E`jN)OHP(A0 z$MxeR_qrbX{rDyxK-zt#@F0D(-Lq+$9==ciA#&>bthfxuJfF7V$NV2kTqpAD@|Ao3 zWvjr@UQS|{1HRilEcWaFC$-{ZeJ>wauv#-xt{-g&!3EIDh<$6rJjGW z=kJ?t*yy?Z*Oc&(w3_*SJWRIq@$bYV^b?qp_6!d54?W1|Bl>=j-RRk89XoCK>9*H> zL!MeXHJn9{V+k|Ka}j;*a)`cm8N|Fgit}tcAG6WAMf<;2yPw|dTfLaP6qloCt9OP3 zx{$Zy6H~+O^6?e6Ux}+wKTsYPkuS%q@LIe9(T6razNdV8vv9U|YIqyD`6bGypPe1v zO@ALgh!5jhOulP+_yqZBWPfXJ?w6;B&(X*EBCRJkp#j$;xuPK4NZyQFaXX@KU#hww z+(kdt|8_6E6}jWehNH@eLhlM4;=0j~x|f|D(&PxTn3UJ=C&w?E9==IVu;m)Q+5lyzw>_IVf(M~ zJ3N3t;vqbYwjJjG$NqtZ`Ue&m^JbUk=l}E7@8P@NTf)9azimkj>}C7z=l{Qleb4S4 zI&OZt@JPG)=@`qfeOuVT+KoA~t(N_d4s@amiLLDa`jh_ukGO^tm@=r1fU__Y=i)pJ z?O^{W*#9l;f42P-{y(T6qIWO*AN@#TAjSUAu>Z+H$0ga<=Sycco{fu3*qICeHC7dH{i{98=g4-Ogg+AXx zKbc&v{ZZlDTxtA|926e$%trN#43Wc2AM3aAY{uxBb)H*+XN2Abo+Hxq!6MIbmgk6$ z?XKItZ3o>4y?crGhOQOfBi`W}-;MX-gZMD6#V7D-Bv%xL&ynk~2@ObBS%09qC|pk; zRacG}GrW<0GjiJ3*;yys*kkI?Tg5dO7lqr&yUfvTd#UKcS9nv0p|!nGo(KdusUVv7>8`oLv7gYYpFiX{51P9nfH2(a>(? zFY^Cc+bd5igEPqc*EA}B#N^o_JU~7e_4k_p?|%i~r~eRjubdW=tEPmX&{Jd@Kd1i+zsAT` z?bCYKx5M?3S>d>rWOIUlpY6ZHAMp@c#kJsJdYn7ov1EF9gnj}&!rj6NvTMckFl9)- z#95e$)ArHA5Y81o59cHA|MqYCc<9f@#kdr0>frX2{)xeqh~(9`?jg! za{FC@D{&PT;pKQ0hV*j{@B6>_zv&z3o6G)}Rh_@cjJBdAU=$1@dkGoo^wtH&9}4vS(>qsr{D?I--oZP{Ou zEerUw;Meru;g9Gjl|PGJUy=Nol9&6~=h^{z`(uXsuOx(r?DsHEAp6v-!<1pq^=ns$ z&Az+o^s_J%=i)q^kN<7{-)#Fl8yDkJT#hSnC9c8~%U}6xzvsG3{z9zl|7rf;lgi)m zuj(6;&cvI17aOk*b7H`0t@ix2z??&|fe;-*}qaVI1gb&i=n9#N4 zC*t`#d++67dFVb|9(rc6?~hrZV~?@@UF>CJuyH=# z8>F+jOZ$0jS{PCv56@Re&(;5RNdMO{bu*%GP!7Gyl)h^De~J8$0r@&ru6`v1FEyrP@QHf2h?d+2cer~iL4-?@aNkIb$1xgB@mUbLbE zb-mLp3XP;`-nH(vEA0rO7PrkMiH9g>RCRKVB3jzq=?rKz|TP&pAK;(f!AN4}4$T z50Rpect)cs?E>MSBkS67Rj#web-KpaiW|YN@N4`If5bz082R<{)L|XSuMeD1_tdI` z*tc;kytk2kxr2R4CT$x~pT&7^aUR^D@R0c7?drf~#^3RT_5X#tj6WpUmU(+}2Ak74 zPoRTs*}11MOc_;GZQO6X&8?Tv9vaJo@>Vjb|gR&P*+x9xkSjE_817Y4!sl zTq>MjKUaO4DPe!lQU7i+cR`t5i;ep3F1Js9{$KqW;R<@^F>RA0?5;EU0ewN6e80BI zUTqYzYy6CGrG1h|_4SjBkghjYOTG#z&n3Su;@X1nTH!Y!yIQ>J?ejff z*V*YhW1A|mAP=KcG=dv>^&-ii)%qs`b~`*!{ByV>^2nEZda zcdFmfqffivK52{~i~I3SJb(vr^8e$H`9Jda9n-$aAiG^V2fbVMzb`#G2Kg`lKfXkM zk$2AQ`mT8yJe;xM`&Tf}?w#h@}@5KEP58+`+_$ zluJ+5_8`aqUH|C66@`uZt7{SabY{kQ_5S!h=J(&DkN-_)g!(U^9nKX$`I-yDd2#zi zQ^NV=#K+GMv&m=UVobKFlYXmRd5`rV&o)*y(Zo|Fays2rtL0@LE)$5-TxeTzz;gyJa;y zl3h4f%x)}YN0K??`Mvr9`u6D`IG}%k91u=bu_H0)xa859VU=`NV-40KU1lHlWEFjM z7WO4QBI;=`uzecPD^o6yvK{eJ=iq&;&5HzRbi)XyU@_C zyRzer zQ2*Ro{FHE;=l$iC#{Vje|M|Y|*fKNhb?iRu#{t|~YknU&q;49{$YbQc96wuE6#8n4 zLjN+p78p=xMIYrfIk?sLo6$d|522&NI)iJ>NwIIj+?DRN))|~t6l$Gglk(0`Nq&sz zpztBwE`MhA|0j2xZH%Qb92R#J$B=G0Tl@FyaGXBspLioj{CndP^Z)+M`h6wtb5`_k zI6I7?Amf?tKRXnXGtg>$w1}LA7RPl+r!#YQC>Ab3cZG6|&D6F1?2tgIxH*`Md6`QCnw%i5SEZju?)+R^c+@@6{y5Yq&?3##;}^6U;ks3a@sh{W_<}y z=s%fbjf4tou%WtZ__hjq<;`Fs!9_9@9V2R2Zu0>(HZK=-#W&CcDN9L$$bM z&Y#McJ{~&^**)=Z$nmnn!^|}IN}+v65ojJ*nuXrpdD?-?(_WvDg6V7 z)wzyIpqt)v(D;5k`yTzv*!LKyWZxrAAH-0#`3Fl~#}e1E(s}8b<@^K4+(Q0;rRrPf z=|c*`Xsu*_tA{i6&gJT6?4<8PkG4%W66nI?|3AFj_C2Wkiu#w`wU^!yjMsjnAnc>> z#{rDvAP%8zrTV{7KQKC5^v`OWC-y0)w9R{P*gi*a6vuEJ1=+B%5Hm1Tss6`^evr{! z>c0c(KQfCr*S$BT|4Ewtwd#KiRG;jdPOnk_Bd)(yB%N8vZhTfKCeszZpRy^Tgg#m= zZ&pnSrSv(-`L?n+wSMJXadG|qdE|Vw&YBVykY#9bT!(aGyLzE;Il7Ojd(dUS1eS_h zhU_$B+F^EBPG5lvRAQsLyw<&KazAzK_=aB3^UG7iO50XpHP&D)sxa}a} z3dtEr8#gZ^XCY<$e5A>^KZ=D*kZqYAa_z?MjkRa6M4EjlwXJTN{+#%}5!W}HBd%?Z z@p=5y{|O3=&!ZQ8=tmL*g~q;T@INrNKUl{9posrLvFj*x9pspB<{y`_RM_m}^(>v|cfd%w3^f<12+&=`4`j-F}ic8L)9m>fih<*=C z$z_Os5F_q!6w8HIAiGYxyhb~{$#tM@ulw8Y{>J$qplb#H19YPY6^^OIO02?atif7T zVd#MTuYSw>mu;2qcI|(h>fh~M;Q62*Nemp7|Bq|`lY@?H_O91Sry4b=MeAPge}8GH zr*|6nk7NH^=v$HhpQdj!XOw|i zGN;_=O=uhW*kVKQn{_Z&k{F=pzwbgHMOcXGM7W^IubWCdbBK_$5oEl0FnQu+}xg<+NOYILtKZjHEhK?3pLxv}p! z|9_+Lg0RLut*fp7Q*%aGOYf{@|Jx_7|C8-fCkw|pgo$n1B4m|)lI`rU->L(iCts!v>@6j%=5w1n_y~!bOr;}}aJzw{i_ft^bcdhk&cc}jxolAbM`$Ivfcia|i z#WplzJ9c159X^ba8O~EE-BS6#K>kOrjQua4pC13S-xKQpRRy6*8ZBr?v-Vsfh7aj) zQ8)L|Qy9h*{A2pqj)?Q)Gq&wSR@oi>*LTr(V-NNs?=N5M`_um3C%hjAu$jFO$KT_c zzz2m7;V_OMT`rH}D1B6Wdt}zsaEyK&xnX&`PaY@73Z{mFo$@OGph9v&|I7@s2(wU( z5|m;N=3*Y^V*w^#&wi-T|0fT3Xd`#x$>Sec>x7lrcOlBL1aYn6n5Q;k$TJ*9{`q?c zW1joz=TGmI*Z+I`!}Iq&q{@Dc5~O8;{NN| z|6AF~Wd8qu^aR_#JH<|Bv#)SWE$XoaTd@s|*p8uu@qPP_IA)X_GiIErVWX2d_I+^u-_j49nfd6`2z)Emt%Kh5B6do_G75p`^U&0`EQr^f57`E^W#7AUa$1}wz~g% z_qxpcuk`+rrVk?JIu1x@9O;EK!$I;8lI1hQVKR=@j=8@$RvZ8OcSPKH%om=+h-c8e zdS*B(?ih}v;C}C4e9O9-p^%=RKOo&1^dj`QzHTHCeK}@{E5_u1o)b#QQp~|z%)@+S zubCRk$b}exD_fMD_`uAtgp756EjDQf)O}H&{u4XXdJju&TZZLWfeKV&C01cI)?h8F zunyIzK`rXB1#NtT+R>qpuoGSM1iF^Z&-lgE)l4Xg)F}93hY57>=X$ko-SBB@}$!`(G#jqmVuWr`vD(|GJcS zZS&MW=qP`z|5<-}9NX`%W+$LYda;k~RR2JcH0pd0v&do$?N+}d*3qLFqi2v+XXem5 zgI$CE672&F&{K$O{0}0TnjT7|(_mmH`sbC>aqH_M*M2=Yu!KDvtxQ#=9uMJ zfeKV&C01cIhK{;_ZIu!7pVkkq{lEA>s`m$7{~BqW@)ul7k3N!BZ8(y=4pP%^y!qK-g?>|sA zJ#4YhAKO-bPZCjum*q{J;6@!|?d}FoIExA%iS(=v}ft z^etZ>`YYFmWX1Y0P_{m#=B`)1t`CF7>qA>&gYoALp#zE|{@V?q z!ucw(606YgC+m|8g|M2w21E84F4+)9=4>$kZ^Oy*{kS&bQRTbxzPD;a=tDn}OE-jp zCEn5U4Iy2=Aq%&%Z8(R0S4~^t@v@G;ok9e-wA>4%Sou2C+&)Ks{?DKrIA^Y@g_4CzBVt}67W8C+M z^=&Z3rXI!!Mr+hxtJPm?-5;6rt+q(B9SPKaUHw9)FpLb^v^Cn%;aPMZ^nXFmCP8+i zXI}_A?elp5;$6bKu?N|=>Nnn~F8jljuuphD4qzMyku?W%;#2Fxp%^z*6AqI{a1_UI z90j|4|CoUyOnz;Bm_-(&1abavDLDsoF%PHa|IVk6%4Z|$`|L%^c;VCYe~+^Dk|PeuZ-3`d{k1rboY_u0Ht}68Sc@vewN+B9*n#v>c|5KSR!y%# zu0+1KJwuMoF)ksl9$T;#+t7&ZnEb|uu!9``$NJDjPW)+uhp3!dr@i4@Xc5T`jUJ&lxJ}oqrObhoM(J$$kVZ=ImbeB5nfVzszB6mdnSEl|iSO1gABK3c< zy1i8WPY%v7_pRxpA<34`_rFc^+w zeeAR0u7jVoo@8~nbIupS9Y?+pZeLYn{H`i|W&g*+ZQ~ygw^n~DG_L(r_%d!utntsd zYr?kbYr>aGuML~#tPR&4{8-p0?fqz2eoffixGvOfUlZzaeapIV!|=Lrz_xMRSoQI6 zQ|%YR7Z+BAt+?5_ZfUOx2W>lqTh~;F+dQ8ykAF5a)>K>j(mJ~OYY*FY1V?cUceaOc zoNV?Ft$V$XCKPp+va3xGYXxG2pR;!OD;XK==+a%p#eKal3({4>o_V;T0t-)GUVI8WG z{ma>5c$sg|GahwaBVh!RKY>N7K^$Y0R7#BDG5-} zjMM(s|8v>5?ARnZFo*4qGzKxWi#^Vcj&tBg+0De9=+rI*VP` z9M^?z^k9i&mSP!}V+ATuiIo`Ah8y0ej$zM_l4HV|73u=y&?}8T^w0HvyvKogo~`qw zoo`V8#42g5#u_v$Z`P76Ek$ey|M~kFpE~{b>%?^&QWl|$&p^UI0K3t1fIrJp&waV_ z#Pe_Vy;aBKmrM<jY1MRud7*qQ^CY=C;(Yi-Aq^Ixqr>ahh| zu?>ybjvW|U&Awl&jX;jpoSgHQ5zY$d4ru?iumKYK2gs!M(f~cRLp{@|e_*A!8B;=& zbXw4k1ky$FDEjF6^>x|FDf%#St~V<`XAsxd&4}BHUD%B{?zo4H|Nh-eMw=q8`55~J zq8+_Y++pd)QKcoI9 zdxXcGv&8YGn1i{Phxu55p*6|4pfc?Ku{jW{XPbTd*P;Go3aebb_ zRmR_KE0fMblw%26%awCjN{@5mqW}9c`f~JS^uHs4u7c?@f$vXyc++a_=M~QTWAC?0 z`dDHAhF{9}`lIse11!{TnCJTVC1guo-+uSssQ+V!bCCb){wt(WiIrG|)mVeIsKQW{ z{D%>Y;`I8j`#fWLF@OJk?jOmm+CR>dTA}|BgIFhxX7^Z4)}U2es>(C3rMJ*Kid|O~ zTV8kz^6j#1+Hlc^UpYN&6}JtI*p3}&LJQiFKp#>V#^dX+jBwpX-}6Q`)!F*)y8LTi zfAlW#6JI$a>?ZeMFXCLZedK-|z&H-#5Dw!AlJeY9@|1u7G5T>7?ACu4ZS;fk=W#ar zVfjd3kVh-Ar3GFytKF|~BMh|+I$luH5@1^o`h5TLZ{Uco>e`9E# z@&O|YogZWL{Q5+;MPHu$xX&zQ*PN5<7PgRcop&Bu=S>Oo$ue{{>7UT|muP4E59^<3 z)IWhbV+h&Tl!nb8F#i7s^LX`PEwq0u2g=FGJEw*v z$${8@FAd{g@O=o!e*o2xwW!AyB(=x4lH1UT?TEgesiS<&=%Y=(&Hd_w|9X}_0p(YM z-l81n^R2Q6+Kh>}ml!{sXZ#TP`TcW^9~#r{As0Ailm8*tVxw{ICgFB8_&?ERb18>* z8ehiw-2WPBR!bY*^d1~^%pn}c5gf%a97n+e^8bGK@0vzj%P7VUxo%{Q{pa`s z^p?1P^ovVkz&4RuKaB>ZUd_7SbLZLL{8U`)yZw9^7_73cnkBWrj+6CQ+YZt_K z71z#+eNWl53qrB&X?h7+idfFXKBv*S<`KTe`{F;ArRU~*_X~Ujm}9@Wh`twDw*EZ& zWEtIuw~$-04MX-HF7=$}dCqw9{~hw{uUGKDTFKXn9PmC;Wy=3@<-h0OU{2xYZx@C} zX-$rWu$|n2rWpRB_xn6+fc(k#AUyFb-zquICP>7vF~Fo})kjZZ7{*lpWR*Ge|;FATNfcViFsqP0-{TRJ`Lqj#!%JJ{{}=?4)1Y1hqe zPpEsl$Z>Ix?>{;yd? z>2spb@+|+y`H$P!O8DQ_|DQDGyhd7UQH3}!W*u3L8ch6>Zzvi2zv^Syc+{2{_RVf3 zqyO7BvJtoPgWFD?@_*YwKjr_{L~p^9{omTfCD4cFQOzk>GkC28~gb# z8{)m%)8b?MV`1FSSV1|N|NjBsfu;0iSdJBFW_#{+PfO?(s6=a#@nfu{N1HVMn|>92 zHF{RDO>5Yw=qi{Q)`*KX{93XKY5F>{8Y%jSIryWf5w1m+p2Mki0^3%yf7d$4YW8nT zlS~MAlRd-dgnGwp!B%WTBer7)nlNOJ`LH<%BNfiGM!I|tGUg&=`5xp}@juvY-a7jC z>ubOOJ%zOW;{X3zq!H(_r_oN|s4bXZpWnDpvRHbh(vwaWImaaI(}xk;MrZhcKtg)L zdH=*N*;qrEFGvyT0Xf<3+=KKO;?474ucPfX52%L?`Ca`{%hTP$3H z{JgsP^09Iwf!f3DV@&?1Ihfi%rS_SNhL^Bw-!wJMqtC|zlp(JBy^x$RZ@!#df~8o7 z zb?pBd{q*G6TK4rW_Vph2HQ6hjK56%>>*D``2GpmqZcVRL|EiahbJ?5H*^NEei}W1( zlxcs^N7tx-S80FH48#0r+42Z&%2F=to>D z=E?rI@t=e9%j);8+}_Srk5c8J21PzH&*Xk!yMPtyl#3Z zwQUYsj~XK;=OL~k(6Lv&hWWw^(Cxd6>ssXhGrF@Nl!;r2axB48EW>iFKm{tX605Kp zYp@pCk6V8?Tph;0VVwGv#>&2UhMjzih~MO-KCt>%+Wu5+3j0s|__EOOrt+}a7+baQ z@1juRJf)a}-2O!&YkS7_F^tkj4lN3E zZM$jZMWOxR41Ex1hR#C$G9qij&FfH6mZ`omfT;sHG z-|(6Fak6`kE(}eD)*m=@wYAX7!+iTLKp7UIZ|BoPud%cII9bipLu#LKF_hc31e@l} z4cE<^8g1>vSGFA87W_JXjrbV0az zX<4|Xq%1W2)_Ac#*;^|YhTB%XBz*bki$Wv)_NJ@D9ZSnYrDIp3wnlx2JC|G?qR&{{ zp~a#7_+o9W#h$_9&{eva{j-?;vzYy}IIMEaYOKLpRAC*eQG+4-4HsMg0HbrQe^74y zgX5n0(M9|p7KPp#>mMw!{sEF0Kx(=559~Kswm3A`UL9(sQ;#j!iq`t8Lrde;VH>@Z z-Z6Z2XrymP&z7sTO|CX}a&_oJ-M96-9WD<$Y->XLPjFs zRx1}Uw(I{R?e63IF6%@8U-^C$N}vHMRHzW3N|Xpus>V?yNPmZJcG@#@X4E*d*`wX; zW;QboQYDa-q$SYMhBnXuX&cgn+}|`e&3){d-K?jXW|E_wWFrfiblQbXax{~`d45vb z!_Ln6oyX(%`{VU^zpv|aeXj54^Z8!4@9WNUl8gM7P&X?d|3LYcX|5ibj0SoNxe6_- zG8FKhma+*3+bM&o$l^(#ecS zzwxfnm|_3ld}FU$&yB&hnlImGzI;2`VNCY8XN&eu8htSy*12cr6>P-^%(0-aN5A@c zAgZI)HPKxC;Usl)%DFjfW9qifNn@~6#$L&8;d-{K+tEiqwnhEBUj1uMeRK7j`~ME; z|D*l1{{Q5PkmWZvAlq|nBsXDm%KhUQC&#rJ3-DUd! zHO3#bQ~I=5jur6J7RVJ(9=2dB`aDywXWvF2QnwCz)_i&aqCE{qbG>7>ch%M?-x*NE6|>*{js2(u~s`{wf7&;|Hp;%zp|9gY5v;`^)y-O zp10aps<^9BgVC=hg#HrqgDTiR%Gf`u*+0tJKh!7oB-CHg|IgF@P;MSYH+s-}#{8lQ z^NX--Kdg5?*X`{8H=oNF_2OFljz8ofB3Z(8tX%oXKt)CH)%YC=2~-;rp!@72YXICztLM$ z!npe;Fp1Nc!Zc6+G)#nt<&<@33V{Ke49>mZ}1H^d3UlW*SnK_+-Jmb z7V}uZBIc~aUm_DGyst5~8E0-K(WmcEl)2Q`$?}umOy-w@?rqv3aTI z*no}Lgw3d%WvlqMap&rUkjt-bodBbO^5?DN*7f2~j5wV;2!6s_ggzdox!T%|u2u}jD@l%oPWqp>ON#Y%b=s!@a52>(%cwZ1mJ z32o>?U!G@(pbusH?IO=WAHmoglx=Jnsp_1{bM)yXM-)0o8>ymfr-EO#k8%sjd4Me2<(OR~K`B>QC`bI=^&Nq>DCu3nV znT_@lbybD7X1%(L-rTF5P3E|kiwo;L{r7p?quLXt*TlnC`ZnaF0EO6&B9vffG{@f9 zS{c0@6{tiNsxjAT{yrJm%4^AJU2r|wgf?`c7ty+h0dfS#aRSq=3E?Ce+1HELNt0Rf zd#e7-tNQ6*xh#xFVQ!Um#f**Z*bD^M$1F0vqW0^Ghd%8}f~!Ho$566vhgb zOBgxfTVgtjOT4q>%qPu-A)Ky! z>i=GKB)h>7Im|u6);@|cbk^$s<0!h(gJ@k`U%mR@?{4vKkqsc)KX!uNm&MjD?v03S z08!t62sEBx@Q~mQGh~hM-fWUFU)|jgYLU<{z02ILcQlK(*IlX z{wKWuxc4V}ro2Dd$9;_KK5J}WJmsiBCHj(tS#?pUqA#z1TyI@Ey$1j5`o~&+b^fP% zvIz|vJU>|~-Ph-MuOjcY(eo&Wdil5Kc`xo(vd#6zYW?5i(gj-3iZ=PTUB2!>iLv@( z47hj4qBg=_>+k*VBizT)f5NqKeQa`QO1pqQVvK)uK|7)5&GC6-^heQ+9zM*wNadCLL0hpVg2oP?K})?3yxs9oi{1}mwEpR?@xBu zcz?2&yN^7kE!KUOTRa08!Ey9OemFrN+T{H=nCCzr#~Al0_Xs(heSMhVSJ!lXm?Tf5 zfj&h}qn_TJVro2hO@@9t#-#f^h(H2}zUk z1^Ub7e+6<<0!d&z8}3;Ing&M|BonJn*4A2DmzBB(d*+O zMV!&OoPBN9n9)mLRKJmf8RAb9f0p>kQH-tD|2ShFv+Fy*tp3$jOrx z9Kjc^57pc=SD4d7)}kIwh-}JjWEXldfaUcE>YfpLG$t{KbM0)@=ZdZ2V{5rwq`ro%WzfLgWze(#z zMhaG8HP#>%7tTMbQvcU$myAU1lDF7@7rno_{^)6SJ$iiCUYtMws4l_UF7c!x9T{j? z*Wx3e&X_w!!H+k z*n-9y^}lv#^Z1+fKfR57aN759M zk%CoNjWtNcTBIS`Kd$T=`-6)soqOgw?TvJ0O{(YBU&`RO-2XUb|G6*Q2hjNb^bh_f z-Z!tUw`gp~^`X=9|AaK~k~!$yQJZu$DjN`;Rns_c|1>nC1+DA!ZPAVn9QVBI+?Rz7 z*oaNojBNCuF~1*!`ocqH=EdtbkLVkZVyxHs06FHL=b3*_b_>_DX#P3+=*OnbpWmJm za>SF1JZwQjm;U#_WnnA5rCR^H(mVrtKBB$z+tG$rzfr)io6Vt++>X98<{FSC=%o*` zSq!mNWP6V?exqy?V{8V){6}i-Tff%$JDTzD&Oh|bxhFX1f{?qj3{YO2= zEd2~R*udM>3y9-CrjUIJeFQPU4$Bx_omt5W- zV7vAKmiOOFQ}?b?2dAk2HzkD)erF>#VKW*wTYrXZdP|x54>|N)bf^p45$#>piadT> zuoc^oj{+28JECt@f~dSK;*Q3QN>N5HN8<_g|D@kQbk=X{lDJdZMpv_&q!{}brUE;y zz20>ldpx}g)u_RR^LNU%|1n%GUtyF!hEDmcOIdrA>^{T(rd*E7=05r{cA4%xdn|~j z9!)46cCAL3LeIfHG~oa15(m8(V_D+H2!?BoJ@bp!4~&q<(Qw*cCFDueuk&6D?21d) zq;OAQ`P`T7>_M~246<=ieTn7uIqLh?H1%SJHZmEVy*cTg)0o0E-rD|~H%niBEuQMyTBM;*z2A#; z`jLd!;`M27#2Ze(5q~b>SMdyfwX5xsnyPP+rHoni%XnMDEAjS>U&K2Sei3h6_3L<3 z^{?a239rUmQhyomNq9NlyY*-B(eJz#FGXj<&*NP=bMd2_=i=Q3{~TW@zD%q~7B*lb zHeoZC_y7G+BD*5{{>vAIY<@Y&MIMSXrQa;+m)y#|4f%-r26NwL|5C0NaPRbvyY63# z7t$M)vC$mL?evy>|1FB>C0L$UovZxJ_HJ9u|EYLA-cO?o{(vikM-FzTw_h@Td23e%WH!^j)) z`hhp%XXq_w--tKkEPWmw-l_e>8}YWuH{z{W;MYC>dVG;wLZ9!M@VDA`=%o*CR4#0m zez{Yyd@daKaE?4oHhQ+EOy57t_uu6EZ}9!e_5$A@tK731YmkbyNJBa@(7#C;Ag;kY zze5g-bENQG8!xgocPjh4#+5&0_muL7?B$N;h#xC5|GRf7zD_)uh-|Ry$%d0l@p@#@ zThuSn+}n-xQRCws+|k_IHZt17^R4^eMC<=H30F@xU^9Jr{ohsk;OW}oY04L6%mw>` ze3mUt4swx)E!c`}$VX$5@=3YUj25(_tx5S*seD3biSngF`BEmYE3bOW1#L&xc z03%WOM*Vhj`s=U9PmnXONbBS{CNPQ9n8Gx=D-*&jd47G_8TwhwW6*y+Bs~o0OCL+h z=LLD;jQ)SR`Y%U4fo1!5m3pT}9ZN?0gm$@i0gG5d!cU}aBqIgO^Z%0ct<~j22uaNrjfhYV|Qu~*Ll`-?hLF$CK}YK^;l1D+3wv6^poiu5bYV%D)Q_a`^HXA?Ff8#%~D9=4!=tNiDiF1NFN<6-WoUA?@|zq+_{%KMAw zsQ9|cbwHQ4KiQ0^@83#x zEXe1mc25mzQI96Hp$ol;`u_dL<*n257#a2bhxtW)|IvAQQ$CAq0MXw2(HgyIeR^aA zIEKguP`t?+JnWD+BU`}e3-;b7`!FE><#QX^)1o~5(m=AwxD~~S`f0d^zyKc%*Xlle}(jD@5$x$e}`9w?c8H) zrOoxy=sIbYY&@&~rw`SvpV~sUMty5?d7nHo!96>S>zB-4?xS2D3b>1~^MLU{{M*nFBz?~KUSqbBd$5`SVoqk0+pyj zHAa2^8nPDkXu^zn6m8_RJtyjB#ZT|W07lTjuYQ5;f*!3YXx1M;K|hITO?~_Nb2g4v z^1}Ih+1?))_BSYC|G>C=CNPQ9c&mMFio1V8{~v=G^1H*+`u`Zk7&=#ZJ{(0idgi@9 z`fv>0o^4h<%kzKx%}1d}*CHFqS^ChVxG_&(z*wfZF>=ChVUb^5js7K>Fy-BQ?fXY2 zqu#fUY$wqih7|5qXfHVTUXkr2n!|9R|F5mvQX%hGD+AD8t_(nQ{@iN!twFT^b}G3R zX-LO~{=a*MD%Fbv@*|=)_!v4D_5X2Hxzep%>6z01C;N`;{|ndMYmKdV)*%z?(YHnz zZHp}Wkmnl22Kq)sXKIhSKH_?G*7hcT(H?l4$s9yy-{z8esLxTRZTDW-`7w2@cZv2C z>rl3|l5MX4-_HNqBA%_-hI~Y40Tx&52Py-0EJSmUl>fM}{%15)Tm{t^LEB|tpe_Plhv>8qabAJ8(N&52ozfUQ@&uSm2dxsJ2+cP?$ z|37YDA7LgiiPM$d$Ee% zV%(uwTVpkS4LXbiwCAb+`L(*%=oy+kOEaQ%0{`ax-5T@i-Ln>HNXJ|Ae+GAdIr|2d z$N#kaC_x#@ zQGrTSp?}2ur>*8csfUNiVfF9`Im$goc53U|w@UdVeHv@*888R1-W;7;a{y61!p4&< z{jOop6<-ZTKWjeZRo3%A@nqT^qhGw2@utMfak13ORz~IDu$yhiH%2 z(HD$~`}Tv|%_rhNj^+9{Oa8;KFw|A)8rJUBY*2XXCwUw3E>R6 zJpba=>q2z?-&yWDW!pTtfJH2!L7J*RZ9l2Mm;R4STZ{HbrYB>0|3B~ChE`!}g^%_n zDY3_WF;awCh1FPt#^f99Ph$`K)Eh#}YJ1+N*!$l2emgqQx#|Y<%Weoq*-yJq-k^NH zA@m-ZLTbn(w_q!_p)T|K zkWUt%5ZlpUT)jTcyaalSarWjycH%~TK<=>1 z4Qf%3CiJJt{~6+6FMbTCiyxy?(*LCNzghkl=BO~;WKWi~N%pOk|D)d$M;Ce#ogpq5d^bz`ToIta@-Xia|E~qz`&h-sCxKFw^jtNBV-%0W`rZA0Vd!h6d&5xPoK7+H^ z)nhKv3yI72zXk3^EMe|t^Wa}po+BAEubBf+M(KYQxf*MbinZw8Vr-vGM+Vm6{Q93v zderA1#Cm!b#-^1G%7pXd-<<>MXB`Jy`@Uq08{j7bg&(?vmLaRX+L0=U-y}d!Wr@``snlI z0($A^?SG5hOZa#8KmGA0EU#0{^8M3%{|w)MlkdMl+h$r`Kr&LW3ahaOsaUrEZIb_U zrBC(F@OJ4Fqtfy6`F|(=U+jOvtrbTa(ott_Sq528Heeln&)R>AXOhh+(r=RVn=1X1 z?c5#Ylg8AZ*lKKT*Ijilj__O%l+<%H6+Hz4in3)u6HohY~v+W(> z>6|sj=*(xyy*?b+sBZm22`@oaM`c!shsX1;3Hvr(8}=7n8>*6=gOkw~Y3b1-%}|mMtuOVTNrSok^RNYT#+R5fbX42>t&*E;%})8Z?f(pOsiSaY-$`}=`f~gK zVr4M*&DZ~3&(K+Cozr*Ormk4G42xU58U5^Ro^9s+qc7F^gVoj_M9-ATK7pYmzt4YUUfGH<)NQ#eoc24V-$@8lbFV=vMthXMZR&sNy&3(O8Kmi0e1+P~Z*72b~=8+o(+ug*Ip83a zJEi@rZE(!>sQ;KHo(Cv3KXz%-MdJdM)ze{I>L$n5H^i*P~lgDyh z%R|F){Q+_->L=7aOUey(^fvB%wCA(itB<3#KW+3z@id`%z4kv^>1}AQ(f+Rrp};+b z*p4EUU`LZYsLqZ?3p|Dv&CPI(W??e1zFk}3svN<``G7KCWUHx4Whkz=02wXJ+Gh5U5_TT zAv%M%cUrrG9_Yw}<2837o_@nx(%e z4K@zQ2VK(M68p`xw8uWww(zE1Z7Vwy8&6j~yApfXG439+w~*~9FS1Legnn`$C$dYW zu=}T&`*7) zSBzwUasM0XXRONquP*P}YW<)7es_`a2YPS5@dxr)lqNFGvlCAy)*}mjx!N1q z=1$XxYSrPJjRVj(VXR5}gWm`_+-u(-e%Y9NB`M^PxyZv7Y(;UF_DIyfObuZh_w;wI zB_wC;$524liKCEQw*TF2pJ;l+Y2z0Y_IjhY_{PoS?ELgHM7I0LhTld;wu*9o-6Q%5 zWF`94=~ZMkdg+5X+68&Wqqu8v-YyXBK_8ui(O95;ie|Ln-<^NcYAvk$n$U(W^kM)b z=-+014TH(-^B7K5HY6z*QrQ2ah$+O)ahG{llANLKQKX` z#A!rl14L&8v}$)m`*xq-pLgyW?cdeDvAiDH;-`e2#m-;dsoXTTfPNP9SisCzR)n?E zVS=fM$n6p-46IoK^y$eFvOpoTcKte`gioud(kJ%v4W2-r=7a#kBU^Cf`+s17w@S4X8nA$F5h#Z^d)1l{M+dr zXhj>kr|t|V_%F{dFkW+#-e+BW@4}s7oIYgj{9u>0-ndxk+*g~i5{KFJGkCicJvgC6|wzcWV`m`D*HQ1Y{dPP=bAB>}+r?(Q^Ug%|GMgv& z^slUaC9|*r8?gzSk&PTg`~T#U=lB1~qi?}h{D0m5C)$5-oBQ)ofI@6XwDzxv{QtfG z&)pvlGujd}7hA{wmAgWTILlCu3RI#B)u_R!@+;aGWbT=}LOu8L{y*D%!!61IvWegM z{eRl%7xw@8Zle7u67`Gp|L67pH~J6ISg-%7E?eIJW~$CmW}#nq*bJBt6g|E>BrTm6e@|C@2~OkfhH(U-3N zRmV-yhYHlc`Pv`!S&Zq6kE)wTa@D_Ev_JSo``?@;=g}bS0=bC#t>@YCN!CX&b$R+d8sLy|4{zVg;jJ(-~c6tY*`TJXh+lp<-M*#}4 z9sT+8zkb4?KI9OF=_4u1{&n&{I#0|0I4WHCR{39D)l2r_m^!FrB<@c&_HHaEvVli? z(-(<%RQ~Hz2ba;yF{JJtL^QTH+{O0aq#PhCTx-~7yqz7YRR6WELVJ?m&R?;oMC&!? z*$T9W+q8#U$*3%d_5`aErW!SpUm8AwJ7=AO3)6dCpZSCcbG?2WK;by!1B#acvTpJ`+|GO!Nk&;QD# zxAb~9tfyz;{JzC)_1+a1=D%;!&&buU*rs2B_G*o8Fs##`zt$hkw+5>s5XsPn^-(fL?u^ zQGK$Z3~{6MOYyM9wXrq)R~i3K^IPfG2=J@(TRR_?Pwi0=&7I%IFCX>w`tIAj7Yeuw z(VpYI^o3gUy{A5Ycey>+_!prBW#}vM`~1u4Lsfpi;<8Xduf*7*-}fFPesdUA{Hjrd zTFk6uJ0Yj9VLu_~TJ32>cA*yo7{PIzKx2Y_9WI=&KyO1kqO}giQ@+ic>F=cQ<0!pT z{iJL8TdLT9D(n$RKZA}%_8;p++RE5}aF$(IVw4R)aG6wBzlos`3v-+?tvOyy{`zKl}zQMgw-59Mo ziu(5T+)Zdh7X~)02+ z)>T^jzuWl4&GN7IU#~bDgsFFJfZlS_b1kXw>Bq5be<<*r+db!m`3C&DtL;@!jw5PU zOpudUK0iobKiZe@H1`yi&n4Wbox*>F4PctzEY9F8=COc9%rz#3C35;hNg?5k@9#{< zBr+K(ScSS&=kH~lYePieXEncuNzbp%v4-9<@A=Q@Ue zzi1b42RZt4wm&>%ojf-_wcc-JVFSi)wH}t-gw4oCzx+Os zCx7bC4{en{_31}2iZLvYuT`l3%G7`CeLaQ#<0AQ+JjR_Pj$Cxhr+H+ryi@wHvYy^2 ze@sMW#?9ABU+UOGb!(U3P&bc}qXXvuZHtAi!fr!83eb=$|E;m-AbolMa+W#0^dfX9 zOQO8rMn?0O8`Xi)9^}m<-k)r3^ZxQhyZjN2zmDE|UDQ7*E<-sgP=#GzHb3CH)X*#?soR4Z0#6a*#B`}`wPqGug{vp|6^r|das22vKZCku0bv4z9Sux z(`*AxYQB z|7GJI!f++~FGewjPH}XJ^C-H}lWcAQ&Y!Qh9KUmJ=024ge2Vam=r@M_<_|n$y77TD z=bcrW$Gti=98NJ_ReXK;mM}BJ^zTU1lh*31pQCqr@XdSs>>Cj6@!$HQIdQ(#U_H0@ zeaQ8(3hVbV!fzOLtBv`d|GiZq?AewU8nS%<_1A}8!nYLq{#%8Kz5`oz$7bIjZTzCW z3Z9<2Dm*p2I#efI9S%%i5e^mH5Dso%6KWRMgl7`24f|#@!sDynA1d?H!(KeOxGFp` zzbaIf-D>SxhPeQ2;QQIo=HDG2s(5cGt$J@Lo4z&dOnqP2wf%je)$cy)IUdP*e|R+g zZO%Qj9&XK*;j#Iv!k*N(FTd9w*B{ThI_#^wChRYGN2pqSM|fiKy6|L?{TW`2hi5J| z{*~id{ZG$Gr&TMpeb8TiNoY+-)Rs`*m0T1~*(darKEroi+b7N!$r=13lEqO{|AE*9 zeXaELRF(0xoOi{BE0uR$(f_TyQ>XMRQwLld}rS$v*M@KDk~|XiPpbYZiZxyoG$1Fj?dukpF?)LVnD(d~D*5lfO&e z&p(&E*R%ftIh69w*k{Ch>Z%XNZs3mQgpJB)(HWYy>?|(crhivx{Iwts3G3u8K8fUlm(EpXi<5V>mN5erIOP;}Oj1ToN)Knu_gu2~%t>|Y?A5X16m@NadVAU)SLoEQ?=n}k z&-%|}!uBlKgXXMtp;_9F$DN}`Hf7p3a@-sf*ILQ86G`D&{&h)7;jrJRU2k4NYEpQP z`v_UT+I}~l;c55ZMdsk(R8ly!F3~u-{ktZU!k_ai#}Uu`-gkxKpLmDE^vC^o^H{(> z|9O)CKN*Ll|0lWo<;6dD?J+!sdbCBjHg78X#aHMf;8fxVW8`nuToRqJA>o{!{*W2e4)V{H5f?}?qdjeZmVm3PL*zx>hIscSwO8~;!wvtr}l z{y^;1J$KUA$4=e;f!KH>_rv6`J{UW7zu&o=yqUb1{O;P=_)YJQoodD1x5rK$NQ;es z_m0@9>u!pTUvX3HROv_D{}I>u5A$yo=2w3I#oJ@!U%x%(ILPAv?)#V$UwpZAK7QkS z-Fr){ncus8uTw`n?_uF@e|PMAiQKo}A^nK&iuJMaFWeP7fxSs!e#C??fGwDWi#Cgy9)AWuJ=h;=iFV+d3eKr)LJr_=LLmvBw&Fu!+*Hx-BSR)&MwE6opC8Q#PFbne>NQL+ceFogGl$*%Z^ z*q8C{*ncDc%zypo*psv}JXWBb-nuf}&;97e8)IMe{N^)Ip*Wb_+@FFPZ%G%*Z^`@_hq4~EfS6@{_a{#>THBUX}fM{GyJ z9kGX!?ueDHxg%DVdPi*Msykx4R^JhOIQ@=TdBz>FN7mjEdo=Bi*zWbxU}-Eow(gGD zp3FO9MdGhKb9vYv{oX0>;Cqxu`U~a;&6r!W z9l7a8%-5*Tv9Dd;<)MaOW78e6rnWm`Q9G}eyFXJOGz$Ak{r6taa%()Cy2?41{JPEW z=rO{HMYq;9r^jD z(SI9#>tmZ!^yN}QFMCd8+lkJLo)xBRjkz)DY;X(mw|P1vxyJl6R~idW34<6wKZ-{# z2_@PBJ60!#hfZFi{go8Trp$|)yd>-z=l-PhuWj*&{k9)Xxj5`zb#bV$U-x6Z?}_c1 zy(H}2cCohQ#m>~dIPA+;Ufq-ws!}fwPn5a$x+Luxb0)Gb4o_+8J?;Ky++XAVQ}WfR z8`TS2FA25UO3#v~hSdvQiQd=vFWJwY&{&cfuDnTIS8b2*Jnco#(bkk0uIAT~d9nG0 z7l*F-#PEZ-|G(nm(35_1tat4t=C@uFj-h|;rC}iL;=m+aJfuxDoRS(FnUXHnT^hzv z>|LAG@y%TcA*Niq7G-Oa!_MmDu&XLLJUpEo%G1(f@8iD#pTZyEo2bCvwo5|grX=&f zlES_%Nnt;#=ud1-3Qw*|3I{Ty_fHB>kx!>2g=bcq3#aZoNFGY|{!PiDc1b72y}&WlG(8gkJxN72!(LYp%ni z${Z&7;+ydB#}b2XXYsAjq%8grbh(QE5l2@V%LyyO&*-n9Pvh#Cvx64s|B6BS&@=2V zrR**mg>UdXWja_j4q<_j+hn6yJl-;Eyrv-F8gf5_@R&mRRwU ze6V;+Z25op-KDI)SMe*n2Hl0? zMBJ$!{H=G{%f&Z3yzr@5?dea&o;~xaSlvQlIC8eo-naLLhN^qRb0hbfAAE0UI(vU; zK7M~_SzBbzPmyu6qR_s&D0HM0h0ekv>90uoE3$s`fzXrpK!4;mjaKTBP< zlYZ6*Lh)~WMEmX*c^Q9&AK~xuGyDR-#)?}*@qflQ@d%zmGY0Xu_&fXq-kuSPi*p_f zC0id1JN)iLg%9fgJ{Zbs9t=Cn9}K%n9t;ncJs8R>9t@9=kDh%n>|S^*4~9xS-t?ezz#k0z2ObPnc!K^U4!BmG{XlqX%>&_SJd^T3s9AU*99()J93l^M z*RFoReiQNnd4#)u{R5#P{ekct8r{>h^?}fw|A6@s511eEfWH3&p`E*f>~vpO+5_Qe z#si_7?8)-|$-dlggkzJ1p?_&Rdr)B*Tr3Pjr?-b;`UwBgiNY{;x=^OLHtR>Xz4|}axN|>}d#`iQzZrH5S5foL@K~39ds4p{_Evs6 zR+)Zpc)aA^uy6d{u>bVEq3X=N^1;2~$%)+9f!cdR_14eDo|?ZmJiYmIv1cak4K-W7 z84i|wI(BICUj3hGL_?0D^);h~qm8Aj+Qu=A_m3cD~)pTf`aDsH?@8HrnQ8*azlc>M1B!zanFpi(>T zCGuBz4fn5;{?PLL{h{&Y`$N-<_lN!52XGjTc((KY&`A#CUvTQ~O|kK?DR%0$-`4N= zcq}GQydC>+2pt&0U*Y>W+IWBH#t-Q~!cXw`=(*?q(EHi@8AYA%R<Z_SEO*{1(nzas2&&wjEh$2k~z`oH9X z{HXt%p#Pf?o+h8+uHk>s^+V)gee2p@Hnx#yFN%b4gu8xH|92`OJV!R3HAbEf>8f6(eH zF=0AW6GK-@VmO+VXg+CTXw6~2%u5XIWQQ<~WD|EY*&?1JmD+P?z;if^T0Dz7Jd>?G zh=VwU1E|JRcpCdxX;0z_MD5AF8|Amn@*BC2dv}`rm?1xsd$Q!$B>6Q(ekC8}F6Ce5 z`c85eckxE~-Tgbrhq%8X4SWj^!JZn$kK!1H@E7IDQ9zgg?R8unUi351z!+ID|Sphhz9Yeu%%v z41S8A;TQNNevK8{3zy?+T#I+%J-7uQM6vRsq?i3q`SB20%3U^T-u;w$_vFK8_4O9m z|CSP@{{-pZTwDH+xxRJ`jWP3!O6;Yu z!tQW$sOY&RJl6l-uxB{k*%V(7Gl!kM)h}-S&t7qle%-!!$>xaZBkuT$z9v7*HNuR> z-`vXuVeYHs^L~GKC38rG-D zneST9@`8DLhZDoh%kl6LX=J9+Jc209k;L$$;5o!K9fb{vVNU#WcSd{S(eH_eS^j5m z7W0_s`A>*o)aXe8=i+|2cUuR~m@E>u1?F$Q_@Gg+uNUw1!0>E_6N?)fr(K%Gdh)+g;KjfMAk&M401=krW=a#vTV8~48Xyz}b3 zPT|SXPqNwakJfNhm$6mo;|zaz~Y zEU@Q%4R`cC>fgNH^rpLu`#QgUm3vO|`-FS&l(CkBr`ff!W7zfRnHK0Tuq!|8y08b0 zxg46{*YBR_o2;|%`m56KYp$<%Elrq+wRDVLBa9B-Q>*DK>7HxuCh7M@;pqnxgi97j z^en>6+$!GanfZUg`_8?>Kk3cqPPr(2nct4DyT4Vq{q|sFSe$u9ojH`!NOig~tk27nud=;I{=P?dzx;BuIb`>#qrKO$ zJB9OZy|2aPaced|=GnuW`QP{IkdLEqZFlp($M1XR=>HsjMw(kEy%lR)HF^$VcIafz z9Dxqx%uCYsgm`owt7nCGJ<1bv-}Y|GlIR`38hy`|#;mWv&BE)a&FHYsTqjJ%xiHah z@9-@TX1V6Ms@b=8xNkIi|JQK%T$~4&yvO{xFpow5iSO}56fXMRHT-?M9iMf-a%)GT zdtNgB8U4O*xLm&JA69Sm+n%fXr1oWvwErddDwhwH@IQZl75^UJ&^LXG9{qPaTFpz? z_};MNRc)W|dgovHR=35%KZ!ruySr4F(c8^Qx#HZjKJ41Tyr8`j!c$$!GuQX{->WzH zkG$uh6P|H*EIfV884keDu?i~nGe-+aOIr-tx0e$nrDc!wzcb%pCfr*vGMZjI9?t_ue@``xV} z{F!oZ#y9^Od7(adMY_84y5;mRHhzQWxgmVxE!U$w6s47K@r%Cua{9^i%;JgatzA*r zug1bpm7xcH|LDJnzW>}BbuMN)Q^VEQs(b0zk~iSj^mmc(!7Z5UkA?S>AB^;3&+ps4 zAx!sa&rk35&3cZrYc8?@kON!TL)RMHrw_U|BFwO`W4_5KJ^l_h0df5Yas@8M<@gNd zuDCB=M^3-6IsP0u^V#k37P14~IED-7d!APRqf}ZdGe*5LAu;SqO$raEv43W=|Ffk% z%C5JYovmVPQh1EMM_Sw4^uMY9bK=^+aqZu@_OJh@Fs}V84HOs)&$eeZ4&u<}csSf^ zZo!ECCdoSU*^V5ycRd>D&n=msN;Zw3+rz5mv^gs$&0Wbfhb4>so9q-vSC08}dFIcN zJ>0#8?B7M^*O2|(1N@^k@I&M<_eiGn?*3>V-jHuFhQGjH;lJYhIHkV-A^9Wx1b-j- zbt+%UpWYR@h!x(#1ub;`|%~@57Gd8H#bL2xqWFXK-@q{an>c{~ zjK}e(ID+T#FZel%v#o8z4m^a?0&_BojK39L6n5ob6dum8#!dU=kv9FATJ{H2;4vs7 zi{Ixt{tNlIvpx4Y=X1aMyUL*O6HC@#lLxq~Cs%~0GOl8WT@jw)u2Cl+tWejdUlpp; z*eLKco3FMdQhwOL*BY30K<`To!4#c#>0k0}!l%dbtI zdDuA333EsO+8nwc88?2B&5++`I;8Ce`Cp!WjP3VOly~u)Y+)DScRlal>Ze^QfA6_n zdBgoq?kCAQepj!TKiu~Szc++`{~fM#A4MJaaqs-suKhl;=Ai_d-j zU*d*wi+>im@AmJl6`pxjq!p=wG zZ~vAy^M>#${g;vZ)Fbh^8+XNvd#?#4C$BO7cWHPi<5J@u)<9z?b}d~K9-cA}zWN>E zku%qZM>A5x?i1IBiur59V?|elJ=5<9d-+#xwBPyW%fmi$zj50t@(J!I$pb}~hidXE z?x)4`Os;WV9NctyIE2Fkmxo$Bd-C#7ha8wsai#Xw`>^BY^stG%T3Y&uI6sCQ{AYX}CD@00^x}JX9{+%! zV-c6UTOEUZd>!w^11Mgl4U8Rl2&L=T|F9Fg@bD`3KRkj*vAcl%507CF_O4?8!{gdZ z`&P66r8uiYnf-(^`$_VEGW^2)r^1Wa|IB~N)NjkuZzB(nTVpn9{*$t`jyy82{euQP zhsFtI+~S+-xb%-&^@-ZhKCb<9LfJTSZjaHU+BMN0qdnR=(H^6H+|eGR{YBb8q;G3pfa-&4X-ve)$<{>Qlc z$PWwmo5)2c2C)Z+@Yg7rO;G@(s6Xscf-;n1*R=IV8P~{LDdB)PpC+F|&8C#_40)&^ zMZX~>RFbzS&ZGDqUcg1t=qTtdGZ z@5d+bX?z_Y!*uV&X6(gmq)X?&hd;m<@z28D z%KuLMCe|VwTW~-A6#MaQ{1uL10zbh~yj?!J7-#73B3~i@73=7C;bZtu_&mOfB9!78 z)S(k6@k7kw4XpHBS7R+cfc416Unmp)QJGMd#Lk2(FLDi^gd|s7il;FnobG$^YGH52 zNAMr91z*Q5JcS^R%W*w2a3}uoV(l|%mK49SLYb8;zT2-1A0{*S--Zw2CfBCbLEkrK z^mpoe!ruE+P$gnvbF-=pHK zpy!ft+|BPEd;;rnEgls2FNL{{emVYxdy=~nUvSR{`MnK)#{b9s_T!8EJIOyJXYPI* z`_GkZF8DDnPGJM#*NQK_O6K;#K0i3O}YlkDKT# zT>E3v)QIBk-jgh{v6pTd{d@%r9sd6k;s;E&>sl}E$lVozA-0#UGnIw~BCYj0oGP#P$m;@yd z6>W;BR8vb8mA0Igw&3rz$Mc-$`Tc(9`{(z0=CwcTw%6>vE_<)bXRQr=&_AE=&ch%K zfran8TeiP{zL#AWTz}=32CNc|q_TCN3VK0GP zWFGc>+zS?NP~of%DuUve4Jv_B%w!BfLgWTZ3 z+z3thdBKOd8Cv$SzX7yiZif#1I-v`5H}u4?KLYe(?t^~(24E2L5LnJ~{}8`?8jn4RbnV;Ae+S%vq2fL;geX{U7@+zOzFgX$S9rWD)k_y}bW<@0KFV zu$SZRz`p`niM^_m_x}Lze`L)e-v38g^K$%KGS^VmV{bUa+Lv>@|B;Q@n{fBy??X0Y zZ|P)x6Xyrlm&dZcX)9x+JJ9z>c4P0^OCM`LI?%{I?ESb8;6I2Q!rnyQd0nKZm-Iw7 zV0Z5#z2iu4WG#8^+C+ZDk{`$_?9Q3w7kOQS?01l!F4B|phmbwk{Wj9uPI@Cdj*}m_ zcj4cSY{A|dAioZgU&xG|H^>f|kOe6lZ%`_vK{_Pvyg@cdf@H9S+#oBUuc-tm-?3JX zY0A?n$`i5}d&#D?w71vNCRmF;%32ldTC2jiwJJhpb50KALLLnB3_1p=GgFG7+|6u0l@0t1!Z|?S0-&zvf*u0SrM~u7_LUQFs}0;62`N469pa5%z}=4Y$KX z@C;ZXAN~ob!C5YYt6(*3g1cZF{0w$O0aTcFuJMQScyGcT^atCHKdkoC53BQH-XXIY zOTz6w=!m&j-=^ub4fEs39>x&;@CyB~|KeR(k9>qV{?e7v8m1pONZ;)Ab;J#obYxE< z9ZQz0oIZjBD$X#jd?-ZkujJn7*FDZ0dnNPf$6y|DEvA3?Yu>LH5@$QyVTQr)ZquD? z`;`C4->zcRg`kM<3vI zh$OCG5jGkBxAFfw{&}3!2#1O1?}YWzF>mBQMqw5G*cad$;&#su(H(?KAWRG6$E`E? zKB+EK`<#X9`1a-M)FeJ`yk2s!PBbi=)ze>3M;=wnzRKKy##fAoJ&(f?sxU=k$LfJ#Bh zCv~}@v^ds3v5qhUvz;{)nXHeyrP#}m<@h_0 z71%40Rp(e2fvm<}gRI5hg*3loT#szvd^hs{wEtque#X%8ZzFs=vLp6~JpY;ho5fra z^u#d#i+eBreaL>|A3zSWw!_>@y&!Oj@}UR{A;)!zvLO$0!EpecjotJokQL+Ta!jMs z0cB7Mwg7AMAsLduI{y+X>kpK$fOY+>)wko80a=guI5d4^XpzjnqJ>WtYyEcZhhO8TByesSyRbj6|R-^yy#l7h) z<6%f2{_Ycumtk*2dQOJuN0gn%C>PC?k-t*bvMDp)qa4*yK5{987f?o?qD`vTW8o4AtU3$Ee&{FlMC zHxvF}_+3OCzk(Zyry#JLvzr=>L1^ z{~uk!_$&Q?utO$fLG}sy|BwrLkPih=2t`orWc~w6p$y8w0ToaQRXdsgApUCpvxfhy zHrMFn7a!glmTDiT6_e^Jf0%E8^`V{H?^B zL!4^}_bK^&Kkl{gBhDYj|G)76fH?OPW|aTKG+Qv<$@p357=_SSKxUm{tRC!;0qJwmO#)L#DFwREEmv0w`U^Qc2axSn z>OHa*dkb>NL0-WCn0ptPYpi?04?R%7^g`8vi%w}R)Xcn4)!@Wj1;PD}7osmdAANCT zBiH3Yy8TPkfd31mQ5NYlO4_uO_Q%Y8CjG7_ZQda5-XLGTghhm_$L%T7aW?6`igW&n z{4@N6^uLVs{}Q*mmOm5Ay66gu8*VmI=Sd?_KOy5$9)cH|ISC&q4xR zz%~38S;T)lL>%*oV;yY6{ULY+hG7+Mhha5w{S=;o73Aw`_$931zix%QAPov(AJ>=7 z_4L52VA)9hhj>VUM6f{;Bty#hHA>w?{hz#s_CNK1JM|xa$GM5zM|K2bZphsA7w8Yj zM@YMZaO(e3>i==-|1|ag6!jljc8>awbYQPIag7#}PF0~7Xall(`2~6wISel1tb>m+ zZzFxykQSbq><>!VCZu;R^PkA({fz%1Td}tx+wt!}c4F^BcCTmr59!C=i|oU{e>$9X zjm)jU5IjWO+u#v+6n+h_Kn--m+pvLha2q@X3Gg;tNZvdP-y?snhezRYXoP0CkbLwuHgAiR&`B z9iD(`W zeg)*jT%~N>@>r9f3k9sp&xa!GLi+fN_#VlXN;%3J=2KUqBYdUoTNWu}$5qNgW^P|3 z>y9g#=($n}OIZuP{7Tv8vX?&Q|IE=bCDWK_%;F3<7V#wWRri9J~i);vg33uQWexdHMpP_ z>UV^y4r%dTDQhR!Ho!F^6S3R4pCs-p8JWU8rA}R;G_EzB>&uwt9&yV=W)UXab)|A5 zu2Akt<{vTV6HkFHQiXdjV~+6qJR{lb8cL^0}VS!Yfpt%KOjH`wu*r8=tj~fD%$?9h+)MG_0lOd%dcX(mPzCkyH?I9G zoQK)iF9U`dEmuP%tbrThR=5N1gP*|@5C_l0%kT=MLM9YK8T3Iod>bAFhB_@5z}w$r zi~&A`X_$gPz`HQ(GWx~v8RkF17jPP;UcArvC-U!b4*mug5RPuGWgaYm%i%J(62f2w zSazWQyDLOXFemIr|7Tx_zJobwFZ#c6=>KkIj-N1T$aF9Izn$oU4xs;w%)*|16deJ| zSuW&Z&L2k?^bqAQK>0gB`8!AXJBw}s?q#@_6V5@npq|mNz3=h zm#gNZBLJ7dr4S0$e)_Eg^kZ2MZRZ6rXTCP&v3c!Gd#O_ZpHB&E1(~ngMQm_ zu6v66*u!&dFa3ty^qb@8M?((gLi)2s`{+mSp>K*FR_PY{&0Fb5Z!-1m=r#7| zN5`VCHrG(XEEEusR%8-=+hqE+iI{D;rL9M2mp*Vx1nZch>9;SS-#(vy{6hNi^o46k z(`wSzIiI{(@J*VVY40L!YS)wIX4;#101cR%Nq_U(EMCGkaZdX_(tbbfRcOK7&9lnf zTix7av4iyQoBn2R8viNu*ueZI)8IMu*iNIzc7}D}q~R>MjdXsWG(G`mVCw37j7xC8 z3+{tQAeJ!q;I|iZBBVmzQPK+Og9Bhb^9m1;c872uC+$vC_D)a+Pf-@3a+9s~8pfeIEn=e*6YtFowPbOp|8c`wo3`SOTlyCb$jm zhUcLKTo8bRa2RG?$~kZ~Fx_J5g>V0dET|M&u7izmFFXdjAsdRp3-7=btXhD67p#ZP zz-KcoKZT#eGmrvyD1cI^gj#Te4-UZV@CW!eeD}Mw@!$s70Cz$hWP=B~pbv)O6F3i+ zaoYb7KSlc=5;5B#=`8JkNWq*6Y51i>24*|_U(bIw%c(y{{x4-8#TfP=JVgEn$p4w_ zJvc@Fhs>oeGPmAJ{zDtr6kPvxl=jbYQ#XqI-$DM^XQDj=@JTm-E?9$#Na6h1bWZ#(#5 zJ?=lny#Sc*vs_4dxD*ya0%a`)M&V8PEBpfScFV+H6;l+f;|z_0SaY?UJ&P`w%53aDC2pB-6eWt~(i>)=afvJRBKI_5@b05|yT z|GmEq(rYu8DO`Yx_vWf|IeWPKSMd*8La=Ap}fu+tpAzu zjs6R|eMMWC|JuX+m+OC-|6M0PciMa32f-pW(6WNA7`AaKb9wZiQ!HFZ>1^ z&<1|^06vFlxQVj=0=x_%)R%d15qygMa^zKD;XQ5*WnBa^A)NLf?|$2Q)_)y1~@}U3;QfbmVXs5hALseEXGU<$+QU2B9J6i$ubt^2$&w1J%=NU&luacSP zRg8NXvJ_kk&ok$7p0)1hRX6{3U+X;McIVYd zSP!z3I69!4^SY3&yU+7Ypz~_qbzW^Su>QRIxt1Z~83cdWdFDjUs}I@Be_Bp{Ei1&I z{aOi-IQ_M3r@mIwiLaUe`I`BkuNmVw&-nLwrH7OMlneWM@*kOny`D6vg4`Xnf40*8 ziKYFsm-f$o+CRwuwf#ewicRD{{lqHPF#NCme=FnP&;V}mKqEAP7ktnREzk;W&<-6D zU#oNd*E|R5|3imxfs5{qD4${g0SHKI+*x5tf@^|E`gz3dRhPej5z}yZG5myEL0=K)cKgfSY5$AF2 zvpDB*+}7fDJy>vi55La|UyI*P{^KL;pTf^^`zO>fw>*0ZbsDOeZyx2H+lh2z_d@#* zxi0u4!_dE-#d{7m;O>G|um&y%3w78E!TsNMQ~!^ilP&d}l6F!59U)pon8k1vTm#p` z2gLaYps8hLav+TRy%P%IHTVEdz)5JK9JNxO{@3rjQU-hQ_v0V@4jpCG{O-#I zq~o_>H#Bga?GXRB8M>DL>c(xEndV@J0+@uAT-PIz3va+J=s#fs zzqjEf%)disPkl={kjt|v5Axl)MgfUaNM;pz+2fxuhq>=5XeSLnAq_3$)tSYNJHRW9 z)vQ{|`v%g%0k6S<<&+)hS>_>5J(hb4NFT!VBKzV^J= z^WcyhdQ^E!k1CBc`y)E|PB#W0RcQ)cMVKmea%LC~fH+bR6cO>pe$V>=Bk3!G3z{ z=O_abV&*6wX~oTgIXQfek_c<#oJ1&N@4eE!b5yc#4sD$|s$?I&3Zx@$4!YiR_$J~U z)&|T`4)eF!n2UDIQQ>xTkBB+ShsNn0* z^K%V*!Jsn3+tBH#FtSe$vj0b?ayRB0d8aFl{G)>^AbjD8LG=G}jpALoM#=tMqtrjh z^N;=C(D`sUF47s&yb^O&DEohGx=7XDSG}az(TM$IN%pv1k8}S?vD-9&||> zp$W4Wd|q_)U9A6LysEX}ZM8YxR=e#T`h9QH?|WNaQ}3!9dQQG8KlGk`SAEcrc>o6e z?`jAvo8OZa;d|!f3Ua0K4fYub9hKQe8oN_zeV=!;{N046Wi#=|B1d* zExK|!m|5If_&MJrtg0~HY$!HT$-CwO_6L2R=UqA1$33)cBz>3Aw&4Fek^ht%0_%!jF<#9%+3O~V2de^XSKk=r7g>rh!73$1p zPKE0-_sDW`ou=&k8*?L=>%cv*NQEQ(Z$9D2na_@6&f%6&HRD(KD%XT;Cld?*VJLw4 zE%-Ba{X^vEa2BqhE>$ug{37PBJ`Yv7DH}pn&%Ahr7aaih2(bN__S5l2O0H#Yh3g%p z-C%xKwm-mJJ<{>x)hbt%Dvz+<=*~4NLHD!x{#hz}kTrh~^L-q2K=U7?P51$%G`NPPFb^!$&q(CE7t_Y)+4KkM|1^A+5~|OTU%+Yj8~hW_fiB^> z0p9{%3zj)>8GHw>gevp}oFUBpLk;>0wKEq9)j-P<(=YT&6KUg}l}ZpDPDh3{~_AK4GT2bN7*K8An5zhDOQ zrx(J-Fc&U`??TtCdUZp`o?3N6Tdd7!-)l2k;%XVouo(*pvk0z(n|W5=jQmgNSy(SW z)P>cPUoP4VF1ZR^s?Kq##^F-cNtc|{F8Ov(syX0R6YkzkZpIN@YTW5o!@fzm32(91 z$~w1>@h^`OLh6(VwlI&9kjd-olrpPMshHCi)+v2K9qraS+4t8eb9bGxcGW4{S*x6- zb>F=IkojSCDmdU_EU1otfJenAdH-$lsMMcilo3z4zfO)bb*eaB$2f4Es^-_pIV;tu z-dxA{XPQyV|G7HTjJlXQ#(_O**q&s#xjxShuNqxxM$_D6!y8d2AK{v3)}b4cX0#qi zGun{tp>^umLOLF=Q&+&lTwERF|8??Ddguq#sqavo`lsqNfcxNhorVH+dXluHs9J7- zyWt_&KsxhTX$znKwtR~@w7IgHd5Xd7*zz;e zJ|oq5b0pPx4D%Dfc0`uv;6?cABlJ)hzy9jSSE^^_D*4}EtzNz*{44z4e1dpyNhkg! z<2TsZ{>73ADUe~>Q{RzwX%q8579(M2732O@vK3S*DW^)w&MKv}Rw>n;Wu%>~V*D$^ z$Z%K5z97TMEU8k~@+Q_6T8x}iRg8Z(vi>*2$UkK@3MQ*mNcf`ZD#pJvjFO!hMk%sv zpbCBZYVpdkRP1U}<(?|`i(?D{s`Y8OSx^e4SUM2r7u7z{@wp0lf3CqB~ zEMqXHN<-%|3`>cV`?nkMZl@9!da0zeeFmILI^|LF0=tpouo$UMr_ukaiV7i~BCV=czO9;XHgH?)cC zIU8>{yw$2W=T_y+Ortopn)Oo2=c!gM?s?W~ z<^$hlKNO!bOX7{Juxh11>H(k9km+Z9O0-g*W>zbSHg_`85>YMd`f9~5uSO@Vn)!%o z)?@iJh#Xq#Q_mjq>rAzJ7t&rwcATi@I|J3~;(xobx0Fa(7tQz$HYbeX)|J)koF04`Bf*R7x zsS}jj!m0OE6kVfY!j_zFRw=S9&`f&2r@2ch|Lz2%61S?&i>9# zk&Ii)$tI<`nv~{jQhH7k`rl2m`xA`JvrWocZ!xl`nv}E4V&oDoZ#(tbonaK5Yf_Z%DdEbFHmQ84#c=Gg7!}A$ZxeeU`{cAYsk)#_HP$B8f{W))9kPC) ziFv;!#=o1;k7`n5L6e$JH_1!&sq|S|1>Oyw!rT*{! zo&13+#s@QuKH}+*Y0|*nCJl0LLmQiDYkRdO;gsr;4U=BEBb@Re z8)s7gW2WTgIpo`IH=1#4ndMb0@w9F9s(p`F9mvk@@kSTVvhL$nqi3I&`?ngs{9oT$ zr}}5vjRCH2kaz0P8LLs8<3pF*%0AUTmCcPe%2R#xQR0mXzfYC3cvicAueNZ;r_N@w z&o$43vpyAc`c&BJLkHBSiR&0kUyFY6`0q&{AJ0dh><6lqiOizDS~mM+<-GXqJ|%4B zc@W{#*VoX#TJF;u5wv3t_~bw2Q}2{deMf!jCtn75ZVWE`J=f>s`A+`kP~Va5b{{(8 zK6U2!nBVZJ+vX#E;tltKc*BEi4E6DTpnM$Xc@W@vFzI71DbsMq`q*EYcF}aaQ5&0O zxc2x|$92_j^6@grG_2NU#Z$KuI-6<#Bp9~&2}Tk!`B1Y`oL;4dQ~#;U>0!;>U$g8X z)PH1FXfylYH!EjnvvT(}D{pTzx;W%N?uE0OS^v|lVk`NNEIoOQIWOveNV6Q&y$WRI zQR+Yab*GE^zdg&SInk`z?d1Q0X6C}0RUg)@hKOdlms`-IYF48+!Dw3EO#UYrzJ<-~ zQBb3nm@K0;u32sCYgtd2V06S~8l793+5e(h-MgA;|0Edx6A4ByvTuDe>Fd?NIqLs( z4f|g-Gm+py|E+dad7UsH_qjScAkG|1lB!2Zt- z%Ca^v?%SZ8@nj=6wLy8YiAMf-1M^>rMj_#fHrKNMb&64Ps6nL;o`1U%jq*dOhU0Vt zYl#|E8E9ZX)I`H+O*E>JH9Hz;pEI^McaApDX6|0a`1g9o266N3Y^3}%CK$@tAks&e zX4jz6vX8MnXq&}YA9O5aY!A9Hch7HBPYh#&=LU`5GlNDS=6*h4JFt8X<)5)a;;}4u z(f(zOlRjSpAL2@k@yP~B^ZiEhe8xmWZdB^gek1LWoA%F;krBq&=v1na8O~VgQpPkl zvi{ZIZ{%+F8+n_CjQm)}HWx6)xrO!5?tY_q2lb0{O82q;negTN{D#BUZ&d6XGAdL1 zjVi0(aPI3jsu#>vP54~$--G_&T=f6Yf5F~>yL({B@XVjfoG0Jh+=KoL_@EhD=CjAd zED!Vl>@jh4$mlpUWOSkz(^b%8bR&B>-yh1}6bD?q|G9tWxCR#X8-vu1A;?etKn1kh z3#ZbIqMaUPLynd9JTkA7_Ij&_F-x0Kanhs8@Fb)7tVboCwC8azvwD=c+anucllIYG z+UsHBnDVrf{sb~%zegFA5&KLJ`=x)t`_jYv)Wdpe595g*o);c<-X+W(gsF_8 zpN)An=kZxl-9gx7`W1J;R@e`%ST>gNnTURpYz?j_5m2v`~%(ave zu4#QF?e0i8p3_D0!`r;$-hhshe04(Ci6V7Fi!Gn| z`c9+G)oHXlI*p{IE9qx2@4S4aY$1$Aom{B|u$)>cEAE-NWzAptjov}%N~M9RhmZoP z3s%z4VoZ;5!F*&v&PwIlRw~cFQrT0H$_X&`gj*%!Nae>Fd&IAF4`Y!9E18dDY!d%t zT%&*fGVS<`_Tlnndb4N1IP~z4v5dUF0a|7)V_t0;>uZ+jChQ%D!dPDr#`=OVb=$-A z6WrXP%jlmhQzJC(UM4U2oMEibS*E-2cb*Gld^=3{Vy=x@rt}r)^zeW6lVK{331fZ4 zGL;rAQ<;02%Iy(y?0i=h{s>h4o`f-O6Gr=N8TzTq zcs~&@WSm?kJ7k81DeD}K-c!qD-5u^LbV9oXt4& zV`M|Tai}StZ)_pIjr}5vap-Re#+!e`Z3*_vsE03F=}#lyq`mm#S+d1*>X*cw z3qR-Hvbe|JVtyL457`CtAmusclsXtk!+hH*#@*4;&4E8+FXEgxM-q+mnE#7%Yfzrv ze478XB=Y?+Fy%*F=O?(C*UI%+mNSM8@iFLcLLz1xBr(^S>_G3?W#|&(DI@OYhoxIF z{{)_g--2a+w5%svjQG%K*6c*LzfIYUjJZY3 zTShB$Ir?vVqnXpA|KDlj{g-6q`E5r27K>59`=T%qrJ{3DDt70onY9;I@m zBQD9PIKY@8RBeos6RM#GYQeQVPj$$8TawW*!1&LeD0!yo|00_@y44OHjAL|8Mblqo9yr`)^vulW`IBVyo{d&tNQ==w9nJeBn)k^J)_;+H zAy&SBm2AW>h*ARaCNjrtJCR7eEL8ILDBf2yl^Pet{acOn6)gu;V6JSmPa$*McGtp=Z z&!PXHY;+umQfEx7(RCu_bS#c&uR?f4; z#v2KK(hxmR+m=Klk^4&SY%`MZOWn6dDSOs1c9LwQp>JT{M;h*BtRQBMGIy>~>yFiG z8BZ|U_pGK5yP9X^8s;x6=({BwT{CZBZA*cApwG2Ny{>p;U?cZ{Ud50<(HPvG%bfRW zRqbA_nn~J_obQTTty-uLU9CEBhp$!xG%n5I*}YoclcXW`X7G{E@hA*sQV$BD)T|HG z*JS?^mGK_&Vef+U`HN-4J$3J5=6seYYr$f@hPypvv2x=U>mcU0AZN#7{S#R#riK~$-c!pkF4AhrEl?WaxhO>y?lu-#{3;{E?A=PBkN<9sCII(uEgxg zS*+`j4XumO16eHJ<|S$_Sd1RnVtJwM)M7>A-*RHHT8}SQ*WM+x`Fdrpcbez>$BUoAI7KQMWA9L?iIsMCW^|zKYZ&9v6ce#dK z<>&&ePNLuTV=wIO|Ho^*sNP1o2P`z2V{Nn;)(o>k9Oa!r7;A zh3ZePP#WdK6%wxMvnyynu22_Yoq-kdm8?+nrf~YD;qpQ+?u{G6)vz;M?xo?R(+b9k zRw#e(3es%_=0{l7oc z|Er+?S3x|vNvi}if zEk5+`YRB#0_=*3U0sBr-PYClN?c6Br-S~fqz4dd(@v+|t*)e~j|92zLfe7Utm{z{y zGukH+v`-??iT#I)F_$ciQ0bX9DqH_5x*zC(>_hh>4n6qo=)HSq)6YN;zV!>$%wliB zX!aOfj~+a-J{-NB1+(Rz|Cv0U|4<|5rnCQ+*YOW@KmN`5?`*Y9ea!wJv(+|tw%U;$ zp|jD6ovp6p|5kVCpVUJ{^erUk+J_bx@g%UuPSE`-~WR=$VV@rVDAj{f6@O&7RSzDZ~Pf5jhLY_ zbhOJSXE46DhV_3l82_1}D*T*;t3Jiv1=H+daE`qUkagJWkqzh=x>+0HVJ$@Ca@L2& zus#%gn47o0p_Z+z?_^NA4cX57w_`8sJEy)-7v}C=tPef&1?&GQ>=;z!47{ojT zmZ=%^GiT86n4ttnq+Mi#B+SW>f?q16p_7-6j$H=Q9**vV>kVavasP8qG5*W_@8O} z*ay#&ho{L)WDWZOR2O_8XZRxO5_%9zUt><3b^f^5vOhrMG-U_e3wRfSsiWZo?-}%H z;3d)^0Sun$>Bx^sw<6?)*vpWgVy{QG!!XRjy@s@$kNg<>cadKr{|=wR1+*J4h0Eb8 zxW+USW*+ipxD7rd&TYsa;Jyy|5-i2M0)CA-3zFeW!sQ`rpb{G4asImxc|BoAkuk_) z$bZ3)FwbM`<5IX8^X+gw<_+*y;<_36cjQCJN8l;2Y^DFW4?UngQ;g5ke~hL7xR?GT z>*SJ=DcDnSPh+iI`qXh{Y-ar*WU{6p3$n-2(}CP6_Wyx=_i@&qq6ZXU|BrL*|A8#Q zUWzO`&i)@r2lfhNC_B)Fxf^;GyuIUr#{PHAS7E-8y8bohdB`T*uR>mf`zrGp0C&Lk*t_uS zgH@PcMLvpq0QoHP*KiQ?VJN~}z&#^yV@?BIbr(A8P|5ng z3OBm!%a5seBl`etVr?JTX0q=OB!{pM5YoDfcS`K*v?bXG2<*ITG9u8Gw@#v0%o;-b z>q^=Ey3%>~q+#!5yj+m{(Z5$)9mO< za$YU=dSo5vJMpi=UW2UW{NQ(Du{Vc(^UieR-hjOk>B0XU(xH-c_~bL@u}H6O%zdx{ z^AK{O*|BErpn+Y?I?(b~46s`;>zmb>2BDe}Z=Dcf=8{lTR9&Uq& zU>)2I_rW&!6Jh@Xr{Ff?ehIf+sDc`(2FKJq+Jv;Fj|NnMEWuukEE}iIjVvgkO`bzr{3LtoBXdp$lzk$g zyj0rkonOm7&6<43f=oyaux209Aq{NPtm&Wi7bPQ;z{GH)0X&M^a_Gd94!|bL${uOx>Zg;{9xW5FK zvzN>k?7znR0CFAnhmiLnU%~BB;deulq=cZC(=Pcr`piDy~=2}zifAqBrw zNTaQtzLouFV#WL?x~AwpW+Af=vH#3b^k9#p(}>K+Uf?*b!n5q}Vmr|f(|0Xk{3;Zpc5?$=R=uZNX@x~?K& z6+}TatcEqP7GmHASO+)4diW7+fLFn61M|&14Z{dQT)hlB72yb0WnCV+KY zIs%jM7O?hB?*Pf5cftJjA1cC{g5Se2cppdxeE=WAM{olE03X9AFb#ZDT%W?9;3WJR z__nw{hg0wcdv3&>cKyJ#AL3u8h+`_lWGaWF z-6Uwr#FQB{Wnsz+nzAuv2Tf+!oT(O$x%$D!@g?S=RACC^w?RMDVlB~9Ez?!HT355n z|J4yI%|p->X_{71^eJ-IFV=x9^-?@gy7K@JSGK? z$-!f4@R%MvW(1F!!DCkNm>oRk1dq8KKj|^!{G`W>?UNq!+)w-t^Vn;i`$=!UdF-n( zkNwu*(L8tB))+i`&0|`Jd9<6p({?lbbhh0*=J4&)tEbD#%wu__dGywC{9}>JJeH>f zkLExAvAn@NR(Z^0iy7OeRx|!jQ~l<#%>4IHYgtBd^`|cL+CFue|NEz8GyI>D+s$K} z(>%7DDR45`yuv@XnD_Nre6e{tMFJA^jH;iic&-rPEkRpsGw7{BBF7M4mw2#omMM?x%Y_GL8sM0r!|UL zgVUOz)0&{uT1BkIX>HJHZO|!35ivN$1f61nPTZm*IG8tv;9`R}Z63K8j{HAI{*ojA z!jXtM5?3&q;0>8kMJn=IbFL(EB|lS`-jTt?iHuNWFmz-vNg@dvOpeH4LPV}oWH1pT zgNYCsrO03cL9D~8lD1%oT zrKoE~eld8B&O}tCqE;yi$7n^ZR@53rtyNTvqJj|w?>lWSe6W#Jt-SV<-2@JrXHtAN~y758XrrU1PZMVOy+wR$^ z+xF>p-F}a5e?YfCE*v*&v-yLcX!BNWen6Wa)K7GW?zl;JaE#TRy7MQx^Zr|O=aahg zwQj{K_9n&Ntk_!=O9ZjR8hfu|A5iQz#ZI_&m+tzB?s`&pJ*&HQkM6l$_uLZ;59pqI zbaYlr?*kLYLGsh!uwU8kMbKMF5s=fOe!TtDCWydKq~*FOt; z_2|y0^(fZI^k^KvNLZ!E^w>>$Y^NT3T93u?iv(O>)?*~*F70|iyPkhsztAtt-)?wX zrr;Q-$MrZ-KA|VLwMcQ|6_R ze@}6Y;};35^wf)bij;g>Pd}-rkEn_<^KB^sb}=eMm=+@o_RpeY|}GO>X|q_^Man$vk&MwJ!dA| za~t*Ct$GgoHa+*Ko_j%i^t_(m`It#CuK$*idOX*Cu1=+2a^eg>p zH^1H5r{YLxsTD4xQ(Q7WfR;SmR^jgc!aI3hdVT*>X8jja+f`$_{Y}0U(hLbg% zqTy5xr)fA{!x_aJ_~bH0;)}N5hR8Zql$6ax zs86HK8s)z_G}@`r9*z1n+N;q%jrMDFK%;{i9nzRZV^)pDYb-%yi5jzMEJaeZmaef3joCGpsj)1LWos-)W4Rj3(^$U73N%)zu_BEXYpg_Lr5Y>KScS$aHO88g zI*rw9tU+ULjd?WIsIexEc{S$KShL1jG}fxIHjTAwtXt#R8qd*quEz5;p0Du&jTdUX zNaMvCFVT3JPy!gQ(0HZBt2FM^c(ulBG+wK5m(UR%uNT_L<8F<6G~TH3CXIVF?$da) z##=Prs_{0Bw`;sZYzmjtvVR5g9$pAsDmjwn5u(m zI+(7589Hd!!Au>@(!p#U%+bMI9W2nnVjV2e!7?2z*FlF4R_dTr2dj0kMh9zk(4~WQ zI#{oR4La!7L5~hL>Y!f-`=8Mp*?J>K^s?w;1^Z@n#wL<9k)nxIO{8feLlc>r$kIf% zCUP{9s|f<+Yob6Cg_4N)t{^)M%ns6E02EX`)^e z4dSUg;n75+CYm(i)dY(F&6;S@M5`v+G|{e!4o!4wqDvFqn&{DlUlYBW=+i_$zm0G! zJOJC^ad;Zy-~~+#Xkt(kLpp5HVXF?u>u`b&C+e_Ghm&+TS%*_}I8}$!bU0myGj!Ol z!o~T1!wkwI(}IKbbx`&bE*2z*~i% z_1|0n(_a1e`G0RU7p>;vu(`Ob{`=9zzefH0pR4BLy1BS_@vp_|Ut|6?KaBm)Q}wTL z|C%2r{A;c!{cEnL{^zE-aBI4`m~AfR{^z{8aBIG~Sn%)jx>?Smf6Wg|)xVaT>(%CB zt+`mQ{bK46x2@{8ZU27TX)X?$i^JyPsJS?4E>2zi-ql=mHy1t4MQ?M_*Ie|w__KN7&*p(Y zn+N_JZ7!y&W|2F;cl@^7ENAEUPXC!^$AJtN=hbh!{xxgdZPw_bS=Mf|Vi#Yx&9~dE zf45oxZnOT~W)Zv1s&|`7_Nw3Z{cD!uqM53>@@RAA@%?7eF1{u>{F=ZT_O0P_;-jyL zKi9Y4jB~$P-~Q)=zjuAk=BinUt7gXEyFTY|)hxal(tb0f{jYo7*YdBKuhlHA)hw;m zEUnc{+G-|kHIufQ*-drOEZaqMUvt&m`+MhqG#8(n%=Diu7oSspt=>;xQ#Nxow=Cq- z=;HIoW{tGXQk$!0im%&lQ5-d^K5FLr+zg)^;d2ulSHFGt&+(#W^e4Za{oea~ryHMl z{5jYRscJ}P{x!>R(JaMPv-q>G+n*yjYu4l9>;BJ$pEb)oYli0HbM7ycpT6#dJ^P%# z8L$_~c{7oV&mWs*pEsJXt7eL373a-D&p+?|dWPKo`pMGHo28vMvtRhvEWyREtIvCz zC0sO1xUju%n)%#lX8OGPkI$#TtM)%U*7n~&9t{}{r=5+ z*TwMX#qVRy)yU_?*X_~fc5~R#)#t@{^?RFt7VqM)xj6c~u#Vr`eqA)nZLr>*PClrFyRYR1<5^f_TOOM7#x!@u8lnhUGxXr}CFR@337>cSi@{(aR<-ptt1 z%;@n>zXx4xG#6XVh2?ZM%W?7Rs+rYKpEEQI@BDlXeqLEn^HZ~^W{oaBC;oNuIq}zE zn_qmNudC*XshY>Tnl&|Vg65)`vbnMhS6>f*uJF_C;%mC*wQmAfza4+wXqNRgSM!c; zZhuYEyxqRg~7Z9{o6@% zaq8l8UJGvK{k43*=iKJ){<_!l{Ls^^v!|Kd#jo36Kbf_sS!s{oG5)Un?;Ff-p7W|% zL|^lO3HqDaTzuU&_WrMJ-`|X>zZp}1Gp7D#u`d37)y&z9$-B3?u)zM$B{WMnqb+I9 zo9g%5>i4_q_s8n@r|OSy)gK+zADz`7-PIpG)gQgpAA{8&L)9O{)gNQkAG6gTYty?3>*~Zk=ha`^ zYkzH9#`gNx)$!-m&Or6o&S3S|?(|=~9^9Szx?23YYJFXueqCLEULCCcbzsVa_0Oxr z-s-PI%Q&=*qsi*8<9_AmV?UpG?ChKBQO#|Soq6oS^+j`SnyY!WsFu`n_1DeNU)QJA zU$?ElZs)$P)~dhmhW@&B=e>CzzWwFmEmer~fgYR5d8)`@Gsg^oW zhw4Zjt5bDZ{oQ3;UDn@aeO)tZR?R8P>awgZ%j&YMu1&S2c)BdNYgbuz*S@m+F3az- z{4UGyviz>^>O@&j*O@w37uDb0+}+&WyxlzAJl#Cq#@}sy-ImvFS>2Y^Z5iE`(QO&s zmeFk)-Imc~yglacG3FlIr)NSLdyjGV*cLr2%6vWMJFtBZOnYFv9+>{X^as{?aHk%s zzYqBjJuiozlXK6_`LwdEbKCXYvd)<=OnYg}m&SW-`fJl)oA%l~*F4uAySAQd z9M`tbHP^N0<=S?-_PktsUamba*T#75dAatyT<sGh37Z}@NcZ}@NcZ}@NcZ}@Nc zZ}@NcZ}@NcZ}@NcZ}@NcZ}@NcZ}@NcZ}@NcZ}@NcZ}@NcZ}@NcZ}@NcZ}@NcZ}@Nc zZ}@NcZ}@NcZ}@NcZ}@NcZ}@NcZ}@NcZ}@NcZ}@NcZ}@NcZ}@NcZ}@NcZ}@#R>EAc} zH~csJH~hXv@9!J_TmD=ATmD=ATmD=ATmD=ATmD=ATmD=ATmD=ATmD=ATmD=ATmD=A zTmD=ATmD=ATYkUP{=VhB<-Fy*<-Fy*<-Fy*<-Fy*<-Fy*<-Fy*<-Fy*<-Fy*<@77@ z?_17W&Rfn~&Rfn~&Rfn~&Rfn~&Rfn~&Rfnq&O6RK&O6RK&O6RK&O6RK&O6RK&O6RK z&O6RK&O6RK&O6RK&O6RK&O6RK&O6RK&O6RK&-0z<`Ofov=Xt*KJl}bq@A&Wd@A&Wd z@A&Wd@A&Wd@A&Wd@A&Wd@A&Wd@A&Wd@A&Wd@A&Wd@A&Wd@A&Wd@A&Wd?>X-|?>X-| z?>X-|?>X-|?>X-|?>YTx`TL&pp7Wmbp7Wmbp7Wmbp7Wmbp7WmXp6{OTp6{OTp6{OP zp6j0Lp6j0Lp6j0Lp6j0Lp6j0Lp6j0Lp6j0Lp6j0Lp6j0Lp6j0Lp6j0Lp6j0LzHxb8 z?>(>gp4WTN>%Hgo-t+pv`@s9a`@s9a`@s9a`M~+W`M~+W`M~+W_rUkS_rUkS_rUky zd3^9ZJ}^G;J+M7^-X8cK_#XHk_#XHk_#XHk_#XHk_#XItbCtoc$o1$o`N;Oj_Q>|g_Q>|g_Q>|g_Q>|g z_Q>|g_Q>|g_Q>|g_Q>|g_UO5Lxt!w<%#9V_@9jb$@rg)|H=5DjP=P_pN#d%IG>F3$vB^k^T{}$jPuDjpN#Rz zc%F>s$#|ZO=gD}UjOWQ%p6;uEzKyC0HK|Vh`*rny-TM0B;-7C99&N5ZZ?*4L|8$$( zpZb6NssE?jV!G$mf;y`H=^0YPYNYz7w_SCr9@VD?mHB$j*So29x?_=IZlSUt9H0pGEdrWS>R$t*Ui(s7%>+tiG!gb*>D)?^@kd|MY)T=I=Luf0yc2 zmeGHp%-?VRe)A7l=D?b=+<^_XuZ(fPbOWZF#F{qE%s}gTsM)vkI~pXyfwYETWSVKt&g)tDMrlTH0UKfg^i*P1;3ed~7c*R7XmB|M#thUyn9RpH?%~&o-lJGmYnxT`YEdn# zRkg0R)Q;Ly-_?maQ|IbZU8x&&tL}cb0k_?&hw5j0@6UGIpxqiY8Z~`JKie(2-L`7C zc5|;J#f8O6Lb;+%BbsGH@jk0+yJ*uaQwW)uPvU%D&+Es_@ zR9&iD^)&Ul#%9>RgdLvxj=s;W+YmbXe|8MJ8c>7E7W({o_~)-p-ZA2ykwqBUlUKdIr-De+$*sePqP3&H_zACEnB(6`}FfI z@bA}IGhlCtjnB02EVupXbFOCTvmXB1KHi2M zbDvwU*_Pgr&6~{Tb=AD2nlZcO>G8BQx4alUNgWGKeZESXE6;ng)UR93SQeZ5Tt_p; zrKUc&npex`9iLBYWAT>gSZ*F_++LnvKYT82`De4$n=S5^m+DW-4O?|?eQGVTO7UVZd zhu5tAV4kYx?q)a#Ujua2Y#;Bm zuO&B|qe0s;%@W=6cJ1({?C@6nT=%(a@1c(78EKv$Z<~%g)vU$aW8BsqcmKHiCTyxn zucigRH5NR+?2WQ&>1)Jme%|O&1*G$^7mmb>fY3 z&VT7;XCLLK{f?hL5C8PJ;@RK3=Wny>rh0DcuAbYct7lvF+3|^I9|oR#cdF<9)#`b$ zO`TWILvz*h$dqzDI#4~2wW{YyHMOMHl-n~i%JOC}s^|H~>Up83dfqnA&UW>@ccd<> z=l#>_`QWU2J|3=~&zd?^*VXfd_1nLHwtug8QuV2RWnK2+pY4}F+s}TskNs?)`q@78 zv;F30`^?YwnV;<^KifxswtxI=A6Cz0^=v=!+5X|P{X^ZCYEm)TpL@2i_H6&_*}m4Z z{ikR9IO=8GO}iw|b}OEJoPPGP_t{6>XCG;weUyBDyi-rri;oO1-&)m+KcFxEOul#z zym)86c&WW~?km$cyz|o4rH0g`+EiQ0G!EgsIE16^Qyr?S>ZRNC-Im*JdY{)`x~J5% zno-u#ZT@c4J4Ey1P|QoWd3yTPpjuSRYE^lx*Rpy$RF5)$?~<~ZN}{ZKyMKUcC&McfkDv#xh{p1J*xa*#kGqat7H3Eoaa;2d!g}fACsa{viKQ zo9a}i8?v0CF=aapnQq89T3OmmhyXL~K!1`D>~f^{x>-WL75 zXv~X@%LCQR%CIussxht__o{8bW_fG2$C`1iTh6-YcHO!*W|et2tb5}~S;mIxH#}FH z*1u`po3_iQF>jjRuY{LPW7_oE+T3|P#MrcXw@|jww+wU3rr$Ejmc{Q_{Ek)a8t$%H z_T0a>UA^pk8vG`CIWUxi$Li(K<~hVT^w^Q19(gK{{Cw=GIkt>r^Bx=Cv1K0H+{d22 z?}ql>CjD-i-`)4!eJ88R&nI3&C!W3&^Y{e*a%xjJ0R3`qQ(su-h4oz^UwHiDzIwSt zzPznou8i|)Opc1B<9fWFv~e0jpOllbyv z^FG~FuXZ9{zb#g;Z5P#Rdz+f9UVZv`^*P|x-@sP~q+h!_RdYS4hSjRtRkzh^x1a6V zz50#t+I_4}t5@&-R|j2RdoHV2@BY_b^V$Zly$k9{ohy&^4X6pVRK503sgvr}yZm)v zM7e*!SO)IZWA!>{OoI>A>yYL6KAP7d>mRzSUWYrB@eJ=(uOr4e(xr@LlxcLmdL82% z^VryxvYhdGWqlK-oiyG_WAUzgotjn$>biRMj(eS+QFF?&roSuOYR0l>tjjy@b!J0t zRj;$ducwAFYq+x>nmufOZ@fLbtNCh32+3E!sL-ZC@8n?>+ju zXbUac>Wg$sw(OF{E*aU9ExKei%SNzlx|K6!>#w@Kx}x@*dbuBYzMt1kU%jqjuX{Sz z&9H7k-q5d`lhx~1d-b|)=sRZEb$ize_C3~e-Lmono8^Gxz#tAy@2iQMckt_>r}OYi z*$hWkc;sn3GKeFCI5IG&abAyylzEOV$Gh(JyW8K-t5<*fUr%@ArCPy%#o?WACq5*5fqL>y`VjjK%TsSI4to z9oKzzO!oEGhW5Ag)v?J}$0J`Idwg|V@zpW9SI6XDy>(t4Z+mr|>DBR+*Qb_xtlqvY zsAY9iy|wkLmin%))I;^=59^zqu{WQq-a6cNe&x*{)VHoabyU4|FII03V7&E=R&QSF zZ(izey}N2(Syu0bx>R@Ssd{rH;;qkgeU{Z{T1O)!BC0n>8{WJX-W*+cb5!ANz|YrFF$E!DQ zhqqal*(GHebM4AH=XmFAyLsDZo_WDK{ZW$nP&>-~-Ux4tZOYjFfqGlwT{4|NQg6$q zSuy>J<$J-ut=Yb7w&nVu8mrzmOt<0ohOuq#E8E72&*t_2<}c0L7VDOAZA~j<+p>-= zW8X5CZR^|~QkJKRQ-y9EpbDZ>XSJ9_ipcC5auNA*;_ zxA&<5b*N6OcjqtPJ3ZDjt2WfG+E*U$v7Da!>b=+U9n5=o5bwR$blw;5y(`tbqi*kx zy1o0*_wGpByVw4Ezh(E2sU>xyETjLSdLP(Q+tvG^`vz7RlLvht zF+LyA-hDuOA6Zm8$~t^7dmr`qn8(Lvs`m-&o^aoU_3oGZ`@Gu=>(%?>u;O2G{}RixWh`6P%3AflY8&_^ z{q6(S`*|uxOz0s~llre2AE7NY+78~ZCpD~$?5&bHu`>5y`Na7)9&vlQ_8fb zw(qGioHCtqoSE;;7@d-TcS`>K%=SKa?RWM2g>hYYZTQ{%erY>i+KzrVzhBwjSMGN% z>fOhgcjtTFozr-C9^>6PiFfBB-hEtocU=A5@$`4c!rz^ve|P-*Ay z|FLA+CGHjL@O$rLWmQ?f-+dn|Pt}Ll(Z_0+^7yK8ubOuCuKMtS{lnk)j}7zqp#HI8 zIUC2y*f!no)Axth&BwOwvc01Ys*jy^WgG51Rv&w|na={nUA6_dT&P{zd7xm%1(}#07 zAI{-?I3M%jJj{plFvU(!neVI0;!ahTj;pdfT$O{XsvM51ZRP&&-&C7wR~@QTb*XOE zqk5HPeD7BSYETU+%knv>WT^W8GPPpC;XrKZ)4npJaZUM;9awWOBSiZagcYieC> zs7?T2A>-`A_*z`A_*z`A_-%&Ml|>e(#o3{!{)_{!{)_{!{)_{!{)_{!{)_{!{)_ z{!{)_e)}Tjl>e09=Zxa>cClAm{4OtkmzOjCGyXIFGk(9@i{I_#j9<%D(Q;LMJ}*9> z7stfP8UGpo8Nbgg#ovH(#(&1|^GiA7KjS~+KjS~+KjS~+KjS~+KjS~+KjZg#r=0Qo z{ZY>NefB9n`;;^OGyXIFGk(Y7${GI|{~7-o{~7-o{~7-o{~7-ozt2tOjQ@=PjQ@<^ z@1}Cbf5z`KRXOL^*Hg~<&-u^!&-u^!{Y@+9{OA1V{OA1V{OA1V{63GBbN+MwbN+Mw zbAJ0RMUPN9=Xcbgob&tJSo91PJwwI*OgZO2=RfE7nXa7kpY!V{D(C#?{OA1V{OA1V z{OA1V{O9~W1D13CbN+LF$54xXq;k%G&VSB-&VSB-&VSB-&VSBtKdPMbU+~+XDn3gV zM>fg@{{{aA{{{aA{{_ERyQ0;uXtgUB{1^Ne{Emc_3w~cmSuXf5_%HY!CoLEJ7yK9e z7yLe}mJ9w1{tNyK{tJHRH_HXT&#}d4hjPJx!SC-!@xL=D_VtQ=z2b;Vx!}LxcjTp9 z@cW!mF8D9_FZeI`FZeI`ef}u+4a)`p1-~Ax;%H5|;J@I%;J@U*fCI1!w75^3g75^3g75^2#eTj0#f5m^r?=xJv;=kg*;=kg* z;=kg*;=kg*;=kg*;=kg*;=kgzUtSy?D_S&*7R{nXvt03C@jGHx95E|b{8#*roE1mT z$`$_={}sP|`f|m8#ec!#{ww|~e#Zie&miTB|BBzy zxZ-GBx#D*$zc`j(>@O9^^NU`+qF1lz)hl}Sie9~9KdLy!UmW8vdi9Fq{6()`(W_VV z>J`0uMXz4bt5>f1_39P9dPT2Z(W_US6DW>F6uo-onqRM8@p-6R^Xt_sdi9E4y`opI z=+!HF^@?){MXz3Q#IWeqD|+>cUcI7MuQ;zz^y(GIHi}-oqF1jt&rtO06}@^zuU>J^ zq1g8;&N~$69g1GPqF1lz)hmub7QK4Kd5EG{uh>5(wiI^@?7-qF1lz)hl}Sie9~cUcKTxN71WS^y(G8 zdc`@9qF1lz)hl}Sie9~cUcI7Mujth)diBZ;zh1qfSFh;RD|+>cV>(5z zUeT*p^y(EyQ;S}`;!tYwIkV{1D|+>cUcI7MuQ(Ex%(><(B`J-)G?BJWtWXSM>0eTYl$!iqFNxA>g8iujt_` zdicsMzx}4-{8-V$SDY6rdiaXN#6=HZ(Zg39E-rfbiXOh=uyN7DSDZ^LdiaXtUd3na zqKB{O;VXLhiXOh=yi(D_SM=}|pTmnDzM_Y(=;14llNCLD#o^~-f2}xXR-A(>KC2g> z)r%g!qKB{e%w8OtE)Go>`*cMQU(v%?^zaoue8u7FqKB_I)>icJ6+L`K4`0#4SM=}| zJ$&VkUk_h#eyix=D|+~f9=>wNuZOSb;VXLhio@PT4`0#4SM=}|J$ywEU(v%?^zapj z#ETxjqKB{O;VXLhiv5Xl$A8Cv$8X=F*taNm{CfL}6KUn1UvFP=D81O{D0=&f-oB!@ zuiW$N?JEwk7rlK&Z(p%bQtXoyy?w<=xT3eO*gq+bBNn}V#ePb;=hx#`^!OD$e#JS> zqQ|f3@hkWI_x$!{ieA6s*kf_*vFPT7`%FczU(xGV90w|T{fb_{qSvqJ z^(*$LieA5>*RMSAKkz^B>-j78vx=U-qUW#Z`700n5ByFw6vvZ_eX!z`L(%(J^!}9x ze!YL?fnWb$dEnOrSbV*A(Fa%_`1Jx7CohZB6vcjj(F<7g0v5f1MK55{3t0357N;+Y zUcllMM$rpc^a2*gXv+h?UcmCe?|4>mdZXwGED!wp0?Pxx-oWBGS8<%HJn-ugEdJl9 z<$?c!--*+rU$E#GEcykDe!=2+SaCA7=oc*d1&e;c;`mr`>ZCYzQtWFM`_mlPO%jIfZghfALdE(bo zSf2Rx6&9z7$`k(+zaGP)$FS%zEKmGT{7?M)4U1DrMZaOuZ&>sjmM8uv{wMw?{wIDt zh(!-#(Sul?`1K(ceTYRLV$p|K^dS~~h(#Y_aXh~0MJ!MJPyA2(PyA2(`V-3&zy8GH z^i|QLSoA0sJ&MJttm1fPdE(boQ2ak~i{tv`>8Z-MZ>mkTs}9wvx>UF7QN5~9^{W9j zsD{+A8d0NaOpU7vHL0f5w3<<~YEI3o1+}P_)UsMpt7=WHs|~fOw$!%TQM+nS?W>kL zP>1SB9jou^M4hTLb*?VdrMgnr>PFqFJ9V!f)T82W<8R|{<8R~ljXr7PcbqqE{B8Vg z{EoXt`(1SSrHx;gU)uO}`lXFuw_n=$b^N7`U)NvS_;voJjbHa)+W2(z_jr@-j_Ch=Uvjq-^Sm@?-+C(gO0AjI35_CgV8w{rz@j- zFwW1Ujb8_0G(e<{-!a2DXA{S*=b9kZI@Fr;>L5c7ErFqCYY1{O$bh{CXAB&fm`8&fm`8&aVYA zdJWUg-_GC8uZb|42%}*z?fmWh?fmWh+V0ZM@0fe^(nZ5uG|WX$T=c|6hgI}jMH^hS z!9~kk^khZLTeQ5TgI}XtI`}*IHM&KkTRQli>Wy<+(b^V0ZPCn@4*m{)$6C|DuW>DU z)uOvAI?~d?uhT4A$kM^@RB@aWi>8z4^on+_XyS_Fw9#@E?No6pIl85yuP3^tqBANw zqoPSA`k|ulDf*$J$tjwgqPZ&_{5qPVk13A#Mjuo3F{OjQgTI4cKU4HGML$#YGetjB z^fSe=;pk~f2fxf!WUeA}73VA>bCphhnXAZL#VPK{T&0s=<|;B*k-3V@Rb;Lra}~## zBXgBbez~j2U8R%1lfRR{lV1)ijzvcnE3#OT#Y!i?JXV}9OeeoQRyz4R`8)Y#v(m}m z$*)-_GFp+*ii}odw4w(po&25rvRdin@8s{~m)lAweEiF=@8WklKh9mIi(lp|PVL9>^K|id@jJ~Q$I|0ef4ca)_`CSK_`CQu5T%R1i(ghO zvSN`HOBa6^e;0oje;0ojzjlGhkVS?pGGx(w6wODGA&Um2XeEdSq&RjTO-PX;i$

#oxuRnJL};8k*A0FMAf* zvvl)!^UI*6o4=dCo4=c1<5RRYq?=!RLo}{OBUCi5M@}soqM{)x8luw8@0>$4v`0gG zoC{7jzbsq2`MddL+S1M6&97A=-TdAB-TdAB-TX3d(Nq;pRgrs3H@}vO$iGGYEm|ug z{}$&qqUkE#{2H&)&EL)6&EL)6&EL)6&EL)6!{5W-!{5W-!{5X2{77WzB10F=Tj}BN z;djm?8uH_OcAPtjbJ}tKBt85xcIn~o;qT#>w~O&Kfe&-O=!{5W-!!O4dEhTYYF+Kbm%c8v`J^VfV zJ^VfVJ^ao)riZ_WzlUE7N}Pv`UczWZi}R7`Ue=mP8e=om=wm7euUjANw=Qq>KugNXF{LXczm%o?4m%o=^h#|fFf(+5y z7<~fK5Es1y(JPQ%{$75qGU?^-_$46H$KS^<36Va2jd#(9kUoB?i1hJm$w?o-hP?Fg_wnmfNFRS6zjmGU@%Qof z@%Qof@#|wqAHQCP^zm!yNgsb7e;x7BasD`c{C)h+?WK>ukH3#!Pemj+($C+|FUb)} zj`Z{Q^GkF@(_#Agor6w4e?Naee?Naezw^@R=kMq5=hvf=e*S*`e*S*`etwOM>F0M2 zJN^9q{Qdm>{Qdlz8q?3;&)?7A&)?7A&)?7A&#(6*k|mKWi5`&j^E)4&etze~qaiZ= z{Qdm>{Qdm>{Qdm>{QdmSo2Q?@pMQXVfPa90fPa90fPa8rV`T>TwOwU^e}I30e}I30 ze}I30e}I30e}G?GRtES7_?<(|0RI600RI600RI5LM$HWHOU6VpCXz9cjPZ?~ir+c? z4Db)|5AaLOL}DfqGm)6d0RI600RI600RI600KewX4Db)|5AbUQjYiN6@avb!0Kc}e z4Db)~>mAM@{~*8fz0oil4WrRkmO=hO{z3jh{z3jh{z3jh{z3jh{z3jh{y~0?rWxed zY??v-L4HlA8RQ@2*Q=00{z3jh{z3jhe*Fs>t#TRU*Zi6xehsi0;veGI2pf&C8R8$} zAL7>(8|T0?#6QG8#6QG8#INNp8fK&CB6=>OaW-1;GQ>Z`Kg6#mJDO=T#IKa^WevP>q;veGIpqn9ny-*qA zAL1Y4AL7>=6)k}o;@7~NA%0D~8Rj46*UTI3g3&*fVg6x$eN-9d*GrXQe*IJ#<{#!C z<{#!C<{#!C<{#!C<{#!C<{#!C<{#!C<{#!C<{#!C<{#$Qo|s|&Vg6zMVSYVZ8Rj46 zALiG-m|^~5{$YMSEg9w?<{#!C=GWtuVg6zMVSbIt8Rj46ALbwCALbwC*AJ6n{$c(R z{t^BWevQo0CK+v#8Q~w{*Vr7rGtoPf5q|wM8Q~w{AK@S2AK@S2AK@S2*MFZ8{t^BW ze!ch^;UD4Gdy^4<{b>98R6Gpoe_TD{hks25&jW=y&D@{jV5@{jV5@@qYfrtggMkMfW5 zkMiru%P9XS|0w?`|0w?`|0w?`|0w?`|0w?`|0w?`|0w?`|0w?`|0w?`|0uua@@Ov4 zDE}z`D8GKEjPj52kMWQ3kMWQ3kMWQ3kMZjf%oxAksEqNC@sIJ3@sIJ3@sIJ3@oS~c z82=dm82=dm82=dm82=dm7{8vwjPZ~0kMWQ3kMWQ3kMWQ3kMWQ3kMV1Uj~>O0@yl{# zjDL)OjDL)Oj9>p^#`wqh$N0zi$N07NW{iJ~e~f>Oe~f>OU%PL{_{aIj`N#Rk`N#S7 zLS~$QoPV5OUu4Gl$N4qzW}JVVf1H1uf1H1uf1H1uf1H1uf1F=GWybln7H6D)oPV5O zgMY^P$N9(kHT!3rf1H1uf1H1uf1H1uf1H1uf1H1uf1H1uf1H1uf1H1uf1H1uf1H1u zf1F=#C*%CGI~nH}D#$qhIR6Cy1pfs81i#kjOz=@4h{{;U8{{;U8 z{{;U8{{;U8{{;U8{{;U8{{;U8{{+9j(oFDA@K5kh@K5kh@K5kh@K5mTJ6KgqA}HIw|4{FD5X{FD5X{FD5X{FD5X{FD5X{FD5X{IXHeH<(F&y|bC*pX8t9 zpX8t9pXAq5n@Rpj{z?8xem%CCxou6#o?e6#o?e6#o?e6#o?e6#o?e6#o?e z6#o?e6#o?e6#o?e6#o>zfJ(H?XNrG{Ute&h_^0@%_+_y&#XrSA#V?nYDgG(`DSkb~ znc|<~pW>h5pW>h5*M^@d{we+`{we+`{we+`{we+`{waRFdYR&%;-BK5;-BK5;-BK5 z;@8WUY5r;cX?~%aO!H6kYxU1G|1|$Jzn;HL^H1|n^H1|n^H1~Zb5Kg}-$6upU=<`+-MH2*aJH2*aJH2*aJ4F3$juux|BXZS@KGQ&T^Kf^!6 zKf^!6Kf^!6Kf^!6Kf^!6Kf^!6Kf^!6Kf^!6Kf^!6Kf^!6Kf^!6Kf^!6Kf^!6uTMN; z6Pe+k;h*83;h*83;h*83;h*83;h*83;h*83;h*83;h*83;h*83;h*83;h*83;h*83 z;h*83;h*K7<)7uB<)7uB<=5w%S^io6S^io6S^io6S^io6S$@6kndR3L96687^3U?m z^3U?m^3U?m^3U??6^&l_%<}7pk6cJ*`DgiO`DgiO`DgiO`DgiO`DgiO`DgiO`SqJ- zmVcIimVcIimVcICt}(Oxv;4FCv;4FCv;4FCv;4FC`j<1uKgX{xKXd$Z{B!(s{B!(s z`~oYP{l?L69R0?b=bz`F=bz`7dyU*h7WfzV7x)+W z7x)+Wg^{wrzrer1zrer1zrer1zrer1zrer1zrer1zrer1zrer1zrer1zrer1zrer1 zzrer1zrer1uNOLcp|ilhz`wvR2NJ!}S>RvbU*KQhU*MMy$pXK?R2KLb_!szPMzX-K zpE?Wtf>c@HU*KQhU*KQhU*KQh7qW_sNf!AR`4{;Y`4{;Y`30~dfR#o5MgB$pMSeZ~ zS>#{jU*uopU*uopU*s38jL=pV`4{;Y`31PL$iK)h(-N7MEb1 zEk|x3i~NiHi~NiHi~NiHi~K^HS>)F{7d{x{Cc{x#J|M9 z#J|M9#J|M9#J|M9#4mG^C4M0}|1TKDzr?@9zsxVco@M@J{$+kay2xl`nSYsonSYsI z?kLOr%lvxDv&_HDzs$ePzs$ePzs$ePFN~LE{$>7U{$>7U{$>7U{$>7U{$>7U{$>7U z{$>7Uei^4M^Dpx+^Dpx+^Dpx+^Dpx+^Dpx+^Dpx+^Dpx+^Dpx+^Dpzun`D`PnSYso znSYsog@1*Ag@1*Ag@1*Ag@1*Ag@1*Agu}*Z9}?*Z9}?*Z9}?*Z9}?*Z9}?Wt6hUFK?GM{x$wJewos&@vrf( z@vrf(@vrf(@e7Ys>hmwU<@{~G@q{~G@q{~G@q{~G@q{~G@q z{~G@q{~G@q{~G@q{~G@q{~G@q{~G@q{~G@q{~G@q{~G@~zbtIl`Pcc^`Q>7>&cDvT z&cDvT&cDttUzl}%x!J7quk)|-uk)|-uk)|-uk)|-uk)|-uk)|-uk)|-uk)|-uk)|- zuk)|-uk)|-uk)|-%kE^If1Q7wf1Q7wf1Q7wf1Q7wf1O_-KkNMK{OkPd{OkPd{OkPd z{OkPd{OkPd{2Tlm{PJem;NRfi;NRfi;NRfi;FqV&2LA^C2LA^C2LA^C2LA^C2LA^C z2LA@X>{>SXH~2UBWstJLzrnx3FOQTBe%ZFjC1rzugMWj6gMWj6gMWj6gMWj6gMWj6 zgI|s*zHlHL{JwM`8~hvm8~hvmzJwq$bJ^gRn~N-H6Z$-l|J z$?uB`vdO>6zsbMJzsbMJzsbMJzsWC)mredn{!M;4>iDvQZ1QjNZ}M;Q`|^Wq^7{gW zZ1QjNZ}M;QZ}R(6glzI}@^A8Q^81p6$Y@1ID>AIv-ZoqOTl`!6Tl~K9AzS=1jgiC6 z7XKFi7XKFi7XKFi7XKFi7XKFi7XKFi7Qb(d$QJ(={}%ri{}%ri{}%ri{}%rizuaVG zeY4Fk@0)G@ZT@Y3-@q1`%53v*^KbKS^UGLfn}3^MHaN1u+2-Ho-{#-u-{#-um#ND( z|2DsDUAFnR`M3GE`M3GE`M3GE`M3FH@3YOn&A-hrkC$!!ZT@ZkZGQQ@Z1c+`XPbYU zf17`sf17`sf17`sf17`sf17`sf1BSIH)oq)mO0z}JN!HRGJo0O-{Ifk-{Ifk-{Ifk zmwV0*{|^5S{|^5S{|^5Szuakd_;>hs_;>hs_;>hs_;>hs_;>hs_;>hs_;>hs_;>i_ zR)MwyZpQS^6S~--{aro-{aro-{aro z-{aro-{aro-{aro-{aro-{aro-{aro-{aro-{aro-{aro-{aro-{aro-{aro-{aro z-{aro-{aro_bo5kcJ{4(M>(y{X>!Pa$bZOx$bZOx$nQH3a>Recf5d;p@7tqt#DBzp#DB!^dl7QPf5d;p zf5h+mJaWW;#DBzp#P5GykR$#he&0ZqBmN`)BmN`)BYxj|mm_}Pm6Id>BmN`)BmN`) zBmN`)BmN`)BmN`)BmN`)BYxkplOz5k{v-Y)e&4y1BmN`)BmN`)BmN_P-^CN(#gilc zBmN`)BYxjUl4Jg3e&3drWBy})-%*ld{$u`Q{$qaMyEQo~z1~PHtOzdv{EJq;31)=wo>JoUou&{fO z-^7s>`^(J4?W%g8^#AGq)BmUcPye6(KmC9D|MdUq|I`1c|4;v){y+VH`v3I*>HpLJ zr~gm?pZ-7nfBOIQ|LOnJ|EK>?|DXOp{eSxZ^#AGq)BmUcPye6(KmC9D|MdUq|I`1c z|4;v){y+VH`v3I*>HpLJr~gm?pZ-7nfBOIQ|LOnJ|EK>?|DXOp{eSxZ^#AGq)BmUc zPye6(KmC9D|MdUq|I`1c|4;v){y+VH`v3I*>HpLJr~gm?pZ-7nfBOIQ|LOnJ|EK>? z|DXOp{eSxZ^#AGq)BmUcPye6(KmC9D|MdUq|I`1c|4;v){y+VH`v3I*>HpLJr~gm? zpZ-7nfBOIQ|LOnJ|EK>?|DXOp{eSxZ^#AGq)BmUcPye6(KmC9D|MdUq|I`1c|4;v) z{y+VH`v3I*>HpLJr~gm?pZ-7nfBOIQ|LOnJ|EK>?|DXOp{eSxZ^#AGq)BmUcPye6( zKmC9D|MdUq|I`1c|4;v){y+VH`v3I*>HpLJr~gm?pZ-7nfBOIQ|LOnJ|EK>?|DXOp z{eSxZ^#AGq)BmUcPye6(KmC9D|MdUq|I`1c|4;v){y+VH`v3I*>HpLJr~gm?pZ-7n zfBOIQ|LOnJ|EK>?|DXOp{eSxZ^#AGq)BmUcPye6(KmC9D|MdUq|I`1c|4;v){y+VH z`v3I*>HpLJr~gm?pZ-7nfBOIQ|LOnJ|EK>?|DXOp{eSxZ^#AGq)BmUcPye6(KmC9D z|MdUq|I`1c|4;wl&HkJHH~Vk)-|WBHf3yE)|IPlJ{Wtq>_TTKk*?+VDX8+CpoBcQY zZ}#8pzuAAY|7QQq{+sHpLJr~gm?-_8D;{Wtq> z_Ur%C|97+hX8+CpoBcQYZ}#8pzuAAY|7QQq{+sHpLJr~gm?-`)PZ{dfEC_Ur%C|97`v|KHvIyZv|j@AlvAzuSMe|8D=?{=5Bm`|tMO z?Z4Z9xBqVc-Tu4%cl+=5-|fHKf4Bc`|K0w({dfEC_TTNl+kdzJZvWl>yZv|j@AlvA zzuSMe|8D=?{=5Bm`|tMO?Z4Z9xBqVcVgF(OVgF(OVgF(OVgF(OVZZ)A{eSxZ4*L)L z5Bm@M_5bPr)BmUcPye6(KmC9D|MdUq|I`1c|4;v){y+VH`v3I*>HpLJr~gm?pZ-7n zfBOIQ|LOnJ|EK>?|DXOp{eSxZ^#AGq)BmUcPye6(KmC9D|MdUq|I`1c|4;v){y+VH z`v3I*>HpLJr~gm?pZ-7nfBOIQ|LOnJ|EK>?|DXOp{eSxZ^#AGq)BmUcPye6(KmC9D z|MdUq|I`1c|4;v){y+VH`u|S*Py0{%Py0{%Py0{%Py0{%Py6-%>Hj?|DXOp{eSxZ^#AGq)BmUcPye6(KmC9D|MdUq|I`1c z|4;v){y+VH`v3I*>HpLJr~gm?pZ-7nfBOIQ|LOnJ|EK>?|DXOp{eSxZ^#AGq)BmUc zPye6(KmC9D|MdUq|I`1c|4;v){y+VH`v3I*>HpLJr~gm?pZ-7nfBOIQ|LOnJ|EK>? z|DXOp{eSxZ^#AGq)BmUcPye6(KmC9D|MdUq|I`1c|4;v){y+VH`v3I*>HpLJr~gm? zpZ-7nfBOIQ|LOnJ|EK>?|DXOp{eSxZ^#AGq)BmUcPye6(KmC9D|MdUq|I`1c|4;v) z{y+VH`v3I*>HpLJr~gm?pZ-7nfBOIQ|LOnJ|EK>?|DXOp{eSxZ^#AGq)BmUcPye6( zKmC9D|MdUq|I`1c|4;v){y+VH`v3I*>HpLJr~gm?pZ-7nfBOIQ|LOnJ|EK>?|DXOp z{eSxZ^#AGq)BmUcPye6(KmC9D|MdUq|I`1c|4;v){y+VH`v3I*>HpLJr~gm?pZ-7n zfBOIQ|LOnJ|EK>?|DXOp{eSxZ^#AGq)BmUcPye6(KmC9D|MdUq|I`1c|4;v){y+VH z`v3I*>HpLJr~gm?pZ-7nfBOIQ|LOnJ|EK>?|DXOp{eSxZ^#AGq)BmUcPye6(KmC9D z|MdUq|I`1c|4;v){y+VH`v3I*>HpLJr~gm?pZ-7nfBOIQ|LOnJ|EK>?|DXOp{eSxZ z^#AGq)BmUcPye6(KmC9D|MdUq|I`1c|4;v){y+VH`v3I*>HpLJr~gm?pZ-7nfBOIQ z|LOnJ|EK>?|DXOp{eSxZ^#AGq)BmUcPye6(KmC9D|MdUq|I`1c|4;v){y+VH`v3I* z>HpLJr~gm?pZ-7nfBOIQ|LOnJ|EK>?|DXOp{eSxZ^#AGq)BmUcPye6(KmC9D|MdUq z|I`1c|4;v){y+VH`v3I*>HpLJr~gm?pZ-7nfBOIQ|LOnJ|EK>?|DXOp{eSxZ^#AGq z)BmUcPye6(KmC9D|MdUq|I`1c|4;v){y+VH`v3I*>HpLJr~gm?pZ-7nfBOIQ|LOnJ z|EK>?|DXOp{eSxZ^#AGq)BmUcPye6(KmC9D|MdUq|I`1c|4;v){y+VH`v3I*>HpLJ zr~gm?pZ-7nfBOIQ|LOnJ|EK>?|DXOp{eSxZ^#AGq)BmUcPye6(KmC9D|MdUq|I`1c z|4;v){y+VH`v3I*>HpLJr~gm?pZ-7nfBOIQ|LOnJ|EK>?|DXOp{eSxZ^#AGq)BmUc zPye6(KmC9D|MdUq|I`1c|4;v){y+VH`v3I*>HpLJr~gm?pZ-7nfBOIQ|LOnJ|EK>? z|DXOp{eSxZ^#AGq)BmUcPye6(KmC9D|MdUq|I`1c|4;v){y+VH`v3I*>HpLJr~gm? zpZ-7nfBOIQ|LOnJ|EK>?|DXOp{eSxZ^#AGq)BmUcPye6(KmC9D|MdUq|I`1c|4;v) z{y+VH`v3I*>HpLJr~gm?pZ-7nfBOIQ|LOnJ|EK>?|DXOp{eSxZ^#AGq)BmUcPye6( zKmC9D|MdUq|I`1c|4;v){y+VH`v3I*>HpLJr~gm?pZ-7nfBOIQ|LOnJ|EK>?|DXOp z{eSxZ^#AGq)BmUcPye6(KmC9D|MdUq|I`1c|4;v){y+VH`v3I*>HpLJr~gm?pZ-7n zfBOIQ|LOnJ|EK>?|DXOp{eSxZ^#AGq)BmUcPye6(KmC9D|MdUq|I`1c|4;v){y+VH z`v3I*>HpLJr~gm?pZ-7nfBOIQ|LOnJ|EK>?|DXOp{eSxZ^#AGq)BmUcPye6(KmC9D z|MdUq|I`1c|4;v){y+VH`v3I*>HpLJr~gm?pZ-7nfBOIQ|LOnJ|EK>?|DXOp{eSxZ z^#AGq)BmUcPye6(KmC9D|MdUq|I`1c|4;v){y+VH`v3I*>HpLJr~gm?pZ-7nfBOIQ z|LOnJ|EK>?|DXOp{eSxZ^#AGq)BmUcPye6(KmC9D|MdUq|I`1c|4;v){y+VH`v3I* z>HpLJr~gm?pZ-7nfBOIQ|LOnJ|EK>?|DXOp{eSxZ^#AGq)BmUcPye6(KmC9D|MdUq z|I`1c|4;v){y+VH`v3I*>HpLJr~gm?pZ-7nfBOIQ|LOnJ|EK>?|DXOp{eSxZ^#AGq z)BmUcPye6(KmC9D|MdUq|I`1c|4;v){y+VH`v3I*>HpLJr~gm?pZ-7nfBOIQ|LOnJ z|EK>?|DXOp{eSxZ^#AGq)BmUcPye6(KmC9D|MdUq|I`1c|4;v){y+VH`v3I*>HpLJ zr~gm?pZ-7nfBOIQ|LOnJ|EK>?|DXOp{eSxZ^#AGq)BmUcPye6(KmC9D|MdUq|I`1c z|4;v){y+VH`v3I*>HpLJr~gm?pZ-7nfBOIQ|LOnJ|EK>?|DXOp{eSxZ^#AGq)BmUc zPye6(KmC9D|MdUq|I`1c|4;v){y+VH`v3I*>HpLJr~gm?pZ-7nfBOIQ|LOnJ|EK>? z|DXOp{eSxZ^#AGq)BmUcPye6(KmC9D|MdUq|I`1c|4;v){y+VH`v3I*>HpLJr~gm? zpZ-7nfBOIQ|LOnJ|EK>?|DXOp{eSxZ^#AGq)BmUcPye6(KmC9D|MdUq|I`1c|4;v) z{y+VH`v3I*>HpLJr~gm?pZ-7nfBOIQ|LOnJ|EK>?|DXOp{eSxZ^#AGq)BmUcPye6( zKmC9D|MdUq|I`1c|4;v){y+VH`v3I*>HpLJr~gm?pZ-7nfBOIQ|LOnJ|EK>?|DXOp z{eSxZ^#AGq)BmUcPye6(KmC9D|MdUq|I`1c|4;v){y+VH`v3I*>HpLJr~gm?pZ-7n zfBOIQ|LOnJ|EK>?|DXOp{eSxZ^#AGq)BmUcPye6(KmC9D|MdUq|I`1c|4;v){y+VH z`v3I*>HpLJr~gm?pZ-7nfBOIQ|LOnJ|EK>?|DXOp{eSxZ^#AGq)BmUcPye6(KmC9D z|MdUq|I`1c|4;v){y+VH`v3I*>HpLJr~gm?pZ-7nfBOIQ|LOnJ|EK>?|DXOp{eSxZ z^#AGq)BmUcPye6(KmC9D|MdUq|I`1c|4;v){y+VH`v3I*>HpLJr~gm?pZ-7nfBOIQ z|LOnJ|EK>?|DXOp{eSxZ^#AGq)BmUcPye6(KmC9D|MdUq|I`1c|4;v){y+VH`v3I* z>HpLJr~gm?pZ-7nfBOIQ|LOnJ|EK>?|DXOp{eSxZ^#AGq)BmUcPye6(KmC9D|MdUq z|I`1c|4;v){y+VH`v3I*>HpLJr~gm?pZ-7nfBOIQ|LOnJ|EK>?|DXOp{eSxZ^#AGq z)BmUcPye6(KmC9D|MdUq|I`1c|4;v){y+VH`v3I*>HpLJr~gm?pZ-7nfBOIQ|LOnJ z|EK>?|DXOp{eSxZ^#AGq)BmUcPye6(KmC9D|MdUq|I`1c|4;v){y+VH`v3I*>HpLJ zr~gm?pZ-7nfBOIQ|LOnJ|EK>?|DXOp{eSxZ^#AGq)BmUcPye6(KmC9D|MdUq|I`1c z|4;v){y+VH`v3I*>HpLJr~gm?pZ-7nfBOIQ|LOnJ|EK>?|DXOp{eSxZ^#AGq)BmUc zPye6(KmC9D|MdUq|I`1c|4;v){y+VH`v3I*>HpLJr~gm?pZ-7nfBOIQ|LOnJ|EK>? z|DXOp{eSxZ^#AGq)BmUcPye6(KmC9D|MdUq|NCtJXZ!X4>HpLJr~gm?pZ-7nfBOIQ z|LOnJ|EK>?|DXOp{eSxZ^#AGq)BmUcPye6(KmC9D|MdUq|I`1c|4;v){=XOdU+mZa zr~gm?pZ-7nfBOIQ|LOnJ|EK>?|DXOp{eSxZ^#AGq)BmUcPye6(KmC9D|MdUq|I`1c z|4;v){y+VH`v3I*>HpLJr~gm?pZ-7nfBOIQ|LOnJ|EK>?|DXOp{eSxZ^#AGq)BmUc zPyg@7Z@(;I0Kx!-0SE&S1|SST7=SPUVF1DagaHTx5C$L&Kp2290AT>a0E7Vu0}uuv z3_uuwFaTiy!T^K;2m=rXAPhhlfG_}I0Kx!-0SE&S1|SST7=SPUVF1DagaHTx5C$L& zKp2290AT>a0E7Vu0}uuv3_uuwFaTiy!T^K;2m=rXAPhhlfG_}I0Kx!-0SE&S1|SST z7=SPUVF1DagaHTx5C$L&Kp2290AT>a0E7Vu0}uuv3_uuwFaTiy!T^K;2m=rXAPhhl zfG_}I0Kx!-0SE&S1|SST7=SPUVF1DagaHTx5C$L&Kp2290AT>a0E7Vu0}uuv3_uuw zFaTiy!T^K;2m=rXAPhhlfG_}I0Kx!-0SE&S1|SST7=SPUVF1DagaHTx5C$L&Kp229 z0AT>a0E7Vu0}uuv3_uuwFaTiy!T^K;2m=rXAPhhlfG_}I0Kx!-0SE&S1|SST7=SPU zVF1DagaHTx5C$L&Kp2290AT>a0E7Vu0}uuv3_uuwFaTiy!T^K;2m=rXAPhhlfG_}I z0Kx!-0SE&S1|SST7=SPUVF1DagaHTx5C$L&Kp2290AT>a0E7Vu0}uuv3_uuwFaTiy z!T^K;2m=rXAPhhlfG_}I0Kx!-0SE&S1|SST7=SPUVF1DagaHTx5C$L&Kp2290AT>a z0E7Vu0}uuv3_uuwFaTiy!T^K;2m=rXAPhhlfG_}I0Kx!-0SE&S1|SST7=SPUVF1Da zgaHTx5C$L&Kp2290AT>a0E7Vu0}uuv3_uuwFaTiy!T^K;2m=rXAPhhlfG_}I0Kx!- z0SE&S1|SST7=SPUVF1DagaHTx5C$L&Kp2290AT>a0E7Vu0}uuv3_uuwFaTiy!T^K; z2m=rXAPhhlfG_}I0Kx!-0SE&S1|SST7=SPUVF1DagaHTx5C$L&Kp2290AT>a0E7Vu z0}uuv3_uuwFaTiy!T^K;2m=rXAPhhlfG_}I0Kx!-0SE&S1|SST7=SPUVF1DagaHTx z5C$L&Kp2290AT>a0E7Vu0}uuv3_uuwFaTiy!T^K;2m=rXAPhhlfG_}I0Kx!-0SE&S z1|SST7=SPUVF1DagaPQy{x|#I?0>T#0}uwFH~Zi0f3yG1{x|#I?0>WW&Hgw0-|T<0 z|IPk4``_$;v;WQhH~Zi0#{h%@2m=rXAPhhlfG_}I0Kx!-0SE&S1|SST7=SPUVF1Da zgaHTx5C$L&Kp2290AT>a0E7Vu0}uuv3_uuwFaTiy!T^K;2m=rXAPhhlfG_}I0Kx!- z0SE&S1|SST7=SPUVF1DagaHTx5C$L&Kp2290AT>a0E7Vu0}uuv3_uuwFaTiy!T^K; z2m=rXAPhhlfG_}I0D8Co-Trs`-|fc$gaHTx5C))k``_(WEI!~Tc;5BneXKkR?l|FHjI|HJ-={SW&e_CM@@*#EHqVgJMahy4%x zF#urzdf5N4|6%{behfeufG_}I0Kx!-0SE&S1|SST7=SPUVF1DagaHTx5C)*9{ZIR! z_CM`^+W)lwY5&vyr~Oa+F#urzdfNZA|7rizehfeufG_}I0Kx!-0SE&S1|SST7=SPU zVF1DagaHTx5C)*9{ZIR!_CM`^+W)lwY5&vyr~Oa+pY}iPf7<`F|7riz{-^y<`=9nd z?SI<;wEt=U)BdOZPy3(tKka|o|Fr*U|I>a9Kp2290AT>a0E7Vu0}uuv3_uuwFaTiy z!T^K;2m=rXAPhhlfG_}I0Kx!-0SE&S1|SST7=SPUVF1DagaHTx5C$L&Kp2290AT>a z0E7Vu0}uuv3_uuwFaTiy!T^K;2m=rXAPhhlfG_}I0Kx!-0SE&S1|SST7=SPUVF1Da zgaHTx5C$L&Kp2290AT>a0E7Vu0}uuv3_uuwFaTiy!T^K;2m=rXAPhhlfG_}I0Kx!- z0SE&S1|SST7=SPUVF1DagaHTx5C$L&Kp2290AT>a0E7Vu0}uuv3_uuwFaTiy!T^K; z2m=rXAPhhlfG_}I0Kx!-0SE&S1|SST7=SPUVF1DagaHTx5C$L&Kp2290AT>a0E7Vu z0}uuv3_uuwFaTiy!T^K;2m=rXAPhhlfG_}I0Kx!-0SE&S1|SST7=SPUVF1DagaHTx z5C$L&Kp2290AT>a0E7Vu0}uuv3_uuwFaTiy!T^K;2m=rXAPhhlfG_}I0Kx!-0SE&S z1|ST8AOEGcjs_47AR0h4fM@{G0HOgz1BeC?4ImmoG=OLT(Ey?WL<5Kh5Dg$2Ks118 z0MP)V0Yn3c1`rJ(8bCCFXaLawq5(t$hz1Z1AR0h4fM@{G0HOgz1BeC?4ImmoG=OLT z(Ey?WL<5Kh5Dg$2Ks1180MP)V0Yn3c1`rJ(8bCCFXaLawq5(t$hz1Z1AR0h4fM@{G z0HOi(8~bSh(Ey?WL<5Kh5Dg$2Ks1180MP)V0Yn3c1`rJ(8bCCFXaLawq5(t$hz1Z1 zAR0h4fM@{G0HOgz1BeC?4ImmoG=OLT(Ey?WL<5Kh5Dg$2Ks1180MP)V0Yn3c1`rJ( z8bCCFXaLawq5(t$hz1Z1AR0h4fM@{G0HOgz1BeC?4ImmoG=OLT(Ey?WL<5Kh5Dg$2 zKs1180MP)V0Yn3c1`rJ(8bCCFXaLawq5(t$hz1Z1AR0h4fM@{G0HOgz1BeC?4Immo zAOC&Q0HOgz1BeC?4ImmoG=OLT(Ey?WL<5Kh5Dg$2Ks1180MP)V0Yn3c1`rJ(8bCCF zXaLawq5(t$hz1Z1AR0h4fM@{G0HOgz1BeC?4ImmoG=OLT(Ey?WL<5Kh5Dg$2Ks118 z0MP)V0Yn3c1`rJ(8bCCFXaLawq5(t$hz1Z1AR0h4fM@{G0HOgz1BeC?4ImmoG=OLT z(Ey?WL<5Kh5Dg$2Ks1180MP)V0Yn3c1`rJ(8bCCFXaLawq5(t$hz1Z1AR0h4fM@{G z0HOgz1BeC?4ImmoG=OLT(Ey?WL<5Kh5Dg$2Ks1180MP)V0Yn3c1`rJ(8bCCFXaLaw zq5(t$hz1Z1AR0h4fM@{G0HOgz1BeC?4ImmoG=OLT(Ey?WL<5Kh5Dg$2Ks1180MP)V z0Yn3c1`rJ(8bCCFXaLawq5(t$hz1Z1AR0h4fM@{G0HOgz1BeC?4ImmoG=OLT(Ey?W zL<5Kh5Dg$2Ks1180MP)V0Yn3c1`rJ(8bCCFXaLawq5(t$hz1Z1AR0h4fM@{G0HOgz z1BeC?4ImmoG=OLT(Ey?WL<5Kh5Dg$2Ks1180MP)V0Yn3c1`rJ(8bCCFXaLawq5(t$ zhz1Z1AR0h4fM@{G0HOgz1BeC?4ImmoG=OLT(Ey?WL<5Kh5Dg$2Ks1180MP)V0Yn3c z1`rJ(8bCCFXaLawq5(t$hz1Z1AR0h4fM@{G0HOgz1BeC?4ImmoG=OLT(Ey?WL<5Kh z5Dg$2Ks1180MP)V0Yn3c1`rJ(8bCCFXaLawq5(t$hz1Z1AR0h4fM@{G0HOgz1BeC? z4ImmoG=OLT(Ey?WL<5Kh5Dg$2Ks1180MP)V0Yn3c1`rJ(8bCCFXaLawq5(t$hz1Z1 zAR0h4fM@{G0HOgz1BeC?4ImmoG=OLT(Ey?WL<5Kh5Dg$2Ks1180MP)V0Yn3c1`rJ( z8bCCFXaLawq5(t$hz1Z1AR0h4fM@{G0HOgz1BeC?4ImmoG=OLT(Ey?WL<5Kh5Dg$2 zKs1180MP)V0Yn3c1`rJ(8bCCFXaLawq5(t$hz1Z1AR0h4fM@{G0HOgz1BeC?4Immo zG=OLT(Ey?WL<5Kh5Dg$2Ks1180MP)V0Yn3c1`rJ(8bCCFXaLawq5(t$hz1Z1AR0h4 zfM@{G0HOgz1BeC?4ImmoG=OLT(Ey?WL<5Kh5Dg$2Ks1180MP)V0Yn3c1`rJ(8bCCF zXaLawq5(t$hz1Z1AR0h4fM@{G0HOgz1BeC?4ImmoG=OLT(Ey?WL<5Kh5Dg$2Ks118 z0MP)V0Yn3c1`rJ(8bCCFXaLawq5(t$hz1Z1AR0h4fM@{G0HOgz1BeC?4ImmoG=OLT z(Ey?WL<5Kh5Dg$2Ks1180MP)V0Yn3c1`rJ(8bCCFXaLawq5(t$hz1Z1AR0h4fM@{G z0HOgz1BeC?4ImmoG=OLT(Ey?WL<5Kh5Dg$2Ks1180MP)V0Yn3c1`rJ(8bCCFXaLaw zq5(t$hz1Z1AR0h4fM@{G0HOgz1BeC?4ImmoG=OLT(Ey?WL<5Kh5Dg$2Ks1180MP)V z0Yn3c1`rJ(8bCCFXaLawq5(t$hz1Z1AR0h4fM@{G0HOgz1BeC?4ImmoG=OLT(Ey?W zL<5Kh5Dg$2Ks1180MP)V0Yn3c1`rJ(8bCCFXaLawq5(t$hz1Z1AR0h4fM@{G0HOgz z1BeC?4ImmoG=OLT(Ey?WL<5Kh5Dg$2Ks1180MP)V0Yn3c1`rJ(8bCCFXaLawq5(t$ zhz1Z1AR0h4fM@{G0HOgz1BeC?4ImmoG=OLT(Ey?WL<5Kh5Dg$2Ks1180MP)V0Yn3c z1`rJ(8bCCFXaLawq5(t$hz1Z1AR0h4fM@{G0HOgz1BeC?4ImmoG=OLT(Ey?WL<5Kh z5Dg$2Ks1180MP)V0Yn3c1`rJ(8bCCFXaLawq5(t$hz1Z1AR0h4fM@{G0HOgz1BeC? z4ImmoG=OLT(Ey?WL<5Kh5Dg$2Ks1180MP)V0Yn3c1`rJ(8bCCFXaLawq5(t$hz1Z1 zAR0h4fM@{G0HOgz1BeC?4ImmoG=OLT(Ey?WL<5Kh5Dg$2Ks1180MP)V0Yn3c1`rJ( z8bCCFXaLawq5(t$hz1Z1AR0h4fM@{G0HOgz1BeC?4ImmoG=OLT(Ey?WL<5Kh5Dg$2 zKs1180MP)V0Yn3c1`rJ(8bCCFXaLawq5(t$hz1Z1AR0h4fYAU(0~ifpG=R|nMgtfP zU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR z7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|n zMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y z(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifp zG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C z4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU( z0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy z0QSfB(*Q;T7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU( z0~ifpG=R|nMgtfPU^IZy07e5C4Pbv_KMi0sfYAU(0~ifpG=R|nMgtfPU^IZy07e5C z4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU( z0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy z07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=F zfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfP zU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR z7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|n zMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y z(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifp zG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C z4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU( z0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy z07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=F zfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfP zU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR z7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|n zMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y z(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifp zG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C z4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU( z0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy z07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e7Y&3+odXaJ)Dj0P|o zz-R!Y0gMJP8o+1(qXCQtFdD#U0HXnn1~3}HXaJ)Dj0P|oz-R!Y0gMJP8o+1(qXCQt zFdD#U0HXoyZa)oRG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|n zMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y z(EvsR7!6=FfYAU(0~ifphy92Bhy92BG=R|nMgtfPV2AyO{fGUB{fGUB{fGUB{fGUB z{fGUB{fGUB{fGUB{fGUB{fGUB{fGUB{fGUB{fGUB{fGUB{fGUB{fGUB{fGUB{fGUB z{fGUB{fGUB{fGUB{fGUB{fGUB{fGUB{ipq>{ipq>{ipq>{ipq>{ipq>{ipq>{ipq> z{ipq>{ipq>{ipq>{ipq>{ipq>{ipq>{ipq>{ipq>{ipq>{WO5l07e5C4PZ2Y(EvsR z7!6=FfYAVU+JD+l0~ifpG=R|ncG`d1f7*ZAf7*ZAf7*ZAf7*ZAf7*ZAf7*ZAf7*ZA zPXibYU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y z(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifp zG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C z4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU( z0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy z07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=F zfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfP zU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR z7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|n zMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y z(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifp zG=R|n_Luh407e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y z(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifp zG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C z4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU( z0~ifpG=R|nMgtfPV1H#l4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C z4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU( z0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy z07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=F zfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfP zU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR z7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|n zMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y z(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifp zG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C z4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU( z0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy z07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=F zfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfP zU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR z7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|n zMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y z(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifp zG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C z4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU( z0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy z07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=F zfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfP zU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR z7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|n zMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y z(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifp zG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C z4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU( z0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy z07e5C4PZ2Y(EvsR7!6=FfYAU(0~ifpG=R|nMgtfPU^IZy07e5C4PZ2Y(EzT$@Ws~g z<%jE6ZW%iM?&ElOxZ~saCx`1Wj>F-Kh2s|=$De;3zjFBQ-SMX%$DcX8=y&|b$MK&I zuLB$(za|j9e(lziBgQ_Tw-9 z_78sI4WH%z{;GfaZ2$j_fByY9{^j@I_>DJy`ETC!FV z<+nfi-MjqmjgNQn_>RM$z>gl_arhJX(W5(#zyIO-^fC04SKtmG>66#9*C$^XuTKns z>ytqF`XsZvKIu`epL{2B{p7FT>nDHrTtDd{ub)Ic*H0b;zJBtYqQf83XLpSp|KM;P z?)vO6!0WT7-r?hZmJMB>bqd#uCkU<=mt(IN*S)S67mFM|mlu~^9DnWbjqUZ~tHAXl z1ixP7X4i{;>3aE%e}69?8NOaTYjwSNc;eg)bXzl_a+@5KXaA;`s&ie_0?;S>#HwV*H>Q& zuCL13>#HQ~`udMP##c|wUtc};eSP(e%=PNV&h_dp#r5iT!S(7|@Ac~9;`Qn~zU$Su zG}o)wu-7Zv!r@bUefa%eKR&fr4~k!}p6b0`J=S!+dXDA#=0@`M&DEysn=k*@H`V*~ zO%r^5^CbNB%_C;lH_ts?KYPLH@CWv@zY4FPWggeh9{9b!yL)+kcWdeT?w-r_-G!U$ ryYI%Y?_LsL-yi-fd~@^qdUKcedUG4+dUO5$dUFBqdh=@VkAC=Xqg5Fh literal 0 HcmV?d00001 diff --git a/records/track_non_record_16mb/2026-03-24_106M_Binary_Asymmetric_UNet_FP8_15L_8192BPE_YaRN_NeoMuon_Smear/fineweb_8192_bpe.vocab b/records/track_non_record_16mb/2026-03-24_106M_Binary_Asymmetric_UNet_FP8_15L_8192BPE_YaRN_NeoMuon_Smear/fineweb_8192_bpe.vocab new file mode 100644 index 0000000000..6e194bf03c --- /dev/null +++ b/records/track_non_record_16mb/2026-03-24_106M_Binary_Asymmetric_UNet_FP8_15L_8192BPE_YaRN_NeoMuon_Smear/fineweb_8192_bpe.vocab @@ -0,0 +1,8192 @@ + 0 + 0 + 0 + 0 +<0x00> 0 +<0x01> 0 +<0x02> 0 +<0x03> 0 +<0x04> 0 +<0x05> 0 +<0x06> 0 +<0x07> 0 +<0x08> 0 +<0x09> 0 +<0x0A> 0 +<0x0B> 0 +<0x0C> 0 +<0x0D> 0 +<0x0E> 0 +<0x0F> 0 +<0x10> 0 +<0x11> 0 +<0x12> 0 +<0x13> 0 +<0x14> 0 +<0x15> 0 +<0x16> 0 +<0x17> 0 +<0x18> 0 +<0x19> 0 +<0x1A> 0 +<0x1B> 0 +<0x1C> 0 +<0x1D> 0 +<0x1E> 0 +<0x1F> 0 +<0x20> 0 +<0x21> 0 +<0x22> 0 +<0x23> 0 +<0x24> 0 +<0x25> 0 +<0x26> 0 +<0x27> 0 +<0x28> 0 +<0x29> 0 +<0x2A> 0 +<0x2B> 0 +<0x2C> 0 +<0x2D> 0 +<0x2E> 0 +<0x2F> 0 +<0x30> 0 +<0x31> 0 +<0x32> 0 +<0x33> 0 +<0x34> 0 +<0x35> 0 +<0x36> 0 +<0x37> 0 +<0x38> 0 +<0x39> 0 +<0x3A> 0 +<0x3B> 0 +<0x3C> 0 +<0x3D> 0 +<0x3E> 0 +<0x3F> 0 +<0x40> 0 +<0x41> 0 +<0x42> 0 +<0x43> 0 +<0x44> 0 +<0x45> 0 +<0x46> 0 +<0x47> 0 +<0x48> 0 +<0x49> 0 +<0x4A> 0 +<0x4B> 0 +<0x4C> 0 +<0x4D> 0 +<0x4E> 0 +<0x4F> 0 +<0x50> 0 +<0x51> 0 +<0x52> 0 +<0x53> 0 +<0x54> 0 +<0x55> 0 +<0x56> 0 +<0x57> 0 +<0x58> 0 +<0x59> 0 +<0x5A> 0 +<0x5B> 0 +<0x5C> 0 +<0x5D> 0 +<0x5E> 0 +<0x5F> 0 +<0x60> 0 +<0x61> 0 +<0x62> 0 +<0x63> 0 +<0x64> 0 +<0x65> 0 +<0x66> 0 +<0x67> 0 +<0x68> 0 +<0x69> 0 +<0x6A> 0 +<0x6B> 0 +<0x6C> 0 +<0x6D> 0 +<0x6E> 0 +<0x6F> 0 +<0x70> 0 +<0x71> 0 +<0x72> 0 +<0x73> 0 +<0x74> 0 +<0x75> 0 +<0x76> 0 +<0x77> 0 +<0x78> 0 +<0x79> 0 +<0x7A> 0 +<0x7B> 0 +<0x7C> 0 +<0x7D> 0 +<0x7E> 0 +<0x7F> 0 +<0x80> 0 +<0x81> 0 +<0x82> 0 +<0x83> 0 +<0x84> 0 +<0x85> 0 +<0x86> 0 +<0x87> 0 +<0x88> 0 +<0x89> 0 +<0x8A> 0 +<0x8B> 0 +<0x8C> 0 +<0x8D> 0 +<0x8E> 0 +<0x8F> 0 +<0x90> 0 +<0x91> 0 +<0x92> 0 +<0x93> 0 +<0x94> 0 +<0x95> 0 +<0x96> 0 +<0x97> 0 +<0x98> 0 +<0x99> 0 +<0x9A> 0 +<0x9B> 0 +<0x9C> 0 +<0x9D> 0 +<0x9E> 0 +<0x9F> 0 +<0xA0> 0 +<0xA1> 0 +<0xA2> 0 +<0xA3> 0 +<0xA4> 0 +<0xA5> 0 +<0xA6> 0 +<0xA7> 0 +<0xA8> 0 +<0xA9> 0 +<0xAA> 0 +<0xAB> 0 +<0xAC> 0 +<0xAD> 0 +<0xAE> 0 +<0xAF> 0 +<0xB0> 0 +<0xB1> 0 +<0xB2> 0 +<0xB3> 0 +<0xB4> 0 +<0xB5> 0 +<0xB6> 0 +<0xB7> 0 +<0xB8> 0 +<0xB9> 0 +<0xBA> 0 +<0xBB> 0 +<0xBC> 0 +<0xBD> 0 +<0xBE> 0 +<0xBF> 0 +<0xC0> 0 +<0xC1> 0 +<0xC2> 0 +<0xC3> 0 +<0xC4> 0 +<0xC5> 0 +<0xC6> 0 +<0xC7> 0 +<0xC8> 0 +<0xC9> 0 +<0xCA> 0 +<0xCB> 0 +<0xCC> 0 +<0xCD> 0 +<0xCE> 0 +<0xCF> 0 +<0xD0> 0 +<0xD1> 0 +<0xD2> 0 +<0xD3> 0 +<0xD4> 0 +<0xD5> 0 +<0xD6> 0 +<0xD7> 0 +<0xD8> 0 +<0xD9> 0 +<0xDA> 0 +<0xDB> 0 +<0xDC> 0 +<0xDD> 0 +<0xDE> 0 +<0xDF> 0 +<0xE0> 0 +<0xE1> 0 +<0xE2> 0 +<0xE3> 0 +<0xE4> 0 +<0xE5> 0 +<0xE6> 0 +<0xE7> 0 +<0xE8> 0 +<0xE9> 0 +<0xEA> 0 +<0xEB> 0 +<0xEC> 0 +<0xED> 0 +<0xEE> 0 +<0xEF> 0 +<0xF0> 0 +<0xF1> 0 +<0xF2> 0 +<0xF3> 0 +<0xF4> 0 +<0xF5> 0 +<0xF6> 0 +<0xF7> 0 +<0xF8> 0 +<0xF9> 0 +<0xFA> 0 +<0xFB> 0 +<0xFC> 0 +<0xFD> 0 +<0xFE> 0 +<0xFF> 0 +▁t -0 +▁a -1 +in -2 +he -3 +re -4 +on -5 +er -6 +▁the -7 +▁s -8 +▁w -9 +or -10 +at -11 +nd -12 +ou -13 +▁c -14 +it -15 +es -16 +▁f -17 +is -18 +ing -19 +en -20 +▁b -21 +▁p -22 +▁o -23 +an -24 +ed -25 +▁to -26 +al -27 +▁m -28 +ar -29 +▁and -30 +▁in -31 +▁of -32 +▁d -33 +le -34 +ic -35 +as -36 +▁h -37 +om -38 +ion -39 +▁th -40 +il -41 +▁T -42 +▁l -43 +ent -44 +ve -45 +▁I -46 +ro -47 +st -48 +▁y -49 +▁e -50 +▁re -51 +▁n -52 +▁S -53 +▁g -54 +et -55 +ct -56 +▁A -57 +▁C -58 +▁you -59 +ly -60 +ay -61 +id -62 +▁for -63 +▁on -64 +▁is -65 +ot -66 +▁be -67 +ow -68 +ol -69 +am -70 +ac -71 +ig -72 +us -73 +ad -74 +el -75 +▁M -76 +im -77 +ver -78 +ith -79 +ut -80 +▁st -81 +▁P -82 +ation -83 +▁with -84 +ur -85 +▁B -86 +▁that -87 +ir -88 +▁W -89 +ch -90 +▁he -91 +▁it -92 +▁The -93 +ce -94 +ill -95 +ers -96 +un -97 +▁al -98 +▁D -99 +ul -100 +▁an -101 +▁H -102 +▁F -103 +out -104 +ra -105 +ke -106 +▁pro -107 +▁wh -108 +▁as -109 +▁are -110 +se -111 +ter -112 +▁we -113 +▁ha -114 +▁R -115 +oo -116 +if -117 +ge -118 +our -119 +pp -120 +▁at -121 +ate -122 +ess -123 +▁com -124 +▁or -125 +▁con -126 +▁L -127 +her -128 +ore -129 +est -130 +▁fr -131 +ment -132 +igh -133 +▁- -134 +ab -135 +▁N -136 +▁se -137 +▁ne -138 +ld -139 +ort -140 +▁G -141 +▁E -142 +ri -143 +ist -144 +▁( -145 +▁your -146 +op -147 +▁O -148 +▁ex -149 +em -150 +ure -151 +ity -152 +▁r -153 +ant -154 +qu -155 +▁v -156 +▁was -157 +art -158 +ust -159 +▁have -160 +ive -161 +um -162 +▁this -163 +▁from -164 +pe -165 +▁de -166 +oc -167 +▁sh -168 +th -169 +ain -170 +up -171 +ies -172 +▁will -173 +▁by -174 +ight -175 +▁ch -176 +and -177 +os -178 +▁can -179 +ie -180 +nt -181 +all -182 +▁us -183 +ome -184 +▁not -185 +ard -186 +ud -187 +▁le -188 +res -189 +▁J -190 +ast -191 +.. -192 +ost -193 +▁pl -194 +ear -195 +▁ab -196 +ack -197 +▁su -198 +iv -199 +▁wor -200 +gh -201 +▁all -202 +rou -203 +ide -204 +ould -205 +▁j -206 +ell -207 +ial -208 +te -209 +ak -210 +ine -211 +od -212 +ag -213 +are -214 +▁has -215 +ice -216 +▁U -217 +▁Th -218 +▁do -219 +age -220 +▁k -221 +ook -222 +fe -223 +▁ad -224 +▁me -225 +ip -226 +▁In -227 +▁comp -228 +▁but -229 +▁up -230 +▁out -231 +ake -232 +per -233 +red -234 +▁whe -235 +ions -236 +ally -237 +pt -238 +ry -239 +og -240 +one -241 +▁more -242 +ail -243 +able -244 +ind -245 +▁my -246 +ite -247 +▁our -248 +ther -249 +▁en -250 +▁“ -251 +very -252 +▁Y -253 +▁sa -254 +▁so -255 +ich -256 +ime -257 +cc -258 +▁cl -259 +ong -260 +▁their -261 +▁K -262 +ated -263 +ood -264 +ame -265 +orm -266 +▁St -267 +▁they -268 +▁one -269 +▁te -270 +ber -271 +ace -272 +ike -273 +iz -274 +▁about -275 +so -276 +ous -277 +du -278 +ick -279 +ase -280 +ans -281 +▁" -282 +▁V -283 +pl -284 +▁cont -285 +act -286 +ia -287 +▁im -288 +▁work -289 +▁un -290 +▁who -291 +ree -292 +cl -293 +ire -294 +▁fe -295 +ign -296 +▁off -297 +▁his -298 +▁man -299 +ue -300 +ff -301 +ance -302 +▁go -303 +ll -304 +ach -305 +▁year -306 +▁new -307 +▁tr -308 +ays -309 +ne -310 +reat -311 +▁It -312 +ction -313 +ub -314 +ib -315 +ult -316 +▁app -317 +erv -318 +und -319 +▁We -320 +ap -321 +▁Ch -322 +ass -323 +▁qu -324 +ep -325 +▁res -326 +ary -327 +ark -328 +▁sp -329 +▁per -330 +ations -331 +ile -332 +ove -333 +form -334 +▁int -335 +▁get -336 +▁also -337 +▁time -338 +▁which -339 +ount -340 +ven -341 +▁like -342 +own -343 +▁other -344 +ents -345 +▁some -346 +ond -347 +ord -348 +▁any -349 +ings -350 +vel -351 +av -352 +▁been -353 +ical -354 +▁over -355 +▁part -356 +ress -357 +▁This -358 +▁dis -359 +ks -360 +▁He -361 +ors -362 +ence -363 +▁said -364 +▁sc -365 +▁rec -366 +▁ar -367 +ition -368 +▁them -369 +▁ag -370 +▁when -371 +▁pe -372 +ild -373 +port -374 +▁her -375 +ound -376 +ough -377 +▁kn -378 +ose -379 +ob -380 +irst -381 +low -382 +▁just -383 +mer -384 +int -385 +▁ro -386 +ov -387 +ck -388 +ish -389 +▁what -390 +oy -391 +▁pr -392 +ru -393 +▁spe -394 +▁pre -395 +▁there -396 +ens -397 +wn -398 +▁acc -399 +day -400 +▁if -401 +ren -402 +▁than -403 +▁would -404 +▁need -405 +▁Re -406 +▁had -407 +vers -408 +▁its -409 +▁were -410 +ink -411 +fter -412 +ning -413 +▁am -414 +ater -415 +... -416 +▁des -417 +old -418 +itt -419 +clud -420 +ade -421 +rough -422 +▁tw -423 +▁into -424 +lp -425 +ory -426 +use -427 +ople -428 +ool -429 +ang -430 +▁first -431 +▁how -432 +▁bec -433 +▁help -434 +lic -435 +hed -436 +ons -437 +▁add -438 +anc -439 +ft -440 +▁make -441 +amp -442 +gr -443 +▁bl -444 +▁look -445 +▁– -446 +▁Wh -447 +▁prov -448 +▁col -449 +▁includ -450 +▁people -451 +▁comm -452 +▁produ -453 +▁You -454 +▁Ne -455 +ual -456 +▁know -457 +ful -458 +▁she -459 +ian -460 +ments -461 +ates -462 +iew -463 +round -464 +▁em -465 +▁every -466 +▁back -467 +▁only -468 +▁serv -469 +tern -470 +les -471 +ious -472 +▁no -473 +▁may -474 +rent -475 +▁through -476 +▁bu -477 +ict -478 +▁most -479 +cts -480 +ating -481 +▁see -482 +▁want -483 +▁two -484 +▁ph -485 +com -486 +pport -487 +▁As -488 +xt -489 +we -490 +ities -491 +ices -492 +iss -493 +▁use -494 +▁well -495 +ont -496 +▁bet -497 +▁after -498 +▁If -499 +ise -500 +hing -501 +▁ind -502 +ause -503 +▁play -504 +▁Se -505 +ph -506 +▁und -507 +je -508 +▁& -509 +▁co -510 +ife -511 +▁| -512 +ock -513 +ily -514 +▁stud -515 +lect -516 +row -517 +▁act -518 +ting -519 +iness -520 +▁fl -521 +hen -522 +▁years -523 +▁Com -524 +▁Un -525 +urn -526 +ts -527 +▁$ -528 +enc -529 +aw -530 +▁these -531 +▁tra -532 +▁An -533 +fore -534 +▁cons -535 +▁under -536 +als -537 +cial -538 +ange -539 +▁exper -540 +bs -541 +aking -542 +▁ke -543 +oth -544 +▁now -545 +ures -546 +ational -547 +▁very -548 +▁Pro -549 +▁wee -550 +▁bus -551 +▁good -552 +▁gu -553 +ased -554 +vent -555 +▁And -556 +formation -557 +▁many -558 +▁sm -559 +get -560 +▁way -561 +any -562 +▁reg -563 +erson -564 +oint -565 +ific -566 +ward -567 +▁De -568 +ert -569 +ility -570 +▁start -571 +▁fin -572 +▁dif -573 +▁could -574 +rit -575 +lease -576 +▁great -577 +▁imp -578 +ork -579 +uch -580 +▁day -581 +fect -582 +▁rem -583 +▁Sh -584 +yst -585 +▁rel -586 +ience -587 +ible -588 +▁even -589 +▁For -590 +uring -591 +ty -592 +▁show -593 +▁high -594 +oss -595 +ics -596 +▁sec -597 +ull -598 +▁own -599 +nds -600 +velop -601 +▁inv -602 +▁where -603 +▁here -604 +▁don -605 +▁inc -606 +▁down -607 +). -608 +▁ent -609 +ident -610 +hes -611 +olog -612 +cess -613 +▁loc -614 +arch -615 +▁right -616 +ble -617 +▁then -618 +chool -619 +▁home -620 +▁should -621 +▁Al -622 +▁New -623 +elf -624 +alth -625 +The -626 +▁ass -627 +ied -628 +▁br -629 +its -630 +ited -631 +▁find -632 +ath -633 +air -634 +ular -635 +▁read -636 +▁too -637 +▁ac -638 +hip -639 +▁av -640 +▁set -641 +ix -642 +▁car -643 +▁fam -644 +ner -645 +▁information -646 +▁mon -647 +gan -648 +line -649 +▁best -650 +▁last -651 +ys -652 +▁min -653 +gram -654 +▁take -655 +io -656 +▁design -657 +▁Cl -658 +pect -659 +ract -660 +▁long -661 +ason -662 +▁did -663 +▁inst -664 +▁much -665 +omet -666 +▁che -667 +|| -668 +erm -669 +▁Be -670 +▁business -671 +ystem -672 +▁because -673 +▁before -674 +other -675 +ank -676 +▁dec -677 +ues -678 +▁But -679 +▁att -680 +▁ins -681 +▁Fr -682 +.” -683 +▁made -684 +▁team -685 +ative -686 +▁call -687 +▁Le -688 +▁him -689 +pr -690 +▁sur -691 +pen -692 +atch -693 +▁cre -694 +rib -695 +me -696 +▁think -697 +ject -698 +ollow -699 +az -700 +▁again -701 +▁world -702 +way -703 +ax -704 +ale -705 +ug -706 +▁Ad -707 +▁art -708 +▁mem -709 +▁does -710 +alk -711 +), -712 +▁vis -713 +arket -714 +▁being -715 +▁pres -716 +ave -717 +▁develop -718 +▁person -719 +oun -720 +▁requ -721 +arn -722 +ustom -723 +ower -724 +chn -725 +rest -726 +▁inte -727 +arm -728 +ient -729 +▁life -730 +▁those -731 +ener -732 +▁diffe -733 +▁such -734 +ins -735 +▁med -736 +ng -737 +ivers -738 +ince -739 +ouse -740 +▁support -741 +ving -742 +▁while -743 +ash -744 +irect -745 +▁Ar -746 +▁pol -747 +view -748 +land -749 +▁sk -750 +▁provid -751 +ss -752 +unity -753 +ier -754 +▁lead -755 +▁ra -756 +▁Te -757 +▁each -758 +▁around -759 +▁book -760 +der -761 +▁love -762 +▁free -763 +▁used -764 +ced -765 +akes -766 +▁care -767 +▁end -768 +read -769 +▁mod -770 +ailable -771 +▁ser -772 +▁comple -773 +▁post -774 +▁run -775 +▁gr -776 +ather -777 +▁disc -778 +▁sim -779 +ric -780 +▁program -781 +ality -782 +▁ret -783 +▁pub -784 +ces -785 +ional -786 +ages -787 +ually -788 +▁bo -789 +▁cur -790 +▁ed -791 +ines -792 +imes -793 +ton -794 +ives -795 +▁All -796 +▁det -797 +▁really -798 +roup -799 +ple -800 +oad -801 +ars -802 +▁eas -803 +ets -804 +▁On -805 +▁child -806 +▁system -807 +▁There -808 +▁So -809 +▁num -810 +iel -811 +au -812 +ize -813 +▁follow -814 +▁trans -815 +." -816 +led -817 +ene -818 +▁count -819 +▁going -820 +▁found -821 +,” -822 +▁top -823 +ah -824 +▁form -825 +▁char -826 +▁somet -827 +iet -828 +▁three -829 +ittle -830 +▁inter -831 +▁list -832 +▁cour -833 +ames -834 +man -835 +▁still -836 +▁Bl -837 +▁fun -838 +▁How -839 +▁month -840 +▁available -841 +▁place -842 +▁del -843 +ature -844 +▁Pl -845 +▁custom -846 +ute -847 +ness -848 +▁though -849 +▁They -850 +▁feel -851 +ways -852 +▁prof -853 +▁cle -854 +▁both -855 +▁To -856 +▁few -857 +▁sub -858 +cept -859 +▁aut -860 +orn -861 +meric -862 +▁str -863 +▁happ -864 +▁week -865 +▁sign -866 +▁open -867 +▁hand -868 +ved -869 +▁gl -870 +▁pur -871 +▁say -872 +uc -873 +▁report -874 +▁health -875 +▁game -876 +▁adv -877 +att -878 +▁rep -879 +▁market -880 +ital -881 +▁different -882 +oot -883 +ired -884 +orth -885 +▁frie -886 +bers -887 +▁keep -888 +▁same -889 +ering -890 +tt -891 +▁lot -892 +▁Ex -893 +▁She -894 +▁point -895 +▁Col -896 +ween -897 +▁techn -898 +▁family -899 +▁ev -900 +▁i -901 +ology -902 +▁exp -903 +iqu -904 +▁ext -905 +▁school -906 +ining -907 +▁little -908 +▁using -909 +," -910 +▁process -911 +ished -912 +atur -913 +▁company -914 +▁lar -915 +ata -916 +▁including -917 +▁Sc -918 +ross -919 +iving -920 +oh -921 +ants -922 +▁next -923 +▁plan -924 +▁win -925 +▁Americ -926 +ott -927 +▁fil -928 +▁real -929 +▁during -930 +▁Tr -931 +▁between -932 +thing -933 +ized -934 +▁water -935 +ger -936 +▁sol -937 +▁Ph -938 +▁import -939 +▁Q -940 +ody -941 +cent -942 +▁state -943 +▁What -944 +gg -945 +ield -946 +▁things -947 +ik -948 +ves -949 +▁met -950 +arly -951 +els -952 +▁come -953 +aut -954 +ists -955 +be -956 +▁allow -957 +▁big -958 +less -959 +aint -960 +reen -961 +▁mus -962 +▁put -963 +▁contin -964 +uss -965 +▁Or -966 +▁rece -967 +▁experience -968 +ware -969 +▁service -970 +▁opt -971 +▁build -972 +cer -973 +self -974 +▁small -975 +▁dri -976 +▁days -977 +▁appro -978 +ined -979 +iversity -980 +ex -981 +▁organ -982 +▁full -983 +ling -984 +▁since -985 +▁cent -986 +▁always -987 +▁rest -988 +▁try -989 +▁phot -990 +▁better -991 +▁cr -992 +▁sure -993 +▁When -994 +ution -995 +▁pat -996 +▁online -997 +▁pri -998 +▁quest -999 +▁ref -1000 +▁Ind -1001 +▁second -1002 +▁pass -1003 +▁something -1004 +▁var -1005 +illion -1006 +▁bel -1007 +▁interest -1008 +rand -1009 +ever -1010 +over -1011 +▁iss -1012 +▁partic -1013 +▁class -1014 +▁poss -1015 +▁gener -1016 +▁def -1017 +▁group -1018 +▁tri -1019 +▁mov -1020 +ffect -1021 +▁perform -1022 +▁hard -1023 +▁direct -1024 +▁Z -1025 +▁pay -1026 +pping -1027 +ours -1028 +▁With -1029 +▁result -1030 +▁bro -1031 +▁today -1032 +▁head -1033 +▁special -1034 +gy -1035 +▁— -1036 +▁sl -1037 +ps -1038 +▁ty -1039 +▁ve -1040 +ploy -1041 +ER -1042 +▁At -1043 +joy -1044 +▁stand -1045 +ms -1046 +work -1047 +ared -1048 +outh -1049 +▁another -1050 +▁ide -1051 +▁give -1052 +br -1053 +▁ann -1054 +▁Con -1055 +▁wom -1056 +▁provide -1057 +uck -1058 +▁got -1059 +▁cor -1060 +ccess -1061 +ior -1062 +▁Chr -1063 +ote -1064 +oor -1065 +▁Res -1066 +oney -1067 +▁meet -1068 +▁students -1069 +▁resp -1070 +istr -1071 +▁current -1072 +ense -1073 +ately -1074 +▁wr -1075 +▁without -1076 +ision -1077 +▁conf -1078 +▁Our -1079 +ients -1080 +rence -1081 +ok -1082 +ium -1083 +▁old -1084 +▁area -1085 +ley -1086 +ope -1087 +ards -1088 +▁number -1089 +▁four -1090 +▁bre -1091 +▁cost -1092 +aj -1093 +ems -1094 +ered -1095 +▁able -1096 +ically -1097 +▁soc -1098 +▁val -1099 +▁Sp -1100 +▁invest -1101 +▁must -1102 +con -1103 +▁access -1104 +▁services -1105 +▁unt -1106 +raph -1107 +ats -1108 +ird -1109 +▁ask -1110 +▁working -1111 +▁never -1112 +▁US -1113 +▁Cent -1114 +iver -1115 +▁No -1116 +stand -1117 +ww -1118 +▁webs -1119 +▁proble -1120 +▁public -1121 +▁vide -1122 +ission -1123 +▁visit -1124 +▁important -1125 +ann -1126 +▁light -1127 +pped -1128 +▁fact -1129 +let -1130 +▁sal -1131 +▁level -1132 +▁order -1133 +▁fac -1134 +ged -1135 +▁Comm -1136 +▁My -1137 +▁test -1138 +▁might -1139 +▁exc -1140 +ral -1141 +▁rese -1142 +▁product -1143 +▁local -1144 +▁night -1145 +▁season -1146 +inal -1147 +▁el -1148 +▁incre -1149 +ember -1150 +▁site -1151 +rol -1152 +▁That -1153 +▁sing -1154 +ruct -1155 +ample -1156 +▁expl -1157 +▁Mar -1158 +▁spec -1159 +▁grow -1160 +▁let -1161 +▁ca -1162 +▁proper -1163 +▁less -1164 +ording -1165 +▁enjoy -1166 +▁ob -1167 +▁past -1168 +▁event -1169 +▁products -1170 +▁Man -1171 +▁' -1172 +▁inf -1173 +▁May -1174 +▁looking -1175 +▁food -1176 +here -1177 +lection -1178 +▁within -1179 +▁profess -1180 +▁Fe -1181 +▁Is -1182 +▁data -1183 +▁making -1184 +▁pop -1185 +ertain -1186 +▁until -1187 +ases -1188 +ories -1189 +ffic -1190 +enn -1191 +ency -1192 +▁children -1193 +ently -1194 +▁University -1195 +We -1196 +gin -1197 +sh -1198 +▁job -1199 +▁offer -1200 +▁law -1201 +ery -1202 +ains -1203 +ney -1204 +urs -1205 +▁pos -1206 +eng -1207 +utes -1208 +▁power -1209 +▁view -1210 +▁turn -1211 +▁eng -1212 +▁email -1213 +ential -1214 +tend -1215 +▁oper -1216 +▁sit -1217 +▁check -1218 +▁against -1219 +ieve -1220 +▁est -1221 +▁Pr -1222 +ream -1223 +ised -1224 +▁Br -1225 +ina -1226 +▁prote -1227 +ids -1228 +ode -1229 +▁room -1230 +▁contact -1231 +IN -1232 +▁community -1233 +med -1234 +to -1235 +▁addition -1236 +▁prom -1237 +▁says -1238 +▁intern -1239 +load -1240 +▁toget -1241 +▁together -1242 +▁Fl -1243 +▁away -1244 +ivid -1245 +▁impro -1246 +▁quality -1247 +▁leg -1248 +ator -1249 +▁dist -1250 +▁creat -1251 +ills -1252 +irl -1253 +hor -1254 +▁indust -1255 +▁complete -1256 +▁news -1257 +aring -1258 +iron -1259 +ique -1260 +ret -1261 +▁App -1262 +icle -1263 +iday -1264 +agement -1265 +ified -1266 +oci -1267 +▁supp -1268 +osed -1269 +ability -1270 +▁project -1271 +▁website -1272 +▁Car -1273 +iety -1274 +ane -1275 +por -1276 +!! -1277 +▁change -1278 +co -1279 +▁success -1280 +▁dep -1281 +bo -1282 +▁learn -1283 +▁include -1284 +▁Co -1285 +pend -1286 +▁fav -1287 +▁chang -1288 +ym -1289 +▁Ste -1290 +▁detail -1291 +ism -1292 +▁offic -1293 +▁Can -1294 +▁members -1295 +▁dr -1296 +arent -1297 +son -1298 +▁buy -1299 +▁easy -1300 +▁please -1301 +rap -1302 +▁Me -1303 +aster -1304 +▁applic -1305 +ising -1306 +ury -1307 +▁name -1308 +▁pract -1309 +▁times -1310 +atures -1311 +▁along -1312 +▁equ -1313 +▁present -1314 +▁One -1315 +▁large -1316 +▁money -1317 +▁beaut -1318 +atter -1319 +augh -1320 +▁Am -1321 +aterial -1322 +the -1323 +▁Cont -1324 +iting -1325 +▁activ -1326 +vern -1327 +RE -1328 +▁employ -1329 +▁la -1330 +aff -1331 +une -1332 +▁house -1333 +ready -1334 +Th -1335 +▁course -1336 +▁expect -1337 +▁. -1338 +▁needs -1339 +ored -1340 +▁air -1341 +▁left -1342 +▁Christ -1343 +▁thing -1344 +itions -1345 +ift -1346 +sc -1347 +ably -1348 +▁cap -1349 +ider -1350 +ived -1351 +lish -1352 +▁music -1353 +▁dra -1354 +min -1355 +▁why -1356 +▁En -1357 +yle -1358 +ohn -1359 +ump -1360 +ify -1361 +▁hist -1362 +ec -1363 +ron -1364 +by -1365 +▁bas -1366 +ern -1367 +▁hum -1368 +▁video -1369 +rie -1370 +▁sw -1371 +▁account -1372 +ON -1373 +ffe -1374 +alf -1375 +ocus -1376 +veral -1377 +▁below -1378 +▁soft -1379 +▁hot -1380 +▁These -1381 +▁short -1382 +ries -1383 +▁Eng -1384 +▁line -1385 +▁live -1386 +pecial -1387 +▁opport -1388 +enef -1389 +▁create -1390 +book -1391 +▁cond -1392 +▁beh -1393 +▁... -1394 +▁perfect -1395 +uly -1396 +▁ce -1397 +▁page -1398 +▁word -1399 +▁/ -1400 +▁writ -1401 +AT -1402 +▁dem -1403 +ots -1404 +▁Med -1405 +▁mar -1406 +▁Please -1407 +fort -1408 +side -1409 +ows -1410 +mber -1411 +▁govern -1412 +▁pa -1413 +artment -1414 +▁already -1415 +▁Che -1416 +▁kind -1417 +▁After -1418 +▁enough -1419 +▁ever -1420 +▁research -1421 +ured -1422 +▁makes -1423 +▁following -1424 +▁million -1425 +▁Do -1426 +▁review -1427 +▁getting -1428 +▁dev -1429 +ten -1430 +itive -1431 +ush -1432 +▁friends -1433 +▁cut -1434 +▁conne -1435 +▁trad -1436 +ee -1437 +., -1438 +▁record -1439 +room -1440 +▁treat -1441 +▁side -1442 +▁const -1443 +vious -1444 +▁Ass -1445 +▁case -1446 +▁having -1447 +ajor -1448 +▁tell -1449 +▁Count -1450 +▁personal -1451 +▁move -1452 +▁based -1453 +▁story -1454 +viron -1455 +ention -1456 +▁John -1457 +rop -1458 +▁Your -1459 +▁Serv -1460 +▁won -1461 +unch -1462 +ips -1463 +▁Des -1464 +▁minutes -1465 +uper -1466 +▁become -1467 +uture -1468 +▁possible -1469 +osp -1470 +oice -1471 +iam -1472 +▁talk -1473 +▁city -1474 +ights -1475 +▁across -1476 +▁vers -1477 +▁share -1478 +ization -1479 +▁done -1480 +▁bit -1481 +▁camp -1482 +▁pack -1483 +▁didn -1484 +▁comes -1485 +▁men -1486 +▁understand -1487 +ead -1488 +▁several -1489 +▁-- -1490 +yn -1491 +▁: -1492 +▁country -1493 +▁Tw -1494 +▁hours -1495 +▁effect -1496 +▁cou -1497 +▁purch -1498 +iven -1499 +▁benef -1500 +ES -1501 +▁mil -1502 +▁women -1503 +uff -1504 +▁net -1505 +ividual -1506 +app -1507 +aces -1508 +▁percent -1509 +▁Comp -1510 +▁educ -1511 +wards -1512 +▁focus -1513 +▁often -1514 +▁material -1515 +ball -1516 +▁social -1517 +aim -1518 +▁elect -1519 +▁Wor -1520 +idd -1521 +ances -1522 +ination -1523 +uro -1524 +ides -1525 +ober -1526 +▁quick -1527 +▁Not -1528 +▁development -1529 +▁es -1530 +▁bring -1531 +▁return -1532 +orts -1533 +▁American -1534 +ister -1535 +ienc -1536 +▁doing -1537 +▁Bro -1538 +▁School -1539 +ript -1540 +▁pie -1541 +▁X -1542 +▁far -1543 +▁hold -1544 +arl -1545 +▁mult -1546 +ted -1547 +▁body -1548 +arr -1549 +err -1550 +▁Gr -1551 +of -1552 +mend -1553 +▁pot -1554 +ference -1555 +iful -1556 +ones -1557 +AN -1558 +▁wa -1559 +ners -1560 +▁fund -1561 +▁took -1562 +ograph -1563 +▁Here -1564 +▁tre -1565 +ource -1566 +lished -1567 +▁blog -1568 +oose -1569 +itc -1570 +AR -1571 +▁State -1572 +▁doesn -1573 +reet -1574 +conom -1575 +▁jo -1576 +vironment -1577 +▁deal -1578 +lement -1579 +▁others -1580 +▁City -1581 +▁Rep -1582 +▁came -1583 +▁called -1584 +▁started -1585 +▁sum -1586 +▁rele -1587 +org -1588 +▁Inst -1589 +nder -1590 +▁least -1591 +▁months -1592 +▁Intern -1593 +▁space -1594 +acy -1595 +▁Gu -1596 +▁mom -1597 +▁future -1598 +▁orig -1599 +▁compet -1600 +▁individual -1601 +oon -1602 +lege -1603 +▁went -1604 +▁occ -1605 +▁yet -1606 +▁young -1607 +rodu -1608 +▁clean -1609 +▁non -1610 +▁mind -1611 +▁told -1612 +ai -1613 +▁five -1614 +▁early -1615 +▁series -1616 +▁control -1617 +af -1618 +utions -1619 +▁term -1620 +▁major -1621 +oll -1622 +hers -1623 +ille -1624 +ape -1625 +▁games -1626 +ained -1627 +▁comb -1628 +▁means -1629 +▁pict -1630 +▁industry -1631 +▁chall -1632 +yl -1633 +▁tool -1634 +anks -1635 +▁Min -1636 +▁ens -1637 +▁lim -1638 +▁cover -1639 +ctor -1640 +▁fore -1641 +▁ago -1642 +AS -1643 +▁low -1644 +sw -1645 +▁key -1646 +fer -1647 +ama -1648 +▁x -1649 +▁heart -1650 +▁features -1651 +▁Ed -1652 +ilt -1653 +▁tem -1654 +rew -1655 +▁price -1656 +unic -1657 +▁store -1658 +fact -1659 +jects -1660 +▁offers -1661 +▁Ab -1662 +itor -1663 +back -1664 +▁once -1665 +▁specific -1666 +come -1667 +▁range -1668 +▁thought -1669 +ges -1670 +urity -1671 +ither -1672 +ateg -1673 +▁Bo -1674 +▁Jan -1675 +sel -1676 +▁pick -1677 +illed -1678 +▁Now -1679 +eral -1680 +▁God -1681 +▁Dr -1682 +▁favor -1683 +▁appear -1684 +year -1685 +▁More -1686 +▁York -1687 +ilities -1688 +▁Ke -1689 +▁Im -1690 +▁hope -1691 +▁redu -1692 +▁discuss -1693 +OR -1694 +ibr -1695 +▁happen -1696 +▁require -1697 +yr -1698 +▁Pe -1699 +▁However -1700 +atic -1701 +It -1702 +▁mean -1703 +▁single -1704 +nes -1705 +▁step -1706 +▁close -1707 +▁upd -1708 +▁land -1709 +▁break -1710 +▁ey -1711 +▁main -1712 +▁invol -1713 +most -1714 +anies -1715 +▁Pres -1716 +ourn -1717 +▁stay -1718 +▁government -1719 +▁Em -1720 +isk -1721 +isc -1722 +// -1723 +▁Sm -1724 +ony -1725 +▁field -1726 +de -1727 +▁priv -1728 +▁United -1729 +▁beautiful -1730 +resh -1731 +cle -1732 +▁Per -1733 +▁friend -1734 +▁everything -1735 +▁Qu -1736 +▁walk -1737 +ched -1738 +▁questions -1739 +▁added -1740 +▁hig -1741 +▁Cal -1742 +▁tax -1743 +aken -1744 +▁customers -1745 +▁strong -1746 +now -1747 +▁taking -1748 +▁install -1749 +for -1750 +:// -1751 +aps -1752 +ging -1753 +▁Pol -1754 +▁charact -1755 +▁wond -1756 +▁South -1757 +▁begin -1758 +▁study -1759 +ources -1760 +▁North -1761 +▁Just -1762 +▁announ -1763 +ief -1764 +ensive -1765 +▁miss -1766 +▁recom -1767 +▁travel -1768 +▁certain -1769 +▁Park -1770 +▁address -1771 +▁problem -1772 +▁By -1773 +▁County -1774 +▁actually -1775 +play -1776 +▁staff -1777 +▁tot -1778 +▁half -1779 +▁mess -1780 +▁z -1781 +aur -1782 +ew -1783 +inc -1784 +ians -1785 +▁search -1786 +▁technology -1787 +▁girl -1788 +▁media -1789 +urther -1790 +time -1791 +▁watch -1792 +▁typ -1793 +▁known -1794 +▁official -1795 +▁manag -1796 +▁National -1797 +▁six -1798 +irm -1799 +▁Pre -1800 +▁wind -1801 +▁enc -1802 +gle -1803 +atural -1804 +ural -1805 +▁front -1806 +ublic -1807 +▁Add -1808 +▁sound -1809 +▁improve -1810 +▁Post -1811 +wh -1812 +▁dig -1813 +irt -1814 +▁lat -1815 +▁content -1816 +▁Su -1817 +▁Stud -1818 +▁anal -1819 +▁track -1820 +itted -1821 +▁Mc -1822 +▁face -1823 +▁training -1824 +▁link -1825 +▁click -1826 +icy -1827 +▁ste -1828 +▁web -1829 +▁someone -1830 +ison -1831 +▁Oct -1832 +arning -1833 +▁works -1834 +▁author -1835 +▁later -1836 +▁building -1837 +not -1838 +lebr -1839 +▁host -1840 +ocu -1841 +▁Gl -1842 +▁environment -1843 +abor -1844 +cted -1845 +▁Center -1846 +▁mor -1847 +▁log -1848 +▁unique -1849 +▁everyone -1850 +▁Reg -1851 +raft -1852 +▁port -1853 +▁provides -1854 +IS -1855 +gest -1856 +▁ener -1857 +▁fall -1858 +▁cred -1859 +▁seen -1860 +▁Dep -1861 +▁film -1862 +ask -1863 +▁Day -1864 +▁prep -1865 +▁oil -1866 +▁particular -1867 +▁professional -1868 +▁aud -1869 +fully -1870 +▁Aug -1871 +▁Euro -1872 +ests -1873 +▁particip -1874 +lex -1875 +ided -1876 +unities -1877 +▁bar -1878 +ibility -1879 +▁results -1880 +▁ident -1881 +▁recommend -1882 +roll -1883 +▁press -1884 +ED -1885 +▁card -1886 +▁While -1887 +▁Will -1888 +▁whole -1889 +▁Don -1890 +aturday -1891 +▁World -1892 +rain -1893 +▁companies -1894 +ino -1895 +▁Ge -1896 +▁High -1897 +urch -1898 +▁Friday -1899 +▁office -1900 +IT -1901 +pper -1902 +▁Bar -1903 +▁March -1904 +▁color -1905 +▁events -1906 +▁anything -1907 +▁issues -1908 +EN -1909 +ancial -1910 +▁mot -1911 +▁eff -1912 +▁prob -1913 +▁mag -1914 +▁areas -1915 +▁pret -1916 +resent -1917 +▁vol -1918 +▁Some -1919 +▁comput -1920 +▁respons -1921 +ops -1922 +▁points -1923 +▁Acc -1924 +▁performance -1925 +▁near -1926 +▁pain -1927 +ster -1928 +obile -1929 +▁red -1930 +▁print -1931 +▁cook -1932 +▁Apr -1933 +itch -1934 +umb -1935 +▁given -1936 +▁history -1937 +▁econom -1938 +pecially -1939 +crib -1940 +obal -1941 +.... -1942 +▁feature -1943 +go -1944 +ili -1945 +ands -1946 +▁sell -1947 +▁designed -1948 +▁above -1949 +ches -1950 +▁maint -1951 +▁skin -1952 +▁text -1953 +▁aff -1954 +▁simple -1955 +eth -1956 +▁assist -1957 +IC -1958 +my -1959 +ued -1960 +▁age -1961 +icult -1962 +▁reason -1963 +inks -1964 +In -1965 +▁size -1966 +▁question -1967 +▁dou -1968 +imate -1969 +▁according -1970 +▁repl -1971 +iod -1972 +ply -1973 +▁Sec -1974 +nding -1975 +▁black -1976 +▁Aust -1977 +head -1978 +▁htt -1979 +edd -1980 +▁pretty -1981 +▁foot -1982 +▁believe -1983 +▁Saturday -1984 +oved -1985 +ables -1986 +▁due -1987 +▁Part -1988 +▁among -1989 +▁select -1990 +AL -1991 +itter -1992 +▁Sund -1993 +▁fire -1994 +cript -1995 +▁phys -1996 +omes -1997 +ental -1998 +ledge -1999 +▁idea -2000 +ety -2001 +▁latest -2002 +▁details -2003 +▁ant -2004 +▁popular -2005 +ole -2006 +▁third -2007 +▁et -2008 +ators -2009 +▁Mr -2010 +pro -2011 +val -2012 +▁management -2013 +aining -2014 +itional -2015 +▁includes -2016 +ruction -2017 +asing -2018 +▁July -2019 +▁energy -2020 +▁items -2021 +ze -2022 +▁weeks -2023 +ouch -2024 +onday -2025 +▁sent -2026 +▁Feb -2027 +▁living -2028 +ites -2029 +▁cult -2030 +▁receive -2031 +▁fre -2032 +▁continue -2033 +▁bad -2034 +▁June -2035 +▁relations -2036 +▁Europe -2037 +vert -2038 +astic -2039 +idence -2040 +▁human -2041 +▁parent -2042 +ulation -2043 +▁Val -2044 +▁His -2045 +▁claim -2046 +aily -2047 +▁Sept -2048 +ufact -2049 +ctions -2050 +elt -2051 +▁Dav -2052 +▁sex -2053 +▁prop -2054 +▁soon -2055 +ung -2056 +▁property -2057 +▁hon -2058 +nov -2059 +▁currently -2060 +▁amount -2061 +▁entire -2062 +new -2063 +▁West -2064 +uation -2065 +▁coming -2066 +ese -2067 +though -2068 +ana -2069 +ogn -2070 +▁Off -2071 +▁kids -2072 +▁TH -2073 +▁Tra -2074 +▁From -2075 +itting -2076 +▁phone -2077 +This -2078 +cast -2079 +▁final -2080 +▁consum -2081 +▁ess -2082 +▁happy -2083 +▁taken -2084 +▁celebr -2085 +▁docu -2086 +▁member -2087 +icro -2088 +.) -2089 +▁answ -2090 +▁meas -2091 +AC -2092 +▁wanted -2093 +▁type -2094 +▁software -2095 +selves -2096 +▁experienc -2097 +▁forward -2098 +▁diff -2099 +eds -2100 +▁whether -2101 +▁Us -2102 +▁wide -2103 +▁Read -2104 +▁either -2105 +▁Bu -2106 +ires -2107 +▁El -2108 +▁value -2109 +▁concer -2110 +▁deb -2111 +▁further -2112 +ux -2113 +ilar -2114 +ival -2115 +▁isn -2116 +▁coll -2117 +used -2118 +ams -2119 +aced -2120 +▁par -2121 +▁almost -2122 +▁required -2123 +▁crit -2124 +▁held -2125 +▁white -2126 +arter -2127 +▁date -2128 +▁comfort -2129 +▁quite -2130 +▁trying -2131 +▁provided -2132 +▁summer -2133 +▁Sw -2134 +▁fit -2135 +▁Pa -2136 +▁sugg -2137 +▁needed -2138 +▁favorite -2139 +▁tit -2140 +St -2141 +ees -2142 +▁Sunday -2143 +▁opportunity -2144 +▁Jo -2145 +▁ach -2146 +aching -2147 +uary -2148 +ek -2149 +▁Cor -2150 +▁via -2151 +▁extra -2152 +▁players -2153 +▁April -2154 +▁books -2155 +▁Monday -2156 +▁network -2157 +▁cop -2158 +amer -2159 +ler -2160 +▁example -2161 +▁box -2162 +▁users -2163 +▁, -2164 +itten -2165 +▁seem -2166 +▁period -2167 +▁various -2168 +▁Health -2169 +▁options -2170 +where -2171 +▁running -2172 +gress -2173 +▁style -2174 +▁especially -2175 +▁consider -2176 +▁yourself -2177 +▁Art -2178 +▁dam -2179 +▁safe -2180 +▁previous -2181 +▁swe -2182 +▁ways -2183 +▁version -2184 +▁created -2185 +▁sle -2186 +▁Mon -2187 +▁recently -2188 +▁potential -2189 +OU -2190 +▁issue -2191 +▁common -2192 +ises -2193 +▁di -2194 +▁Inc -2195 +▁stri -2196 +▁ready -2197 +▁attend -2198 +▁morning -2199 +▁regular -2200 +▁insp -2201 +▁else -2202 +▁road -2203 +▁nice -2204 +▁throughout -2205 +▁probably -2206 +▁ensure -2207 +-- -2208 +▁veh -2209 +▁received -2210 +earch -2211 +▁ball -2212 +▁Associ -2213 +▁President -2214 +▁clear -2215 +▁download -2216 +par -2217 +icles -2218 +▁engine -2219 +▁sho -2220 +erc -2221 +▁song -2222 +azing -2223 +▁lo -2224 +▁brand -2225 +▁relationship -2226 +▁takes -2227 +▁reading -2228 +mit -2229 +▁natural -2230 +▁Aut -2231 +▁States -2232 +ades -2233 +amed -2234 +▁park -2235 +▁House -2236 +ively -2237 +▁shows -2238 +▁asked -2239 +▁medical -2240 +istration -2241 +ague -2242 +▁inj -2243 +▁hit -2244 +▁choose -2245 +▁collect -2246 +▁Direct -2247 +▁Mich -2248 +▁original -2249 +▁cool -2250 +▁spr -2251 +▁couple -2252 +angu -2253 +reme -2254 +ipping -2255 +▁represent -2256 +▁bott -2257 +▁init -2258 +▁release -2259 +▁goal -2260 +▁behind -2261 +ny -2262 +apt -2263 +oid -2264 +▁Face -2265 +▁wonder -2266 +▁Soc -2267 +▁recent -2268 +▁sales -2269 +eter -2270 +▁clients -2271 +▁financial -2272 +aging -2273 +overed -2274 +▁accom -2275 +▁fresh -2276 +▁fast -2277 +▁super -2278 +▁leave -2279 +▁problems -2280 +▁anyone -2281 +▁role -2282 +face -2283 +▁Get -2284 +gs -2285 +hib -2286 +▁Ser -2287 +▁career -2288 +uge -2289 +▁Fin -2290 +bor -2291 +▁Black -2292 +ume -2293 +▁cup -2294 +ried -2295 +ville -2296 +▁model -2297 +▁article -2298 +oura -2299 +▁ful -2300 +uesday -2301 +▁meth -2302 +arth -2303 +▁ground -2304 +▁programs -2305 +▁Up -2306 +▁hol -2307 +▁fail -2308 +na -2309 +▁sun -2310 +aving -2311 +▁weeke -2312 +▁accept -2313 +▁flow -2314 +ada -2315 +ursday -2316 +▁base -2317 +medi -2318 +▁customer -2319 +▁difficult -2320 +OT -2321 +atform -2322 +▁writing -2323 +anced -2324 +urance -2325 +▁looks -2326 +▁PM -2327 +▁tour -2328 +▁polit -2329 +▁likely -2330 +ox -2331 +hel -2332 +oogle -2333 +▁paper -2334 +▁ap -2335 +▁abs -2336 +▁simply -2337 +cing -2338 +name -2339 +verage -2340 +▁inside -2341 +▁manufact -2342 +▁TV -2343 +clus -2344 +▁etc -2345 +▁mix -2346 +▁total -2347 +▁included -2348 +▁po -2349 +idge -2350 +ming -2351 +▁Int -2352 +▁risk -2353 +▁Wed -2354 +adem -2355 +aker -2356 +▁increase -2357 +▁party -2358 +▁changes -2359 +▁ele -2360 +ashing -2361 +▁board -2362 +▁education -2363 +oud -2364 +▁Her -2365 +▁October -2366 +▁action -2367 +▁former -2368 +▁meeting -2369 +Wh -2370 +▁however -2371 +▁News -2372 +▁outside -2373 +ification -2374 +uit -2375 +iple -2376 +▁match -2377 +▁Ac -2378 +▁America -2379 +▁Act -2380 +▁nothing -2381 +▁security -2382 +▁self -2383 +ground -2384 +▁contrib -2385 +▁stop -2386 +ester -2387 +▁town -2388 +▁August -2389 +▁matter -2390 +▁position -2391 +▁Af -2392 +▁ple -2393 +▁bed -2394 +▁late -2395 +istrict -2396 +▁Ob -2397 +▁systems -2398 +▁Every -2399 +icated -2400 +adu -2401 +ules -2402 +▁Bus -2403 +▁words -2404 +▁playing -2405 +▁cir -2406 +▁pan -2407 +ST -2408 +▁UK -2409 +wood -2410 +▁sat -2411 +▁impact -2412 +▁anim -2413 +▁mark -2414 +▁private -2415 +▁application -2416 +▁police -2417 +▁knowledge -2418 +▁exist -2419 +▁photos -2420 +▁method -2421 +▁longer -2422 +▁coun -2423 +▁worked -2424 +iddle -2425 +▁national -2426 +▁projects -2427 +ederal -2428 +▁ord -2429 +▁Are -2430 +▁necess -2431 +ude -2432 +▁table -2433 +▁stra -2434 +off -2435 +▁Ag -2436 +empt -2437 +elcome -2438 +▁September -2439 +ecut -2440 +▁activities -2441 +▁worth -2442 +▁recogn -2443 +▁production -2444 +str -2445 +nesday -2446 +▁Department -2447 +based -2448 +aby -2449 +iff -2450 +▁comment -2451 +▁compl -2452 +▁skills -2453 +▁true -2454 +▁general -2455 +▁Austral -2456 +▁January -2457 +iol -2458 +▁round -2459 +▁lives -2460 +▁learning -2461 +▁Tuesday -2462 +▁Thursday -2463 +ID -2464 +che -2465 +▁Then -2466 +▁introdu -2467 +ky -2468 +arden -2469 +▁signific -2470 +ING -2471 +oom -2472 +▁Sal -2473 +▁ill -2474 +▁student -2475 +▁Pat -2476 +▁lay -2477 +▁hair -2478 +▁Free -2479 +▁Nove -2480 +▁computer -2481 +▁squ -2482 +▁purchase -2483 +▁tal -2484 +ham -2485 +▁Also -2486 +ession -2487 +ett -2488 +▁Mus -2489 +▁death -2490 +▁defin -2491 +▁seems -2492 +▁Of -2493 +ci -2494 +▁hands -2495 +izing -2496 +▁communic -2497 +mon -2498 +▁rad -2499 +▁choice -2500 +▁screen -2501 +AM -2502 +▁draw -2503 +▁concern -2504 +▁leading -2505 +▁additional -2506 +▁First -2507 +▁rights -2508 +attle -2509 +▁cell -2510 +▁credit -2511 +▁located -2512 +▁variety -2513 +▁leaders -2514 +▁Facebook -2515 +▁stat -2516 +▁tick -2517 +▁drive -2518 +▁movie -2519 +▁San -2520 +arget -2521 +oring -2522 +▁file -2523 +▁fig -2524 +ipment -2525 +▁hy -2526 +▁bud -2527 +▁image -2528 +▁determ -2529 +▁amazing -2530 +aign -2531 +▁Sim -2532 +▁suggest -2533 +mercial -2534 +▁chance -2535 +▁Red -2536 +▁associ -2537 +▁rather -2538 +▁practice -2539 +▁built -2540 +▁plans -2541 +▁function -2542 +oph -2543 +▁Har -2544 +▁providing -2545 +iter -2546 +▁cal -2547 +ached -2548 +airs -2549 +light -2550 +ought -2551 +urg -2552 +pm -2553 +▁War -2554 +▁vict -2555 +▁court -2556 +▁aw -2557 +▁saf -2558 +▁cand -2559 +example -2560 +▁Out -2561 +▁touch -2562 +▁Air -2563 +▁teac -2564 +cil -2565 +▁exam -2566 +▁autom -2567 +▁Street -2568 +▁international -2569 +▁loss -2570 +▁weekend -2571 +▁Wind -2572 +▁infl -2573 +▁prior -2574 +▁prevent -2575 +▁allows -2576 +▁arri -2577 +▁Calif -2578 +▁Click -2579 +irth -2580 +ibrary -2581 +▁character -2582 +▁piece -2583 +▁treatment -2584 +cember -2585 +itchen -2586 +olution -2587 +▁http -2588 +ma -2589 +▁similar -2590 +▁Most -2591 +▁moment -2592 +gar -2593 +oke -2594 +ruary -2595 +▁clos -2596 +▁Design -2597 +▁investig -2598 +▁rate -2599 +▁AM -2600 +reg -2601 +▁commit -2602 +▁growth -2603 +imum -2604 +▁norm -2605 +OM -2606 +iber -2607 +▁Dis -2608 +ivery -2609 +▁estab -2610 +▁cause -2611 +▁user -2612 +sp -2613 +▁deg -2614 +▁lost -2615 +▁display -2616 +▁collection -2617 +▁myself -2618 +▁Cr -2619 +▁op -2620 +▁enter -2621 +▁Wednesday -2622 +unt -2623 +▁rout -2624 +ault -2625 +▁decided -2626 +▁decision -2627 +▁sil -2628 +▁inde -2629 +▁Any -2630 +▁higher -2631 +cy -2632 +▁bal -2633 +▁daily -2634 +ha -2635 +ournal -2636 +▁digital -2637 +▁November -2638 +▁purp -2639 +▁Group -2640 +▁released -2641 +▁significant -2642 +▁reported -2643 +LE -2644 +▁Home -2645 +▁woman -2646 +▁Cour -2647 +▁easily -2648 +▁cannot -2649 +▁goes -2650 +▁International -2651 +▁excell -2652 +lin -2653 +▁wall -2654 +▁Thanks -2655 +▁quickly -2656 +▁College -2657 +▁usually -2658 +amb -2659 +▁bag -2660 +▁apply -2661 +▁floor -2662 +▁expected -2663 +iant -2664 +▁involved -2665 +▁Law -2666 +▁dom -2667 +▁attack -2668 +just -2669 +▁boy -2670 +illing -2671 +▁regard -2672 +▁platform -2673 +▁capt -2674 +▁iP -2675 +▁Net -2676 +▁encoura -2677 +▁protect -2678 +ondon -2679 +▁Cons -2680 +▁agree -2681 +ael -2682 +▁serious -2683 +▁December -2684 +▁safety -2685 +▁roll -2686 +▁saw -2687 +▁dress -2688 +▁Google -2689 +▁gen -2690 +▁parents -2691 +▁mach -2692 +idents -2693 +▁played -2694 +▁Service -2695 +▁immedi -2696 +▁surpr -2697 +mas -2698 +▁warm -2699 +zz -2700 +▁integr -2701 +▁mobile -2702 +▁tast -2703 +ica -2704 +▁February -2705 +▁sn -2706 +▁club -2707 +▁langu -2708 +▁president -2709 +▁sche -2710 +▁related -2711 +hern -2712 +▁shoot -2713 +▁finish -2714 +▁ideas -2715 +▁global -2716 +▁marketing -2717 +▁tools -2718 +▁ep -2719 +▁expert -2720 +band -2721 +▁code -2722 +▁exact -2723 +ospital -2724 +asons -2725 +▁mass -2726 +▁note -2727 +avy -2728 +▁photo -2729 +izes -2730 +▁save -2731 +▁source -2732 +▁ut -2733 +▁option -2734 +▁respect -2735 +▁Brit -2736 +▁Let -2737 +▁feed -2738 +enge -2739 +iding -2740 +▁arch -2741 +▁deep -2742 +▁corre -2743 +▁Ang -2744 +▁announced -2745 +ilies -2746 +▁appe -2747 +edding -2748 +▁Well -2749 +cription -2750 +▁La -2751 +www -2752 +hood -2753 +reng -2754 +▁stock -2755 +▁sens -2756 +▁admin -2757 +▁location -2758 +▁ri -2759 +ellow -2760 +▁gets -2761 +▁David -2762 +▁costs -2763 +▁helps -2764 +▁Av -2765 +ples -2766 +▁materials -2767 +ength -2768 +▁Je -2769 +ipe -2770 +rab -2771 +▁Tex -2772 +▁huge -2773 +▁published -2774 +agn -2775 +like -2776 +AP -2777 +▁send -2778 +▁mother -2779 +▁benefits -2780 +▁English -2781 +enior -2782 +mission -2783 +ography -2784 +▁lab -2785 +oday -2786 +▁Play -2787 +▁fight -2788 +▁Over -2789 +▁hear -2790 +▁weight -2791 +rown -2792 +▁Spr -2793 +ornia -2794 +uel -2795 +vey -2796 +iction -2797 +▁images -2798 +rought -2799 +▁restaur -2800 +key -2801 +▁gar -2802 +▁Book -2803 +▁earn -2804 +ald -2805 +▁ability -2806 +▁interview -2807 +add -2808 +▁Check -2809 +▁Business -2810 +atory -2811 +▁London -2812 +ructure -2813 +▁written -2814 +akers -2815 +▁challeng -2816 +▁standard -2817 +▁gives -2818 +▁giving -2819 +▁ones -2820 +▁legal -2821 +▁sense -2822 +▁campaign -2823 +▁Sch -2824 +▁dest -2825 +▁innov -2826 +erved -2827 +▁door -2828 +▁patients -2829 +rom -2830 +▁mid -2831 +▁trust -2832 +urt -2833 +▁sus -2834 +▁wasn -2835 +▁Services -2836 +▁center -2837 +▁instead -2838 +aged -2839 +▁Produ -2840 +▁fab -2841 +▁Coun -2842 +▁heat -2843 +▁neg -2844 +▁fine -2845 +▁item -2846 +▁Great -2847 +▁target -2848 +erous -2849 +▁prem -2850 +erve -2851 +▁sold -2852 +▁White -2853 +aught -2854 +▁wish -2855 +▁Trans -2856 +▁parts -2857 +▁write -2858 +▁levels -2859 +▁lic -2860 +▁award -2861 +iring -2862 +arant -2863 +aves -2864 +▁cases -2865 +▁describ -2866 +▁picture -2867 +▁pers -2868 +▁partners -2869 +▁Web -2870 +▁dry -2871 +▁neigh -2872 +irit -2873 +▁Mod -2874 +▁Prof -2875 +▁stuff -2876 +ashington -2877 +ida -2878 +▁pull -2879 +▁conditions -2880 +▁ded -2881 +atives -2882 +▁green -2883 +▁California -2884 +▁broad -2885 +▁effic -2886 +▁Hol -2887 +board -2888 +▁Hall -2889 +put -2890 +rows -2891 +▁Program -2892 +ivity -2893 +▁began -2894 +▁sale -2895 +▁upon -2896 +istic -2897 +▁highly -2898 +▁interesting -2899 +TM -2900 +bit -2901 +OS -2902 +▁vot -2903 +▁fans -2904 +▁stories -2905 +inner -2906 +▁request -2907 +▁contract -2908 +▁remember -2909 +▁slow -2910 +▁Cle -2911 +▁emer -2912 +▁subs -2913 +▁answer -2914 +▁Techn -2915 +anch -2916 +▁comments -2917 +acing -2918 +ocol -2919 +▁bra -2920 +▁Phot -2921 +▁wood -2922 +▁Other -2923 +▁lower -2924 +▁sym -2925 +▁dead -2926 +orge -2927 +▁prim -2928 +orage -2929 +▁modern -2930 +▁player -2931 +▁cat -2932 +coming -2933 +bum -2934 +▁interested -2935 +ooth -2936 +▁reports -2937 +aches -2938 +▁except -2939 +ara -2940 +lev -2941 +▁dise -2942 +▁trip -2943 +▁teams -2944 +▁Jack -2945 +▁Texas -2946 +▁attention -2947 +▁equipment -2948 +▁paint -2949 +sy -2950 +▁fully -2951 +▁wrong -2952 +▁directly -2953 +▁starting -2954 +▁completely -2955 +▁organization -2956 +▁types -2957 +uk -2958 +wide -2959 +▁Green -2960 +mm -2961 +▁resources -2962 +▁Last -2963 +▁www -2964 +ET -2965 +urb -2966 +ager -2967 +▁document -2968 +▁themselves -2969 +apan -2970 +▁dru -2971 +▁solutions -2972 +▁stru -2973 +▁viol -2974 +ashion -2975 +▁bank -2976 +▁Washington -2977 +▁Loc -2978 +▁Rem -2979 +ament -2980 +▁multiple -2981 +▁Association -2982 +▁band -2983 +▁achieve -2984 +▁condition -2985 +▁gold -2986 +▁businesses -2987 +▁Twitter -2988 +uses -2989 +▁wait -2990 +ule -2991 +▁Go -2992 +ening -2993 +udd -2994 +▁Each -2995 +▁affect -2996 +▁opportunities -2997 +▁vac -2998 +▁Gener -2999 +urer -3000 +▁hop -3001 +EC -3002 +▁sett -3003 +▁policy -3004 +▁Par -3005 +▁led -3006 +ension -3007 +▁thinking -3008 +▁dream -3009 +▁Once -3010 +raz -3011 +rel -3012 +▁groups -3013 +▁planning -3014 +▁commercial -3015 +EO -3016 +He -3017 +ffee -3018 +olf -3019 +▁Spe -3020 +▁separ -3021 +▁applications -3022 +▁qual -3023 +▁streng -3024 +▁approach -3025 +▁families -3026 +▁solution -3027 +▁Del -3028 +▁firm -3029 +▁Class -3030 +▁express -3031 +ores -3032 +▁gave -3033 +▁Found -3034 +enty -3035 +iles -3036 +▁offe -3037 +▁consult -3038 +▁Year -3039 +▁gift -3040 +▁subject -3041 +▁Mem -3042 +AD -3043 +▁Afric -3044 +▁prices -3045 +▁successful -3046 +ties -3047 +▁positive -3048 +▁employees -3049 +arlier -3050 +▁blood -3051 +▁AN -3052 +▁race -3053 +itute -3054 +▁deliver -3055 +oul -3056 +▁join -3057 +ares -3058 +▁itself -3059 +▁King -3060 +▁shot -3061 +▁advice -3062 +▁cert -3063 +▁THE -3064 +▁eye -3065 +riend -3066 +▁hour -3067 +▁defe -3068 +▁saying -3069 +▁healthy -3070 +▁glass -3071 +▁creating -3072 +▁Sub -3073 +▁According -3074 +▁dark -3075 +ration -3076 +▁spent -3077 +▁div -3078 +▁Even -3079 +▁Why -3080 +field -3081 +▁cy -3082 +itely -3083 +ford -3084 +▁Best -3085 +▁cancer -3086 +▁Christmas -3087 +▁effective -3088 +▁serve -3089 +omen -3090 +▁sites -3091 +▁budget -3092 +▁Whe -3093 +▁Road -3094 +▁lif -3095 +▁goals -3096 +▁message -3097 +king -3098 +▁Vis -3099 +▁reve -3100 +mb -3101 +down -3102 +▁Paul -3103 +▁fair -3104 +▁India -3105 +▁average -3106 +▁Dan -3107 +▁fix -3108 +▁circ -3109 +▁Office -3110 +▁Pri -3111 +▁condu -3112 +▁East -3113 +▁reach -3114 +elling -3115 +▁Since -3116 +▁cross -3117 +aughter -3118 +▁traditional -3119 +▁extreme -3120 +▁organiz -3121 +▁director -3122 +PS -3123 +▁Hot -3124 +▁implement -3125 +Ch -3126 +▁sometimes -3127 +▁physical -3128 +▁obs -3129 +ipped -3130 +▁camer -3131 +ords -3132 +vis -3133 +▁Oh -3134 +▁opp -3135 +▁adult -3136 +▁terms -3137 +iable -3138 +▁Germ -3139 +▁plant -3140 +▁wonderful -3141 +US -3142 +rote -3143 +▁hor -3144 +▁Many -3145 +▁Rec -3146 +▁aim -3147 +▁attempt -3148 +▁limited -3149 +▁pictures -3150 +tee -3151 +▁Japan -3152 +▁See -3153 +▁Develop -3154 +▁excellent -3155 +▁dro -3156 +urning -3157 +ysis -3158 +▁mount -3159 +BC -3160 +▁emb -3161 +▁Work -3162 +imately -3163 +onse -3164 +▁brought -3165 +uth -3166 +yond -3167 +▁Ann -3168 +▁quarter -3169 +hest -3170 +▁title -3171 +▁section -3172 +ecutive -3173 +▁block -3174 +▁delivery -3175 +▁Mor -3176 +▁became -3177 +▁farm -3178 +▁arr -3179 +▁carry -3180 +▁effort -3181 +▁IN -3182 +▁kitchen -3183 +▁mention -3184 +▁developed -3185 +▁imm -3186 +inary -3187 +▁Use -3188 +iance -3189 +yright -3190 +reci -3191 +▁jud -3192 +▁fish -3193 +▁China -3194 +▁Inter -3195 +▁countries -3196 +estern -3197 +▁progress -3198 +▁necessary -3199 +▁ge -3200 +▁suppl -3201 +▁sweet -3202 +pendent -3203 +▁complex -3204 +ocks -3205 +▁baby -3206 +vest -3207 +▁felt -3208 +mitted -3209 +▁feeling -3210 +▁System -3211 +▁nation -3212 +▁promot -3213 +▁Top -3214 +▁Make -3215 +▁Dem -3216 +▁Good -3217 +hold -3218 +iced -3219 +▁birth -3220 +▁sleep -3221 +▁growing -3222 +▁impress -3223 +porate -3224 +▁Public -3225 +▁places -3226 +ocr -3227 +▁seven -3228 +▁IT -3229 +▁Flor -3230 +ffects -3231 +venue -3232 +▁Mac -3233 +▁war -3234 +▁heard -3235 +itation -3236 +gu -3237 +pite -3238 +▁weather -3239 +▁Lear -3240 +▁Open -3241 +▁region -3242 +▁Michael -3243 +haps -3244 +▁billion -3245 +▁son -3246 +itary -3247 +▁star -3248 +▁Sur -3249 +duc -3250 +▁Today -3251 +▁hotel -3252 +▁wants -3253 +Re -3254 +▁Thank -3255 +▁stick -3256 +▁college -3257 +▁construction -3258 +IL -3259 +▁bi -3260 +▁album -3261 +▁spend -3262 +▁mat -3263 +▁cold -3264 +▁medic -3265 +▁stage -3266 +▁ver -3267 +▁Port -3268 +▁Director -3269 +▁individuals -3270 +▁double -3271 +nded -3272 +▁Canada -3273 +▁Market -3274 +): -3275 +EL -3276 +aries -3277 +▁Down -3278 +▁convers -3279 +▁Russ -3280 +▁profession -3281 +ying -3282 +▁ble -3283 +▁speed -3284 +▁distrib -3285 +pects -3286 +▁exerc -3287 +rup -3288 +▁ST -3289 +aled -3290 +▁finished -3291 +fl -3292 +▁gas -3293 +istry -3294 +▁suit -3295 +ils -3296 +▁pages -3297 +▁statement -3298 +pre -3299 +ancy -3300 +▁charge -3301 +▁ing -3302 +▁spot -3303 +▁ult -3304 +▁requirements -3305 +▁finally -3306 +▁schools -3307 +▁vehicle -3308 +▁smart -3309 +▁annual -3310 +▁Windows -3311 +". -3312 +ado -3313 +wor -3314 +▁eat -3315 +useum -3316 +▁feet -3317 +▁Board -3318 +▁advant -3319 +ibly -3320 +▁blue -3321 +▁load -3322 +▁aware -3323 +unk -3324 +▁Gold -3325 +▁Research -3326 +▁straight -3327 +▁appl -3328 +arc -3329 +▁Mark -3330 +▁nearly -3331 +ato -3332 +▁Bel -3333 +▁Tom -3334 +▁tried -3335 +▁hous -3336 +▁avoid -3337 +aling -3338 +ports -3339 +▁difference -3340 +▁wrote -3341 +▁William -3342 +▁Sol -3343 +▁pattern -3344 +owl -3345 +ened -3346 +▁James -3347 +▁respond -3348 +▁challenge -3349 +▁Bre -3350 +▁dog -3351 +▁beginning -3352 +ION -3353 +▁Educ -3354 +▁About -3355 +▁helping -3356 +:|| -3357 +▁benefit -3358 +▁insurance -3359 +▁situation -3360 +iment -3361 +▁essential -3362 +▁imag -3363 +ancing -3364 +unte -3365 +▁device -3366 +ceed -3367 +▁Obama -3368 +rast -3369 +▁shop -3370 +ological -3371 +▁Care -3372 +▁Indian -3373 +▁political -3374 +box -3375 +uted -3376 +▁Time -3377 +▁loved -3378 +▁Review -3379 +ube -3380 +▁nut -3381 +▁pow -3382 +overn -3383 +▁wear -3384 +▁Apple -3385 +▁Sl -3386 +▁Mag -3387 +olute -3388 +▁Find -3389 +▁activity -3390 +▁devices -3391 +▁moving -3392 +▁Met -3393 +▁lik -3394 +▁paid -3395 +▁enh -3396 +▁Club -3397 +▁Hel -3398 +▁uses -3399 +▁eight -3400 +▁exhib -3401 +▁Court -3402 +▁turned -3403 +oms -3404 +oses -3405 +▁posted -3406 +▁towards -3407 +”. -3408 +▁nature -3409 +▁Sk -3410 +▁partner -3411 +asy -3412 +▁investment -3413 +ourney -3414 +▁appreci -3415 +▁offering -3416 +▁temper -3417 +▁contain -3418 +▁largest -3419 +ivil -3420 +▁knew -3421 +▁ahead -3422 +oves -3423 +rench -3424 +idered -3425 +▁retail -3426 +▁hus -3427 +▁eyes -3428 +▁owners -3429 +▁language -3430 +▁Ant -3431 +inger -3432 +▁expand -3433 +house -3434 +ey -3435 +rences -3436 +ios -3437 +▁rent -3438 +ned -3439 +▁cas -3440 +▁connect -3441 +▁wife -3442 +ampions -3443 +▁advert -3444 +▁Rel -3445 +▁Rich -3446 +▁reduce -3447 +▁European -3448 +▁guarant -3449 +ago -3450 +cause -3451 +▁Look -3452 +▁sports -3453 +▁correct -3454 +aly -3455 +anta -3456 +▁categ -3457 +▁client -3458 +▁states -3459 +▁consist -3460 +pri -3461 +▁maybe -3462 +▁named -3463 +▁definitely -3464 +hips -3465 +▁influ -3466 +▁entertain -3467 +erry -3468 +hens -3469 +▁accur -3470 +▁concept -3471 +osing -3472 +ounds -3473 +▁runs -3474 +▁grand -3475 +▁stress -3476 +IP -3477 +change -3478 +▁Super -3479 +▁guide -3480 +▁homes -3481 +▁Have -3482 +▁thous -3483 +last -3484 +▁jobs -3485 +▁offered -3486 +estival -3487 +▁earlier -3488 +▁immediately -3489 +▁doll -3490 +▁numbers -3491 +sych -3492 +▁conc -3493 +iers -3494 +▁decl -3495 +▁Fam -3496 +esome -3497 +▁Rob -3498 +▁rates -3499 +▁Council -3500 +azine -3501 +▁rev -3502 +▁Community -3503 +▁path -3504 +▁collabor -3505 +lying -3506 +roud -3507 +▁Cop -3508 +You -3509 +alt -3510 +orrow -3511 +▁candid -3512 +▁interact -3513 +ails -3514 +▁remain -3515 +▁II -3516 +more -3517 +▁bottom -3518 +sec -3519 +dule -3520 +▁Sum -3521 +▁Cong -3522 +▁belie -3523 +▁drink -3524 +▁pieces -3525 +▁exactly -3526 +asc -3527 +lim -3528 +▁tips -3529 +▁Micro -3530 +▁View -3531 +iation -3532 +▁overall -3533 +▁max -3534 +▁federal -3535 +▁storage -3536 +vin -3537 +icious -3538 +▁Custom -3539 +▁opening -3540 +▁demand -3541 +▁Two -3542 +place -3543 +▁surround -3544 +▁Cur -3545 +▁histor -3546 +▁Bay -3547 +orial -3548 +▁Rober -3549 +▁adjust -3550 +ulations -3551 +▁shipping -3552 +▁strateg -3553 +▁Internet -3554 +▁active -3555 +▁threat -3556 +ram -3557 +▁Win -3558 +▁looked -3559 +oma -3560 +▁ten -3561 +▁occas -3562 +▁length -3563 +inated -3564 +▁served -3565 +▁conference -3566 +ico -3567 +iny -3568 +▁IS -3569 +▁guys -3570 +▁rock -3571 +▁button -3572 +▁garden -3573 +▁Florida -3574 +▁acqu -3575 +▁Police -3576 +▁easier -3577 +▁Angel -3578 +yd -3579 +order -3580 +undred -3581 +▁Island -3582 +▁father -3583 +oly -3584 +▁bath -3585 +▁speak -3586 +▁attract -3587 +If -3588 +▁normal -3589 +▁thanks -3590 +dom -3591 +umn -3592 +▁Love -3593 +▁thank -3594 +▁bill -3595 +▁People -3596 +▁background -3597 +illa -3598 +rial -3599 +▁born -3600 +arily -3601 +▁girls -3602 +rig -3603 +▁Ev -3604 +▁Det -3605 +▁wedding -3606 +care -3607 +▁lots -3608 +▁damage -3609 +roid -3610 +▁Big -3611 +▁fat -3612 +▁pet -3613 +bl -3614 +ses -3615 +▁Ty -3616 +▁culture -3617 +▁replace -3618 +▁creative -3619 +▁internet -3620 +▁completed -3621 +▁assess -3622 +OL -3623 +▁Call -3624 +▁prec -3625 +aduate -3626 +atever -3627 +mod -3628 +que -3629 +▁Life -3630 +▁Team -3631 +▁wine -3632 +▁Company -3633 +▁husband -3634 +ij -3635 +▁coach -3636 +▁beyond -3637 +aith -3638 +▁cards -3639 +ipp -3640 +▁cash -3641 +▁Child -3642 +▁haven -3643 +▁altern -3644 +ota -3645 +▁Matt -3646 +▁guy -3647 +phone -3648 +▁depend -3649 +▁setting -3650 +leg -3651 +▁bul -3652 +▁Back -3653 +▁Show -3654 +▁miles -3655 +▁er -3656 +antly -3657 +force -3658 +▁transport -3659 +▁Management -3660 +ustain -3661 +body -3662 +ston -3663 +wise -3664 +▁emot -3665 +▁behav -3666 +▁driving -3667 +▁cream -3668 +▁response -3669 +iling -3670 +▁pred -3671 +▁estate -3672 +ously -3673 +het -3674 +▁USA -3675 +oving -3676 +isions -3677 +▁owner -3678 +▁Australia -3679 +friend -3680 +▁Pet -3681 +▁Sun -3682 +▁cho -3683 +error -3684 +▁Contact -3685 +izz -3686 +▁excited -3687 +▁selection -3688 +▁Ir -3689 +ales -3690 +anging -3691 +▁Ret -3692 +▁middle -3693 +▁efforts -3694 +▁particularly -3695 +▁Plan -3696 +▁Pal -3697 +itect -3698 +icks -3699 +▁Dri -3700 +▁helped -3701 +door -3702 +ustr -3703 +▁Lake -3704 +▁doub -3705 +▁colors -3706 +▁inform -3707 +▁Ve -3708 +aper -3709 +▁files -3710 +▁allowed -3711 +▁lines -3712 +▁existing -3713 +▁Bank -3714 +▁satis -3715 +▁patient -3716 +▁comfortable -3717 +istered -3718 +▁welcome -3719 +▁considered -3720 +▁responsible -3721 +▁clot -3722 +▁drop -3723 +▁truly -3724 +▁coffee -3725 +▁understanding -3726 +DA -3727 +▁plus -3728 +▁Govern -3729 +▁Thom -3730 +▁measure -3731 +set -3732 +▁economic -3733 +▁Yes -3734 +oming -3735 +▁frame -3736 +▁slight -3737 +▁journey -3738 +isl -3739 +▁Dec -3740 +▁indic -3741 +▁degree -3742 +▁ingred -3743 +▁himself -3744 +bon -3745 +▁purpose -3746 +▁tom -3747 +▁surv -3748 +▁changed -3749 +▁liter -3750 +▁mission -3751 +free -3752 +nown -3753 +ences -3754 +onstr -3755 +ona -3756 +▁Although -3757 +EM -3758 +▁pen -3759 +ologies -3760 +▁models -3761 +reed -3762 +▁train -3763 +▁winter -3764 +▁prot -3765 +▁stream -3766 +▁highest -3767 +ads -3768 +see -3769 +encies -3770 +▁prefer -3771 +▁seeing -3772 +▁strugg -3773 +▁evening -3774 +press -3775 +▁Take -3776 +▁artist -3777 +▁talking -3778 +OW -3779 +▁Camp -3780 +▁Phil -3781 +▁afford -3782 +▁Information -3783 +▁Str -3784 +▁sty -3785 +▁Smith -3786 +▁fashion -3787 +▁Republic -3788 +▁gun -3789 +▁disease -3790 +▁pool -3791 +▁absolute -3792 +OV -3793 +▁Sen -3794 +▁shopping -3795 +raw -3796 +oman -3797 +apter -3798 +▁River -3799 +▁Church -3800 +met -3801 +soft -3802 +▁Mart -3803 +▁lack -3804 +▁appoint -3805 +▁heavy -3806 +▁letter -3807 +rem -3808 +▁Color -3809 +▁British -3810 +▁daughter -3811 +▁fem -3812 +▁Rock -3813 +▁cast -3814 +▁brother -3815 +rey -3816 +▁Sing -3817 +▁flav -3818 +porary -3819 +▁occur -3820 +▁smooth -3821 +▁opin -3822 +▁increased -3823 +▁Jes -3824 +▁Music -3825 +▁moved -3826 +▁proud -3827 +▁couldn -3828 +▁launch -3829 +▁analysis -3830 +▁organizations -3831 +dd -3832 +▁PC -3833 +tion -3834 +▁mer -3835 +fit -3836 +▁links -3837 +gery -3838 +▁obt -3839 +▁Water -3840 +▁craft -3841 +▁church -3842 +▁compon -3843 +▁Blue -3844 +▁fill -3845 +▁rules -3846 +▁shared -3847 +▁spring -3848 +eria -3849 +uled -3850 +▁mail -3851 +▁Under -3852 +▁sched -3853 +▁Because -3854 +ronic -3855 +chan -3856 +▁Special -3857 +▁reviews -3858 +▁senior -3859 +▁hundred -3860 +IM -3861 +▁onto -3862 +▁whose -3863 +bed -3864 +▁Brown -3865 +net -3866 +▁fan -3867 +icing -3868 +▁Power -3869 +▁decor -3870 +▁secure -3871 +▁machine -3872 +imal -3873 +▁spread -3874 +▁u -3875 +▁frequ -3876 +▁score -3877 +ocolate -3878 +▁spirit -3879 +▁residents -3880 +amic -3881 +▁Hum -3882 +▁trade -3883 +▁science -3884 +vant -3885 +▁fra -3886 +▁Wood -3887 +▁appropri -3888 +▁officials -3889 +▁Sam -3890 +▁unit -3891 +▁died -3892 +hone -3893 +▁gone -3894 +▁manager -3895 +▁pressure -3896 +▁Like -3897 +▁challenges -3898 +TS -3899 +ady -3900 +▁clin -3901 +▁extend -3902 +▁instruct -3903 +▁dedicated -3904 +▁competition -3905 +▁Mount -3906 +▁Char -3907 +▁session -3908 +▁fant -3909 +▁Follow -3910 +▁happened -3911 +rian -3912 +▁Food -3913 +▁Mary -3914 +▁sort -3915 +ulated -3916 +▁initial -3917 +▁Fire -3918 +▁trou -3919 +▁Media -3920 +▁District -3921 +BA -3922 +icon -3923 +▁characters -3924 +▁basic -3925 +▁camera -3926 +▁holiday -3927 +azon -3928 +ategy -3929 +▁Enter -3930 +▁powerful -3931 +▁Institute -3932 +▁produce -3933 +▁beg -3934 +istics -3935 +▁Press -3936 +osition -3937 +▁dating -3938 +ette -3939 +asp -3940 +▁Hist -3941 +▁reasons -3942 +▁increasing -3943 +icken -3944 +▁shown -3945 +▁sugar -3946 +▁incred -3947 +▁extremely -3948 +▁rob -3949 +▁chem -3950 +▁Education -3951 +oos -3952 +▁AC -3953 +inese -3954 +▁volunte -3955 +▁disp -3956 +▁package -3957 +▁payment -3958 +RA -3959 +▁eval -3960 +▁guests -3961 +▁aren -3962 +▁snow -3963 +▁leader -3964 +▁biggest -3965 +▁TO -3966 +▁alone -3967 +▁object -3968 +▁proced -3969 +▁Sa -3970 +rowd -3971 +▁basis -3972 +▁disapp -3973 +▁supply -3974 +▁General -3975 +orney -3976 +▁Star -3977 +ifying -3978 +olic -3979 +▁laws -3980 +▁breat -3981 +▁graph -3982 +▁solid -3983 +▁forget -3984 +▁continues -3985 +LC -3986 +▁cars -3987 +▁guid -3988 +▁voice -3989 +▁experienced -3990 +▁Lou -3991 +▁mis -3992 +▁brows -3993 +rapy -3994 +▁arrest -3995 +▁passed -3996 +▁schedule -3997 +ken -3998 +omb -3999 +uing -4000 +▁egg -4001 +▁passion -4002 +▁dang -4003 +▁fear -4004 +▁guess -4005 +▁scene -4006 +esterday -4007 +BS -4008 +▁bur -4009 +▁steps -4010 +cel -4011 +▁Mal -4012 +▁beat -4013 +▁military -4014 +Sh -4015 +▁PR -4016 +▁Miss -4017 +gal -4018 +▁gra -4019 +▁names -4020 +▁approx -4021 +▁update -4022 +▁subst -4023 +▁During -4024 +▁protection -4025 +▁Att -4026 +▁Franc -4027 +▁French -4028 +annel -4029 +▁peace -4030 +▁conven -4031 +term -4032 +▁Who -4033 +▁ton -4034 +▁advantage -4035 +state -4036 +▁placed -4037 +▁Commission -4038 +▁pair -4039 +▁notice -4040 +▁strength -4041 +ero -4042 +What -4043 +incip -4044 +using -4045 +▁academ -4046 +▁Arch -4047 +▁epis -4048 +▁adding -4049 +▁waiting -4050 +▁although -4051 +ags -4052 +ideo -4053 +▁League -4054 +IV -4055 +▁Ben -4056 +clusive -4057 +▁Mot -4058 +▁reb -4059 +▁Alex -4060 +▁beauty -4061 +▁scient -4062 +ula -4063 +▁Dig -4064 +▁calls -4065 +▁relax -4066 +▁demonstr -4067 +▁regarding -4068 +amin -4069 +mark -4070 +ovel -4071 +▁income -4072 +▁covered -4073 +▁effects -4074 +ari -4075 +ixt -4076 +▁Sign -4077 +▁Online -4078 +uty -4079 +imin -4080 +▁copy -4081 +iverse -4082 +▁initi -4083 +▁experts -4084 +▁standards -4085 +▁technical -4086 +ros -4087 +okes -4088 +▁Atl -4089 +▁Vol -4090 +ading -4091 +▁manage -4092 +▁Chic -4093 +▁knows -4094 +▁winning -4095 +▁hospital -4096 +▁certainly -4097 +▁Real -4098 +▁batter -4099 +▁workers -4100 +▁connection -4101 +osh -4102 +▁compared -4103 +As -4104 +oe -4105 +▁RE -4106 +▁hom -4107 +ga -4108 +oop -4109 +▁Ins -4110 +▁Form -4111 +▁Development -4112 +▁wild -4113 +▁dinner -4114 +▁fabric -4115 +▁associated -4116 +▁experiences -4117 +▁Pay -4118 +▁doctor -4119 +▁master -4120 +▁cit -4121 +▁cru -4122 +▁wat -4123 +ograp -4124 +▁vote -4125 +▁posts -4126 +▁finding -4127 +▁Foundation -4128 +▁opened -4129 +▁Profess -4130 +▁reflect -4131 +IG -4132 +▁Carol -4133 +amm -4134 +▁audience -4135 +▁friendly -4136 +cell -4137 +unning -4138 +atically -4139 +mail -4140 +ctors -4141 +▁surface -4142 +▁den -4143 +▁Science -4144 +▁pm -4145 +▁Cap -4146 +itude -4147 +▁trail -4148 +▁artists -4149 +▁traffic -4150 +▁critical -4151 +▁communities -4152 +AA -4153 +uce -4154 +▁NY -4155 +▁Valley -4156 +works -4157 +▁remind -4158 +▁victim -4159 +▁Step -4160 +▁salt -4161 +▁followed -4162 +la -4163 +well -4164 +▁Rad -4165 +iques -4166 +▁Elect -4167 +▁football -4168 +tr -4169 +aming -4170 +▁electric -4171 +aven -4172 +▁Beach -4173 +▁facility -4174 +▁cry -4175 +gency -4176 +▁Disc -4177 +▁keeping -4178 +▁meaning -4179 +▁luck -4180 +▁pros -4181 +▁figure -4182 +▁learned -4183 +yer -4184 +ander -4185 +ulate -4186 +▁tickets -4187 +▁professionals -4188 +antic -4189 +▁laun -4190 +▁taste -4191 +▁instit -4192 +gen -4193 +▁bright -4194 +ech -4195 +arge -4196 +▁produced -4197 +▁watching -4198 +▁flex -4199 +▁catch -4200 +▁monitor -4201 +▁contains -4202 +lor -4203 +▁ter -4204 +There -4205 +ooper -4206 +▁entry -4207 +▁Project -4208 +▁Society -4209 +▁classic -4210 +▁department -4211 +edy -4212 +itar -4213 +▁diagn -4214 +▁lock -4215 +▁classes -4216 +rees -4217 +▁closed -4218 +▁starts -4219 +▁continued -4220 +▁dire -4221 +▁jump -4222 +▁awesome -4223 +▁kept -4224 +▁bought -4225 +▁listed -4226 +▁Christian -4227 +▁Wil -4228 +osure -4229 +▁Whether -4230 +▁neighbor -4231 +▁selected -4232 +▁Town -4233 +▁explore -4234 +▁testing -4235 +▁harm -4236 +▁Date -4237 +▁larger -4238 +▁videos -4239 +▁Another -4240 +▁presented -4241 +fast -4242 +▁Ber -4243 +▁ice -4244 +▁Times -4245 +▁transfer -4246 +▁thousands -4247 +▁developing -4248 +fin -4249 +▁capital -4250 +▁OF -4251 +iller -4252 +▁teaching -4253 +▁Mel -4254 +▁Nov -4255 +▁Long -4256 +▁force -4257 +▁grant -4258 +▁minute -4259 +▁talent -4260 +▁established -4261 +▁fol -4262 +▁Hill -4263 +▁desk -4264 +standing -4265 +▁England -4266 +▁AP -4267 +enses -4268 +▁announce -4269 +▁exciting -4270 +end -4271 +▁Vir -4272 +acity -4273 +▁Family -4274 +▁street -4275 +▁furn -4276 +▁facilities -4277 +▁Jim -4278 +▁brings -4279 +▁Tim -4280 +▁buying -4281 +▁records -4282 +▁articles -4283 +gn -4284 +▁sto -4285 +▁drug -4286 +▁ideal -4287 +▁library -4288 +▁requires -4289 +noon -4290 +itors -4291 +enance -4292 +▁Scott -4293 +▁micro -4294 +▁Chicago -4295 +win -4296 +rief -4297 +▁sup -4298 +▁rich -4299 +▁virt -4300 +▁novel -4301 +▁Chinese -4302 +▁sharing -4303 +▁updated -4304 +▁mo -4305 +part -4306 +sequ -4307 +▁Start -4308 +▁butter -4309 +▁driver -4310 +▁greater -4311 +riage -4312 +▁Sand -4313 +▁ship -4314 +▁crowd -4315 +▁wouldn -4316 +▁restaurant -4317 +imb -4318 +▁ir -4319 +lands -4320 +▁vision -4321 +▁Note -4322 +▁Exper -4323 +▁ingredients -4324 +ray -4325 +unately -4326 +▁List -4327 +▁poor -4328 +▁Stand -4329 +▁studies -4330 +▁Cup -4331 +overy -4332 +▁loan -4333 +▁Build -4334 +▁Grand -4335 +▁handle -4336 +▁plenty -4337 +▁resident -4338 +outs -4339 +▁bird -4340 +illage -4341 +ka -4342 +▁tree -4343 +▁economy -4344 +▁Central -4345 +▁leaving -4346 +▁serving -4347 +▁Div -4348 +▁sem -4349 +▁Support -4350 +SP -4351 +word -4352 +▁Mex -4353 +iture -4354 +▁beach -4355 +▁famous -4356 +ini -4357 +inn -4358 +▁Mil -4359 +lastname -4360 +▁manufacturer -4361 +▁faith -4362 +▁rooms -4363 +▁shall -4364 +▁recipe -4365 +▁Congress -4366 +CH -4367 +▁station -4368 +UR -4369 +▁react -4370 +▁shape -4371 +pective -4372 +▁origin -4373 +night -4374 +▁Amazon -4375 +▁injury -4376 +▁missing -4377 +reek -4378 +semb -4379 +▁Sil -4380 +▁upgr -4381 +▁Social -4382 +do -4383 +▁Pub -4384 +isher -4385 +▁motor -4386 +▁claims -4387 +▁medium -4388 +▁Bill -4389 +▁Posted -4390 +▁orders -4391 +▁maintain -4392 +rd -4393 +▁Fun -4394 +asure -4395 +▁brain -4396 +▁notes -4397 +▁views -4398 +▁Download -4399 +▁appropriate -4400 +▁boo -4401 +ishes -4402 +point -4403 +▁Offic -4404 +▁meant -4405 +▁older -4406 +▁spons -4407 +▁window -4408 +▁sustain -4409 +atab -4410 +▁Jesus -4411 +▁signed -4412 +berg -4413 +▁remove -4414 +cks -4415 +▁ended -4416 +▁changing -4417 +▁strategy -4418 +fr -4419 +cles -4420 +look -4421 +▁map -4422 +▁Union -4423 +outhern -4424 +▁happens -4425 +▁efficient -4426 +▁uns -4427 +going -4428 +▁advance -4429 +▁journal -4430 +ervation -4431 +▁plastic -4432 +▁Fore -4433 +▁stores -4434 +▁independent -4435 +▁iPhone -4436 +iest -4437 +▁useful -4438 +top -4439 +▁CD -4440 +umber -4441 +▁Organ -4442 +▁forms -4443 +▁leaves -4444 +▁Jul -4445 +craft -4446 +▁Light -4447 +▁Academ -4448 +acks -4449 +▁Award -4450 +▁advent -4451 +no -4452 +▁sand -4453 +▁shut -4454 +rehens -4455 +▁agency -4456 +▁repair -4457 +▁evidence -4458 +▁spending -4459 +▁afternoon -4460 +▁tim -4461 +apers -4462 +odes -4463 +rooms -4464 +▁throw -4465 +▁AND -4466 +▁menu -4467 +essions -4468 +▁secret -4469 +▁whatever -4470 +▁Fil -4471 +▁fee -4472 +estic -4473 +iliar -4474 +▁core -4475 +▁pray -4476 +▁sport -4477 +▁operations -4478 +▁combination -4479 +allery -4480 +▁Chris -4481 +▁Before -4482 +▁helpful -4483 +▁reality -4484 +atively -4485 +▁Where -4486 +▁multi -4487 +▁district -4488 +▁prepared -4489 +men -4490 +oyal -4491 +eless -4492 +icted -4493 +▁Week -4494 +▁cris -4495 +▁cab -4496 +ption -4497 +▁adop -4498 +▁tend -4499 +▁Democr -4500 +▁Series -4501 +▁status -4502 +▁balance -4503 +▁Mad -4504 +▁YOU -4505 +▁scen -4506 +▁estim -4507 +alls -4508 +▁flu -4509 +▁Both -4510 +▁flat -4511 +▁Author -4512 +▁joined -4513 +▁designs -4514 +▁remains -4515 +▁ID -4516 +▁Los -4517 +▁ride -4518 +▁corner -4519 +▁rank -4520 +▁eating -4521 +▁memory -4522 +Cl -4523 +mp -4524 +itz -4525 +▁Bet -4526 +▁Mont -4527 +▁caused -4528 +▁operating -4529 +▁Ma -4530 +aser -4531 +▁mist -4532 +▁George -4533 +▁discount -4534 +▁slightly -4535 +▁teachers -4536 +eed -4537 +▁IP -4538 +▁Women -4539 +▁esc -4540 +▁perhaps -4541 +▁primary -4542 +▁numerous -4543 +hem -4544 +▁funds -4545 +▁worry -4546 +▁survey -4547 +▁winner -4548 +▁enjoyed -4549 +▁showing -4550 +▁exercise -4551 +een -4552 +▁unc -4553 +▁Card -4554 +▁fourth -4555 +▁showed -4556 +▁spl -4557 +uries -4558 +▁anti -4559 +▁Francis -4560 +▁surgery -4561 +▁becoming -4562 +▁properties -4563 +pan -4564 +▁gain -4565 +▁recip -4566 +▁veget -4567 +▁Engine -4568 +▁markets -4569 +▁obvious -4570 +▁committed -4571 +▁suff -4572 +▁theme -4573 +▁focused -4574 +vere -4575 +▁plants -4576 +▁direction -4577 +ius -4578 +▁Tor -4579 +▁listen -4580 +▁managed -4581 +▁kick -4582 +iences -4583 +▁forum -4584 +▁chocolate -4585 +▁shel -4586 +▁limit -4587 +gers -4588 +lets -4589 +iency -4590 +▁legisl -4591 +aked -4592 +▁Its -4593 +▁Jun -4594 +▁busy -4595 +▁rain -4596 +issions -4597 +▁mechan -4598 +▁movement -4599 +▁encourage -4600 +▁rap -4601 +▁cloud -4602 +▁resist -4603 +▁putting -4604 +▁communication -4605 +OP -4606 +cher -4607 +▁bon -4608 +▁Their -4609 +▁raised -4610 +▁animals -4611 +▁assistance -4612 +?? -4613 +obe -4614 +oles -4615 +▁Bob -4616 +▁CEO -4617 +▁Full -4618 +▁Frank -4619 +▁lunch -4620 +▁defense -4621 +ita -4622 +▁analy -4623 +▁relig -4624 +life -4625 +rael -4626 +▁poll -4627 +▁corporate -4628 +▁practices -4629 +▁Technology -4630 +”, -4631 +itness -4632 +▁discover -4633 +▁Microsoft -4634 +", -4635 +gl -4636 +!!! -4637 +▁Mike -4638 +▁civil -4639 +▁reached -4640 +▁sources -4641 +bert -4642 +▁util -4643 +igation -4644 +vention -4645 +▁society -4646 +▁yesterday -4647 +orter -4648 +▁mill -4649 +▁chair -4650 +▁Wr -4651 +▁scr -4652 +▁youth -4653 +▁central -4654 +abilities -4655 +▁advanced -4656 +▁Ham -4657 +▁cart -4658 +▁architect -4659 +▁determine -4660 +REE -4661 +▁Fort -4662 +arrant -4663 +▁cleaning -4664 +▁vehicles -4665 +▁firstname -4666 +ena -4667 +ror -4668 +west -4669 +▁Tri -4670 +▁tea -4671 +▁dete -4672 +▁rare -4673 +▁AS -4674 +▁NOT -4675 +▁Mass -4676 +▁actual -4677 +yan -4678 +▁psych -4679 +▁Robert -4680 +▁tables -4681 +▁worksh -4682 +▁methods -4683 +▁leadership -4684 +▁Bur -4685 +▁ath -4686 +▁structure -4687 +kin -4688 +▁vs -4689 +▁pock -4690 +aturing -4691 +▁Commit -4692 +CC -4693 +MS -4694 +iled -4695 +▁Log -4696 +▁Set -4697 +▁fell -4698 +▁register -4699 +?” -4700 +▁repe -4701 +▁battle -4702 +▁format -4703 +▁becomes -4704 +▁willing -4705 +bre -4706 +ifts -4707 +▁colle -4708 +▁charges -4709 +▁funding -4710 +▁updates -4711 +▁thoughts -4712 +▁ju -4713 +▁Tre -4714 +ordin -4715 +▁toward -4716 +▁appears -4717 +▁visitors -4718 +▁fees -4719 +▁incor -4720 +▁sector -4721 +▁Copyright -4722 +▁absolutely -4723 +▁temperature -4724 +▁lose -4725 +▁locations -4726 +▁Keep -4727 +▁Next -4728 +▁colour -4729 +▁filled -4730 +▁songs -4731 +▁Network -4732 +▁Old -4733 +▁instru -4734 +levision -4735 +▁Wall -4736 +▁Trump -4737 +▁brown -4738 +▁Spring -4739 +▁century -4740 +▁extensive -4741 +▁Conference -4742 +kins -4743 +▁Land -4744 +▁Learn -4745 +▁Louis -4746 +▁asking -4747 +▁environmental -4748 +ola -4749 +ship -4750 +▁Way -4751 +▁topic -4752 +▁favour -4753 +▁transl -4754 +▁courses -4755 +▁profile -4756 +▁AL -4757 +▁Ol -4758 +while -4759 +▁Test -4760 +▁south -4761 +▁dur -4762 +▁Medic -4763 +▁Report -4764 +▁documents -4765 +▁previously -4766 +coh -4767 +▁Dou -4768 +▁Oper -4769 +▁adapt -4770 +▁north -4771 +ception -4772 +ipl -4773 +▁Plus -4774 +▁bowl -4775 +▁swim -4776 +ivered -4777 +▁guest -4778 +▁refer -4779 +▁visual -4780 +▁readers -4781 +▁anywhere -4782 +▁kid -4783 +▁registered -4784 +otton -4785 +▁Jeff -4786 +▁France -4787 +For -4788 +▁Cre -4789 +▁Lim -4790 +▁lux -4791 +▁sch -4792 +▁polic -4793 +▁charged -4794 +▁expertise -4795 +New -4796 +water -4797 +▁task -4798 +iration -4799 +▁upcoming -4800 +▁UN -4801 +▁wire -4802 +▁allowing -4803 +FL -4804 +▁Ok -4805 +▁selling -4806 +po -4807 +bour -4808 +▁bask -4809 +▁recommended -4810 +▁stre -4811 +▁Hotel -4812 +▁plays -4813 +▁Android -4814 +▁coverage -4815 +icip -4816 +▁Lat -4817 +▁fuel -4818 +▁neck -4819 +▁audio -4820 +▁sounds -4821 +▁Library -4822 +▁population -4823 +list -4824 +umin -4825 +▁Only -4826 +▁Conne -4827 +▁featured -4828 +▁Saf -4829 +▁pal -4830 +▁joint -4831 +▁Medical -4832 +▁princip -4833 +▁smaller -4834 +▁walking -4835 +▁ur -4836 +ulty -4837 +▁thr -4838 +▁Prov -4839 +▁seat -4840 +▁mental -4841 +▁establish -4842 +▁discussion -4843 +▁Jew -4844 +▁tun -4845 +▁apart -4846 +▁trial -4847 +▁parties -4848 +▁NE -4849 +istan -4850 +▁dance -4851 +ferences -4852 +IA -4853 +azz -4854 +ora -4855 +osis -4856 +▁Somet -4857 +▁Watch -4858 +igan -4859 +prise -4860 +▁Main -4861 +▁dogs -4862 +▁radio -4863 +▁despite -4864 +On -4865 +▁Lord -4866 +▁Walk -4867 +▁fold -4868 +▁truck -4869 +▁Africa -4870 +▁Virgin -4871 +▁scheduled -4872 +▁maintenance -4873 +▁Head -4874 +▁inspired -4875 +▁ON -4876 +▁diet -4877 +▁nine -4878 +▁restr -4879 +SA -4880 +▁writer -4881 +▁outdoor -4882 +▁Security -4883 +▁accommod -4884 +▁combined -4885 +▁van -4886 +ki -4887 +▁CA -4888 +▁har -4889 +▁citiz -4890 +▁scored -4891 +aks -4892 +alog -4893 +▁Western -4894 +rehensive -4895 +▁techniques -4896 +OO -4897 +▁Game -4898 +▁Admin -4899 +▁decide -4900 +▁seconds -4901 +▁Soft -4902 +▁Museum -4903 +▁values -4904 +▁removed -4905 +▁provider -4906 +▁sav -4907 +▁earth -4908 +▁raise -4909 +▁accompl -4910 +ownt -4911 +▁metal -4912 +▁stret -4913 +▁researc -4914 +eal -4915 +▁Place -4916 +▁spect -4917 +▁elements -4918 +▁purchased -4919 +▁joy -4920 +▁calc -4921 +▁purs -4922 +▁trees -4923 +▁launched -4924 +zen -4925 +▁Hy -4926 +▁Mer -4927 +▁sea -4928 +▁honest -4929 +▁movies -4930 +▁innovative -4931 +An -4932 +IF -4933 +▁panel -4934 +idering -4935 +▁counter -4936 +▁shooting -4937 +▁delicious -4938 +▁approximately -4939 +▁sitting -4940 +gment -4941 +▁killed -4942 +▁separate -4943 +▁edge -4944 +▁Video -4945 +▁Digital -4946 +▁teacher -4947 +▁relevant -4948 +ano -4949 +▁matt -4950 +▁approved -4951 +gage -4952 +▁lovely -4953 +▁parking -4954 +▁consumers -4955 +▁executive -4956 +My -4957 +nel -4958 +van -4959 +▁steel -4960 +▁Israel -4961 +▁Angeles -4962 +▁Manager -4963 +▁magazine -4964 +rs -4965 +ye -4966 +orry -4967 +▁hearing -4968 +▁concerns -4969 +bu -4970 +appy -4971 +igned -4972 +ushed -4973 +▁Charl -4974 +▁Person -4975 +pet -4976 +ellig -4977 +known -4978 +▁chat -4979 +▁conv -4980 +▁Georg -4981 +▁Peter -4982 +ensions -4983 +▁mostly -4984 +▁agreement -4985 +ears -4986 +▁eth -4987 +▁milk -4988 +▁rise -4989 +▁occasion -4990 +ups -4991 +▁Aud -4992 +▁tow -4993 +olars -4994 +▁Cook -4995 +▁Data -4996 +▁Join -4997 +isation -4998 +▁cheese -4999 +▁highlight -5000 +▁generation -5001 +VD -5002 +▁Ext -5003 +▁Ill -5004 +▁Penn -5005 +▁Word -5006 +▁Const -5007 +osit -5008 +▁mur -5009 +▁rid -5010 +▁Room -5011 +▁Thomas -5012 +▁identify -5013 +▁Gal -5014 +▁Pac -5015 +▁Centre -5016 +▁connected -5017 +▁intended -5018 +▁appearance -5019 +TV -5020 +fol -5021 +ring -5022 +orthern -5023 +▁controll -5024 +PA -5025 +ris -5026 +apes -5027 +▁sets -5028 +▁Prote -5029 +▁feels -5030 +▁waste -5031 +▁described -5032 +▁operation -5033 +▁commitment -5034 +▁Mo -5035 +▁Ver -5036 +irmed -5037 +▁truth -5038 +▁Master -5039 +▁academic -5040 +▁delivered -5041 +▁participate -5042 +cm -5043 +▁sympt -5044 +▁Through -5045 +ournament -5046 +!) -5047 +ENT -5048 +▁Men -5049 +oston -5050 +▁Lead -5051 +▁push -5052 +▁stars -5053 +▁Indust -5054 +▁Invest -5055 +▁server -5056 +▁Children -5057 +▁familiar -5058 +▁marriage -5059 +osen -5060 +▁Bas -5061 +▁nom -5062 +▁Arts -5063 +▁tough -5064 +▁enhance -5065 +▁capacity -5066 +▁relationships -5067 +UT -5068 +ycl -5069 +▁Upd -5070 +reens -5071 +▁cooking -5072 +▁promote -5073 +den -5074 +elines -5075 +▁landsc -5076 +ker -5077 +alend -5078 +nergy -5079 +▁cells -5080 +▁campus -5081 +▁editor -5082 +mond -5083 +▁mort -5084 +▁optim -5085 +▁cities -5086 +▁Journal -5087 +▁decisions -5088 +▁generally -5089 +▁Fair -5090 +▁signs -5091 +▁Access -5092 +▁wearing -5093 +▁therefore -5094 +▁introduced -5095 +arsh -5096 +berry -5097 +▁Vict -5098 +▁breast -5099 +▁accident -5100 +▁properly -5101 +▁processes -5102 +▁Er -5103 +prene -5104 +▁educational -5105 +▁Ul -5106 +▁Cam -5107 +cohol -5108 +eline -5109 +▁situ -5110 +▁majority -5111 +▁investigation -5112 +anda -5113 +inch -5114 +▁jew -5115 +▁minor -5116 +ya -5117 +burg -5118 +▁arm -5119 +ishing -5120 +▁opinion -5121 +▁detailed -5122 +▁Government -5123 +▁Dev -5124 +▁fly -5125 +▁Hand -5126 +▁Rest -5127 +reprene -5128 +▁technologies -5129 +▁teen -5130 +▁Chief -5131 +▁Earth -5132 +atabase -5133 +▁Global -5134 +▁minimum -5135 +▁category -5136 +▁presence -5137 +IR -5138 +▁Lab -5139 +▁ban -5140 +▁Live -5141 +▁label -5142 +▁calling -5143 +▁returned -5144 +▁emergency -5145 +▁expensive -5146 +▁mentioned -5147 +ef -5148 +▁Tur -5149 +▁feedback -5150 +fortunately -5151 +▁responsibility -5152 +▁Ari -5153 +▁Fund -5154 +▁Ohio -5155 +▁Wild -5156 +ression -5157 +▁Committee -5158 +▁installed -5159 +DF -5160 +▁Mur -5161 +▁ring -5162 +▁square -5163 +▁Johnson -5164 +▁foreign -5165 +▁bringing -5166 +▁hundreds -5167 +▁websites -5168 +▁Americans -5169 +▁installation -5170 +col -5171 +▁Que -5172 +▁plug -5173 +▁female -5174 +▁ourselves -5175 +rag -5176 +razy -5177 +▁Boston -5178 +▁entertainment -5179 +otten -5180 +ternal -5181 +▁invent -5182 +▁arrange -5183 +▁behavior -5184 +▁exchange -5185 +▁performed -5186 +▁episode -5187 +▁factors -5188 +▁consumer -5189 +▁advertising -5190 +ien -5191 +▁Pack -5192 +▁sizes -5193 +▁begins -5194 +▁satisf -5195 +hab -5196 +text -5197 +▁appeared -5198 +▁Di -5199 +▁Kn -5200 +aded -5201 +▁brief -5202 +▁sides -5203 +▁veter -5204 +▁Squ -5205 +▁flo -5206 +▁teach -5207 +▁units -5208 +▁studio -5209 +uts -5210 +▁Den -5211 +▁coast -5212 +ictions -5213 +emporary -5214 +▁MP -5215 +rist -5216 +▁Adv -5217 +▁Sup -5218 +▁Human -5219 +▁Federal -5220 +AY -5221 +▁elig -5222 +▁icon -5223 +▁tight -5224 +▁caught -5225 +▁transform -5226 +▁confidence -5227 +icians -5228 +▁chief -5229 +▁sauce -5230 +▁thick -5231 +ae -5232 +When -5233 +iser -5234 +▁Tour -5235 +▁fruit -5236 +▁Colorado -5237 +▁honor -5238 +▁holding -5239 +▁reserved -5240 +lock -5241 +▁Wal -5242 +▁Those -5243 +▁adults -5244 +▁topics -5245 +▁policies -5246 +▁supporting -5247 +spe -5248 +uke -5249 +▁https -5250 +▁Contin -5251 +▁ven -5252 +OC -5253 +hew -5254 +cean -5255 +▁alle -5256 +▁meat -5257 +▁ment -5258 +▁achie -5259 +▁chicken -5260 +▁windows -5261 +▁confident -5262 +▁HD -5263 +acle -5264 +▁vary -5265 +▁Price -5266 +rastructure -5267 +▁administration -5268 +▁Pan -5269 +▁motiv -5270 +▁animal -5271 +ifications -5272 +▁supported -5273 +with -5274 +▁Jud -5275 +▁cro -5276 +▁fantastic -5277 +ushing -5278 +▁mouth -5279 +▁sexual -5280 +▁seeking -5281 +SS -5282 +▁meal -5283 +▁Creat -5284 +▁alternative -5285 +arp -5286 +iat -5287 +arks -5288 +oted -5289 +▁Maybe -5290 +▁victory -5291 +ait -5292 +how -5293 +▁Bi -5294 +▁Search -5295 +▁Carolina -5296 +▁Australian -5297 +kes -5298 +ancer -5299 +▁Germany -5300 +▁components -5301 +▁importance -5302 +▁competitive -5303 +vy -5304 +▁sy -5305 +▁Prem -5306 +▁quiet -5307 +▁basket -5308 +▁edition -5309 +paper -5310 +▁tele -5311 +▁sister -5312 +▁dollars -5313 +rier -5314 +▁cheap -5315 +▁leads -5316 +▁thread -5317 +▁apparent -5318 +ste -5319 +▁Jon -5320 +▁rom -5321 +▁rub -5322 +unting -5323 +▁Canad -5324 +▁Sports -5325 +▁switch -5326 +▁guarantee -5327 +▁Academy -5328 +▁conduct -5329 +▁confirm -5330 +▁transact -5331 +▁conversation -5332 +inct -5333 +▁Lin -5334 +ighter -5335 +▁distance -5336 +▁Tit -5337 +▁Young -5338 +▁recru -5339 +▁centre -5340 +▁measures -5341 +▁worldwide -5342 +Com -5343 +▁Gar -5344 +▁Gen -5345 +▁info -5346 +▁Festival -5347 +▁Students -5348 +.| -5349 +etic -5350 +▁Bal -5351 +▁fif -5352 +▁picked -5353 +iability -5354 +▁remaining -5355 +▁photograph -5356 +weet -5357 +▁Jose -5358 +weight -5359 +▁bread -5360 +▁license -5361 +away -5362 +ucks -5363 +▁impl -5364 +▁flight -5365 +▁totally -5366 +▁Nor -5367 +▁rat -5368 +▁Meet -5369 +▁doubt -5370 +▁prison -5371 +▁unless -5372 +▁tack -5373 +▁Martin -5374 +inations -5375 +NA -5376 +atre -5377 +▁Sar -5378 +▁ang -5379 +▁vir -5380 +achel -5381 +uable -5382 +▁species -5383 +How -5384 +elly -5385 +ersey -5386 +▁restaurants -5387 +▁comprehensive -5388 +asks -5389 +▁seek -5390 +▁doors -5391 +▁contest -5392 +▁agencies -5393 +ailability -5394 +▁Champions -5395 +iano -5396 +verse -5397 +▁Quest -5398 +▁tests -5399 +▁faster -5400 +▁delight -5401 +▁maximum -5402 +▁celebrate -5403 +uzz -5404 +eries -5405 +▁league -5406 +▁clearly -5407 +▁musical -5408 +▁visiting -5409 +▁photograp -5410 +RC -5411 +TH -5412 +Our -5413 +▁Type -5414 +▁forg -5415 +itable -5416 +▁depart -5417 +▁painting -5418 +▁eventually -5419 +pass -5420 +▁Did -5421 +▁dyn -5422 +▁wel -5423 +estyle -5424 +▁noted -5425 +▁planned -5426 +▁election -5427 +▁revealed -5428 +▁considering -5429 +TC -5430 +otic -5431 +▁Inte -5432 +▁propos -5433 +▁prepare -5434 +▁depending -5435 +▁Cred -5436 +▁Using -5437 +▁Energy -5438 +▁arrived -5439 +▁housing -5440 +▁married -5441 +▁university -5442 +igr -5443 +▁Ro -5444 +usion -5445 +▁burn -5446 +▁lived -5447 +▁ticket -5448 +▁Hospital -5449 +▁bike -5450 +▁mine -5451 +▁Jackson -5452 +▁sessions -5453 +erg -5454 +▁Ce -5455 +▁inn -5456 +iminal -5457 +ixture -5458 +orough -5459 +▁scale -5460 +▁Assist -5461 +▁SP -5462 +wing -5463 +▁McC -5464 +▁ign -5465 +▁ris -5466 +ulous -5467 +▁FREE -5468 +▁apps -5469 +▁otherwise -5470 +▁discovered -5471 +▁Mid -5472 +▁Cost -5473 +▁compar -5474 +▁gather -5475 +▁officer -5476 +mes -5477 +▁Secret -5478 +▁climate -5479 +▁monthly -5480 +▁Japanese -5481 +▁chemical -5482 +▁neighborhood -5483 +▁boys -5484 +▁ends -5485 +▁liqu -5486 +▁evalu -5487 +▁turns -5488 +▁inches -5489 +▁spokes -5490 +▁struct -5491 +▁commission -5492 +▁Kore -5493 +▁weap -5494 +▁symptoms -5495 +ht -5496 +▁Bul -5497 +▁Cat -5498 +agram -5499 +▁freed -5500 +▁missed -5501 +▁cutting -5502 +▁accounts -5503 +▁internal -5504 +▁reliable -5505 +ias -5506 +▁ran -5507 +tered -5508 +▁pump -5509 +▁surf -5510 +related -5511 +▁brands -5512 +▁lights -5513 +▁seemed -5514 +▁appreciate -5515 +▁participants -5516 +otes -5517 +alian -5518 +▁Know -5519 +▁battery -5520 +▁organic -5521 +▁affordable -5522 +edia -5523 +▁hyd -5524 +▁Cert -5525 +▁corn -5526 +▁twice -5527 +▁Applic -5528 +▁Columb -5529 +▁Georgia -5530 +▁cultural -5531 +▁resource -5532 +▁featuring -5533 +hi -5534 +▁Second -5535 +▁automatically -5536 +They -5537 +ician -5538 +▁valid -5539 +▁athlet -5540 +▁paying -5541 +▁submit -5542 +▁African -5543 +▁meetings -5544 +iors -5545 +▁Code -5546 +▁Jones -5547 +▁Andrew -5548 +EE -5549 +▁emp -5550 +▁Share -5551 +▁bigger -5552 +▁regularly -5553 +); -5554 +Ex -5555 +but -5556 +▁Hard -5557 +▁Qual -5558 +▁debt -5559 +▁Middle -5560 +▁failed -5561 +▁supposed -5562 +▁Ep -5563 +▁Help -5564 +▁Steve -5565 +▁storm -5566 +▁accurate -5567 +▁possibly -5568 +GB -5569 +ua -5570 +ban -5571 +▁mel -5572 +▁pod -5573 +▁boost -5574 +▁deals -5575 +▁labor -5576 +▁volume -5577 +▁television -5578 +▁presentation -5579 +cont -5580 +▁fro -5581 +▁draft -5582 +▁fellow -5583 +▁realize -5584 +▁manufacturing -5585 +Pro -5586 +▁Ut -5587 +▁fle -5588 +▁Daniel -5589 +▁concent -5590 +▁Virginia -5591 +▁messages -5592 +?" -5593 +▁SH -5594 +ennis -5595 +idden -5596 +pected -5597 +▁fields -5598 +▁revenue -5599 +▁affected -5600 +▁recovery -5601 +EST -5602 +rupt -5603 +▁Boy -5604 +▁Blog -5605 +▁German -5606 +▁covers -5607 +▁shares -5608 +▁proposed -5609 +▁researchers -5610 +No -5611 +roy -5612 +eper -5613 +mosp -5614 +▁die -5615 +rical -5616 +▁Page -5617 +iamond -5618 +alendar -5619 +oration -5620 +▁Rights -5621 +ployment -5622 +▁returns -5623 +▁engineering -5624 +▁Lee -5625 +▁Tem -5626 +▁Farm -5627 +▁Travel -5628 +▁birthday -5629 +▁AD -5630 +case -5631 +▁Rom -5632 +▁aid -5633 +▁ages -5634 +▁Little -5635 +▁confirmed -5636 +▁instructions -5637 +▁amb -5638 +cious -5639 +▁Cast -5640 +▁Trust -5641 +▁dates -5642 +▁tells -5643 +▁answers -5644 +▁creation -5645 +▁interior -5646 +▁protected -5647 +ca -5648 +ters -5649 +▁Tech -5650 +▁breakfast -5651 +▁sad -5652 +▁wal -5653 +▁dish -5654 +▁chart -5655 +▁warrant -5656 +▁industrial -5657 +▁infrastructure -5658 +iner -5659 +▁nor -5660 +which -5661 +▁Orig -5662 +▁Games -5663 +▁Visit -5664 +▁loves -5665 +▁Mexico -5666 +▁county -5667 +▁applied -5668 +▁browser -5669 +▁employee -5670 +ario -5671 +▁nurs -5672 +▁agent -5673 +▁pregn -5674 +▁specifically -5675 +▁Opt -5676 +▁mir -5677 +▁poly -5678 +▁route -5679 +▁desire -5680 +▁issued -5681 +▁choices -5682 +▁decades -5683 +▁drivers -5684 +▁NC -5685 +▁Hen -5686 +▁hook -5687 +▁rapid -5688 +▁furniture -5689 +▁chain -5690 +▁foods -5691 +fection -5692 +▁flowers -5693 +▁reference -5694 +▁twe -5695 +▁hero -5696 +▁jack -5697 +▁affili -5698 +▁element -5699 +▁perfectly -5700 +▁WH -5701 +gend -5702 +▁Joe -5703 +erves -5704 +▁thus -5705 +lights -5706 +▁attorney -5707 +▁standing -5708 +▁exclusive -5709 +ansas -5710 +▁tail -5711 +▁plate -5712 +▁chosen -5713 +▁earned -5714 +▁supports -5715 +upp -5716 +▁CH -5717 +▁anc -5718 +▁yes -5719 +anger -5720 +odies -5721 +▁Made -5722 +▁bond -5723 +▁Broad -5724 +▁talks -5725 +▁Control -5726 +▁Francisco -5727 +▁employment -5728 +hand -5729 +rick -5730 +▁Ken -5731 +hetic -5732 +oking -5733 +▁mode -5734 +▁vent -5735 +▁Brand -5736 +▁remote -5737 +ibilities -5738 +▁Executive -5739 +anna -5740 +irms -5741 +▁Dom -5742 +▁End -5743 +ospit -5744 +▁Enjoy -5745 +▁agreed -5746 +▁purposes -5747 +▁apartment -5748 +▁incredible -5749 +Al -5750 +▁AT -5751 +▁Lo -5752 +lymp -5753 +▁Bon -5754 +▁wid -5755 +▁Expl -5756 +▁broken -5757 +▁improved -5758 +▁strategies -5759 +UN -5760 +can -5761 +▁DVD -5762 +▁nav -5763 +▁Does -5764 +▁logo -5765 +▁Store -5766 +▁Williams -5767 +▁processing -5768 +▁Hope -5769 +▁Pass -5770 +▁Sher -5771 +▁Current -5772 +▁illustr -5773 +▁hardware -5774 +▁surrounding -5775 +▁Sy -5776 +anges -5777 +▁cake -5778 +▁cute -5779 +▁whom -5780 +▁advis -5781 +▁Product -5782 +▁recorded -5783 +▁disappoint -5784 +BI -5785 +MA -5786 +▁Id -5787 +ench -5788 +hent -5789 +▁Equ -5790 +▁Haw -5791 +▁lit -5792 +▁Coast -5793 +▁quant -5794 +▁reput -5795 +▁rough -5796 +▁premium -5797 +aped -5798 +▁Mic -5799 +adium -5800 +▁golf -5801 +ampion -5802 +▁holds -5803 +▁judge -5804 +▁pleased -5805 +▁accepted -5806 +▁suitable -5807 +umes -5808 +idays -5809 +▁boat -5810 +▁Point -5811 +▁downt -5812 +▁losing -5813 +▁Instead -5814 +▁male -5815 +▁pure -5816 +▁grade -5817 +▁trouble -5818 +uous -5819 +▁rule -5820 +▁Three -5821 +▁wheel -5822 +▁administr -5823 +▁buildings -5824 +lyn -5825 +oga -5826 +uits -5827 +▁usual -5828 +▁History -5829 +▁explain -5830 +▁domestic -5831 +▁concerned -5832 +!” -5833 +xy -5834 +itage -5835 +▁telling -5836 +▁Minister -5837 +▁violence -5838 +▁candidates -5839 +gas -5840 +ums -5841 +▁moist -5842 +▁licens -5843 +▁aspects -5844 +▁Communic -5845 +▁injuries -5846 +▁favourite -5847 +tra -5848 +▁ok -5849 +what -5850 +▁Girl -5851 +person -5852 +▁moments -5853 +▁typically -5854 +otal -5855 +▁pun -5856 +▁tur -5857 +▁Party -5858 +▁error -5859 +▁causes -5860 +▁styles -5861 +▁Italian -5862 +▁awareness -5863 +▁registration -5864 +▁vit -5865 +▁arts -5866 +▁phil -5867 +▁Night -5868 +▁Print -5869 +▁Perform -5870 +rim -5871 +road -5872 +lines -5873 +▁oven -5874 +▁grown -5875 +▁enable -5876 +▁island -5877 +▁greatest -5878 +vell -5879 +▁Harr -5880 +▁rand -5881 +orable -5882 +▁abuse -5883 +▁shoes -5884 +▁forces -5885 +▁stated -5886 +fficient -5887 +▁surprise -5888 +va -5889 +▁FOR -5890 +▁Key -5891 +▁tag -5892 +▁taxes -5893 +▁photography -5894 +ERS -5895 +hors -5896 +▁jun -5897 +anish -5898 +cluding -5899 +▁closer -5900 +▁citizens -5901 +▁negative -5902 +▁influence -5903 +CA -5904 +bur -5905 +writ -5906 +▁Four -5907 +▁circum -5908 +▁actions -5909 +ria -5910 +▁Def -5911 +▁Dog -5912 +tters -5913 +ulture -5914 +▁retire -5915 +▁script -5916 +▁stopped -5917 +▁stretch -5918 +▁broadcast -5919 +▁Wi -5920 +pond -5921 +▁Drive -5922 +▁Local -5923 +▁gradu -5924 +▁resol -5925 +▁Division -5926 +▁wet -5927 +▁crew -5928 +▁powder -5929 +▁database -5930 +▁tomorrow -5931 +▁sam -5932 +astern -5933 +▁Olymp -5934 +▁leather -5935 +▁practical -5936 +ribe -5937 +▁Bra -5938 +▁Ell -5939 +▁Max -5940 +▁adm -5941 +▁argu -5942 +Un -5943 +▁serves -5944 +▁weekly -5945 +▁alleged -5946 +iami -5947 +udden -5948 +▁shock -5949 +▁Pacific -5950 +▁payments -5951 +▁functions -5952 +▁inspiration -5953 +DS -5954 +▁Gra -5955 +stone -5956 +▁acid -5957 +▁bound -5958 +▁faculty -5959 +And -5960 +yers -5961 +▁tro -5962 +alled -5963 +▁mini -5964 +▁funny -5965 +▁Awards -5966 +▁speech -5967 +▁receiving -5968 +▁authorities -5969 +ava -5970 +hus -5971 +▁Mat -5972 +merce -5973 +▁Ryan -5974 +▁sequ -5975 +▁thin -5976 +lywood -5977 +▁column -5978 +▁designer -5979 +ucle -5980 +▁hits -5981 +▁cable -5982 +forcement -5983 +▁supplies -5984 +▁Available -5985 +▁electronic -5986 +TA -5987 +ERE -5988 +▁rot -5989 +atholic -5990 +▁config -5991 +▁pepper -5992 +▁village -5993 +▁identified -5994 +▁tut -5995 +▁gear -5996 +▁Cross -5997 +▁random -5998 +poration -5999 +▁everyday -6000 +▁committee -6001 +GE -6002 +bol -6003 +oup -6004 +irty -6005 +▁Hor -6006 +▁Oil -6007 +under -6008 +profit -6009 +▁Econom -6010 +▁perman -6011 +▁recognized -6012 +ache -6013 +▁Aff -6014 +itate -6015 +never -6016 +right -6017 +▁Coll -6018 +▁Need -6019 +▁grab -6020 +▁atmosp -6021 +▁degrees -6022 +▁printed -6023 +▁convenient -6024 +▁healthcare -6025 +▁impressive -6026 +PM -6027 +mar -6028 +inet -6029 +▁crime -6030 +▁keeps -6031 +▁lessons -6032 +▁Michigan -6033 +Pl -6034 +So -6035 +rip -6036 +▁tab -6037 +▁Bell -6038 +▁Cond -6039 +isters -6040 +▁essay -6041 +▁flour -6042 +▁crisis -6043 +▁height -6044 +▁emotional -6045 +▁determined -6046 +▁Cas -6047 +▁Ref -6048 +▁Tay -6049 +▁voc -6050 +atoes -6051 +etime -6052 +▁Ariz -6053 +▁films -6054 +▁imagine -6055 +▁treated -6056 +▁Sometimes -6057 +▁dangerous -6058 +▁happening -6059 +▁Lt -6060 +▁PS -6061 +aren -6062 +phas -6063 +▁Dun -6064 +▁Try -6065 +▁Small -6066 +▁crazy -6067 +▁Comple -6068 +▁ongoing -6069 +▁champions -6070 +▁explained -6071 +iate -6072 +hered -6073 +inter -6074 +▁Jenn -6075 +▁Mean -6076 +uction -6077 +▁Santa -6078 +▁fixed -6079 +▁sheet -6080 +▁entreprene -6081 +Ar -6082 +▁Run -6083 +▁Sus -6084 +urban -6085 +▁Safety -6086 +▁dropped -6087 +▁Marketing -6088 +cue -6089 +rum -6090 +▁Fed -6091 +▁patterns -6092 +▁resolution -6093 +▁du -6094 +pret -6095 +▁Mach -6096 +▁Canadian -6097 +▁investors -6098 +LS -6099 +All -6100 +aid -6101 +eler -6102 +made -6103 +▁row -6104 +▁worse -6105 +▁Victor -6106 +▁dining -6107 +iversary -6108 +▁subscrib -6109 +▁gro -6110 +anged -6111 +arian -6112 +▁Writ -6113 +▁rear -6114 +▁Guide -6115 +▁command -6116 +▁trading -6117 +▁conducted -6118 +▁tradition -6119 +LA -6120 +mary -6121 +anche -6122 +osoph -6123 +▁Rose -6124 +▁soul -6125 +▁taught -6126 +▁arrested -6127 +▁attended -6128 +▁officers -6129 +▁appointment -6130 +▁collaboration -6131 +Bl -6132 +Con -6133 +▁GM -6134 +▁Kh -6135 +enced -6136 +▁lift -6137 +▁simpl -6138 +▁extended -6139 +lete -6140 +▁der -6141 +▁Priv -6142 +▁cock -6143 +▁grad -6144 +▁roof -6145 +▁Chair -6146 +▁hoping -6147 +▁alcohol -6148 +▁positions -6149 +▁Environment -6150 +▁successfully -6151 +ppers -6152 +oosing -6153 +▁native -6154 +▁tournament -6155 +Don -6156 +inson -6157 +▁grew -6158 +▁wash -6159 +▁depth -6160 +▁flood -6161 +▁Account -6162 +▁freedom -6163 +▁ordered -6164 +▁eligible -6165 +▁incident -6166 +▁sick -6167 +▁folks -6168 +▁Senate -6169 +▁versions -6170 +iana -6171 +▁Inf -6172 +▁kne -6173 +▁Mult -6174 +▁spin -6175 +▁Richard -6176 +ello -6177 +rate -6178 +▁obtain -6179 +▁severe -6180 +▁Sat -6181 +aints -6182 +▁Turn -6183 +▁Photo -6184 +▁cycle -6185 +▁guard -6186 +▁teeth -6187 +▁noticed -6188 +iki -6189 +▁bat -6190 +▁Area -6191 +▁Paris -6192 +▁advoc -6193 +▁belong -6194 +▁forced -6195 +▁massive -6196 +▁graduate -6197 +▁construct -6198 +Be -6199 +ala -6200 +cers -6201 +essed -6202 +racts -6203 +▁adds -6204 +▁dram -6205 +▁none -6206 +▁houses -6207 +▁improvement -6208 +hire -6209 +real -6210 +rics -6211 +▁Daily -6212 +▁trend -6213 +iveness -6214 +▁Summer -6215 +▁tested -6216 +▁failure -6217 +▁Building -6218 +▁valuable -6219 +▁innovation -6220 +tle -6221 +▁ol -6222 +▁Kent -6223 +▁Which -6224 +▁mixed -6225 +▁shots -6226 +▁yards -6227 +▁cotton -6228 +▁regional -6229 +ayer -6230 +utch -6231 +▁Ash -6232 +▁Die -6233 +rease -6234 +▁Carl -6235 +▁Clean -6236 +▁Right -6237 +▁council -6238 +Is -6239 +▁MS -6240 +▁Box -6241 +▁Rev -6242 +▁thorough -6243 +▁integrated -6244 +▁DC -6245 +▁syn -6246 +▁Size -6247 +▁tiny -6248 +hentic -6249 +▁output -6250 +za -6251 +▁ec -6252 +inem -6253 +▁tank -6254 +▁owned -6255 +▁concert -6256 +▁knowing -6257 +▁routine -6258 +▁turning -6259 +▁efficiency -6260 +erse -6261 +▁drugs -6262 +▁Avenue -6263 +▁facing -6264 +▁guitar -6265 +▁diverse -6266 +▁therapy -6267 +▁clothing -6268 +▁providers -6269 +▁MO -6270 +▁Sn -6271 +▁Ent -6272 +▁Tool -6273 +acking -6274 +▁Select -6275 +▁publish -6276 +▁reduced -6277 +▁interface -6278 +CE -6279 +▁fo -6280 +▁Hon -6281 +osite -6282 +secut -6283 +▁Asia -6284 +▁Though -6285 +▁yellow -6286 +▁follows -6287 +▁description -6288 +▁distribution -6289 +illy -6290 +▁LLC -6291 +▁ped -6292 +abled -6293 +ansion -6294 +▁Training -6295 +▁settings -6296 +▁surprised -6297 +▁effectively -6298 +▁EU -6299 +print -6300 +▁auto -6301 +▁dial -6302 +sembly -6303 +▁Miami -6304 +▁silver -6305 +▁mixture -6306 +▁contemporary -6307 +▁expectations -6308 +▁:) -6309 +abet -6310 +▁Ball -6311 +intage -6312 +▁baking -6313 +▁enthus -6314 +▁unable -6315 +▁carried -6316 +▁circumst -6317 +▁intellig -6318 +▁accessible -6319 +▁challenging -6320 +▁perspective -6321 +▁Ira -6322 +▁Low -6323 +▁Want -6324 +letter -6325 +▁bonus -6326 +▁risks -6327 +▁upper -6328 +quality -6329 +▁nearby -6330 +▁pulled -6331 +▁protein -6332 +▁stunning -6333 +▁candidate -6334 +CT -6335 +PR -6336 +▁af -6337 +iece -6338 +ATION -6339 +▁Phys -6340 +▁Italy -6341 +▁stands -6342 +ev -6343 +aze -6344 +claim -6345 +▁Lind -6346 +ington -6347 +▁Beaut -6348 +▁matters -6349 +▁tonight -6350 +▁significantly -6351 +rowse -6352 +▁Nick -6353 +▁laugh -6354 +▁Proper -6355 +▁excess -6356 +▁garlic -6357 +▁univers -6358 +▁witness -6359 +▁approval -6360 +▁medicine -6361 +▁carefully -6362 +sm -6363 +zy -6364 +▁hur -6365 +▁Shop -6366 +▁chapter -6367 +▁complic -6368 +▁joining -6369 +obs -6370 +flow -6371 +oral -6372 +▁Cir -6373 +oured -6374 +▁fulf -6375 +▁equal -6376 +▁kinds -6377 +▁awarded -6378 +▁bedroom -6379 +▁channel -6380 +▁hosting -6381 +▁guidance -6382 +▁vacation -6383 +▁adventure -6384 +▁increases -6385 +▁recording -6386 +▁availability -6387 +▁SU -6388 +▁Dub -6389 +▁Requ -6390 +▁sole -6391 +▁Never -6392 +▁Works -6393 +▁likes -6394 +▁emphas -6395 +▁festival -6396 +▁accessories -6397 +bal -6398 +zer -6399 +▁glad -6400 +▁iron -6401 +▁tall -6402 +▁Heart -6403 +▁loans -6404 +▁Spanish -6405 +UL -6406 +rete -6407 +▁ease -6408 +riends -6409 +▁filed -6410 +▁renew -6411 +clusion -6412 +▁cooper -6413 +▁Republican -6414 +▁exhibition -6415 +▁partnership -6416 +stal -6417 +▁hopes -6418 +▁Credit -6419 +▁Mobile -6420 +▁SE -6421 +▁Rub -6422 +acked -6423 +ether -6424 +folio -6425 +▁bags -6426 +nesota -6427 +orgeous -6428 +▁creates -6429 +▁speaking -6430 +▁lifestyle -6431 +HA -6432 +sen -6433 +you -6434 +▁diss -6435 +▁hang -6436 +▁vend -6437 +▁Connect -6438 +▁Student -6439 +To -6440 +▁) -6441 +▁AR -6442 +adow -6443 +▁unf -6444 +▁legs -6445 +▁occup -6446 +▁Disney -6447 +▁appeal -6448 +▁assets -6449 +▁motion -6450 +▁trends -6451 +▁clothes -6452 +▁context -6453 +▁reporting -6454 +▁replacement -6455 +FC -6456 +yth -6457 +onto -6458 +yard -6459 +agues -6460 +▁Email -6461 +▁spaces -6462 +▁entirely -6463 +▁scholars -6464 +▁constantly -6465 +!" -6466 +anny -6467 +ican -6468 +long -6469 +▁arms -6470 +orders -6471 +▁shift -6472 +▁stamp -6473 +▁forest -6474 +▁Members -6475 +▁certific -6476 +▁searching -6477 +▁sustainable -6478 +▁OS -6479 +irts -6480 +onym -6481 +rition -6482 +▁spark -6483 +▁Number -6484 +▁Taylor -6485 +▁engage -6486 +▁manner -6487 +▁conflic -6488 +▁believes -6489 +▁submitted -6490 +II -6491 +bi -6492 +▁LED -6493 +comes -6494 +eding -6495 +▁kill -6496 +▁luxury -6497 +▁Studies -6498 +▁streets -6499 +▁procedures -6500 +ml -6501 +▁pil -6502 +▁fort -6503 +▁Still -6504 +▁sudden -6505 +▁outstanding -6506 +rid -6507 +▁Rh -6508 +foot -6509 +▁odd -6510 +▁cuts -6511 +▁Field -6512 +▁goods -6513 +▁negot -6514 +▁awards -6515 +▁criminal -6516 +▁monitoring -6517 +▁originally -6518 +▁SC -6519 +▁Kim -6520 +ially -6521 +▁Russian -6522 +▁invited -6523 +▁trained -6524 +▁Southern -6525 +▁millions -6526 +▁seriously -6527 +▁performing -6528 +▁transition -6529 +erts -6530 +ikes -6531 +▁Pot -6532 +▁eleg -6533 +▁weak -6534 +▁walls -6535 +▁recycl -6536 +▁refund -6537 +▁unlike -6538 +▁Arizona -6539 +▁capture -6540 +osc -6541 +asts -6542 +emic -6543 +izer -6544 +▁Pop -6545 +▁dim -6546 +▁rac -6547 +athan -6548 +ented -6549 +▁ille -6550 +▁zone -6551 +▁factor -6552 +▁prompt -6553 +▁reward -6554 +friendly -6555 +PC -6556 +ih -6557 +pat -6558 +bing -6559 +▁mal -6560 +▁Very -6561 +▁entr -6562 +▁horse -6563 +▁quote -6564 +▁museum -6565 +▁Mountain -6566 +Le -6567 +Ph -6568 +ba -6569 +▁Ra -6570 +▁Far -6571 +▁anx -6572 +▁vul -6573 +▁Jersey -6574 +▁conver -6575 +▁relief -6576 +▁illness -6577 +▁fighting -6578 +ATE -6579 +icket -6580 +▁blow -6581 +▁remov -6582 +▁Despite -6583 +▁Seattle -6584 +▁Standard -6585 +▁interests -6586 +▁foundation -6587 +▁cm -6588 +izza -6589 +front -6590 +▁Braz -6591 +▁Kenn -6592 +▁Pract -6593 +▁Should -6594 +▁herself -6595 +▁virtual -6596 +▁younger -6597 +HS -6598 +born -6599 +elry -6600 +▁tip -6601 +▁Easy -6602 +▁Ford -6603 +▁Iraq -6604 +▁moves -6605 +▁pocket -6606 +▁involve -6607 +▁examples -6608 +ani -6609 +rell -6610 +▁rose -6611 +▁smile -6612 +▁pounds -6613 +▁wealth -6614 +▁offices -6615 +▁flexible -6616 +▁Minnesota -6617 +▁transportation -6618 +▁Fre -6619 +▁Ire -6620 +▁Fall -6621 +▁gifts -6622 +▁input -6623 +▁Senior -6624 +▁upload -6625 +▁bathroom -6626 +▁assessment -6627 +▁capabilities -6628 +▁Jr -6629 +▁Ray -6630 +▁Rod -6631 +▁Stat -6632 +▁eggs -6633 +▁hole -6634 +▁pink -6635 +▁directed -6636 +▁identity -6637 +anes -6638 +ifer -6639 +iler -6640 +uter -6641 +▁Luc -6642 +▁Sav -6643 +▁beer -6644 +▁rein -6645 +▁bottle -6646 +▁Finally -6647 +▁airport -6648 +▁founded -6649 +▁clinical -6650 +▁ultimate -6651 +RS -6652 +sey -6653 +▁Army -6654 +▁debut -6655 +aturally -6656 +▁scientific -6657 +At -6658 +▁Ha -6659 +aron -6660 +▁Ask -6661 +▁Jac -6662 +▁sac -6663 +▁Bible -6664 +▁Royal -6665 +▁worst -6666 +illiant -6667 +▁distinct -6668 +▁improving -6669 +car -6670 +ilst -6671 +quir -6672 +▁Est -6673 +▁Kat -6674 +▁Vers -6675 +▁Event -6676 +▁elimin -6677 +▁figures -6678 +▁fishing -6679 +▁forever -6680 +▁copyright -6681 +da -6682 +▁Put -6683 +▁bab -6684 +ashed -6685 +▁Supp -6686 +▁faces -6687 +▁hospit -6688 +▁Country -6689 +▁Software -6690 +▁? -6691 +▁Non -6692 +ingly -6693 +▁garage -6694 +▁Instagram -6695 +▁tie -6696 +arrow -6697 +icate -6698 +▁Come -6699 +▁Site -6700 +▁Again -6701 +▁spoke -6702 +▁rating -6703 +▁Charles -6704 +▁visited -6705 +▁residential -6706 +▁Cab -6707 +ylvan -6708 +▁Arab -6709 +▁Fact -6710 +▁hasn -6711 +▁blank -6712 +▁stone -6713 +aration -6714 +▁entered -6715 +▁objects -6716 +▁rig -6717 +▁split -6718 +▁contribute -6719 +▁Unfortunately -6720 +RI -6721 +awn -6722 +uine -6723 +▁Bed -6724 +▁Dist -6725 +season -6726 +▁liked -6727 +▁spots -6728 +▁murder -6729 +▁Atlanta -6730 +▁developers -6731 +▁implementation -6732 +eah -6733 +With -6734 +▁coc -6735 +▁san -6736 +▁sky -6737 +▁Term -6738 +▁pitc -6739 +cluded -6740 +▁Radio -6741 +▁shower -6742 +▁Looking -6743 +▁Systems -6744 +▁baseball -6745 +▁calendar -6746 +▁Professor -6747 +▁procedure -6748 +oes -6749 +▁Ms -6750 +That -6751 +▁Save -6752 +▁cups -6753 +▁vital -6754 +resents -6755 +▁Member -6756 +▁linked -6757 +▁historical -6758 +▁possibility -6759 +Se -6760 +omy -6761 +umps -6762 +▁Mom -6763 +▁Foot -6764 +▁vibr -6765 +▁pitch -6766 +▁flavor -6767 +▁liquid -6768 +▁drawing -6769 +▁fitness -6770 +▁password -6771 +▁household -6772 +▁programme -6773 +▁atmosphere -6774 +▁reputation -6775 +andy -6776 +hell -6777 +ossible -6778 +▁enroll -6779 +▁papers -6780 +▁recipes -6781 +▁attached -6782 +▁mountain -6783 +▁organized -6784 +▁LA -6785 +▁Pow -6786 +▁hall -6787 +▁soph -6788 +▁tiss -6789 +asters -6790 +▁liber -6791 +▁Having -6792 +▁critic -6793 +▁muscle -6794 +▁talked -6795 +▁Administration -6796 +LY -6797 +One -6798 +host -6799 +▁Sem -6800 +▁Van -6801 +▁empt -6802 +▁seed -6803 +Americ -6804 +▁Brazil -6805 +▁Russia -6806 +▁carbon -6807 +▁passing -6808 +▁privacy -6809 +▁seasons -6810 +▁victims -6811 +▁frequently -6812 +▁institutions -6813 +.' -6814 +MP -6815 +But -6816 +rad -6817 +▁CO -6818 +▁PA -6819 +▁Space -6820 +▁chose -6821 +▁Living -6822 +▁theory -6823 +▁Shipping -6824 +▁MA -6825 +Read -6826 +▁ads -6827 +enger -6828 +ordan -6829 +▁rail -6830 +▁tech -6831 +▁regul -6832 +▁profit -6833 +▁managing -6834 +▁circumstances -6835 +ras -6836 +adel -6837 +tain -6838 +▁Son -6839 +▁Barb -6840 +▁hurt -6841 +▁proven -6842 +▁Justice -6843 +▁historic -6844 +▁networks -6845 +▁permission -6846 +▁legislation -6847 +▁publication -6848 +phy -6849 +▁Ba -6850 +bury -6851 +▁Cru -6852 +▁Cut -6853 +rible -6854 +▁butt -6855 +▁inch -6856 +▁Image -6857 +▁Express -6858 +▁regulations -6859 +dy -6860 +neys -6861 +ucky -6862 +▁err -6863 +uling -6864 +▁counsel -6865 +ta -6866 +ura -6867 +▁BE -6868 +▁Ur -6869 +olis -6870 +▁Fac -6871 +worth -6872 +▁Prom -6873 +▁skill -6874 +unction -6875 +▁Source -6876 +▁debate -6877 +▁Further -6878 +▁exposure -6879 +ubs -6880 +▁($ -6881 +▁Mir -6882 +▁Nic -6883 +▁Tax -6884 +▁cos -6885 +▁west -6886 +▁Garden -6887 +▁tracks -6888 +▁operate -6889 +RL -6890 +nders -6891 +▁Link -6892 +▁Name -6893 +▁lets -6894 +ffered -6895 +▁breath -6896 +▁qualified -6897 +▁represents -6898 +▁Leg -6899 +▁Oak -6900 +▁Brad -6901 +▁delay -6902 +▁finds -6903 +▁Season -6904 +▁walked -6905 +▁technique -6906 +▁NAS -6907 +▁bow -6908 +▁obl -6909 +▁tou -6910 +▁Anth -6911 +uclear -6912 +▁Choose -6913 +▁saving -6914 +▁authors -6915 +▁Learning -6916 +▁contrast -6917 +ella -6918 +ione -6919 +pons -6920 +▁Ltd -6921 +▁lad -6922 +icial -6923 +▁Scot -6924 +▁Brian -6925 +▁normally -6926 +▁realized -6927 +▁authentic -6928 +zes -6929 +urse -6930 +▁Rog -6931 +eller -6932 +▁fifth -6933 +▁merch -6934 +▁sight -6935 +▁tasks -6936 +▁hosted -6937 +▁reader -6938 +▁causing -6939 +▁savings -6940 +▁downtown -6941 +▁instance -6942 +By -6943 +odd -6944 +▁OR -6945 +▁Tony -6946 +▁mold -6947 +▁casual -6948 +▁execut -6949 +igration -6950 +ographic -6951 +▁anticip -6952 +▁justice -6953 +▁promise -6954 +▁somewhere -6955 +▁Professional -6956 +▁architecture -6957 +ingu -6958 +stra -6959 +entle -6960 +▁coat -6961 +▁smell -6962 +▁templ -6963 +ultural -6964 +▁sample -6965 +▁consequ -6966 +▁portion -6967 +▁estimated -6968 +Sc -6969 +idi -6970 +▁Pict -6971 +▁trib -6972 +remony -6973 +▁Labor -6974 +▁agric -6975 +▁trick -6976 +▁coordin -6977 +▁default -6978 +▁sending -6979 +▁upgrade -6980 +▁priority -6981 +▁interpret -6982 +▁surprising -6983 +▁volunteers -6984 +ults -6985 +cknow -6986 +▁batt -6987 +▁soil -6988 +▁mainly -6989 +▁manual -6990 +▁matches -6991 +▁gorgeous -6992 +▁shoulder -6993 +▁certified -6994 +▁apparently -6995 +▁continuing -6996 +▁situations -6997 +law -6998 +▁Es -6999 +▁exec -7000 +▁warn -7001 +arters -7002 +▁Stock -7003 +▁banks -7004 +▁bench -7005 +▁facil -7006 +▁lucky -7007 +ylvania -7008 +▁Golden -7009 +▁planet -7010 +▁posting -7011 +▁immediate -7012 +▁guidelines -7013 +bel -7014 +▁PH -7015 +star -7016 +▁Buy -7017 +▁Hou -7018 +words -7019 +▁Wilson -7020 +▁blocks -7021 +▁Financial -7022 +▁discussed -7023 +owa -7024 +ulf -7025 +ulpt -7026 +▁Mix -7027 +▁Mrs -7028 +▁USB -7029 +class -7030 +▁bear -7031 +▁hate -7032 +earing -7033 +▁firms -7034 +▁shops -7035 +▁Policy -7036 +▁Spirit -7037 +▁drinks -7038 +▁scheme -7039 +▁Customer -7040 +▁Medicine -7041 +▁Lar -7042 +anned -7043 +▁fasc -7044 +ealand -7045 +▁charm -7046 +ogether -7047 +respond -7048 +▁ending -7049 +▁terror -7050 +▁attacks -7051 +▁singles -7052 +▁workshop -7053 +▁Engineering -7054 +▁FA -7055 +iger -7056 +▁Ron -7057 +uster -7058 +▁Stay -7059 +▁magn -7060 +▁Sales -7061 +▁layer -7062 +▁prove -7063 +▁teasp -7064 +▁fairly -7065 +▁vulner -7066 +▁Ireland -7067 +▁external -7068 +nam -7069 +▁Yet -7070 +▁hat -7071 +▁vice -7072 +ingers -7073 +▁aspect -7074 +▁capable -7075 +▁Catholic -7076 +▁retirement -7077 +from -7078 +icit -7079 +unes -7080 +▁Cro -7081 +inder -7082 +▁scan -7083 +bridge -7084 +▁Motor -7085 +▁Order -7086 +▁Phone -7087 +▁stuck -7088 +eration -7089 +▁loving -7090 +▁Toronto -7091 +▁closely -7092 +▁injured -7093 +▁listing -7094 +▁Memorial -7095 +▁clicking -7096 +▁programming -7097 +aping -7098 +▁bare -7099 +▁Linux -7100 +▁climb -7101 +▁saved -7102 +▁orange -7103 +▁Zealand -7104 +▁proceed -7105 +▁believed -7106 +▁listening -7107 +▁industries -7108 +▁destination -7109 +▁Cy -7110 +▁EV -7111 +rich -7112 +▁Exp -7113 +▁wra -7114 +uting -7115 +▁Conf -7116 +▁Eric -7117 +▁juice -7118 +▁casino -7119 +▁breaking -7120 +▁memories -7121 +▁collected -7122 +▁landscape -7123 +SE -7124 +lo -7125 +▁Ca -7126 +▁FL -7127 +alle -7128 +aska -7129 +▁Ram -7130 +otted -7131 +▁Band -7132 +▁Tenn -7133 +▁terr -7134 +angers -7135 +▁reform -7136 +▁strike -7137 +▁Welcome -7138 +▁doctors -7139 +▁Material -7140 +▁enjoying -7141 +▁religious -7142 +▁spiritual -7143 +▁suggested -7144 +ati -7145 +▁MD -7146 +▁OK -7147 +Tube -7148 +aste -7149 +odge -7150 +▁hell -7151 +▁Roman -7152 +▁blend -7153 +▁forth -7154 +▁meets -7155 +▁assign -7156 +▁winners -7157 +▁machines -7158 +▁alongside -7159 +▁relatively -7160 +equ -7161 +ghan -7162 +▁Fox -7163 +▁Ide -7164 +oster -7165 +cludes -7166 +▁index -7167 +faction -7168 +▁riding -7169 +▁choosing -7170 +▁pleasure -7171 +▁strategic -7172 +▁anniversary -7173 +Ad -7174 +gypt -7175 +▁Dur -7176 +▁gym -7177 +child -7178 +imize -7179 +▁Line -7180 +▁yard -7181 +▁Smart -7182 +▁Think -7183 +▁aside -7184 +▁boxes -7185 +▁newly -7186 +▁prize -7187 +▁treatments -7188 +▁celebration -7189 +▁Subsc -7190 +▁bodies -7191 +▁writers -7192 +▁requests -7193 +▁designers -7194 +▁engagement -7195 +bro -7196 +inte -7197 +amber -7198 +▁Dave -7199 +▁east -7200 +▁Davis -7201 +▁Happy -7202 +▁bunch -7203 +▁pharm -7204 +▁belief -7205 +▁covering -7206 +▁extension -7207 +▁performances -7208 +▁WW -7209 +days -7210 +▁Sky -7211 +▁arg -7212 +▁Bang -7213 +▁elev -7214 +▁Camer -7215 +▁buyers -7216 +▁Meanwhile -7217 +▁brilliant -7218 +De -7219 +ls -7220 +agon -7221 +obby -7222 +▁Dar -7223 +▁NFL -7224 +▁Sep -7225 +ormal -7226 +▁enem -7227 +ensity -7228 +giving -7229 +▁birds -7230 +▁broke -7231 +▁giant -7232 +▁proof -7233 +▁franch -7234 +▁division -7235 +nic -7236 +inos -7237 +▁Pak -7238 +ashes -7239 +osophy -7240 +▁Asian -7241 +▁Kevin -7242 +lements -7243 +▁acknow -7244 +▁symbol -7245 +▁titles -7246 +sylvania -7247 +▁packaging -7248 +▁platforms -7249 +▁instrument -7250 +▁differences -7251 +oty -7252 +▁raw -7253 +▁unw -7254 +iders -7255 +ureau -7256 +▁Adam -7257 +▁iPad -7258 +esides -7259 +▁meals -7260 +▁river -7261 +▁compat -7262 +▁enables -7263 +▁drinking -7264 +▁volunteer -7265 +’. -7266 +▁PDF -7267 +inton -7268 +▁mile -7269 +▁slic -7270 +▁solo -7271 +▁superv -7272 +▁letters -7273 +▁authority -7274 +.’ -7275 +wan -7276 +▁PL -7277 +alse -7278 +rage -7279 +wart -7280 +▁pip -7281 +▁Bush -7282 +▁Iran -7283 +lisher -7284 +parent -7285 +▁Story -7286 +▁urban -7287 +ainless -7288 +▁consistent -7289 +pes -7290 +▁Uk -7291 +▁|| -7292 +bles -7293 +wich -7294 +▁kit -7295 +ronics -7296 +▁Chall -7297 +▁Model -7298 +▁centers -7299 +▁charity -7300 +▁typical -7301 +▁explains -7302 +▁replaced -7303 +▁newspaper -7304 +▁communications -7305 +GA -7306 +OVID -7307 +▁rug -7308 +▁acts -7309 +▁lapt -7310 +▁vacc -7311 +▁vast -7312 +ateful -7313 +jection -7314 +▁infect -7315 +▁YouTube -7316 +▁mortgage -7317 +▁CN -7318 +leep -7319 +oker -7320 +▁Jay -7321 +▁stim -7322 +▁tape -7323 +▁trim -7324 +▁tooth -7325 +▁dreams -7326 +▁falling -7327 +▁handling -7328 +▁holidays -7329 +▁swimming -7330 +cons -7331 +iley -7332 +page -7333 +▁stir -7334 +▁Return -7335 +▁decade -7336 +▁domain -7337 +▁singer -7338 +▁Perhaps -7339 +▁destroy -7340 +▁dynamic -7341 +▁lighting -7342 +▁proposal -7343 +▁categories -7344 +▁encouraged -7345 +▁membership -7346 +▁personally -7347 +Fi -7348 +acious -7349 +▁Jason -7350 +▁Jordan -7351 +▁Columbia -7352 +▁forecast -7353 +▁informed -7354 +▁wireless -7355 +▁classroom -7356 +▁accomplish -7357 +▁initiative -7358 +▁suggestions -7359 +▁Po -7360 +▁mut -7361 +erman -7362 +▁Bird -7363 +▁Mill -7364 +▁Swed -7365 +▁slee -7366 +▁susp -7367 +▁Egypt -7368 +▁Staff -7369 +▁Treat -7370 +▁recre -7371 +▁solve -7372 +▁agents -7373 +▁combine -7374 +▁founder -7375 +▁percentage -7376 +▁Advis -7377 +▁Cancer -7378 +▁arrive -7379 +▁headed -7380 +▁expansion -7381 +▁sensitive -7382 +▁manufacturers -7383 +TER -7384 +uis -7385 +athy -7386 +▁Bad -7387 +▁Ess -7388 +▁magic -7389 +▁penal -7390 +▁Agency -7391 +▁Miller -7392 +▁Gallery -7393 +ounce -7394 +▁bars -7395 +▁embr -7396 +▁tied -7397 +▁Being -7398 +▁crash -7399 +▁flash -7400 +▁filter -7401 +▁Classic -7402 +▁Houston -7403 +▁shouldn -7404 +▁Remember -7405 +▁Transport -7406 +▁participating -7407 +▁ast -7408 +▁Talk -7409 +▁dust -7410 +▁Annual -7411 +▁Recent -7412 +▁slowly -7413 +▁Airport -7414 +▁Kingdom -7415 +▁pricing -7416 +▁travell -7417 +▁Northern -7418 +▁enterprise -7419 +ko -7420 +▁Josh -7421 +▁evol -7422 +▁mood -7423 +▁unus -7424 +▁facts -7425 +▁phones -7426 +▁Consult -7427 +▁ancient -7428 +▁presents -7429 +▁printing -7430 +▁Secretary -7431 +▁permanent -7432 +wis -7433 +onna -7434 +level -7435 +▁hire -7436 +amsung -7437 +rovers -7438 +▁Brook -7439 +▁venue -7440 +▁Joseph -7441 +▁gender -7442 +▁extract -7443 +▁intense -7444 +ervations -7445 +▁Pennsylvania -7446 +▁DI -7447 +..... -7448 +abeth -7449 +▁Base -7450 +▁assum -7451 +▁dealing -7452 +▁gallery -7453 +▁genuine -7454 +▁portfolio -7455 +▁enforcement -7456 +FA -7457 +esy -7458 +site -7459 +▁suc -7460 +igate -7461 +uties -7462 +▁Film -7463 +▁gall -7464 +ership -7465 +▁Level -7466 +▁roles -7467 +ologist -7468 +▁Create -7469 +▁watched -7470 +▁producing -7471 +▁IC -7472 +lers -7473 +wear -7474 +▁Dam -7475 +asted -7476 +mates -7477 +▁fest -7478 +making -7479 +▁scenes -7480 +▁constit -7481 +▁carrying -7482 +▁suffered -7483 +▁traveling -7484 +▁attractive -7485 +OD -7486 +Tr -7487 +▁Own -7488 +▁Sea -7489 +iking -7490 +oices -7491 +▁Webs -7492 +▁vari -7493 +ardens -7494 +▁Grant -7495 +ulating -7496 +▁Silver -7497 +▁border -7498 +▁assault -7499 +▁Continue -7500 +▁generate -7501 +▁assistant -7502 +▁Collection -7503 +▁guaranteed -7504 +▁recommendations -7505 +Do -7506 +axy -7507 +bar -7508 +pir -7509 +Book -7510 +▁Sym -7511 +▁Stan -7512 +▁trig -7513 +▁wins -7514 +▁Books -7515 +▁absor -7516 +▁stake -7517 +▁Studio -7518 +▁Quality -7519 +▁chances -7520 +▁Personal -7521 +▁equipped -7522 +▁Ter -7523 +Press -7524 +books -7525 +active -7526 +▁grass -7527 +▁opens -7528 +▁solar -7529 +inating -7530 +▁compens -7531 +▁heading -7532 +▁Everyone -7533 +▁diseases -7534 +▁reducing -7535 +▁Hollywood -7536 +▁languages -7537 +▁professor -7538 +▁incredibly -7539 +boy -7540 +▁rh -7541 +aine -7542 +ilty -7543 +raid -7544 +burgh -7545 +▁Fred -7546 +▁actor -7547 +▁formed -7548 +▁Eastern -7549 +▁booking -7550 +▁podcast -7551 +▁speaker -7552 +▁Experience -7553 +▁interactive -7554 +SC -7555 +Te -7556 +rm -7557 +amel -7558 +▁hel -7559 +▁anyway -7560 +▁lawyer -7561 +▁neighb -7562 +▁cookies -7563 +▁Magazine -7564 +▁Therefore -7565 +acc -7566 +ila -7567 +▁CL -7568 +▁Deb -7569 +asant -7570 +ctive -7571 +▁Bern -7572 +▁lect -7573 +▁Force -7574 +▁Henry -7575 +▁Would -7576 +▁formal -7577 +▁string -7578 +▁filling -7579 +▁Products -7580 +▁purchasing -7581 +▁connections -7582 +alo -7583 +run -7584 +▁Gi -7585 +etch -7586 +game -7587 +phia -7588 +shire -7589 +▁narr -7590 +▁alive -7591 +▁pride -7592 +graduate -7593 +▁preferred -7594 +▁Hi -7595 +ials -7596 +▁Ath -7597 +▁Hun -7598 +▁Mov -7599 +stein -7600 +▁Clin -7601 +▁Emer -7602 +▁Guard -7603 +▁Major -7604 +▁phase -7605 +▁limits -7606 +▁marked -7607 +▁writes -7608 +▁defined -7609 +▁deposit -7610 +▁visible -7611 +▁suggests -7612 +oto -7613 +swe -7614 +roke -7615 +▁Tel -7616 +▁Kids -7617 +▁seats -7618 +▁shell -7619 +▁accused -7620 +▁aggress -7621 +▁expressed -7622 +▁basketball -7623 +Fr -7624 +▁EN -7625 +onic -7626 +allas -7627 +▁bact -7628 +lessly -7629 +▁empty -7630 +▁Estate -7631 +▁hotels -7632 +▁nights -7633 +▁racing -7634 +▁Comment -7635 +▁jewelry -7636 +▁substant -7637 +▁primarily -7638 +esh -7639 +imp -7640 +▁CP -7641 +bell -7642 +▁bid -7643 +▁gay -7644 +utter -7645 +▁Past -7646 +▁aims -7647 +▁lady -7648 +▁habit -7649 +▁Father -7650 +▁Histor -7651 +▁Mother -7652 +▁Things -7653 +▁rental -7654 +▁shapes -7655 +▁weapons -7656 +itionally -7657 +▁accuracy -7658 +▁resulting -7659 +▁creativity -7660 +▁specialist -7661 +▁vegetables -7662 +AV -7663 +▁oz -7664 +ogue -7665 +▁Has -7666 +▁lie -7667 +ifies -7668 +inity -7669 +▁cycl -7670 +intend -7671 +▁Based -7672 +▁bills -7673 +limited -7674 +▁remark -7675 +▁rising -7676 +▁engaged -7677 +▁instant -7678 +▁organis -7679 +▁politics -7680 +▁Published -7681 +▁recognition -7682 +ns -7683 +hour -7684 +▁Las -7685 +inois -7686 +uters -7687 +▁Give -7688 +▁Iowa -7689 +▁Marc -7690 +▁Tele -7691 +abetes -7692 +▁Vegas -7693 +▁criteria -7694 +▁suffering -7695 +▁compliance -7696 +essee -7697 +▁rice -7698 +▁marks -7699 +adelphia -7700 +▁Officer -7701 +▁compare -7702 +▁desired -7703 +▁component -7704 +▁highlights -7705 +▁TR -7706 +uana -7707 +▁tub -7708 +oween -7709 +▁dism -7710 +▁Prime -7711 +▁brush -7712 +▁Kansas -7713 +▁dollar -7714 +▁Britain -7715 +▁crucial -7716 +▁graphic -7717 +▁recover -7718 +▁achieved -7719 +▁literally -7720 +▁interviews -7721 +jo -7722 +igs -7723 +lee -7724 +▁Ap -7725 +greg -7726 +▁Map -7727 +▁tap -7728 +▁Fast -7729 +▁HERE -7730 +▁duty -7731 +makers -7732 +▁Among -7733 +▁Steel -7734 +▁knock -7735 +▁healing -7736 +▁illegal -7737 +▁admitted -7738 +▁describe -7739 +▁entering -7740 +▁releases -7741 +▁speakers -7742 +▁Solutions -7743 +▁functional -7744 +des -7745 +▁pra -7746 +▁Roll -7747 +▁Cover -7748 +▁Kelly -7749 +athered -7750 +▁intent -7751 +▁Edition -7752 +▁massage -7753 +▁packages -7754 +▁Following -7755 +▁attending -7756 +▁obviously -7757 +li -7758 +uan -7759 +▁EX -7760 +mers -7761 +▁Meth -7762 +▁keys -7763 +▁heads -7764 +holders -7765 +▁Change -7766 +▁Orange -7767 +▁matching -7768 +▁displayed -7769 +▁recognize -7770 +▁wondering -7771 +▁correspond -7772 +isa -7773 +▁CC -7774 +▁IM -7775 +Cont -7776 +orous -7777 +▁Diego -7778 +▁dough -7779 +▁trips -7780 +▁signal -7781 +▁developer -7782 +▁exceptional -7783 +▁increasingly -7784 +%. -7785 +ja -7786 +htt -7787 +▁Ros -7788 +athon -7789 +heast -7790 +▁Dead -7791 +▁puts -7792 +▁till -7793 +▁Nation -7794 +▁alumin -7795 +▁struck -7796 +novation -7797 +▁claimed -7798 +▁farmers -7799 +▁hitting -7800 +▁whenever -7801 +▁officially -7802 +▁introduction -7803 +pson -7804 +▁Isl -7805 +found -7806 +▁Auto -7807 +▁Body -7808 +▁king -7809 +▁mand -7810 +inding -7811 +▁Table -7812 +▁Forest -7813 +▁Valent -7814 +▁narrow -7815 +▁colours -7816 +▁Attorney -7817 +▁networking -7818 +▁necessarily -7819 +▁improvements -7820 +tail -7821 +▁bug -7822 +▁clar -7823 +▁Civil -7824 +utional -7825 +▁hidden -7826 +▁Theatre -7827 +▁texture -7828 +▁checking -7829 +▁constant -7830 +▁licensed -7831 +▁Cry -7832 +▁cust -7833 +▁root -7834 +ickets -7835 +terior -7836 +▁Youth -7837 +▁loose -7838 +▁setup -7839 +▁acting -7840 +▁Chapter -7841 +▁Reading -7842 +▁occurred -7843 +▁struggling -7844 +TP -7845 +tw -7846 +AND -7847 +▁ -7848 +e -7849 +t -7850 +a -7851 +o -7852 +i -7853 +n -7854 +s -7855 +r -7856 +h -7857 +l -7858 +d -7859 +c -7860 +u -7861 +m -7862 +p -7863 +g -7864 +f -7865 +y -7866 +w -7867 +b -7868 +. -7869 +v -7870 +, -7871 +k -7872 +T -7873 +I -7874 +S -7875 +A -7876 +- -7877 +C -7878 +0 -7879 +1 -7880 +M -7881 +P -7882 +B -7883 +x -7884 +2 -7885 +W -7886 +D -7887 +R -7888 +E -7889 +H -7890 +F -7891 +L -7892 +O -7893 +N -7894 +’ -7895 +' -7896 +: -7897 +G -7898 +j -7899 +) -7900 +3 -7901 +( -7902 +z -7903 +5 -7904 +q -7905 +" -7906 +U -7907 +4 -7908 +J -7909 +9 -7910 +6 -7911 +8 -7912 +V -7913 +Y -7914 +K -7915 +7 -7916 +! -7917 +| -7918 +/ -7919 +? -7920 +“ -7921 +” -7922 +; -7923 +– -7924 +& -7925 +$ -7926 +— -7927 +Q -7928 +X -7929 +% -7930 +Z -7931 diff --git a/records/track_non_record_16mb/2026-03-24_106M_Binary_Asymmetric_UNet_FP8_15L_8192BPE_YaRN_NeoMuon_Smear/requirements.txt b/records/track_non_record_16mb/2026-03-24_106M_Binary_Asymmetric_UNet_FP8_15L_8192BPE_YaRN_NeoMuon_Smear/requirements.txt new file mode 100644 index 0000000000..0c5eedce7b --- /dev/null +++ b/records/track_non_record_16mb/2026-03-24_106M_Binary_Asymmetric_UNet_FP8_15L_8192BPE_YaRN_NeoMuon_Smear/requirements.txt @@ -0,0 +1,10 @@ +numpy +tqdm +torch==2.10 +huggingface-hub +kernels +setuptools +typing-extensions==4.15.0 +datasets +tiktoken +sentencepiece \ No newline at end of file diff --git a/records/track_non_record_16mb/2026-03-24_106M_Binary_Asymmetric_UNet_FP8_15L_8192BPE_YaRN_NeoMuon_Smear/run_cuda_binary.sh b/records/track_non_record_16mb/2026-03-24_106M_Binary_Asymmetric_UNet_FP8_15L_8192BPE_YaRN_NeoMuon_Smear/run_cuda_binary.sh new file mode 100644 index 0000000000..473b3388e3 --- /dev/null +++ b/records/track_non_record_16mb/2026-03-24_106M_Binary_Asymmetric_UNet_FP8_15L_8192BPE_YaRN_NeoMuon_Smear/run_cuda_binary.sh @@ -0,0 +1,72 @@ +RUN_ID=pushing_run_binary_1 \ +DATA_PATH=./data/datasets/fineweb10B_sp8192 \ +TOKENIZER_PATH=./data/tokenizers/fineweb_8192_bpe.model \ +ATTN_PROJ_TYPE=standard \ +LOGIT_HEAD_TYPE=standard \ +TVERSKY_MEMBERSHIP=sigmoid \ +TVERSKY_NUM_FEATURES=0 \ +TVERSKY_FEATURE_POOLS=0 \ +VOCAB_SIZE=8192 \ +BITNET_GROUP_SIZE=128 \ +BIGRAM_HASH=0 \ +EMBED_DIM=254 \ +TRAINING_DEPTH_RECURRENCE=0 \ +EVAL_DEPTH_RECURRENCE=0 \ +NUM_LAYERS=15 \ +MODEL_DIM=768 \ +NUM_KV_HEADS=4 \ +NUM_HEADS=8 \ +DIFF_ATTN=0 \ +MLP_MULT=4 \ +MLP_GROUPS=0 \ +MATRIX_OPTIMIZER=muon \ +ADAM_LR=0.05 \ +ADAM_WD=0.05 \ +MUON_BACKEND_STEPS=3 \ +MUON_MOMENTUM=0.95 \ +MUON_MOMENTUM_WARMUP_START=0.85 \ +MUON_MOMENTUM_WARMUP_STEPS=500 \ +MUON_WD=0.0 \ +MATRIX_LR=0.04 \ +SCALAR_LR=0.02 \ +TIED_EMBED_LR=0.02 \ +WARMDOWN_FRACTION=0.2 \ +LOGIT_SOFTCAP=10 \ +QK_GAIN_INIT=2.25 \ +ROPE_TYPE=yarn \ +YARN_MAX_LEN=2048 \ +ROPE_BASE=5000 \ +BATCH_TOKENS_START=0 \ +BATCH_SCHEDULE_FRACTION=0.33 \ +TRAIN_BATCH_TOKENS=524288 \ +SEQ_LEN_START=0 \ +SEQ_SCHEDULE_FRACTION=0.0 \ +TRAIN_SEQ_LEN=1024 \ +SMEAR=1 \ +ITERATIONS=50000 \ +WARMUP_STEPS=5 \ +MAX_WALLCLOCK_SECONDS=0 \ +VAL_LOSS_EVERY=0 \ +TRAIN_LOG_EVERY=500 \ +CHURN_LOG_EVERY=1000 \ +VAL_MAX_TOKENS=0 \ +TIE_EMBEDDINGS=1 \ +UNTIE_AT_FRACTION=0.00 \ +HEAD_LR=0.02 \ +CORR_WEIGHT_LR=0.02 \ +ACTIVATION=relu2 \ +SOFTCAP_TYPE=poly \ +MTP_HEADS=0 \ +REFINER=0 \ +REFINER_KERNEL=3 \ +SLIDING_EVAL=1 \ +SLIDING_EVAL_STRIDE=16 \ +SLIDING_BATCH_SIZE=256 \ +TEMP_SCALING=1 \ +FP_STORAGE=FP8 \ +EMA=0 \ +EMA_DECAY=0.995 \ +EMA_START_FRACTION=0.5 \ +SEED=42 \ +COMPILE_MODE=default \ +OMP_NUM_THREADS=1 torchrun --standalone --nproc_per_node=8 train_gpt_cuda_binary.py diff --git a/records/track_non_record_16mb/2026-03-24_106M_Binary_Asymmetric_UNet_FP8_15L_8192BPE_YaRN_NeoMuon_Smear/setup.sh b/records/track_non_record_16mb/2026-03-24_106M_Binary_Asymmetric_UNet_FP8_15L_8192BPE_YaRN_NeoMuon_Smear/setup.sh new file mode 100644 index 0000000000..93f1c41fea --- /dev/null +++ b/records/track_non_record_16mb/2026-03-24_106M_Binary_Asymmetric_UNet_FP8_15L_8192BPE_YaRN_NeoMuon_Smear/setup.sh @@ -0,0 +1,143 @@ +#!/bin/bash +# ------------------------------------------------------------------------------- +# Parameter Golf -- Complete Environment Setup Script +# Drop this into the project root and run: bash setup.sh +# ------------------------------------------------------------------------------- + +set -e + +echo "----------------------------------------------" +echo " Parameter Golf -- Environment Setup" +echo "----------------------------------------------" + +# ------------------------------------------------------------------------------- +# 1. Miniconda +# ------------------------------------------------------------------------------- +echo "" +echo "[1/5] Miniconda..." + +if [ -d "$HOME/miniconda3" ]; then + echo " Already installed -- skipping." +else + wget -q https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh -O /tmp/miniconda.sh + bash /tmp/miniconda.sh -b + rm /tmp/miniconda.sh + ~/miniconda3/bin/conda init bash + echo " Installed." +fi + +export PATH="$HOME/miniconda3/bin:$PATH" +source ~/miniconda3/etc/profile.d/conda.sh + +echo " Accepting conda TOS..." +~/miniconda3/bin/conda tos accept --override-channels --channel https://repo.anaconda.com/pkgs/main +~/miniconda3/bin/conda tos accept --override-channels --channel https://repo.anaconda.com/pkgs/r +echo " TOS accepted." + +# ------------------------------------------------------------------------------- +# 2. Python Environment +# ------------------------------------------------------------------------------- +echo "" +echo "[2/5] Python 3.13 environment..." + +if conda env list | grep -q "^golf "; then + echo " Environment 'golf' already exists -- skipping." +else + conda create -n golf python=3.13 -y + echo " Created." +fi + +conda activate golf +echo " Activated." + +# ------------------------------------------------------------------------------- +# 3. Requirements +# ------------------------------------------------------------------------------- +echo "" +echo "[3/5] Requirements..." + +if python3 -c "import torch, sentencepiece, numpy" 2>/dev/null; then + echo " Core packages already installed -- skipping." +else + pip install --upgrade pip -q + pip install -r requirements.txt -q + echo " Installed." +fi + +# ------------------------------------------------------------------------------- +# 4. FlashAttention-3 +# ------------------------------------------------------------------------------- +echo "" +echo "[4/5] FlashAttention-3..." + +if python3 -c "import flash_attn" 2>/dev/null || python3 -c "import flash_attn_interface" 2>/dev/null; then + echo " Already installed -- skipping." +else + # abi3 wheel -- Python 3.9+ compatible, installs in seconds, no compilation + pip install --no-cache-dir "https://download.pytorch.org/whl/cu128/flash_attn_3-3.0.0-cp39-abi3-manylinux_2_28_x86_64.whl" + echo " Installed." +fi + +# ------------------------------------------------------------------------------- +# 5. Dataset +# ------------------------------------------------------------------------------- +echo "" +echo "[5/5] FineWeb dataset (sp8192, 10 shards)..." + +echo " Downloading... ($TRAIN_COUNT/10 train shards found)" +hf download sproos/parameter-golf-tokenizers --include "datasets/fineweb10B_sp8192/*" --local-dir ./data +echo " Downloaded." + +# ------------------------------------------------------------------------------- +# Verification +# ------------------------------------------------------------------------------- +echo "" +echo "----------------------------------------------" +echo " Verification" +echo "----------------------------------------------" + +python3 - << 'EOF' +import sys +import torch +import numpy as np +import glob + +print(f"Python : {sys.version.split()[0]}") +print(f"PyTorch : {torch.__version__}") +print(f"CUDA : {torch.cuda.is_available()}") +print(f"GPUs : {torch.cuda.device_count()}") + +if torch.cuda.is_available(): + for i in range(torch.cuda.device_count()): + props = torch.cuda.get_device_properties(i) + print(f" GPU {i} : {props.name} ({props.total_memory // 1024**3}GB)") + +try: + import flash_attn + print(f"FlashAttn : {flash_attn.__version__}") +except ImportError: + try: + import flash_attn_interface + print(f"FlashAttn3 : available") + except ImportError: + print(f"FlashAttn : NOT found") + +train_files = sorted(glob.glob("./data/datasets/fineweb10B_sp8192/fineweb_train_*.bin")) +val_files = sorted(glob.glob("./data/datasets/fineweb10B_sp8192/fineweb_val_*.bin")) +print(f"Train shards : {len(train_files)}") +print(f"Val shards : {len(val_files)}") + +if val_files: + total = sum( + int(np.fromfile(f, dtype=' tuple[bytes, int]: + bits = ((q.reshape(-1).to(torch.int8) + 1) // 2).numpy().astype(np.uint8) + n = len(bits) + pad = (8 - n % 8) % 8 + if pad: + bits = np.concatenate([bits, np.zeros(pad, dtype=np.uint8)]) + groups = bits.reshape(-1, 8) + packed = np.zeros(len(groups), dtype=np.uint8) + for i in range(8): + packed |= groups[:, i] << i + return packed.tobytes(), n + +def unpack_binary(data: bytes, n: int) -> Tensor: + packed = np.frombuffer(data, dtype=np.uint8) + bits = np.zeros((len(packed), 8), dtype=np.int8) + for i in range(8): + bits[:, i] = (packed >> i) & 1 + flat = bits.reshape(-1)[:n] + return torch.from_numpy(flat.astype(np.int8) * 2 - 1) + +# --------------------------------------------------------------------------- +# FP4 quantization (per-row absmax, 2 values packed per byte) +# --------------------------------------------------------------------------- +def quantize_to_int4(t: Tensor) -> tuple[Tensor, Tensor, list]: + t32 = t.float() + orig_shape = t32.shape + if t32.ndim < 2: + t32 = t32.unsqueeze(0) + absmax = t32.abs().amax(dim=-1, keepdim=True).clamp(min=1e-8) + scale = absmax / 7.0 + q = torch.clamp(torch.round(t32 / scale), -7, 7).to(torch.int8) + flat = q.reshape(-1) + if flat.numel() % 2 != 0: + flat = F.pad(flat, (0, 1)) + low = (flat[0::2] + 8).to(torch.uint8) + high = (flat[1::2] + 8).to(torch.uint8) + return low | (high << 4), scale.half().squeeze(-1), list(orig_shape) + +def dequantize_from_int4(packed: Tensor, scale: Tensor, shape: list) -> Tensor: + low = (packed & 0x0F).to(torch.int8) - 8 + high = ((packed >> 4) & 0x0F).to(torch.int8) - 8 + flat = torch.zeros(packed.numel() * 2, dtype=torch.int8) + flat[0::2] = low + flat[1::2] = high + numel = 1 + for s in shape: + numel *= s + flat = flat[:numel].float() + if len(shape) <= 1: + return (flat * scale.float().squeeze()).reshape(shape) + return (flat.reshape(-1, shape[-1]) * scale.float().unsqueeze(-1)).reshape(shape) + +# --------------------------------------------------------------------------- +# State dict serialization (binary + fp16/fp8/fp4) +# --------------------------------------------------------------------------- +def q_sd(state_dict: dict, group_size: int = 64, fp_storage=False, binary_override_names: set | None = None) -> tuple[dict, dict]: + "Binary for large 2D weight matrices, fp16/fp8/fp4 for everything else." + quantized = {} + stats = {"binary_params": 0, "binary_bytes": 0, "fp_params": 0, "fp_bytes": 0} + for name, tensor in state_dict.items(): + if "mtp_heads" in name: + continue + t = tensor.detach().cpu().float().contiguous() + t_orig_shape = list(t.shape) + if t.ndim == 3: + t = t.reshape(t.shape[0], -1) + is_binary_candidate = ( + t.ndim == 2 and t.numel() > 65_536 + and "tok_emb" not in name and "lm_head" not in name and "embed_proj" not in name and "bigram_emb" not in name and "lm_head_correction" not in name and "lm_head_U" not in name and "lm_head_V" not in name + and "prototypes" not in name and "tversky" not in name + ) or (binary_override_names is not None and name in binary_override_names) + if is_binary_candidate: + pad = (group_size - t.shape[1] % group_size) % group_size + t_padded = F.pad(t, (0, pad)) if pad > 0 else t + t_grouped = t_padded.reshape(-1, group_size) + scale = t_grouped.abs().mean(-1, keepdim=True).clamp(min=1e-8).half().float() + q = torch.where(t_grouped >= 0, + torch.ones_like(t_grouped, dtype=torch.int8), + -torch.ones_like(t_grouped, dtype=torch.int8)) + packed_bytes, n_bits = pack_binary(q) + quantized[name] = { + "type": "binary", "packed": packed_bytes, + "scale": scale.half().squeeze(-1), + "shape": list(t.shape), "padded_cols": t_padded.shape[1], + "group_size": group_size, "n_bits": n_bits, + "orig_shape": t_orig_shape, + } + stats["binary_params"] += t.numel() + stats["binary_bytes"] += len(packed_bytes) + scale.numel() * 2 + elif fp_storage == "fp4" and t.ndim == 2: + packed, scale, orig_shape = quantize_to_int4(t) + quantized[name] = {"type": "fp4", "packed": packed, "scale": scale, "shape": orig_shape} + stats["fp_params"] += t.numel() + stats["fp_bytes"] += packed.numel() + scale.numel() * 2 + elif fp_storage and t.ndim == 2: + quantized[name] = {"type": "fp8", "data": t.to(torch.float8_e4m3fn)} + stats["fp_params"] += t.numel() + stats["fp_bytes"] += t.numel() + else: + quantized[name] = {"type": "fp16", "data": t.half()} + stats["fp_params"] += t.numel() + stats["fp_bytes"] += t.numel() * 2 + return quantized, stats + +def deq_sd(quantized: dict, target_dtype=torch.bfloat16): + "Reconstruct full-precision state dict from quantized representation." + out = {} + for name, entry in quantized.items(): + if entry["type"] == "binary": + q = unpack_binary(entry["packed"], entry["n_bits"]) + q = q.float().reshape(-1, entry["group_size"]) + scale = entry["scale"].float().unsqueeze(-1) + # No shrinkage correction needed: binary has no zeros, q.abs().mean() == 1.0 always + t = (q * scale).reshape(-1, entry["padded_cols"]) + shape = entry["shape"] + result = t[:shape[0], :shape[1]].to(target_dtype) + orig = entry.get("orig_shape") + out[name] = result.reshape(orig).contiguous() if orig and orig != shape else result.contiguous() + elif entry["type"] == "fp8": + out[name] = entry["data"].to(torch.float32).to(target_dtype).contiguous() + elif entry["type"] == "fp4": + out[name] = dequantize_from_int4(entry["packed"], entry["scale"], entry["shape"]).to(target_dtype).contiguous() + else: + out[name] = entry["data"].to(target_dtype).contiguous() + return out + +# --------------------------------------------------------------------------- +# Binary diagnostics (logged during training) +# --------------------------------------------------------------------------- +_prev_committed: dict = {} +def churn_fn(model: nn.Module, group_size: int = 64): + global _prev_committed + total = flipped = 0 + with torch.no_grad(): + for name, p in model.named_parameters(): + if p.ndim == 2 and ("weight" in name or "prototypes" in name) and p.shape[0] > 1: + w = p.detach().float().reshape(-1, group_size) + q = torch.where(w >= 0, torch.ones_like(w), -torch.ones_like(w)).cpu().numpy() + if name in _prev_committed: + flipped += int(np.sum(q != _prev_committed[name])) + total += q.size + _prev_committed[name] = q + return flipped / max(total, 1) + +# --------------------------------------------------------------------------- +# Muon optimizer (Newton-Schulz orthogonalized momentum) +# --------------------------------------------------------------------------- +def ns_orth(G: Tensor, steps: int = 10, eps: float = 1e-7) -> Tensor: + a, b, c = (3.4445, -4.7750, 2.0315) + X = G.bfloat16() + X /= X.norm() + eps + transposed = G.size(0) > G.size(1) + if transposed: + X = X.T + for _ in range(steps): + A = X @ X.T + B = b * A + c * A @ A + X = a * X + B @ X + return X.T if transposed else X + +class Muon(torch.optim.Optimizer): + def __init__(self, params, lr: float, momentum: float, backend_steps: int, nesterov: bool = True, wd: float = 0.0): + super().__init__(params, dict(lr=lr, momentum=momentum, backend_steps=backend_steps, nesterov=nesterov, wd=wd)) + @torch.no_grad() + def step(self, closure=None): + loss = None + if closure is not None: + with torch.enable_grad(): + loss = closure() + distributed = dist.is_available() and dist.is_initialized() + world_size = dist.get_world_size() if distributed else 1 + rank = dist.get_rank() if distributed else 0 + for group in self.param_groups: + params = group["params"] + if not params: + continue + lr, momentum = group["lr"], group["momentum"] + backend_steps, nesterov = group["backend_steps"], group["nesterov"] + total_params = sum(int(p.numel()) for p in params) + updates_flat = torch.zeros(total_params, device=params[0].device, dtype=torch.bfloat16) + curr = 0 + for i, p in enumerate(params): + if i % world_size == rank and p.grad is not None: + g = p.grad + state = self.state[p] + if "momentum_buffer" not in state: + state["momentum_buffer"] = torch.zeros_like(g) + buf = state["momentum_buffer"] + buf.mul_(momentum).add_(g) + if nesterov: + g = g.add(buf, alpha=momentum) + g = F.rms_norm(g.float(), (g.size(-1),)).bfloat16() + g = ns_orth(g, steps=backend_steps) + g *= max(1, g.size(0) / g.size(1)) ** 0.5 + updates_flat[curr:curr + p.numel()] = g.reshape(-1) + curr += p.numel() + if distributed: + dist.all_reduce(updates_flat, op=dist.ReduceOp.SUM) + wd = group.get("wd", 0.0) + curr = 0 + for p in params: + g = updates_flat[curr : curr + p.numel()].view_as(p).to(dtype=p.dtype) + if wd > 0: + p.mul_(1 - lr * wd) + p.add_(g, alpha=-lr) + curr += p.numel() + return loss + +# --------------------------------------------------------------------------- +# Data loading +# --------------------------------------------------------------------------- +def ld_shard(file: Path) -> Tensor: + header_bytes = 256 * np.dtype(" Tensor: + chunks = [] + remaining = n + while remaining > 0: + avail = self.tokens.numel() - self.pos + if avail <= 0: + self._advance_file() + continue + k = min(remaining, avail) + chunks.append(self.tokens[self.pos:self.pos + k]) + self.pos += k + remaining -= k + return chunks[0] if len(chunks) == 1 else torch.cat(chunks) + +class DistributedTokenLoader: + def __init__(self, pattern: str, rank: int, world_size: int, device: torch.device): + self.rank, self.world_size, self.device = rank, world_size, device + self.stream = TokenStream(pattern) + def next_batch(self, global_tokens: int, seq_len: int, grad_accum_steps: int) -> tuple[Tensor, Tensor]: + local_tokens = global_tokens // (self.world_size * grad_accum_steps) + per_rank_span = local_tokens + 1 + chunk = self.stream.take(per_rank_span * self.world_size) + start = self.rank * per_rank_span + local = chunk[start:start + per_rank_span].pin_memory().to(self.device, non_blocking=True).to(torch.int64) + x = local[:-1].reshape(-1, seq_len) + y = local[1:].reshape(-1, seq_len) + return x, y +# --------------------------------------------------------------------------- +# Model +# --------------------------------------------------------------------------- +class RMSNorm(nn.Module): + def __init__(self, eps: float | None = None): + super().__init__() + self.eps = eps + def forward(self, x: Tensor) -> Tensor: + return F.rms_norm(x, (x.size(-1),), eps=self.eps) + +def apply_qat_ste(w: Tensor, fp_storage: str | bool) -> Tensor: + """Applies Straight-Through Estimator (STE) for FP4 or FP8 simulated quantization.""" + if not fp_storage: + return w + if fp_storage == "fp4": + absmax = w.abs().amax(dim=-1, keepdim=True).clamp(min=1e-8) + scale = absmax / 7.0 + q = torch.clamp(torch.round(w / scale), -7.0, 7.0) + w_sim = q * scale + return (w_sim - w).detach() + w + elif fp_storage is True or fp_storage == "fp8": + w_sim = w.to(torch.float8_e4m3fn).to(w.dtype) + return (w_sim - w).detach() + w + return w + +class QATLinear(nn.Linear): + def __init__(self, in_features: int, out_features: int, bias: bool = False, fp_storage: str | bool = False): + super().__init__(in_features, out_features, bias=bias) + self.fp_storage = fp_storage + def forward(self, x: Tensor) -> Tensor: + w_qat = apply_qat_ste(self.weight, self.fp_storage) + return F.linear(x, w_qat.to(x.dtype), self.bias.to(x.dtype) if self.bias is not None else None) + +class QATEmbedding(nn.Embedding): + def __init__(self, num_embeddings: int, embedding_dim: int, fp_storage: str | bool = False): + super().__init__(num_embeddings, embedding_dim) + self.fp_storage = fp_storage + def forward(self, input: Tensor) -> Tensor: + w_qat = apply_qat_ste(self.weight, self.fp_storage) + return F.embedding(input, w_qat, self.padding_idx, self.max_norm, + self.norm_type, self.scale_grad_by_freq, self.sparse) + +class BinaryLinear(nn.Linear): + def __init__(self, in_features, out_features, bias=False, group_size=64): + super().__init__(in_features, out_features, bias=bias) + self.group_size = group_size + def forward(self, x: Tensor) -> Tensor: + w = self.weight.bfloat16() + g = self.group_size + w_g = w.reshape(-1, g) + scale = w_g.abs().mean(-1, keepdim=True).clamp(min=1e-8) + q = torch.where(w_g >= 0, torch.ones_like(w_g), -torch.ones_like(w_g)) + w_binary = w + ((q * scale).reshape(w.shape) - w).detach() + return F.linear(x, w_binary, + self.bias.to(x.dtype) if self.bias is not None else None) + +class NormedBinaryLinear(BinaryLinear): + "Binary linear with RMSNorm on input — for output projections receiving un-normalized activations." + def forward(self, x: Tensor) -> Tensor: + return super().forward(F.rms_norm(x, (x.size(-1),))) + +class GroupedBinaryLinear(nn.Module): + "Grouped linear with binary STE. Weight stored as 2D [groups*group_out, group_in] for binary quantization compatibility." + def __init__(self, in_features, out_features, groups=4, group_size=64, normed=False): + super().__init__() + assert in_features % groups == 0 and out_features % groups == 0 + self.groups = groups + self.group_in = in_features // groups + self.group_out = out_features // groups + self.group_size = group_size + self.normed = normed + self.weight = nn.Parameter(torch.randn(groups * self.group_out, self.group_in) * 0.02) + def forward(self, x: Tensor) -> Tensor: + if self.normed: + x = F.rms_norm(x, (x.size(-1),)) + w = self.weight.bfloat16() + g = self.group_size + w_g = w.reshape(-1, g) + scale = w_g.abs().mean(-1, keepdim=True).clamp(min=1e-8) + q = torch.where(w_g >= 0, torch.ones_like(w_g), -torch.ones_like(w_g)) + w_binary = w + ((q * scale).reshape(w.shape) - w).detach() + w_grouped = w_binary.reshape(self.groups, self.group_out, self.group_in) + bsz = x.shape[:-1] + x_g = x.reshape(*bsz, self.groups, self.group_in) + out = torch.einsum('...gi,goi->...go', x_g, w_grouped) + return out.reshape(*bsz, self.groups * self.group_out) + +class TverskyProjection(nn.Module): + "Tversky similarity: S = θ·f(A∩B) - α·f(A\\B) - β·f(B\\A). Three modes." + def __init__(self, in_features: int, out_features: int, num_features: int = 16, + group_size: int = 64, use_shared_features: bool = False, + membership: str = "sigmoid"): + super().__init__() + self.group_size = group_size + self.num_features = num_features + self.membership_type = membership + self.no_features_mode = (num_features == 0) + if not self.no_features_mode and not use_shared_features: + self.features = nn.Parameter(torch.empty(num_features, in_features).uniform_(-0.02, 0.02)) + else: + self.register_parameter('features', None) + self.prototypes = nn.Parameter(torch.empty(out_features, in_features).uniform_(-0.02, 0.02)) + self.theta = nn.Parameter(torch.tensor(1.0)) + self.alpha = nn.Parameter(torch.tensor(0.5)) + self.beta = nn.Parameter(torch.tensor(0.5)) + + def _binary_ste(self, w: Tensor) -> Tensor: + w_bf16 = w.bfloat16() + g = self.group_size + w_grouped = w_bf16.reshape(-1, g) + scale = w_grouped.abs().mean(-1, keepdim=True).clamp(min=1e-8) + q = torch.where(w_grouped >= 0, torch.ones_like(w_grouped), -torch.ones_like(w_grouped)) + w_binary = w_bf16 + ((q * scale).reshape(w_bf16.shape) - w_bf16).detach() + return w_binary.reshape(w.shape) + + def _membership(self, t: Tensor) -> Tensor: + if self.membership_type == "poly": + return torch.clamp(t * 5.0 / 4.0 + 0.5, 0.0, 1.0) + elif self.membership_type == "tanh": + return (torch.tanh(t * 5.0) + 1.0) * 0.5 + else: + return torch.sigmoid(t * 5.0) + + def forward(self, x: Tensor, shared_features: Tensor | None = None) -> Tensor: + proto = self._binary_ste(self.prototypes) + if self.no_features_mode: + x_f = x @ proto.t() + p_norm = F.normalize(proto, dim=-1) + p_f = p_norm @ p_norm.t() + else: + feat = (shared_features if shared_features is not None else self.features).float() + x_f = x @ feat.t() + p_f = proto @ feat.t() + x_s = self._membership(x_f) + p_s = self._membership(p_f) + x_a = x_f * x_s + p_a = p_f * p_s + t, a, b = self.theta.abs(), self.alpha.abs(), self.beta.abs() + return t * (x_a @ p_a.t()) - a * (x_a @ (1 - p_s).t()) - b * ((1 - x_s) @ p_a.t()) + +def restore_low_dim_params_to_fp32(module: nn.Module) -> None: + with torch.no_grad(): + for name, param in module.named_parameters(): + if (param.ndim < 2 or any(p in name for p in CTP)) and param.dtype != torch.float32: + param.data = param.data.float() + +class Rotary(nn.Module): + def __init__(self, dim: int, base: float = 10000.0, no_cache: bool = False, + rope_type: str = "rope", yarn_max_len: int = 4096, train_seq_len: int = 1024): + super().__init__() + self.no_cache = no_cache + inv_freq = 1.0 / (base ** (torch.arange(0, dim, 2, dtype=torch.float32) / dim)) + if rope_type == "yarn": + scale = train_seq_len / yarn_max_len + freq_idx = torch.arange(0, dim, 2, dtype=torch.float32) + ramp = torch.clamp((freq_idx / dim - 0.25) / 0.75, 0.0, 1.0) + inv_freq = inv_freq / (ramp * (1.0 / scale - 1.0) + 1.0) + self.register_buffer("inv_freq", inv_freq, persistent=False) + self._seq_len_cached = 0 + self._cos_cached: Tensor | None = None + self._sin_cached: Tensor | None = None + def forward(self, seq_len, device, dtype): + if self.no_cache: + t = torch.arange(seq_len, device=device, dtype=self.inv_freq.dtype) + freqs = torch.outer(t, self.inv_freq.to(device)) + return freqs.cos()[None, :, None, :].to(dtype=dtype), freqs.sin()[None, :, None, :].to(dtype=dtype) + if ( + self._cos_cached is None + or self._sin_cached is None + or self._seq_len_cached != seq_len + or self._cos_cached.device != device + ): + t = torch.arange(seq_len, device=device, dtype=self.inv_freq.dtype) + freqs = torch.outer(t, self.inv_freq.to(device)) + self._cos_cached = freqs.cos()[None, :, None, :] + self._sin_cached = freqs.sin()[None, :, None, :] + self._seq_len_cached = seq_len + return self._cos_cached.to(dtype=dtype), self._sin_cached.to(dtype=dtype) + +def apply_rotary_emb(x: Tensor, cos: Tensor, sin: Tensor) -> Tensor: + half = x.size(-1) // 2 + x1, x2 = x[..., :half], x[..., half:] + return torch.cat((x1 * cos + x2 * sin, x1 * (-sin) + x2 * cos), dim=-1) + +class CausalSelfAttention(nn.Module): + def __init__(self, dim, num_heads, num_kv_heads, rope_base, qk_gain_init, + group_size=64, attn_proj_type="standard", tversky_num_features=16, + tversky_feature_pools=0, no_cache=False, rope_type="rope", + yarn_max_len=4096, train_seq_len=1024, tversky_membership="sigmoid", + diff_attn=False): + super().__init__() + self.num_heads, self.num_kv_heads = num_heads, num_kv_heads + self.head_dim = dim // num_heads + self.diff_attn = diff_attn + self.q_size = self.num_heads * self.head_dim + self.kv_size = self.num_kv_heads * self.head_dim + self.c_qkv = BinaryLinear(dim, self.q_size + 2 * self.kv_size, bias=False, group_size=group_size) + self.proj = NormedBinaryLinear(dim, dim, bias=False, group_size=group_size) if attn_proj_type != "tversky" else None + if self.proj is not None: + self.proj._zero_init = True + self.tversky_proj = TverskyProjection( + dim, dim, num_features=tversky_num_features, group_size=group_size, + use_shared_features=(tversky_feature_pools > 0), + membership=tversky_membership, + ) if attn_proj_type == "tversky" else None + self.shared_features = None + self.q_gain = nn.Parameter(torch.full((num_heads,), qk_gain_init, dtype=torch.float32)) + if diff_attn: + self.diff_lambda = nn.Parameter(torch.full((num_heads,), 0.5, dtype=torch.float32)) + self.rotary = Rotary(self.head_dim, base=rope_base, no_cache=no_cache, + rope_type=rope_type, yarn_max_len=yarn_max_len, + train_seq_len=train_seq_len) + def forward(self, x: Tensor) -> Tensor: + bsz, seqlen, dim = x.shape + qkv_out = self.c_qkv(x) + q_out, k_out, v_out = qkv_out.split([self.q_size, self.kv_size, self.kv_size], dim=-1) + q = q_out.reshape(bsz, seqlen, self.num_heads, self.head_dim) + k = k_out.reshape(bsz, seqlen, self.num_kv_heads, self.head_dim) + v = v_out.reshape(bsz, seqlen, self.num_kv_heads, self.head_dim) + q, k = F.rms_norm(q, (q.size(-1),)), F.rms_norm(k, (k.size(-1),)) + cos, sin = self.rotary(seqlen, x.device, q.dtype) + q, k = apply_rotary_emb(q, cos, sin), apply_rotary_emb(k, cos, sin) + q = q * self.q_gain.to(dtype=q.dtype)[None, None, :, None] + if self.diff_attn: + half = self.head_dim // 2 + q1, q2 = q[..., :half], q[..., half:] + k1, k2 = k[..., :half], k[..., half:] + v1, v2 = v[..., :half], v[..., half:] + y1 = flash_attn_func(q1.contiguous(), k1.contiguous(), v1.contiguous(), causal=True) + y2 = flash_attn_func(q2.contiguous(), k2.contiguous(), v2.contiguous(), causal=True) + lam = self.diff_lambda.to(dtype=y1.dtype)[None, None, :, None] + y = torch.cat([y1 - lam * y2, y1 + lam * y2], dim=-1) + else: + y = flash_attn_func( + q.contiguous(), + k.contiguous(), + v.contiguous(), + causal=True + ) + y = y.reshape(bsz, seqlen, dim) + return self.tversky_proj(y, self.shared_features) if self.tversky_proj is not None else self.proj(y) + +class MLP(nn.Module): + def __init__(self, dim, mlp_mult, group_size=64, activation="swiglu", mlp_groups=0): + super().__init__() + hidden = mlp_mult * dim + self.activation = activation + if mlp_groups > 0: + if activation == "swiglu": + self.gate_up = GroupedBinaryLinear(dim, hidden * 2, groups=mlp_groups, group_size=group_size) + else: + self.fc = GroupedBinaryLinear(dim, hidden, groups=mlp_groups, group_size=group_size) + self.proj = GroupedBinaryLinear(hidden, dim, groups=mlp_groups, group_size=group_size, normed=True) + else: + if activation == "swiglu": + self.gate_up = BinaryLinear(dim, hidden * 2, bias=False, group_size=group_size) + else: + self.fc = BinaryLinear(dim, hidden, bias=False, group_size=group_size) + self.proj = NormedBinaryLinear(hidden, dim, bias=False, group_size=group_size) + self.proj._zero_init = True + def forward(self, x: Tensor) -> Tensor: + if self.activation == "swiglu": + gu = self.gate_up(x) + gate, up = gu.chunk(2, dim=-1) + return self.proj(F.silu(gate) * up) + elif self.activation == "relu": + return self.proj(torch.relu(self.fc(x))) + elif self.activation == "leaky_relu": + return self.proj(F.leaky_relu(self.fc(x), negative_slope=0.01)) + else: # relu2 + return self.proj(torch.relu(self.fc(x)).square()) + +class SmearModule(nn.Module): + def __init__(self, dim: int): + super().__init__() + self.gate = nn.Parameter(torch.zeros(dim, dtype=torch.float32)) + def forward(self, x: Tensor) -> Tensor: + cumsum = x.cumsum(dim=1) + counts = torch.arange(1, x.size(1) + 1, device=x.device, dtype=x.dtype).view(1, -1, 1) + smeared = cumsum / counts + gate = torch.tanh(self.gate.to(dtype=x.dtype)) + return x + gate * (smeared - x) + +class CausalConvRefiner(nn.Module): + "Causal Conv1d that refines hidden states using local n-gram context." + def __init__(self, dim: int, kernel_size: int = 3): + super().__init__() + self.kernel_size = kernel_size + self.conv = nn.Conv1d(dim, dim, kernel_size, padding=0, bias=False) + self.gate = nn.Parameter(torch.zeros(1, dtype=torch.float32)) + def forward(self, x: Tensor) -> Tensor: + h = x.permute(0, 2, 1) + h = F.pad(h, (self.kernel_size - 1, 0)) + h = self.conv(h) + h = h.permute(0, 2, 1) + return x + torch.tanh(self.gate.to(dtype=x.dtype)) * F.rms_norm(h, (h.size(-1),)) + +class Block(nn.Module): + def __init__(self, dim: int, num_heads: int, num_kv_heads: int, mlp_mult: int, + rope_base: float, qk_gain_init: float, group_size: int=64, + activation: str="swiglu", attn_proj_type: str="standard", + tversky_num_features: int=16, tversky_feature_pools: int=0, no_cache: bool=False, + smear: bool=False, rope_type: str="rope", yarn_max_len: int=4096, + train_seq_len: int=1024, tversky_membership: str="sigmoid", + diff_attn: bool=False, mlp_groups: int=0): + super().__init__() + self.attn_norm = RMSNorm() + self.mlp_norm = RMSNorm() + self.attn = CausalSelfAttention(dim, num_heads, num_kv_heads, rope_base, qk_gain_init, + group_size, attn_proj_type, tversky_num_features, + tversky_feature_pools, no_cache, rope_type, yarn_max_len, + train_seq_len, tversky_membership, diff_attn) + self.mlp = MLP(dim, mlp_mult, group_size, activation, mlp_groups) + self.attn_scale = nn.Parameter(torch.ones(dim, dtype=torch.float32)) + self.mlp_scale = nn.Parameter(torch.ones(dim, dtype=torch.float32)) + self.resid_mix = nn.Parameter(torch.stack((torch.ones(dim), torch.zeros(dim))).float()) + self.smear = SmearModule(dim) if smear else None + def forward(self, x: Tensor, x0: Tensor) -> Tensor: + mix = self.resid_mix.to(dtype=x.dtype) + x = mix[0] * x + mix[1] * x0 + n = self.attn_norm(x) + x = x + self.attn_scale.to(dtype=x.dtype) * self.attn(n) + x = x + self.mlp_scale.to(dtype=x.dtype) * self.mlp(self.mlp_norm(x)) + if self.smear is not None: + x = self.smear(x) + return x + +class GPT(nn.Module): + def __init__(self, vocab_size, num_layers, model_dim, num_heads, num_kv_heads, mlp_mult, + tie_embeddings, tied_embed_init_std, logit_softcap, rope_base, qk_gain_init, + group_size: int = 64, activation: str = "swiglu", mtp_heads_count: int = 0, + embed_dim: int = 0, attn_proj_type: str = "standard", logit_head_type: str = "standard", + tversky_num_features: int = 16, tversky_feature_pools: int = 0, + training_depth_recurrence: int=1, fp_storage=False, bigram_hash: bool=False, + softcap_type: str="poly", no_cache: bool=False, + smear: bool=False, rope_type: str="rope", yarn_max_len: int=4096, + train_seq_len: int=1024, tversky_membership: str="sigmoid", + diff_attn=False, mlp_groups=0, refiner=False, refiner_kernel=3): + super().__init__() + self.training_depth_recurrence = training_depth_recurrence + self.fp_storage = fp_storage + self.tie_embeddings = tie_embeddings + self.logit_softcap = logit_softcap + self.softcap_type = softcap_type + self.embed_dim = embed_dim if embed_dim > 0 else model_dim + self.tok_emb = QATEmbedding(vocab_size, self.embed_dim, fp_storage=fp_storage) + self.bigram_emb = QATEmbedding(vocab_size, self.embed_dim, fp_storage=fp_storage) if bigram_hash else None + if self.bigram_emb is not None: + nn.init.zeros_(self.bigram_emb.weight) + self.lm_head_correction = nn.Parameter( + torch.zeros(vocab_size, self.embed_dim)) if tie_embeddings == 2 else None + self.embed_proj = QATLinear(self.embed_dim, model_dim, bias=False, fp_storage=fp_storage) if self.embed_dim != model_dim else None + self.embed_proj_rev = QATLinear(model_dim, self.embed_dim, bias=False, fp_storage=fp_storage) if ( + self.embed_dim != model_dim and logit_head_type != "tversky") else None + self.num_encoder_layers = num_layers // 2 + self.num_decoder_layers = num_layers - self.num_encoder_layers + self.num_skip_weights = min(self.num_encoder_layers, self.num_decoder_layers) + self.skip_weights = nn.Parameter(torch.ones(self.num_skip_weights, model_dim, dtype=torch.float32)) + # Shared Tversky feature pools (if enabled and num_features > 0) + if attn_proj_type == "tversky" and tversky_feature_pools > 0 and tversky_num_features > 0: + self.tversky_feature_pools_list = nn.ParameterList([ + nn.Parameter(torch.empty(tversky_num_features, model_dim).uniform_(-0.02, 0.02)) + for _ in range(tversky_feature_pools) + ]) + else: + self.tversky_feature_pools_list = None + self.blocks = nn.ModuleList([ + Block(model_dim, num_heads, num_kv_heads, mlp_mult, rope_base, qk_gain_init, + group_size, activation, attn_proj_type, tversky_num_features, tversky_feature_pools, + no_cache, smear, rope_type, yarn_max_len, train_seq_len, tversky_membership, + diff_attn, mlp_groups) + for _ in range(num_layers) + ]) + # Inject shared feature pool references into attention layers + if self.tversky_feature_pools_list is not None: + for i, block in enumerate(self.blocks): + pool_idx = (i * tversky_feature_pools) // num_layers + block.attn.shared_features = self.tversky_feature_pools_list[pool_idx] + self.final_norm = RMSNorm() + self.refiner = CausalConvRefiner(model_dim, kernel_size=refiner_kernel) if refiner else None + self.mtp_heads = nn.ModuleList([ + nn.Linear(model_dim, vocab_size, bias=False) for _ in range(mtp_heads_count) + ]) + for h in self.mtp_heads: + nn.init.zeros_(h.weight) + self.logit_head_type = logit_head_type + if logit_head_type == "tversky" and tversky_num_features == 0 and vocab_size > 1024: + raise ValueError( + f"Tversky logit head with no-features mode creates O(V^2) = {vocab_size}x{vocab_size} " + f"matrix per forward pass. Use tversky_num_features > 0 or a smaller vocab." + ) + self.tversky_head = TverskyProjection( + model_dim, vocab_size, num_features=tversky_num_features, + membership=tversky_membership, + ) if logit_head_type == "tversky" else None + self.lm_head = QATLinear(model_dim, vocab_size, bias=False, fp_storage=fp_storage) + self.lm_head._zero_init = True + if self.lm_head is not None and (tie_embeddings or logit_head_type == "tversky"): + self.lm_head.weight.requires_grad_(False) + self.vocab_bias = nn.Parameter(torch.zeros(vocab_size, dtype=torch.float32)) + self._init_weights(tied_embed_init_std) + def _init_weights(self, tied_embed_init_std: float) -> None: + if self.tie_embeddings: + nn.init.normal_(self.tok_emb.weight, mean=0.0, std=tied_embed_init_std) + for module in self.modules(): + if isinstance(module, BinaryLinear) and not getattr(module, "_zero_init", False): + nn.init.normal_(module.weight, mean=0.0, std=0.02) + elif isinstance(module, nn.Linear) and getattr(module, "_zero_init", False): + nn.init.zeros_(module.weight) + def _compute_logits(self, x: Tensor) -> Tensor: + if self.tversky_head is not None: + logits_raw = self.tversky_head(x) + elif self.tie_embeddings: + if self.embed_proj_rev is not None: + proj = self.embed_proj_rev(x) + else: + proj = x + weight = self.tok_emb.weight + if self.lm_head_correction is not None: + weight = weight + self.lm_head_correction + logits_raw = F.linear(proj, weight.to(x.dtype)) + else: + logits_raw = self.lm_head(x) + return logits_raw + self.vocab_bias.to(x.dtype) + def _softcap(self, logits: Tensor) -> Tensor: + s = self.logit_softcap + if self.softcap_type == "tanh": + return s * torch.tanh(logits / s) + x_sc = torch.clamp(logits / s, -2.0, 2.0) + x2 = x_sc * x_sc + return s * torch.clamp(x_sc * (1.0 - x2 / 3.0 + x2 * x2 / 15.0), -1.0, 1.0) + def forward(self, input_ids: Tensor, target_ids: Tensor, reduction: str = "mean", temperature: float = 1.0) -> Tensor: + x = self.tok_emb(input_ids).float() + if self.bigram_emb is not None: + prev = F.pad(input_ids[:, :-1], (1, 0), value=0) + x = x + self.bigram_emb(prev).float() + if self.embed_proj is not None: + x = self.embed_proj(x) + x = F.rms_norm(x, (x.size(-1),)) + x0 = x + # U-Net style encoder/decoder with skip connections + skips = [] + for i in range(self.num_encoder_layers): + for _ in range(max(1, self.training_depth_recurrence)): + x = self.blocks[i](x, x0) + skips.append(x) + for i in range(self.num_decoder_layers): + bi = self.num_encoder_layers + i + if skips: + x = x + self.skip_weights[i].to(dtype=x.dtype) * skips.pop() + for _ in range(max(1, self.training_depth_recurrence)): + x = self.blocks[bi](x, x0) + x_normed = self.final_norm(x) + if self.refiner is not None: + x_normed = self.refiner(x_normed) + # Standard training/eval path + x_flat = x_normed.reshape(-1, x_normed.size(-1)) + targets = target_ids.reshape(-1) + logits = self._softcap(self._compute_logits(x_flat)) + if reduction == "none": + return F.cross_entropy(logits.float(), targets, reduction="none").reshape(input_ids.shape) + # Fused CE + Z-loss: single logsumexp computation + logits_f = logits.float() + lse = torch.logsumexp(logits_f, dim=-1) + target_logits = logits_f.gather(1, targets.unsqueeze(1)).squeeze(1) + main_loss = (lse - target_logits).mean() + 1e-4 * (lse ** 2).mean() + # Multi-token prediction auxiliary loss (training only) + if self.training and len(self.mtp_heads) > 0: + mtp_loss = torch.zeros((), device=main_loss.device) + for k, head in enumerate(self.mtp_heads): + shift = k + 2 + if target_ids.shape[1] > shift: + mtp_tgt = target_ids[:, shift:].reshape(-1) + mtp_in = x_normed[:, :target_ids.shape[1] - shift, :].reshape(-1, x_normed.shape[-1]) + mtp_loss = mtp_loss + F.cross_entropy(head(mtp_in).float(), mtp_tgt, reduction="mean") + main_loss = main_loss + 0.1 * mtp_loss / len(self.mtp_heads) + return main_loss + +# --------------------------------------------------------------------------- +# Validation +# --------------------------------------------------------------------------- +def build_luts(sp, vocab_size: int, device: torch.device): + sp_vocab_size = int(sp.vocab_size()) + table_size = max(sp_vocab_size, vocab_size) + base_bytes_np = np.zeros((table_size,), dtype=np.int16) + has_leading_space_np = np.zeros((table_size,), dtype=np.bool_) + is_boundary_token_np = np.ones((table_size,), dtype=np.bool_) + for token_id in range(sp_vocab_size): + if sp.is_control(token_id) or sp.is_unknown(token_id) or sp.is_unused(token_id): + continue + is_boundary_token_np[token_id] = False + if sp.is_byte(token_id): + base_bytes_np[token_id] = 1 + continue + piece = sp.id_to_piece(token_id) + if piece.startswith("\u2581"): + has_leading_space_np[token_id] = True + piece = piece[1:] + base_bytes_np[token_id] = len(piece.encode("utf-8")) + return ( + torch.tensor(base_bytes_np, dtype=torch.int16, device=device), + torch.tensor(has_leading_space_np, dtype=torch.bool, device=device), + torch.tensor(is_boundary_token_np, dtype=torch.bool, device=device), + ) + +def ld_val(pattern, seq_len, max_tok=int(os.environ.get("VAL_MAX_TOKENS", 500000))): + files = sorted(glob.glob(pattern)) + assert files, f"No files: {pattern}" + tok = torch.cat([ld_shard(Path(p)) for p in files]).contiguous() + if max_tok > 0: tok = tok[:max_tok + 1] + u = ((tok.numel() - 1) // seq_len) * seq_len + return tok[:u + 1] + +def eval_val(args, model, rank, world_size, device, grad_accum_steps, val_tokens, + base_bytes_lut, has_leading_space_lut, is_boundary_token_lut, temperature: float = 1.0): + local_batch_tokens = args.val_batch_size // (world_size * grad_accum_steps) + local_batch_seqs = max(1, local_batch_tokens // args.train_seq_len) + total_seqs = (val_tokens.numel() - 1) // args.train_seq_len + seq_start = (total_seqs * rank) // world_size + seq_end = (total_seqs * (rank + 1)) // world_size + loss_sum = torch.zeros((), device=device, dtype=torch.float64) + token_count = torch.zeros((), device=device, dtype=torch.float64) + byte_count = torch.zeros((), device=device, dtype=torch.float64) + model.eval() + with torch.inference_mode(): + for batch_start in range(seq_start, seq_end, local_batch_seqs): + batch_end = min(batch_start + local_batch_seqs, seq_end) + raw_start = batch_start * args.train_seq_len + raw_end = batch_end * args.train_seq_len + 1 + local = val_tokens[raw_start:raw_end].to(device=device, dtype=torch.int64) + x, y = local[:-1].reshape(-1, args.train_seq_len), local[1:].reshape(-1, args.train_seq_len) + with torch.autocast(device_type="cuda", dtype=torch.bfloat16): + batch_loss = model(x, y, temperature=temperature).detach() + n = float(y.numel()) + loss_sum += batch_loss.to(torch.float64) * n + token_count += n + prev_ids, tgt_ids = x.reshape(-1), y.reshape(-1) + tok_bytes = base_bytes_lut[tgt_ids].to(torch.int16) + tok_bytes += (has_leading_space_lut[tgt_ids] & ~is_boundary_token_lut[prev_ids]).to(torch.int16) + byte_count += tok_bytes.to(torch.float64).sum() + if dist.is_available() and dist.is_initialized(): + for t in (loss_sum, token_count, byte_count): + dist.all_reduce(t, op=dist.ReduceOp.SUM) + val_loss = loss_sum / token_count + bpb = (val_loss.item() / math.log(2.0)) * (token_count.item() / byte_count.item()) + model.train() + return float(val_loss.item()), float(bpb) + +def eval_val_sliding(args, model, rank, world_size, device, grad_accum_steps, val_tokens, + base_bytes_lut, has_leading_space_lut, is_boundary_token_lut, + stride: int = 64, temperature: float = 1.0): + seq_len = args.train_seq_len + batch_size = args.sliding_batch_size + total_tokens = val_tokens.numel() - 1 + loss_sum = torch.zeros((), device=device, dtype=torch.float64) + token_count = torch.zeros((), device=device, dtype=torch.float64) + byte_count = torch.zeros((), device=device, dtype=torch.float64) + all_starts = list(range(0, total_tokens - seq_len, stride)) + my_starts = all_starts[rank::world_size] + model.eval() + with torch.inference_mode(): + for i in range(0, len(my_starts), batch_size): + batch_starts = my_starts[i:i + batch_size] + starts_t = torch.tensor(batch_starts, dtype=torch.int64) + offsets = torch.arange(seq_len + 1, dtype=torch.int64) + indices = starts_t.unsqueeze(1) + offsets.unsqueeze(0) + local_batch = val_tokens[indices].to(device=device, dtype=torch.int64, non_blocking=True) + x = local_batch[:, :-1] + y = local_batch[:, 1:] + with torch.autocast(device_type="cuda", dtype=torch.bfloat16): + per_token_loss = model(x, y, reduction="none", temperature=temperature).detach() + for b, start in enumerate(batch_starts): + score_from = 0 if start == 0 else seq_len - stride + scored = per_token_loss[b, score_from:] + sx, sy = x[b, score_from:], y[b, score_from:] + loss_sum += scored.to(torch.float64).sum() + token_count += scored.numel() + tok_bytes = base_bytes_lut[sy].to(torch.int16) + tok_bytes += (has_leading_space_lut[sy] & ~is_boundary_token_lut[sx]).to(torch.int16) + byte_count += tok_bytes.to(torch.float64).sum() + if dist.is_available() and dist.is_initialized(): + for t in (loss_sum, token_count, byte_count): + dist.all_reduce(t, op=dist.ReduceOp.SUM) + val_loss = loss_sum / token_count + bpb = (val_loss.item() / math.log(2.0)) * (token_count.item() / byte_count.item()) + model.train() + return float(val_loss.item()), float(bpb) + +# --------------------------------------------------------------------------- +# Temperature scaling +# --------------------------------------------------------------------------- +def find_temp(args, base_model, rank, world_size, device, grad_accum_steps, + calibration_tokens, base_bytes_lut, has_leading_space_lut, + is_boundary_token_lut): + best_t, best_loss = 1.0, float("inf") + for t in [0.90, 0.95, 1.00, 1.05, 1.10]: + loss, _ = eval_val(args, base_model, rank, world_size, device, grad_accum_steps, + calibration_tokens, base_bytes_lut, has_leading_space_lut, + is_boundary_token_lut, temperature=t) + if loss < best_loss: + best_loss = loss + best_t = t + return best_t + +# --------------------------------------------------------------------------- +# Training +# --------------------------------------------------------------------------- +def main() -> None: + args = Hyperparameters() + code = Path(__file__).read_text(encoding="utf-8") + if args.matrix_optimizer != "adamw": + global ns_orth + ns_orth = torch.compile(ns_orth) + distributed = "RANK" in os.environ and "WORLD_SIZE" in os.environ + rank = int(os.environ.get("RANK", "0")) + world_size = int(os.environ.get("WORLD_SIZE", "1")) + local_rank = int(os.environ.get("LOCAL_RANK", "0")) + grad_accum_steps = max(1, 8 // world_size) + grad_scale = 1.0 / grad_accum_steps + if not torch.cuda.is_available(): + raise RuntimeError("CUDA is required") + device = torch.device("cuda", local_rank) + torch.cuda.set_device(device) + if distributed: + dist.init_process_group(backend="nccl", device_id=device) + dist.barrier() + master_process = rank == 0 + torch.backends.cuda.matmul.allow_tf32 = True + torch.backends.cudnn.allow_tf32 = True + os.makedirs("logs/cuda/", exist_ok=True) + logfile = f"logs/cuda/{args.run_id}.txt" if master_process else None + if master_process: + print(logfile) + def log0(msg: str, console: bool = True) -> None: + if not master_process: + return + if console: + print(msg) + if logfile: + with open(logfile, "a", encoding="utf-8") as f: + print(msg, file=f) + log0(code, console=False) + log0("=" * 100, console=False) + log0(f"Python {sys.version}", console=False) + log0(f"PyTorch {torch.__version__}", console=False) + random.seed(args.seed) + np.random.seed(args.seed) + torch.manual_seed(args.seed) + torch.cuda.manual_seed_all(args.seed) + sp = spm.SentencePieceProcessor(model_file=args.tokenizer_path) + val_tokens = ld_val(args.val_files, args.train_seq_len) + base_bytes_lut, has_leading_space_lut, is_boundary_token_lut = build_luts( + sp, args.vocab_size, device) + + # --- Model --- + base_model = GPT( + vocab_size=args.vocab_size, num_layers=args.num_layers, model_dim=args.model_dim, + num_heads=args.num_heads, num_kv_heads=args.num_kv_heads, mlp_mult=args.mlp_mult, + tie_embeddings=args.tie_embeddings, tied_embed_init_std=args.tied_embed_init_std, + logit_softcap=args.logit_softcap, rope_base=args.rope_base, qk_gain_init=args.qk_gain_init, + group_size=args.bitnet_group_size, activation=args.activation_type, mtp_heads_count=args.mtp_heads_count, + embed_dim=args.embed_dim, attn_proj_type=args.attn_proj_type, logit_head_type=args.logit_head_type, + tversky_num_features=args.tversky_num_features, tversky_feature_pools=args.tversky_feature_pools, + training_depth_recurrence=args.training_depth_recurrence, fp_storage=args.fp_storage, + bigram_hash=args.bigram_hash, softcap_type=args.softcap_type, no_cache=(args.compile_mode == "reduce-overhead"), + smear=args.smear, rope_type=args.rope_type, yarn_max_len=args.yarn_max_len, train_seq_len=args.train_seq_len, + tversky_membership=args.tversky_membership, diff_attn=args.diff_attn, + refiner=args.refiner, refiner_kernel=args.refiner_kernel, mlp_groups=args.mlp_groups, + ).to(device).bfloat16() + for module in base_model.modules(): + if isinstance(module, nn.Linear): + module.float() + restore_low_dim_params_to_fp32(base_model) + if base_model.lm_head is not None and (args.tie_embeddings or args.logit_head_type == "tversky"): + base_model.lm_head.weight.requires_grad_(False) + torch._dynamo.config.optimize_ddp = False + compiled_model = torch.compile(base_model, mode=args.compile_mode if args.compile_mode != "default" else None) + use_find_unused = args.untie_at_fraction > 0 or args.mtp_heads_count > 0 or not args.tie_embeddings + model = DDP(compiled_model, device_ids=[local_rank], broadcast_buffers=False, + find_unused_parameters=use_find_unused, + static_graph=not use_find_unused, + gradient_as_bucket_view=True) if distributed else compiled_model + + # --- Optimizers --- + _excl = {"tok_emb.weight", "lm_head.weight", "lm_head_correction"} + all_other_params = [(n, p) for n, p in base_model.named_parameters() + if not any(eh in n for eh in _excl)] + matrix_params = [p for n, p in all_other_params + if p.ndim == 2 and not any(pat in n for pat in CTP)] + scalar_params = [p for n, p in all_other_params + if p.ndim < 2 or any(pat in n for pat in CTP)] + token_lr = args.tied_embed_lr if args.tie_embeddings else args.embed_lr + opt_tok = torch.optim.Adam( + [{"params": [base_model.tok_emb.weight], "lr": token_lr, "base_lr": token_lr}], + betas=(args.beta1, args.beta2), eps=args.adam_eps, fused=True) + if args.matrix_optimizer == "adamw": + opt_muon = torch.optim.AdamW( + [{"params": matrix_params, "lr": args.adam_lr, "base_lr": args.adam_lr}], + betas=(args.beta1, args.beta2), eps=args.adam_eps, weight_decay=args.adam_wd, fused=True) + else: + opt_muon = Muon(matrix_params, lr=args.matrix_lr, momentum=args.muon_momentum, + backend_steps=args.muon_backend_steps, wd=args.muon_wd) + for g in opt_muon.param_groups: + g["base_lr"] = args.matrix_lr + opt_scalar = torch.optim.Adam( + [{"params": scalar_params, "lr": args.scalar_lr, "base_lr": args.scalar_lr}], + betas=(args.beta1, args.beta2), eps=args.adam_eps, fused=True) + opt_head = torch.optim.Adam( + [{"params": [base_model.lm_head.weight], "lr": 0.0, "base_lr": 0.0}], + betas=(args.beta1, args.beta2), eps=args.adam_eps, fused=True) + optimizers = [opt for opt in [opt_tok, opt_muon, opt_scalar, opt_head] if opt is not None] + if base_model.lm_head_correction is not None: + opt_corr = torch.optim.Adam( + [{"params": [base_model.lm_head_correction], + "lr": args.corr_weight_lr, "base_lr": args.corr_weight_lr}], + betas=(args.beta1, args.beta2), eps=args.adam_eps, fused=True) + optimizers.append(opt_corr) + + # --- Log all hyperparameters --- + log0("--- Hyperparameters ---", console=False) + log0(" ".join(f"{a}={getattr(args,a)}" for a in sorted(dir(args)) if not a.startswith("_") and a not in ("train_files","val_files") and not callable(getattr(args,a))), console=False) + n_params = sum(p.numel() for p in base_model.parameters()) + log0(f"params:{n_params} L:{args.num_layers} d:{args.model_dim} h:{args.num_heads} kv:{args.num_kv_heads} ws:{world_size} ga:{grad_accum_steps} s:{args.seed}") + # --- Data loader & helpers --- + train_loader = DistributedTokenLoader(args.train_files, rank, world_size, device) + def zero_grad_all(): + for opt in optimizers: + opt.zero_grad(set_to_none=True) + max_wallclock_ms = 1000.0 * args.max_wallclock_seconds if args.max_wallclock_seconds > 0 else None + def lr_mul(step: int, elapsed_ms: float): + if args.warmdown_fraction <= 0: + return 1.0 + if max_wallclock_ms is None: + warmdown_start = int(args.iterations * (1.0 - args.warmdown_fraction)) + return max((args.iterations - step) / max(args.iterations * args.warmdown_fraction, 1), 0.0) if step >= warmdown_start else 1.0 + warmdown_ms = max_wallclock_ms * args.warmdown_fraction + remaining_ms = max(max_wallclock_ms - elapsed_ms, 0.0) + return remaining_ms / max(warmdown_ms, 1e-9) if remaining_ms <= warmdown_ms else 1.0 + _seq_switched = False + _batch_switched = False + active_seq_len = args.seq_len_start if args.seq_len_start > 0 else args.train_seq_len + active_batch_tokens = args.batch_tokens_start if args.batch_tokens_start > 0 else args.train_batch_tokens + # --- Compiler warmup --- + if args.warmup_steps > 0: + _ms = {n: t.detach().cpu().clone() for n, t in base_model.state_dict().items()} + _os = [copy.deepcopy(o.state_dict()) for o in optimizers] + model.train() + for ws in range(args.warmup_steps): + zero_grad_all() + for mi in range(grad_accum_steps): + if distributed: model.require_backward_grad_sync = mi == grad_accum_steps - 1 + x, y = train_loader.next_batch(active_batch_tokens, active_seq_len, grad_accum_steps) + torch.compiler.cudagraph_mark_step_begin() + with torch.autocast(device_type="cuda", dtype=torch.bfloat16): loss = model(x, y) + (loss * grad_scale).backward() + for o in optimizers: o.step() + zero_grad_all() + log0(f"warmup:{ws+1}/{args.warmup_steps}") + base_model.load_state_dict(_ms, strict=True) + for o, s in zip(optimizers, _os): o.load_state_dict(s) + zero_grad_all() + train_loader = DistributedTokenLoader(args.train_files, rank, world_size, device) + + # --- EMA model --- + ema_model = None + _ema_started = False + _ema_steps = 0 + if args.ema: + ema_model = copy.deepcopy(base_model) + for p in ema_model.parameters(): + p.requires_grad_(False) + + # --- Main training loop --- + training_time_ms = 0.0 + stop_after_step: int | None = None + _untied = False + train_loss = torch.zeros((), device=device) + torch.cuda.synchronize() + t0 = time.perf_counter() + step = 0 + while True: + last_step = step == args.iterations or (stop_after_step is not None and step >= stop_after_step) + if last_step or (args.val_loss_every > 0 and step % args.val_loss_every == 0): + torch.cuda.synchronize() + training_time_ms += 1000.0 * (time.perf_counter() - t0) + val_loss, val_bpb = eval_val(args, model, rank, world_size, device, grad_accum_steps, + val_tokens, base_bytes_lut, has_leading_space_lut, is_boundary_token_lut) + log0(f"step:{step}/{args.iterations} val_loss:{val_loss:.4f} val_bpb:{val_bpb:.4f} " + f"train_time:{training_time_ms:.0f}ms") + torch.cuda.synchronize() + t0 = time.perf_counter() + if last_step: + if stop_after_step is not None and step < args.iterations: + log0(f"stopping_early: wallclock_cap train_time:{training_time_ms:.0f}ms step:{step}/{args.iterations}") + break + elapsed_ms = training_time_ms + 1000.0 * (time.perf_counter() - t0) + scale = lr_mul(step, elapsed_ms) + # Sequence length schedule + if args.seq_len_start > 0 and not _seq_switched: + if max_wallclock_ms is not None: + should_switch_seq = elapsed_ms >= args.seq_schedule_fraction * max_wallclock_ms + else: + should_switch_seq = step >= int(args.iterations * args.seq_schedule_fraction) + if should_switch_seq: + active_seq_len = args.train_seq_len + _seq_switched = True + torch._dynamo.reset() + train_loader = DistributedTokenLoader(args.train_files, rank, world_size, device) + log0(f"step:{step} seq_len_switch:{args.seq_len_start}->{active_seq_len}") + + # Batch size schedule + if args.batch_tokens_start > 0 and not _batch_switched: + if max_wallclock_ms is not None: + should_switch_batch = elapsed_ms >= args.batch_schedule_fraction * max_wallclock_ms + else: + should_switch_batch = step >= int(args.iterations * args.batch_schedule_fraction) + if should_switch_batch: + active_batch_tokens = args.train_batch_tokens + _batch_switched = True + log0(f"step:{step} batch_switch:{args.batch_tokens_start}->{active_batch_tokens}") + zero_grad_all() + train_loss.zero_() + for micro in range(grad_accum_steps): + if distributed: + model.require_backward_grad_sync = micro == grad_accum_steps - 1 + x, y = train_loader.next_batch(active_batch_tokens, active_seq_len, grad_accum_steps) + torch.compiler.cudagraph_mark_step_begin() + with torch.autocast(device_type="cuda", dtype=torch.bfloat16): + loss = model(x, y) + train_loss.add_(loss.detach()) + (loss * grad_scale).backward() + train_loss /= grad_accum_steps + + # Untie lm_head at configured fraction of training + if args.untie_at_fraction > 0: + if max_wallclock_ms is not None: + should_untie = not _untied and elapsed_ms >= args.untie_at_fraction * max_wallclock_ms + else: + should_untie = not _untied and step >= int(args.iterations * args.untie_at_fraction) + if should_untie and base_model.tie_embeddings: + with torch.no_grad(): + base_weight = base_model.tok_emb.weight.float() + if base_model.lm_head_correction is not None: + base_weight = base_weight + base_model.lm_head_correction.float() + if base_model.embed_proj_rev is not None: + full_weight = base_weight @ base_model.embed_proj_rev.weight.float() + else: + full_weight = base_weight + base_model.lm_head.weight.copy_(full_weight) + base_model.tie_embeddings = False + base_model.lm_head.weight.requires_grad_(True) + for g in opt_head.param_groups: + g["lr"] = g["base_lr"] = args.head_lr + _untied = True + torch._dynamo.reset() + log0(f"step:{step} untied lm_head (head_lr={args.head_lr})") + + # Muon momentum warmup + if args.matrix_optimizer != "adam": + frac = min(step / args.muon_momentum_warmup_steps, 1.0) if args.muon_momentum_warmup_steps > 0 else 1.0 + for g in opt_muon.param_groups: + g["momentum"] = (1 - frac) * args.muon_momentum_warmup_start + frac * args.muon_momentum + + # LR scheduling + for opt in optimizers: + for g in opt.param_groups: + g["lr"] = g["base_lr"] * scale + opt.step() + zero_grad_all() + # EMA update + if ema_model is not None: + if not _ema_started: + if max_wallclock_ms is not None: + should_start_ema = elapsed_ms >= args.ema_start_fraction * max_wallclock_ms + else: + should_start_ema = step >= int(args.iterations * args.ema_start_fraction) + if should_start_ema: + _ema_started = True + _ema_steps = 0 + with torch.no_grad(): + for ep, bp in zip(ema_model.parameters(), base_model.parameters()): + ep.data.copy_(bp.data) + log0(f"step:{step} ema_started") + if _ema_started: + _ema_steps += 1 + decay = min(args.ema_decay, (1.0 + _ema_steps) / (10.0 + _ema_steps)) + with torch.no_grad(): + for ep, bp in zip(ema_model.parameters(), base_model.parameters()): + ep.data.mul_(decay).add_(bp.data, alpha=1.0 - decay) + step += 1 + approx_ms = training_time_ms + 1000.0 * (time.perf_counter() - t0) + + if args.train_log_every > 0 and step % args.train_log_every == 0: + log0(f"step:{step}/{args.iterations} loss:{train_loss.item():.4f} t:{approx_ms:.0f}ms avg:{approx_ms/step:.1f}ms") + if args.churn_log_every > 0 and step % args.churn_log_every == 0: + log0(f"step:{step} churn:{churn_fn(base_model, args.bitnet_group_size):.4f}") + # Wallclock cap sync + if stop_after_step is None and max_wallclock_ms is not None and step % 10 == 0: + reached_cap = approx_ms >= max_wallclock_ms + if distributed: + cap_t = torch.tensor(int(reached_cap), device=device) + dist.all_reduce(cap_t, op=dist.ReduceOp.MAX) + reached_cap = bool(cap_t.item()) + if reached_cap: + stop_after_step = step + + # --- Serialization --- + if master_process: + sd = (ema_model if ema_model is not None and _ema_started else base_model).state_dict() + if base_model.tie_embeddings or args.logit_head_type == "tversky": + sd.pop("lm_head.weight", None) + + # Compute binary overrides for no-features Tversky prototypes + binary_overrides = set() + for n, m in base_model.named_modules(): + if isinstance(m, TverskyProjection) and m.no_features_mode: + binary_overrides.add(n + ".prototypes") + binary_overrides = binary_overrides or None + q_obj, q_stats = q_sd(sd, group_size=args.bitnet_group_size, fp_storage=args.fp_storage, binary_override_names=binary_overrides) + buf = io.BytesIO() + torch.save(q_obj, buf) + final_blob = lzma.compress(buf.getvalue(), preset=9) + with open("final_model.binary.ptz", "wb") as f: + f.write(final_blob) + artifact_bytes = len(final_blob) + code_bytes = len(code.encode("utf-8")) + total = artifact_bytes + code_bytes + log0(f"artifact:{artifact_bytes/1e6:.2f}MB binary:{q_stats['binary_params']}({q_stats['binary_bytes']}B) fp:{q_stats['fp_params']}({q_stats['fp_bytes']}B) code:{code_bytes}") + log0(f"budget:{total}/{16000000} ({total/1e6:.2f}/{16.00:.2f}MB) {'FITS' if total <= 16000000 else 'OVER'}") + if args.eval_depth_recurrence > 0: + base_model.training_depth_recurrence = args.eval_depth_recurrence + log0(f"eval_depth_recurrence:{args.eval_depth_recurrence}") + + # --- All ranks load roundtrip weights and evaluate --- + if distributed: + dist.barrier() + with open("final_model.binary.ptz", "rb") as f: + loaded = torch.load(io.BytesIO(lzma.decompress(f.read())), map_location="cpu", weights_only=False) + base_model.load_state_dict(deq_sd(loaded), strict=False) + if ema_model is not None: + ema_model.load_state_dict(deq_sd(loaded), strict=False) + torch._dynamo.reset() + q_val_loss, q_val_bpb = eval_val(args, model, rank, world_size, device, grad_accum_steps, + val_tokens, base_bytes_lut, has_leading_space_lut, is_boundary_token_lut) + log0(f"final_binary_roundtrip val_loss:{q_val_loss:.4f} val_bpb:{q_val_bpb:.4f}") + + opt_temp = 1.0 + if args.temp_scaling: + torch.cuda.synchronize() + t_temp = time.perf_counter() + calibration_tokens = train_loader.stream.take(65536).to(device) + opt_temp = find_temp(args, base_model, rank, world_size, device, grad_accum_steps, + calibration_tokens, base_bytes_lut, has_leading_space_lut, + is_boundary_token_lut) + torch.cuda.synchronize() + temp_time_ms = 1000.0 * (time.perf_counter() - t_temp) + log0(f"temp_scaling optimal_T:{opt_temp:.2f} eval_time:{temp_time_ms:.0f}ms") + + if args.sliding_eval: + torch.cuda.synchronize() + t_sliding = time.perf_counter() + sw_loss, sw_bpb = eval_val_sliding(args, base_model, rank, world_size, device, grad_accum_steps, + val_tokens, base_bytes_lut, has_leading_space_lut, + is_boundary_token_lut, stride=args.sliding_eval_stride, + temperature=opt_temp) + torch.cuda.synchronize() + sliding_time_ms = 1000.0 * (time.perf_counter() - t_sliding) + log0(f"final_sliding val_loss:{sw_loss:.4f} val_bpb:{sw_bpb:.4f} " + f"(stride={args.sliding_eval_stride}, T={opt_temp:.2f}) eval_time:{sliding_time_ms:.0f}ms") + + if distributed: + dist.destroy_process_group() + + +if __name__ == "__main__": + main() From 5b0266d23fdf53f36f6bc5da44aa6d7cb0d61f65 Mon Sep 17 00:00:00 2001 From: Ciprian-Florin Ifrim <94687473+CiprianFlorin-Ifrim@users.noreply.github.com> Date: Tue, 24 Mar 2026 20:21:00 +0000 Subject: [PATCH 2/2] Updated README.md for Non-record submission. --- .../README.md | 16 ++++++++-------- 1 file changed, 8 insertions(+), 8 deletions(-) diff --git a/records/track_non_record_16mb/2026-03-24_106M_Binary_Asymmetric_UNet_FP8_15L_8192BPE_YaRN_NeoMuon_Smear/README.md b/records/track_non_record_16mb/2026-03-24_106M_Binary_Asymmetric_UNet_FP8_15L_8192BPE_YaRN_NeoMuon_Smear/README.md index 2f7f235d78..e121c6c254 100644 --- a/records/track_non_record_16mb/2026-03-24_106M_Binary_Asymmetric_UNet_FP8_15L_8192BPE_YaRN_NeoMuon_Smear/README.md +++ b/records/track_non_record_16mb/2026-03-24_106M_Binary_Asymmetric_UNet_FP8_15L_8192BPE_YaRN_NeoMuon_Smear/README.md @@ -44,21 +44,21 @@ The results document linked here and in my repo showcases all methods and sweeps ### Architecture - **Binary quantisation:** 1 bit/param packs 60% more parameters per MB than ternary (1.6 bits/param), allowing 15 layers vs 10 within similar budget -- **4x relu² MLP:** same as ternary — relu² strictly dominates relu; 4x width outperforms 3x even with fewer layers at matched budget +- **4x relu² MLP:* relu² strictly dominates relu; 4x width outperforms 3x even with fewer layers at matched budget - **SmearGate:** blends each position with causal cumulative mean; adds 22ms/step overhead but provides -0.007 bpb at scale. Viable here because the run is not wallclock-constrained ### Training -- **NeoMuon** with 3 Newton-Schulz steps: same optimizer as ternary, effective for binary STE as well -- **50,000 steps unconstrained:** binary converges slower than ternary — at 4,000 steps (the 10-minute equivalent) binary lags by 0.025 bpb. Extended training closes the gap and surpasses ternary -- **524k batch tokens:** same optimal batch size as ternary +- **NeoMuon** with 3 Newton-Schulz steps optimizer +- **50,000 steps unconstrained:** binary converges slower than ternary (my other #640, at 4,000 steps (the 10-minute equivalent) binary lags by 0.025 bpb. Extended training closes the gap and surpasses ternary, showcasing with "unlimited compute" the models can be quite powerful. +- **524k batch tokens:** ### Evaluation -- **Temperature scaling (T=0.90):** same auto-calibrated grid as ternary -- **Sliding window (stride=16):** same evaluation protocol +- **Temperature scaling (T=0.90):** auto-calibrated grid +- **Sliding window (stride=16):** evaluation protocol ### Compression - **Bit-packing + LZMA (preset=9):** binary weights pack at exactly 1 bit/param before LZMA entropy coding -- **FP8 QAT (e4m3):** same as ternary for non-binary parameters. Clean roundtrip — binary has no zero state, so `mean(|Q|)=1.0` always; no shrinkage correction needed +- **FP8 QAT (e4m3):** for non-binary parameters. Clean roundtrip, binary has no zero state, so `mean(|Q|)=1.0` always; no shrinkage correction needed - **No EMA:** despite clean binary roundtrip math, EMA still hurts quality by 0.03 bpb in practice ## Setup and Run @@ -159,4 +159,4 @@ OMP_NUM_THREADS=1 torchrun --standalone --nproc_per_node=8 train_gpt_cuda_binary - [x] No test-time training on validation data - [x] No network calls during evaluation - [x] No external compute -- [ ] Train time <=600s — **non-record submission** (7,763s / 50,000 steps) +- [x] Train time: **non-record submission** (7,763s/ 2.2h / 50,000 steps)