Skip to content

Improve environment setup & fix path errors & exec #12

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
30 changes: 21 additions & 9 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,13 +9,23 @@ See more details in our [blog post](https://blog.openai.com/better-language-mode
## Installation

Download the model data (needs [gsutil](https://cloud.google.com/storage/docs/gsutil_install)):
```bash
./download_model.sh 117M
```
sh download_model.sh 117M

Create virtual environment with [miniconda](https://conda.io/projects/conda/en/latest/user-guide/install/index.html#regular-installation):
```bash
conda create -y -n gpt-2 python=3.6
```

Install python packages:
And activate it:
```bash
conda activate gpt-2
```
pip3 install -r requirements.txt

Then install python packages:
```bash
pip install -r requirements.txt
```

## Unconditional sample generation
Expand All @@ -24,25 +34,27 @@ pip3 install -r requirements.txt
| --- |

To generate unconditional samples from the small model:
```bash
python generate_unconditional_samples.py | tee samples
```
python3 src/generate_unconditional_samples.py | tee samples
```

There are various flags for controlling the samples:
```
python3 src/generate_unconditional_samples.py --top_k 40 --temperature 0.7 | tee samples
```bash
python generate_unconditional_samples.py --top_k 40 --temperature 0.7 | tee samples
```

While we have not yet released GPT-2 itself, you can see some unconditional samples from it (with default settings of temperature 1 and no truncation) in `gpt2-samples.txt`.

## Conditional sample generation

To give the model custom prompts, you can use:
```
python3 src/interactive_conditional_samples.py
```bash
python interactive_conditional_samples.py
```

## Future work

We may release code for evaluating the models on various benchmarks.

We are still considering release of the larger models.

1 change: 1 addition & 0 deletions download_model.sh
100644 → 100755
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
#!/usr/bin/env bash
model=$1

mkdir -p models/$model
Expand Down
2 changes: 0 additions & 2 deletions src/generate_unconditional_samples.py → generate_unconditional_samples.py
100755 → 100644
Original file line number Diff line number Diff line change
@@ -1,5 +1,3 @@
#!/usr/bin/env python3

import fire
import json
import os
Expand Down
2 changes: 0 additions & 2 deletions src/interactive_conditional_samples.py → interactive_conditional_samples.py
100755 → 100644
Original file line number Diff line number Diff line change
@@ -1,5 +1,3 @@
#!/usr/bin/env python3

import fire
import json
import os
Expand Down
Empty file added src/__init__.py
Empty file.
2 changes: 1 addition & 1 deletion src/sample.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
import tensorflow as tf

import model
from src import model

def top_k_logits(logits, k):
if k == 0:
Expand Down