-
-
Notifications
You must be signed in to change notification settings - Fork 39
Open
Description
@turboderp I noticed you guys had some trouble running our inference code. The main blocker is usually installing fast-hadamard-transform due to a ninja issue. The way I set up the environment is:
- Create a new conda env
- Install torch and ninja=1.11.1 (not 1.11.4 or whatever the latest is)
- Clone fast-hadamard-transform, cd into it, and run
python setup.py install
(alternatively,pip install -e .
also works) - Install the qtip kernels. Cd into the qtip-kernels folder and run
python setup.py install
- Install everything else in requirements.txt
To generate text with our kernels, run interactive_gen.py
with the commands in the repo. To evaluate perplexity and zeroshot performance, use the scripts in the eval folder.
Also, feel free to reach out to me ([email protected]) if you have any questions about QTIP. We're excited to see our stuff adopted in bigger projects!
Metadata
Metadata
Assignees
Labels
No labels