Skip to content

junaire/run.cu

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

22 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

RUN.CU

Caution

This tool currently is under heavily development.

Motivation

In the process of writing CUDA code, I often found my local GPU resources insufficient for the tasks at hand. While renting cloud GPU instances might seem like a viable solution, it can lead to unnecessary costs, especially when the instances remain active for longer durations than required.

To address this issue, I developed this tool, which allows users to submit their local CUDA files to the cloud for compilation and execution. The main advantage of this tool is that it spins up an instance only when needed, optimizing the pay-as-you-go model. By using this tool, users can efficiently run their CUDA programs on powerful remote GPUs without the financial burden of maintaining a cloud instance when it is not in use.

Installation

git clone https://github.com/junaire/run.cu
cd run.cu
pip install --user .

Usage

To use this tool, you must have a .rcc.toml file in your home directory, and have following credentials:

[credentials.autodl]
username = 15012341234
password = "XXXXXXX"

You also need to create at least one instance here

Syntax

Compile and run a local CUDA file

rcc examples/sgemm.cu

Compile and run a remote CUDA file via url

rcc https://raw.githubusercontent.com/junaire/run.cu/master/examples/sgemm.cu

Pass arguments to the executable

rcc examples/sgemm.cu --args 1 2 3

Pass flags to compilation process, note you need put them in the last

rcc examples/gemm.cu --args 2048 1024 512 --flags -lcublas
demo.webm

About

Compile & run a single CUDA file on the cloud GPUs

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages