Skip to content

[IMPORTANT] torchchat sunset #1543

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
Jack-Khuu opened this issue May 20, 2025 · 3 comments
Open

[IMPORTANT] torchchat sunset #1543

Jack-Khuu opened this issue May 20, 2025 · 3 comments
Assignees

Comments

@Jack-Khuu
Copy link
Contributor

As of May 19th 2025, we are halting active development on torchchat.

The original intent of torchchat was to both demonstrate how to run LLM inference using PyTorch and improve the performance and functionality of the entire PyTorch ecosystem.

Since torchchat’s launch, we’ve seen vLLM become the dominant player for server-side LLM inference. We’re ecstatic to have vLLM join the PyTorch Ecosystem and recommend folks use them for hosting LLMs in server production environments. Given the growth of vLLM and others, we do not see the need to maintain an active demonstration of how to run LLM inference using PyTorch.

We are very proud of the performance and functionality improvements we saw in the PyTorch ecosystem over the last year, including:

  • The performance of LLM inference increase by multiples for every device we support (CUDA, CPU, MPS, ARM, etc)
  • Working code, demonstrating how to run LLM inference for all the major execution modes (Eager, Compile, AOTI and ET) giving users a starting point for using PyTorch for LLM inference from server to embedded devices and everything in between
  • Quantization expand to support the most popular schemes and bit sizes
  • torchchat become the testing grounds for new advancements (experimental torchao kernels, MPS compile, AOTI Packaging)

There’s still plenty of exciting work to do across the LLM Inference space and PyTorch will stay invested in improving things.
We appreciate and thank everyone in the community for all that you’ve contributed.

Thanks to our contributors:
@mikekgfb @Jack-Khuu @metascroy @malfet @larryliu0820 @kirklandsign @swolchok @vmpuri @kwen2501 @Gasoonjia @orionr @guangy10 @byjlw @lessw2020 @mergennachin @GregoryComer @shoumikhin @kimishpatel @manuelcandales @lucylq @desertfire @gabe-l-hart @seemethere @iseeyuan @jerryzh168 @leseb @yanbing-j @mreso @fduwjj @Olivia-liu @angelayi @JacobSzwejbka @ali-khosh @nlpfollower @songhappy @HDCharles @jenniew @silverguo @zhenyan-zhang-meta @ianbarber @dbort @kit1980 @mcr229 @georgehong @krammnic @xuedinge233 @anirudhs001 @shreyashah1903 @soumith @TheBetterSolution @codereba @jackzhxng @KPCOFGS @kuizhiqing @kartikayk @nobelchowdary @mike94043 @vladoovtcharov @prideout @sanchitintel @cbilgin @jeffdaily @infil00p @msaroufim @zhxchen17 @vmoens @wjunLu

-PyTorch Team

@Jack-Khuu Jack-Khuu self-assigned this May 20, 2025
@Gasoonjia
Copy link
Contributor

It is my pleasure contributing to such wonderful project. Hope to see the new things in the future!

@KPCOFGS
Copy link
Contributor

KPCOFGS commented May 20, 2025

Likewise, I am very glad that I could contribute something to open source projects like this one

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants