Skip to content

Conversation

pacwoodson
Copy link

Adding support for AI-Mask extension as a local model provider in addition to ollama

AI-Mask is a wrapper on top of libraries such as web-llm and transformers.js and enable executing models directly in the browser. The extension caches the models and provide them to whatever web app need them, like this one.

A live demo of this is available here (Install the extension before)

@pacwoodson pacwoodson changed the title Local, in-browser inference with AI-Mask In-browser inference with AI-Mask Mar 22, 2024
@mckaywrigley
Copy link
Owner

Looks interesting but going to hold off for now.

If this project becomes more popular we'll certainly reconsider support!

@pacwoodson
Copy link
Author

I was hoping to get AI-mask to be known by integrating in some great open source projects that would benefit from it, like this one.

Maybe you could elaborate on what would refrain you from integrating it ? Would you have specific requirements to match your expectations ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants