Skip to content

Add a new plugin to provide IMagicProvider#5

Merged
Zsailer merged 6 commits intojupyter-ai-contrib:mainfrom
jtpio:magic-provider
Feb 11, 2025
Merged

Add a new plugin to provide IMagicProvider#5
Zsailer merged 6 commits intojupyter-ai-contrib:mainfrom
jtpio:magic-provider

Conversation

@jtpio
Copy link
Member

@jtpio jtpio commented Feb 7, 2025

This should help with #4

Currently the extension relies on making a request to /api/ai/magic.

This PR moves the request part to a separate plugin, so it can more easily swapped by other extensions.

@github-actions
Copy link

github-actions bot commented Feb 7, 2025

Binder 👈 Launch a Binder on branch jtpio/jupyterlab-magic-wand/magic-provider

@jtpio jtpio changed the title Add a new plugin to provide a IMagicProvider Add a new plugin to provide IMagicProvider Feb 7, 2025
@Zsailer
Copy link
Collaborator

Zsailer commented Feb 7, 2025

This is great! Thank you @jtpio.

This makes sense to me.

For context, I'm hoping to make the backend more general to other "intelligence" features. The design here is pretty straightforward but powerful. The client POST's to the server, which has access to agents + models. The Jupyter event that comes back is a list of JupyterLab commands that drive the UI. In summary, "natural language prompt to JupyterLab commands". JupyterLab commands just need to be passed to the LLM as context. This simple bridge can be used by all sorts JupyterLab activities.

What I'd really like to do is offer a way for JupyterLab plugins to register (1) JupyterLab commands and (2) custom input schemas with server-side agents that will be passed to models as context. That way, any plugin can easily register new UX it brings and offer it to the agents + LLMs for usage.

I say all this to say, this API will probably change a bit going forward. As long as you're okay with this being a little experimental for now, I think exposing this token is great!

@jtpio
Copy link
Member Author

jtpio commented Feb 10, 2025

Thanks @Zsailer, sounds good!

As long as you're okay with this being a little experimental for now, I think exposing this token is great!

I think it's fine, all of this is still in early stages anyway. The main motivation is to allow for not having to depend on a specific backend to enable the AI features, for example:

  • leveraging local LLMs: ChromeAI, in-browser or a local Ollama directly
  • allowing to reuse almost all the frontend plugins and components, but let folks provide a different server extensions (or use a different server, for example jupyverse), if they want to

@Zsailer
Copy link
Collaborator

Zsailer commented Feb 10, 2025

This is great @jtpio. If you would like to pull this out of draft state, I can review, merge and release.

@jtpio
Copy link
Member Author

jtpio commented Feb 10, 2025

Sure, I pushed a couple more changes. But this is probably the kind of thing that is still likely to change quickly anyway, so happy to iterate on it.

Also feel free to make some edits to the PR directly if needed.

@jtpio jtpio marked this pull request as ready for review February 10, 2025 20:33
@Zsailer Zsailer added the enhancement New feature or request label Feb 11, 2025
@Zsailer Zsailer merged commit 0901abd into jupyter-ai-contrib:main Feb 11, 2025
5 checks passed
@jtpio jtpio deleted the magic-provider branch February 11, 2025 07:08
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants