-
Notifications
You must be signed in to change notification settings - Fork 38
Fix broken Vega/Altair plot rendering. #47
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
I looked a bit closer into this, and the situation is messy: JupyterLab writes vega-lite MIME as a regular JSON, nteract writes it as a stringified JSON. JupyterLab seems to be able to open and properly display either style, nteract can only open and display things when it is a regular JSON (yes, nteract can not open a file that it wrote itself, because it writes as stringified JSON but can't read that). My sense is that the ideal implementation would probably a) write as regular embedded JSON, like JupyterLab and b) be able to read either style, given that there are probably notebooks out there now that have been written by different notebooks. For a very simple Julia notebook that just creates one vega-lite plot, the written JupyterLab version looks like this (what I called regular JSON): {
"cells": [
{
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [],
"source": [
"using VegaLite"
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {},
"outputs": [
{
"data": {
"application/vnd.vegalite.v4+json": {
"data": {
"values": [
{
"x": 0.0038454847328945885,
"y": 0.8999489758945327
},
{
"x": 0.7396718227389907,
"y": 0.20696230977496288
}
]
},
"encoding": {
"x": {
"field": "x",
"title": null,
"type": "quantitative"
},
"y": {
"field": "y",
"title": null,
"type": "quantitative"
}
},
"mark": "point"
}
},
"execution_count": 3,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"@vlplot(:point, rand(2), rand(2))"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Julia 1.6.0",
"language": "julia",
"name": "julia-1.6"
},
"language_info": {
"file_extension": ".jl",
"mimetype": "application/julia",
"name": "julia",
"version": "1.6.0"
}
},
"nbformat": 4,
"nbformat_minor": 4
} Same code in nteract creates this (what I called stringified JSON): {
"cells": [
{
"cell_type": "code",
"source": [
"using VegaLite"
],
"outputs": [],
"execution_count": 1,
"metadata": {
"collapsed": true,
"jupyter": {
"source_hidden": false,
"outputs_hidden": false
},
"nteract": {
"transient": {
"deleting": false
}
},
"execution": {
"iopub.status.busy": "2021-06-22T00:19:03.447Z",
"iopub.execute_input": "2021-06-22T00:19:04.046Z",
"iopub.status.idle": "2021-06-22T00:19:08.220Z"
}
}
},
{
"cell_type": "code",
"source": [
"@vlplot(:point, rand(2), rand(2))"
],
"outputs": [
{
"output_type": "execute_result",
"execution_count": 2,
"data": {
"application/vnd.vegalite.v4+json": "{\"mark\":\"point\",\"encoding\":{\"x\":{\"field\":\"x\",\"title\":null,\"type\":\"quantitative\"},\"y\":{\"field\":\"y\",\"title\":null,\"type\":\"quantitative\"}},\"data\":{\"values\":[{\"x\":0.3886860314749825,\"y\":0.7599543600413519},{\"x\":0.4839360629916083,\"y\":0.7166149015288332}]}}",
},
"metadata": {}
}
],
"execution_count": 2,
"metadata": {
"collapsed": true,
"jupyter": {
"source_hidden": false,
"outputs_hidden": false
},
"nteract": {
"transient": {
"deleting": false
}
},
"execution": {
"iopub.status.busy": "2021-06-22T00:19:12.057Z",
"iopub.execute_input": "2021-06-22T00:19:12.371Z",
"iopub.status.idle": "2021-06-22T00:19:17.816Z"
}
}
}
],
"metadata": {
"kernel_info": {
"name": "python3"
},
"language_info": {
"file_extension": ".jl",
"name": "julia",
"mimetype": "application/julia",
"version": "1.6.0"
},
"kernelspec": {
"argv": [
"python",
"-m",
"ipykernel_launcher",
"-f",
"{connection_file}"
],
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"nteract": {
"version": "0.28.0"
}
},
"nbformat": 4,
"nbformat_minor": 0
} In both cases I manually removed other MIME types to keep things a bit shorter. |
@joshua-maros thanks for this PR, I've updated this and we'll try to get this merged, in the next few days @davidanthoff thanks for the detail, I've tested this fix and now works with JSON & stringified JSON (i.e. tested with your sample output). |
@@ -35,7 +35,8 @@ module.exports = { | |||
memoryLimit: 9096 | |||
}), | |||
new DefinePlugin({ | |||
scriptUrl: 'import.meta.url' | |||
scriptUrl: 'import.meta.url', | |||
'process.env': '{}' // utils references `process.env.xxx` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we happen to know more specifically what utils needs? What if something else is also looking at process.env?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What if something else is also looking at process.env?
if something tries to use it, the variable will not be defined.
In webworker we don't have access to process
, hence i defined env as an empty JSON (similar to env variable not being defined). Which i believe is the right thing, generally env variables are not defined in most places, unless you need a specific behavior.
Do we happen to know more specifically what utils needs?
Yes, documented further below where util
is used (util.promisify).
@@ -50,7 +51,7 @@ module.exports = { | |||
fallback: { | |||
fs: false, | |||
path: require.resolve('path-browserify'), | |||
util: require.resolve('util') | |||
util: require.resolve('util') // vega uses `util.promisify` (we need something that works outside node) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Documented what of util
is used.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@IanMatthewHuff /cc
The Vega transform expects a serialized JSON string, but it was being passed a regular JS object, causing this issue.