-
Notifications
You must be signed in to change notification settings - Fork 424
Cache, state, lifecycle events, and the future of Turbolinks 3 #551
Description
While working on the docs I realized that the partial replacement feature currently in master introduced significant issues with page caching and lifecycle events. Below is a summary of the problem and a list of possible solutions.
I apologize for the long description. This is a defining issue for Turbolinks 3 that needs careful consideration.
Problem
Say I have this:
<body>
<textarea data-rte></textarea>
</body>$(document).on 'ready page:load', ->
$('[data-rte]').each ->
fn = -> # some function
$(this).wrap('<div/>').on('click', fn)After the page loads, the resulting markup is this:
<body>
<div>
<textarea data-rte></textarea>
</div>
</body>Now say I:
- type "foo" in the textarea
- visit another page
- click the back button
Behavior in v2.5:
- the
<body>element is stored in the cache; its state (event listener and textarea value) is kept in memory - on popstate, the previous
<body>is reassigned to the document, and:- wrapping
<div>is present fnis still attached- textarea value is still "foo"
- wrapping
- (this actually leads to a memory leak, because when the
<body>is evicted from the cache, jQuery stills hold a reference tofn)
Behavior in master:
- the
<body>element'souterHTMLis stored in the cache; its state is cleaned up - on popstate, the previous
<body>is parsed and reassigned to the document, and:- wrapping
<div>is present fnis not attached to click event- textarea value is blank (because the value of form elements is stored in memory, not in the DOM, which only stores their initial / hard-coded value)
- wrapping
- (no memory leak)
(Side note: the reason why we now serialize the <body> element instead of keeping it in memory is that it isn't guaranteed to be replaced on Turbolinks.visit anymore. If we didn't and you did Turbolinks.visit("#{currentUrl}?q=test", change: ['container']), the container element would be replaced and we'd have no way to bring the old one back unless we somehow managed to keep track of the diff across pages + the old nodes in memory.)
Now, one way to get around this problem might be to bind to page:change (which also fires on history back/forward) instead of page:load, but then here's what happens after popstate:
- new
fnis attached - wrapping
<div>is wrapped in another<div> - textarea value is still lost
The DOM transformation is applied a second time.
So then we might do something like this:
$(document).on 'ready page:load', ->
$('[data-rte]').each ->
$(this).wrap('<div/>')
$(document).on 'page:change', ->
$('[data-rte]').each ->
fn = ->
$(this).on('click', fn)Ignoring the fact that this isn't how people write JavaScript, here comes another problem: say I partial-replace another element on the page (with Turbolinks.visit(url, change: ['id'])):
- both
page:loadandpage:changeare triggered on thedocument - the callbacks add a duplicate
<div>andfncallback to the existingdata-rteelement
To alleviate this problem, #537 started passing the affected nodes to these two events, allowing you to do this:
$(document).on 'ready page:load', (event) ->
event.data.forEach (node) ->
$('[data-rte]', node).each ->
$(this).wrap('<div/>')
$(document).on 'page:change', ->
event.data.forEach (node) ->
$('[data-rte]', node).each ->
fn = ->
$(this).on('click', fn)... which, in practice, is impossible to pull off (race conditions).
tl;dr: current master will break everyone's back button unless they write insanely complex JavaScript.
Paths forward
(1) Get rid of the cache entirely
This is what we do at Shopify / Turbograft.
Upsides:
- greater simplicity
- zero breakage
- no breaking change
Downsides:
- slower back/forward navigation
- loses state
Since a new page is loaded on history back/forward, we can keep using page:load the same way we would with Turbolinks 2.5 (no need to split DOM transformations / event listeners in page:load and page:update). For plugins that aren't idempotent, Turbograft also attaches the affected nodes to the page:load event (since it fires on partial replacement), so you'd end up with this:
$(document).on 'ready page:load', (event) ->
$('foo').each -> $(this).idemPotentPlugin()
event.data.forEach (node) ->
$('bar', node).each -> $(this).nonIdemPotentPlugin()To help with the "loses state" issue, we could keep the values of form elements in memory and try to re-apply them on back/forward (each element would need a unique id).
(2) Cache before executing JS
Like (1) but we cache the untouched <body>'s outerHTML of each page (before any JS is executed on it), like an HTTP cache.
Upsides:
- less breakage
- faster back/forward navigation than (1)
Downsides:
- there might be some issues going back to initial pages if code transforms the DOM before we get a chance to cache it (for XHRs it's easy since we control the response)
- the server would need to render the full page all the time. Doing
render :view, change: :key, layout: falseand updating the URL at the same time wouldn't be possible since we can't make an untouched<body>out of that (this was a requested feature that I was planning to address by makingrender :view, change: :keyupdate the current URL on GET requests; the use-case being that of a search form which updates a container and the URL at the same time) - loses state
(3) Cache after executing JS (current master)
To make my example work without split callbacks, we'd need to write the code like this:
$(document).on 'ready page:load', ->
$('[data-rte]').each ->
return if this.loaded
this.loaded = true
fn = ->
$(this).on('click', fn)
$(this).wrap('<div/>') if parentNotADiv()... and make one breaking change in Turbolinks: fire page:load on history back/forward.
Upsides:
- keep new feature
Downsides:
- breaking change
- loses some state
- unrealistic — all
page:loadcallbacks would need to be both idempotent and able to re-apply themselves on an already-transformed DOM
(Side note: if we go with (1), (2) or (3), I would drop the transition cache, since its speed benefit is significantly reduced by the fact that we have to re-instantiate all the state on the page before loading the real new page.)
(4) Cache nodes and "reverse-apply" partial replacements
Like I briefly explained above, one solution might be to keep the changed nodes in memory (like Turbolinks 2.5 except it wouldn't always be the <body>). So for example when this happens: Turbolinks.visit(url, change: ['container']), we would keep the previous container elements in memory and put them back when you hit back.
Upsides:
- not a big breaking change (same JS as in (1))
- less breakage
Downsides:
- unproven and potentially complex solution
(5) Drop partial replacement, keep Turbograft separate
If neither (1) or (2) are an option, and nobody is up for exploring (4), I would go with this, since I don't think (3) is acceptable.
Again sorry for the wall of text. Please let me know what you think.