-
Notifications
You must be signed in to change notification settings - Fork 48.4k
[Fizz] Enable the progressiveChunkSize option #33027
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
We still need to complete the fallback in case we outline the boundary.
@@ -348,6 +349,7 @@ export opaque type Request = { | |||
pendingRootTasks: number, // when this reaches zero, we've finished at least the root boundary. | |||
completedRootSegment: null | Segment, // Completed but not yet flushed root segments. | |||
completedPreambleSegments: null | Array<Array<Segment>>, // contains the ready-to-flush segments that make up the preamble | |||
byteSize: number, // counts the number of bytes accumulated in the shell |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Seems this isn't doing anything for now. But your comment didn't indicate that you have plans for tracking the size of the root. I suppose if the root was tiny we ought to still include even large boundaries because there isn't anything else being blocked that is meaningful?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm using it in #33029
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Since the very beginning we have had the `progressiveChunkSize` option but we never actually took advantage of it because we didn't count the bytes that we emitted. This starts counting the bytes by taking a pass over the added chunks each time a segment completes. That allows us to outline a Suspense boundary to stream in late even if it is already loaded by the time that back-pressure flow and in a `prerender`. Meaning it gets inserted with script. The effect can be seen in the fixture where if you have large HTML content that can block initial paint (thanks to [`rel="expect"`](#33016) but also nested Suspense boundaries). Before this fix, the paint would be blocked until the large content loaded. This lets us paint the fallback first in the case that the raw bytes of the content takes a while to download. You can set it to `Infinity` to opt-out. E.g. if you want to ensure there's never any scripts. It's always set to `Infinity` in `renderToHTML` and the legacy `renderToString`. One downside is that if we might choose to outline a boundary, we need to let its fallback complete. We don't currently discount the size of the fallback but really just consider them additive even though in theory the fallback itself could also add significant size or even more than the content. It should maybe really be considered the delta but that would require us to track the size of the fallback separately which is tricky. One problem with the current heuristic is that we just consider the size of the boundary content itself down to the next boundary. If you have a lot of small boundaries adding up, it'll never kick in. I intend to address that in a follow up. DiffTrain build for [8e9a5fc](8e9a5fc)
Since the very beginning we have had the `progressiveChunkSize` option but we never actually took advantage of it because we didn't count the bytes that we emitted. This starts counting the bytes by taking a pass over the added chunks each time a segment completes. That allows us to outline a Suspense boundary to stream in late even if it is already loaded by the time that back-pressure flow and in a `prerender`. Meaning it gets inserted with script. The effect can be seen in the fixture where if you have large HTML content that can block initial paint (thanks to [`rel="expect"`](facebook#33016) but also nested Suspense boundaries). Before this fix, the paint would be blocked until the large content loaded. This lets us paint the fallback first in the case that the raw bytes of the content takes a while to download. You can set it to `Infinity` to opt-out. E.g. if you want to ensure there's never any scripts. It's always set to `Infinity` in `renderToHTML` and the legacy `renderToString`. One downside is that if we might choose to outline a boundary, we need to let its fallback complete. We don't currently discount the size of the fallback but really just consider them additive even though in theory the fallback itself could also add significant size or even more than the content. It should maybe really be considered the delta but that would require us to track the size of the fallback separately which is tricky. One problem with the current heuristic is that we just consider the size of the boundary content itself down to the next boundary. If you have a lot of small boundaries adding up, it'll never kick in. I intend to address that in a follow up. DiffTrain build for [8e9a5fc](facebook@8e9a5fc)
…pletion (#33029) Follow up to #33027. This enhances the heuristic so that we accumulate the size of the currently written boundaries. Starting from the size of the root (minus preamble) for the shell. This ensures that if you have many small boundaries they don't all continue to get inlined. For example, you can wrap each paragraph in a document in a Suspense boundary to regain document streaming capabilities if that's what you want. However, one consideration is if it's worth producing a fallback at all. Maybe if it's like `null` it's free but if it's like a whole alternative page, then it's not. It's possible to have completely useless Suspense boundaries such as when you nest several directly inside each other. So this uses a limit of at least 500 bytes of the content itself for it to be worth outlining at all. It also can't be too small because then for example a long list of paragraphs can never be outlined. In the fixture I straddle this limit so some paragraphs are too small to be considered. An unfortunate effect of that is that you can end up with some of them not being outlined which means that they appear out of order. SuspenseList is supposed to address that but it's unfortunate. The limit is still fairly high though so it's unlikely that by default you'd start outlining anything within the viewport at all. I had to reduce the `progressiveChunkSize` by an order of magnitude in my fixture to try it out properly.
…pletion (#33029) Follow up to #33027. This enhances the heuristic so that we accumulate the size of the currently written boundaries. Starting from the size of the root (minus preamble) for the shell. This ensures that if you have many small boundaries they don't all continue to get inlined. For example, you can wrap each paragraph in a document in a Suspense boundary to regain document streaming capabilities if that's what you want. However, one consideration is if it's worth producing a fallback at all. Maybe if it's like `null` it's free but if it's like a whole alternative page, then it's not. It's possible to have completely useless Suspense boundaries such as when you nest several directly inside each other. So this uses a limit of at least 500 bytes of the content itself for it to be worth outlining at all. It also can't be too small because then for example a long list of paragraphs can never be outlined. In the fixture I straddle this limit so some paragraphs are too small to be considered. An unfortunate effect of that is that you can end up with some of them not being outlined which means that they appear out of order. SuspenseList is supposed to address that but it's unfortunate. The limit is still fairly high though so it's unlikely that by default you'd start outlining anything within the viewport at all. I had to reduce the `progressiveChunkSize` by an order of magnitude in my fixture to try it out properly. DiffTrain build for [18212ca](18212ca)
…pletion (facebook#33029) Follow up to facebook#33027. This enhances the heuristic so that we accumulate the size of the currently written boundaries. Starting from the size of the root (minus preamble) for the shell. This ensures that if you have many small boundaries they don't all continue to get inlined. For example, you can wrap each paragraph in a document in a Suspense boundary to regain document streaming capabilities if that's what you want. However, one consideration is if it's worth producing a fallback at all. Maybe if it's like `null` it's free but if it's like a whole alternative page, then it's not. It's possible to have completely useless Suspense boundaries such as when you nest several directly inside each other. So this uses a limit of at least 500 bytes of the content itself for it to be worth outlining at all. It also can't be too small because then for example a long list of paragraphs can never be outlined. In the fixture I straddle this limit so some paragraphs are too small to be considered. An unfortunate effect of that is that you can end up with some of them not being outlined which means that they appear out of order. SuspenseList is supposed to address that but it's unfortunate. The limit is still fairly high though so it's unlikely that by default you'd start outlining anything within the viewport at all. I had to reduce the `progressiveChunkSize` by an order of magnitude in my fixture to try it out properly. DiffTrain build for [18212ca](facebook@18212ca)
…pletion (facebook#33029) Follow up to facebook#33027. This enhances the heuristic so that we accumulate the size of the currently written boundaries. Starting from the size of the root (minus preamble) for the shell. This ensures that if you have many small boundaries they don't all continue to get inlined. For example, you can wrap each paragraph in a document in a Suspense boundary to regain document streaming capabilities if that's what you want. However, one consideration is if it's worth producing a fallback at all. Maybe if it's like `null` it's free but if it's like a whole alternative page, then it's not. It's possible to have completely useless Suspense boundaries such as when you nest several directly inside each other. So this uses a limit of at least 500 bytes of the content itself for it to be worth outlining at all. It also can't be too small because then for example a long list of paragraphs can never be outlined. In the fixture I straddle this limit so some paragraphs are too small to be considered. An unfortunate effect of that is that you can end up with some of them not being outlined which means that they appear out of order. SuspenseList is supposed to address that but it's unfortunate. The limit is still fairly high though so it's unlikely that by default you'd start outlining anything within the viewport at all. I had to reduce the `progressiveChunkSize` by an order of magnitude in my fixture to try it out properly. DiffTrain build for [18212ca](facebook@18212ca)
Since the very beginning we have had the
progressiveChunkSize
option but we never actually took advantage of it because we didn't count the bytes that we emitted. This starts counting the bytes by taking a pass over the added chunks each time a segment completes.That allows us to outline a Suspense boundary to stream in late even if it is already loaded by the time that back-pressure flow and in a
prerender
. Meaning it gets inserted with script.The effect can be seen in the fixture where if you have large HTML content that can block initial paint (thanks to
rel="expect"
but also nested Suspense boundaries). Before this fix, the paint would be blocked until the large content loaded. This lets us paint the fallback first in the case that the raw bytes of the content takes a while to download.You can set it to
Infinity
to opt-out. E.g. if you want to ensure there's never any scripts. It's always set toInfinity
inrenderToHTML
and the legacyrenderToString
.One downside is that if we might choose to outline a boundary, we need to let its fallback complete.
We don't currently discount the size of the fallback but really just consider them additive even though in theory the fallback itself could also add significant size or even more than the content. It should maybe really be considered the delta but that would require us to track the size of the fallback separately which is tricky.
One problem with the current heuristic is that we just consider the size of the boundary content itself down to the next boundary. If you have a lot of small boundaries adding up, it'll never kick in. I intend to address that in a follow up.