Skip to content

Conversation

@cpsievert
Copy link

Type of changes

  • Bug fix
  • New feature
  • Documentation / docstrings
  • Tests
  • Other

Checklist

  • I've run the latest black with default args on new code.
  • I've updated CHANGELOG.md and CONTRIBUTORS.md where appropriate.
  • I've added tests for new code.
  • I accept that @willmcgugan may be pedantic in the code review.

Description

As described in #3263, Live(vertical_overflow="visible") has the unfortunate behavior of duplicating content when the content update exceeds the height of the console. This is especially unfortunate for a few reasons:

  • It seems there is no way to get rid of the duplicated content (i.e., transient=True doesn't help to avoid this).
  • There are no other vertical_overflow options that allows for the "newest" content to be visible.
  • In today's Gen AI world where markdown often is received in chunks, it's really useful to have a live display where a (possibly long) markdown string accumulates over time.

This PR proposes a new option vertical_overflow="crop_above" which does the reverse of vertical_overflow="crop" (it displays only the bottom portion of the content instead of the top). It has the nice behavior of always making the "newest" content visible, but without the downside of duplicated content. Here's a demo:

import requests
import time

from rich.live import Live
from rich.markdown import Markdown

readme = requests.get(
    "https://raw.githubusercontent.com/posit-dev/py-shiny/refs/heads/main/README.md"
)
readme_chunks = readme.text.replace("\n", " \n ").split(" ")[: 200]

content = ""
with Live(auto_refresh=False, vertical_overflow="crop_above") as live:
    for chunk in readme_chunks:
        content += chunk + " "
        time.sleep(0.01)
        live.update(Markdown(content), refresh=True)
rich-crop-above.mp4

And note that if you change vertical_overflow="crop_above" to vertical_overflow="visible", this is the behavior:

rich-visible.mp4

I'm happy to write tests or anything else you need if you like this overall direction

@LeoYoung-code
Copy link

👍

@david-zlai
Copy link

would love to see this!

@willmcgugan - if we can get a r?

@timesler
Copy link

This feature would be amazing for CLI LLM chat apps!

@timesler
Copy link

@willmcgugan sorry to spam you. I wanted to frame this a bit and why I think it's super important. LLM agents are proliferating pretty wildly at the moment, and one of the quickest ways to build really effective UIs to give users access to them is via CLIs. IMO, rich is the best tool for building beautiful CLI apps (in python or otherwise), and I think that this specific feature is the single biggest blocker to it being near perfect for that use case

Copy link
Member

@willmcgugan willmcgugan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Going to need an update to the docs (live.rst)

@willmcgugan
Copy link
Member

You would be better off using Textual for LLM output. Better Markdown rendering as well.

@willmcgugan
Copy link
Member

Rich just isn't the right tool for this kind of interface. Use Textual for Markdown streaming, and scrollable windows.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants