ChatInterface flickers

I am seeing screen flicker when running a callback on ChatInterface. I have a screen recording at this link. Wondering if anyone has come across this?

Using version 1.3.6

1 Like

Thanks! Do you have a code snippet that I can try to reproduce?

2 Likes

Thanks for looking. Here is the whole script. I am using mlx framework and Phi-2.

from asyncio import sleep
import panel as pn
import mlx.core as mx
from llms.phi2 import phi2 as llm

pn.extension()

model_path = "phi-2"
seed = 1234
mx.random.seed(seed)
max_tokens = 2048
temp = 0.0

temp_widget = pn.widgets.FloatSlider(name="Temperature", start=0.0, end=1.0, value=temp, sizing_mode="stretch_width")
max_tokens_widget = pn.widgets.IntSlider(name="Max Tokens", start=1, end=2048, value=max_tokens, sizing_mode="stretch_width")
seed_widget = pn.widgets.IntInput(name="Random Seed", value=seed, sizing_mode="stretch_width")

@pn.cache
def load_model(model_path: str):
    model, tokenizer = llm.load_model(model_path)
    return model, tokenizer

def llm_callback(contents: str, user: str, instance: pn.chat.ChatInterface):   
    contents = f'INPUT: {contents}\nOUTPUT:'
    
    prompt = tokenizer(
        contents,
        return_tensors="np",
        return_attention_mask=False,
    )["input_ids"]

    max_tokens = max_tokens_widget.value
    temp = temp_widget.value
    seed = seed_widget.value
    if temp != 0:
        seed = mx.random.randint(0, 2**30).item()
        seed_widget.value = seed

    mx.random.seed(seed)
    prompt = mx.array(prompt)

    message = ""
    for token, _ in zip(llm.generate(prompt, model, temp), range(max_tokens)):
        mx.eval(token)
        if token.item() == tokenizer.eos_token_id:
            break
        s = tokenizer.decode(token.item())
        message += s
        yield message

model, tokenizer = load_model(model_path)
 
chat_interface = pn.chat.ChatInterface(
    callback=llm_callback, callback_exception='verbose', callback_user="Phi-2", reset_on_send=True
)

sidebar = pn.WidgetBox(
    seed_widget, temp_widget, max_tokens_widget
)

pn.template.FastListTemplate(
    main=[chat_interface],
    sidebar=[sidebar],
    sidebar_width=200
).servable()

Thanks for providing that. Do you know where I can get config.json?

  File "/Users/ahuang/repos/mlx-examples/llms/phi2/phi2.py", line 162, in load_model
    with open(model_path / "config.json", "r") as f:
FileNotFoundError: [Errno 2] No such file or directory: 'mlx_model/config.json'

Sorry. Yes instructions for downloading the model weights and config are here:

If you do this and put the script in the mlx-examples directory it should work.

pip install mlx
pip install transformers huggingface_hub hf_transfer
git clone https://github.com/ml-explore/mlx-examples.git
cd mlx-examples

# Download model
export HF_HUB_ENABLE_HF_TRANSFER=1
huggingface-cli download --local-dir-use-symlinks False --local-dir llms/phi2 mlx-community/phi-2

Thanks! I can’t seem to reproduce; what version of Panel do you have? I’m using main, which is close to pip install panel==1.3.6
ezgif.com-video-to-gif-converted (1)

Thanks for looking. I am using 1.3.6. Let me try restart the laptop.

If you still encounter this issue, maybe you could report more information about your system like which browser you use, etc?

Thanks everyone. I resolved the problem by using a fresh python environment. There was something panel depends on that was causing the flicker but I don’t know what it is.

1 Like