Skip to content

fix: lagging issue in ask AI during message streaming#16201

Merged
FelixMalfait merged 7 commits intomainfrom
fix/ai-chat-scroll-issue
Dec 1, 2025
Merged

fix: lagging issue in ask AI during message streaming#16201
FelixMalfait merged 7 commits intomainfrom
fix/ai-chat-scroll-issue

Conversation

@abdulrahmancodes
Copy link
Copy Markdown
Contributor

@abdulrahmancodes abdulrahmancodes commented Nov 30, 2025

Changes

1. Fixed streaming lag with throttle parameter

Problem: The AI chat was experiencing lag during message streaming because the messages array was being updated too frequently, causing all messages to re-render too quickly.

Solution: Added the experimental_throttle: 100 parameter to the useChat hook configuration. This throttles message updates during streaming to prevent excessive re-renders and improve performance.

2. Cleaned up useAgentChat hook return values

Context: The useAgentChat hook primarily returns values from the underlying useChat hook, so there wasn't significant room for improvement regarding the "umbrella hook" pattern. However, some unnecessary values were being returned that weren't needed.

Solution:

  • Removed input and handleInputChange from the useAgentChat hook return. These weren't needed since input state is already managed directly via Recoil state (agentChatInputState) in components.

Note

Throttle message streaming updates and switch AI chat input management from context to Recoil; minor scroll behavior tweak.

  • AI Chat performance:
    • Add experimental_throttle: 100 to useChat in useAgentChat to reduce re-render frequency during streaming.
  • State management:
    • Migrate input handling to Recoil via agentChatInputState; remove input and handleInputChange from AgentChatContext and useAgentChat returns.
    • Update AIChatTab and SendMessageButton to read/write input from Recoil and adjust hotkeys/disabled state accordingly.
  • UX behavior:
    • Remove smooth scroll behavior in useAgentChatScrollToBottom.

Written by Cursor Bugbot for commit 911b341. This will update automatically on new commits. Configure here.

@greptile-apps
Copy link
Copy Markdown
Contributor

greptile-apps bot commented Nov 30, 2025

Greptile Overview

Greptile Summary

Fixed lagging during AI message streaming by adding 100ms throttle to message updates and removing smooth scroll animation.

The changes address performance issues during streaming by:

  • Adding experimental_throttle: 100 to the useChat hook in useAgentChat.ts:113 to batch state updates every 100ms instead of on every chunk
  • Removing behavior: 'smooth' from scrollTo() in useAgentChatScrollToBottom.ts:18 to eliminate animation overhead during rapid scroll updates

The throttle reduces the frequency of React re-renders during streaming, while the instant scroll prevents animation queuing that caused visual lag. These optimizations work together since AgentChatMessagesEffect.tsx triggers scrollToBottom() on every message update via useEffect.

Confidence Score: 5/5

  • Safe to merge - minimal changes with clear performance benefits
  • Both changes are non-breaking optimizations: throttling is a standard performance pattern from the AI SDK, and removing smooth scroll only changes animation behavior without affecting functionality. No logic changes or new dependencies introduced.
  • No files require special attention

Important Files Changed

File Analysis

Filename Score Overview
packages/twenty-front/src/modules/ai/hooks/useAgentChat.ts 5/5 Added experimental_throttle: 100 to reduce re-render frequency during streaming
packages/twenty-front/src/modules/ai/hooks/useAgentChatScrollToBottom.ts 5/5 Removed behavior: 'smooth' from scrollTo to eliminate animation overhead

Sequence Diagram

sequenceDiagram
    participant User
    participant AgentChat
    participant useChat
    participant Stream
    participant ScrollEffect
    participant DOM

    User->>AgentChat: Send message
    AgentChat->>useChat: sendMessage()
    useChat->>Stream: Start streaming response
    
    loop Every 100ms (throttled)
        Stream->>useChat: Chunk received
        useChat->>AgentChat: Update messages state
        AgentChat->>ScrollEffect: Messages changed
        ScrollEffect->>DOM: scrollTo({top, behavior: instant})
    end
    
    Stream->>useChat: Stream complete
    useChat->>AgentChat: Final message state
    AgentChat->>User: Display complete response
Loading

Copy link
Copy Markdown
Contributor

@greptile-apps greptile-apps bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

2 files reviewed, no comments

Edit Code Review Agent Settings | Greptile

@github-actions
Copy link
Copy Markdown
Contributor

github-actions bot commented Nov 30, 2025

🚀 Preview Environment Ready!

Your preview environment is available at: http://bore.pub:51890

This environment will automatically shut down when the PR is closed or after 5 hours.

abdulrahmancodes and others added 5 commits December 1, 2025 17:16
…ed AIChatTab and SendMessageButton components to utilize agentChatInputState for input value and change handling, improving state consistency across components.
…context structure for improved clarity and maintainability.
@FelixMalfait FelixMalfait merged commit 19fc201 into main Dec 1, 2025
65 of 69 checks passed
@FelixMalfait FelixMalfait deleted the fix/ai-chat-scroll-issue branch December 1, 2025 14:31
@github-actions
Copy link
Copy Markdown
Contributor

github-actions bot commented Dec 1, 2025

Thanks @abdulrahmancodes for your contribution!
This marks your 85th PR on the repo. You're top 1% of all our contributors 🎉
See contributor page - Share on LinkedIn - Share on Twitter

Contributions

NotYen pushed a commit to NotYen/twenty-ym that referenced this pull request Dec 4, 2025
**Problem:** The AI chat was experiencing lag during message streaming
because the messages array was being updated too frequently, causing all
messages to re-render too quickly.

**Solution:** Added the `experimental_throttle: 100` parameter to the
`useChat` hook configuration. This throttles message updates during
streaming to prevent excessive re-renders and improve performance.

**Context:** The `useAgentChat` hook primarily returns values from the
underlying `useChat` hook, so there wasn't significant room for
improvement regarding the "umbrella hook" pattern. However, some
unnecessary values were being returned that weren't needed.

**Solution:**
- Removed `input` and `handleInputChange` from the `useAgentChat` hook
return. These weren't needed since input state is already managed
directly via Recoil state (`agentChatInputState`) in components.

<!-- CURSOR_SUMMARY -->
---

> [!NOTE]
> Throttle message streaming updates and switch AI chat input management
from context to Recoil; minor scroll behavior tweak.
>
> - **AI Chat performance**:
> - Add `experimental_throttle: 100` to `useChat` in `useAgentChat` to
reduce re-render frequency during streaming.
> - **State management**:
> - Migrate input handling to Recoil via `agentChatInputState`; remove
`input` and `handleInputChange` from `AgentChatContext` and
`useAgentChat` returns.
> - Update `AIChatTab` and `SendMessageButton` to read/write input from
Recoil and adjust hotkeys/disabled state accordingly.
> - **UX behavior**:
>   - Remove smooth scroll behavior in `useAgentChatScrollToBottom`.
>
> <sup>Written by [Cursor
Bugbot](https://cursor.com/dashboard?tab=bugbot) for commit
911b341. This will update automatically
on new commits. Configure
[here](https://cursor.com/dashboard?tab=bugbot).</sup>
<!-- /CURSOR_SUMMARY -->

---------

Co-authored-by: Félix Malfait <felix.malfait@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants