Skip to content

Conversation

ad-astra-video
Copy link
Contributor

@ad-astra-video ad-astra-video commented Aug 20, 2025

Add basic plumbing to send data back through trickle protocol.

Dependent on go-livepeer PR livepeer/go-livepeer#3689

while not self.stop_event.is_set():
# Wait for 333ms or until stop event is set
try:
await asyncio.wait_for(self.stop_event.wait(), timeout=0.333)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

timeout value ( naming?) should be a constant. but ... I think it might be beneficial as a configurable option.

Copy link
Contributor Author

@ad-astra-video ad-astra-video Aug 20, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think configurable would be good as well. Added an option on StreamProcessor and TrickleClient that defaults to 333ms (3x as fast as a segment). a26d25c

Note that go-livepeer does not deserialize it, just shuffles it along with a scanner.Scan() that reads until a new line char is found.

@ad-astra-video ad-astra-video marked this pull request as ready for review August 20, 2025 23:45
Copy link
Collaborator

@eliteprox eliteprox left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This change lgtm. Only one small suggestion. Can we log when data_queue is full?

Comment on lines 125 to 126
except queue.Full:
pass
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we add one warning log line here with when the queue is full?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That is the output queue right? Not the data queue. Happy to add that in a separate PR while also changing it to wither an asyncio.Queue or a Deque

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, yes. It's a deque. Apparently this will not throw a QueueFull error, but can be logged once the length hits max https://docs.python.org/3.10/library/collections.html?highlight=deque#collections.deque.

Sure, let's handle this in another PR.

I'm thinking we can add something like this or go back to a regular queue if catching full is more performant:

if len(self.data_queue) ==  self.data_queue.maxlen:
    print("Discarding message:", self.data_queue[0])
self.data_queue.append(new_item)

@eliteprox eliteprox merged commit 7ef8af5 into main Aug 25, 2025
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants