Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
While working on the chat completion detection streaming implementation, I realized that we can simplify some things related to the batcher. As the
ChatCompletionBatcher
implementation ended up being simpler than expected, we do not actually need generic batches. It's essentially the same as theMaxProcessedIndexBatcher
(a single batcher could actually be used for both), but we can keep separate implementations in case chat completion batching requirements evolve.Changes related to this:
Batch
fromDetectionBatcher
andBatch
generics throughoutBatch = (u32, Chunk, Detections)
type alias; we can use this for all batchersDetectionBatchStream
and dropprocess_detection_stream()
functionsprocess_batch_detection_stream()
now rather than separate functionsDetectionBatchStream
returned a generic type, whileDetectionStream
returned a static type (even if they are the same)Other changes:
detector_id
fromDetectionStream
andDetectionBatcher::push()
; it's redundant asdetector_id
is set inDetection
NoopBatcher
(not needed for testing)InputId
type alias and just useu32