Skip to content

Commit

Permalink
Merge remote-tracking branch 'origin/DST-688-batch-processing-limitat…
Browse files Browse the repository at this point in the history
…ions' into DST-709-ui-feedback-batch-processing
  • Loading branch information
fg-nava committed Jan 10, 2025
2 parents 51061e5 + 91e3d27 commit 61482e4
Showing 1 changed file with 1 addition and 3 deletions.
4 changes: 1 addition & 3 deletions app/src/batch_process.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,9 +18,7 @@ async def batch_process(file_path: str, engine: ChatEngineInterface) -> str:
# Process questions sequentially to avoid thread-safety issues with LiteLLM
# Previous parallel implementation caused high CPU usage due to potential thread-safety
# concerns in the underlying LLM client libraries
processed_data = []
for q in questions:
processed_data.append(_process_question(q, engine))
processed_data = [_process_question(q, engine) for q in questions]

# Update rows with processed data while preserving original order
for row, data in zip(rows, processed_data, strict=True):
Expand Down

0 comments on commit 61482e4

Please sign in to comment.