ChatOllama doesn't accept num_batch parameter #26661
ekinsenler
started this conversation in
Ideas
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Checked
Feature request
ChatOllama()
doesn't parsenum_batch
parameter that is supported by ollama.Motivation
To be able to utilize all available memory.
Proposal (If applicable)
No response
Beta Was this translation helpful? Give feedback.
All reactions