Class OllamaRequestResponse
java.lang.Object
com.developer.nefarious.zjoule.plugin.chat.ollama.OllamaRequestResponse
Represents the response received from the Ollama API after a chat request.
This class contains details about the generated chat response, including the
model used, message content, completion status, evaluation metrics, and timing details.
It is designed for serialization and deserialization using the Gson
library.
-
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionRetrieves the timestamp when the response was created.Retrieves the reason why the response generation was completed.int
Retrieves the number of tokens evaluated in the main evaluation phase.long
Retrieves the duration taken for the main evaluation phase.long
Retrieves the duration taken to load the model before response generation.Retrieves the generated chat message.getModel()
Retrieves the name of the AI model used.int
Retrieves the number of tokens evaluated in the prompt phase.long
Retrieves the duration taken to evaluate the prompt.long
Retrieves the total duration taken to generate the response.boolean
isDone()
Checks if the response generation is complete.void
setCreatedAt
(String createdAt) Sets the timestamp when the response was created.void
setDone
(boolean done) Sets whether the response generation is complete.void
setDoneReason
(String doneReason) Sets the reason why the response generation was completed.void
setEvalCount
(int evalCount) Sets the number of tokens evaluated in the main evaluation phase.void
setEvalDuration
(long evalDuration) Sets the duration taken for the main evaluation phase.void
setLoadDuration
(long loadDuration) Sets the duration taken to load the model before response generation.void
setMessage
(OllamaChatMessage message) Sets the generated chat message.void
Sets the name of the AI model used.void
setPromptEvalCount
(int promptEvalCount) Sets the number of tokens evaluated in the prompt phase.void
setPromptEvalDuration
(long promptEvalDuration) Sets the duration taken to evaluate the prompt.void
setTotalDuration
(long totalDuration) Sets the total duration taken to generate the response.
-
Constructor Details
-
OllamaRequestResponse
public OllamaRequestResponse()
-
-
Method Details
-
getModel
Retrieves the name of the AI model used.- Returns:
- the model name as a
String
.
-
setModel
Sets the name of the AI model used.- Parameters:
model
- the model name to set.
-
getCreatedAt
Retrieves the timestamp when the response was created.- Returns:
- the creation timestamp as a
String
.
-
setCreatedAt
Sets the timestamp when the response was created.- Parameters:
createdAt
- the creation timestamp to set.
-
getMessage
Retrieves the generated chat message.- Returns:
- the chat message as an
OllamaChatMessage
.
-
setMessage
Sets the generated chat message.- Parameters:
message
- the chat message to set.
-
getDoneReason
Retrieves the reason why the response generation was completed.- Returns:
- the completion reason as a
String
.
-
setDoneReason
Sets the reason why the response generation was completed.- Parameters:
doneReason
- the completion reason to set.
-
isDone
public boolean isDone()Checks if the response generation is complete.- Returns:
true
if the response is fully generated,false
otherwise.
-
setDone
public void setDone(boolean done) Sets whether the response generation is complete.- Parameters:
done
-true
if the response is fully generated,false
otherwise.
-
getTotalDuration
public long getTotalDuration()Retrieves the total duration taken to generate the response.- Returns:
- the total duration in milliseconds.
-
setTotalDuration
public void setTotalDuration(long totalDuration) Sets the total duration taken to generate the response.- Parameters:
totalDuration
- the total duration in milliseconds.
-
getLoadDuration
public long getLoadDuration()Retrieves the duration taken to load the model before response generation.- Returns:
- the load duration in milliseconds.
-
setLoadDuration
public void setLoadDuration(long loadDuration) Sets the duration taken to load the model before response generation.- Parameters:
loadDuration
- the load duration in milliseconds.
-
getPromptEvalCount
public int getPromptEvalCount()Retrieves the number of tokens evaluated in the prompt phase.- Returns:
- the prompt evaluation token count.
-
setPromptEvalCount
public void setPromptEvalCount(int promptEvalCount) Sets the number of tokens evaluated in the prompt phase.- Parameters:
promptEvalCount
- the prompt evaluation token count to set.
-
getPromptEvalDuration
public long getPromptEvalDuration()Retrieves the duration taken to evaluate the prompt.- Returns:
- the prompt evaluation duration in milliseconds.
-
setPromptEvalDuration
public void setPromptEvalDuration(long promptEvalDuration) Sets the duration taken to evaluate the prompt.- Parameters:
promptEvalDuration
- the prompt evaluation duration in milliseconds.
-
getEvalCount
public int getEvalCount()Retrieves the number of tokens evaluated in the main evaluation phase.- Returns:
- the evaluation token count.
-
setEvalCount
public void setEvalCount(int evalCount) Sets the number of tokens evaluated in the main evaluation phase.- Parameters:
evalCount
- the evaluation token count to set.
-
getEvalDuration
public long getEvalDuration()Retrieves the duration taken for the main evaluation phase.- Returns:
- the evaluation duration in milliseconds.
-
setEvalDuration
public void setEvalDuration(long evalDuration) Sets the duration taken for the main evaluation phase.- Parameters:
evalDuration
- the evaluation duration in milliseconds.
-