LlmInferenceResponse¶
-
class
oci.generative_ai_inference.models.
LlmInferenceResponse
(**kwargs)¶ Bases:
object
The base class for inference responses.
Attributes
RUNTIME_TYPE_COHERE
A constant which can be used with the runtime_type property of a LlmInferenceResponse. RUNTIME_TYPE_LLAMA
A constant which can be used with the runtime_type property of a LlmInferenceResponse. runtime_type
[Required] Gets the runtime_type of this LlmInferenceResponse. Methods
__init__
(**kwargs)Initializes a new LlmInferenceResponse object with values from keyword arguments. get_subtype
(object_dictionary)Given the hash representation of a subtype of this class, use the info in the hash to return the class of the subtype. -
RUNTIME_TYPE_COHERE
= 'COHERE'¶ A constant which can be used with the runtime_type property of a LlmInferenceResponse. This constant has a value of “COHERE”
-
RUNTIME_TYPE_LLAMA
= 'LLAMA'¶ A constant which can be used with the runtime_type property of a LlmInferenceResponse. This constant has a value of “LLAMA”
-
__init__
(**kwargs)¶ Initializes a new LlmInferenceResponse object with values from keyword arguments. This class has the following subclasses and if you are using this class as input to a service operations then you should favor using a subclass over the base class:
The following keyword arguments are supported (corresponding to the getters/setters of this class):
Parameters: runtime_type (str) – The value to assign to the runtime_type property of this LlmInferenceResponse. Allowed values for this property are: “COHERE”, “LLAMA”, ‘UNKNOWN_ENUM_VALUE’. Any unrecognized values returned by a service will be mapped to ‘UNKNOWN_ENUM_VALUE’.
-
static
get_subtype
(object_dictionary)¶ Given the hash representation of a subtype of this class, use the info in the hash to return the class of the subtype.
-
runtime_type
¶ [Required] Gets the runtime_type of this LlmInferenceResponse. The runtime of the provided model.
Allowed values for this property are: “COHERE”, “LLAMA”, ‘UNKNOWN_ENUM_VALUE’. Any unrecognized values returned by a service will be mapped to ‘UNKNOWN_ENUM_VALUE’.
Returns: The runtime_type of this LlmInferenceResponse. Return type: str
-