LlmInferenceRequest

class oci.generative_ai_inference.models.LlmInferenceRequest(**kwargs)

Bases: object

The base class for the inference requests.

Attributes

RUNTIME_TYPE_COHERE A constant which can be used with the runtime_type property of a LlmInferenceRequest.
RUNTIME_TYPE_LLAMA A constant which can be used with the runtime_type property of a LlmInferenceRequest.
runtime_type [Required] Gets the runtime_type of this LlmInferenceRequest.

Methods

__init__(**kwargs) Initializes a new LlmInferenceRequest object with values from keyword arguments.
get_subtype(object_dictionary) Given the hash representation of a subtype of this class, use the info in the hash to return the class of the subtype.
RUNTIME_TYPE_COHERE = 'COHERE'

A constant which can be used with the runtime_type property of a LlmInferenceRequest. This constant has a value of “COHERE”

RUNTIME_TYPE_LLAMA = 'LLAMA'

A constant which can be used with the runtime_type property of a LlmInferenceRequest. This constant has a value of “LLAMA”

__init__(**kwargs)

Initializes a new LlmInferenceRequest object with values from keyword arguments. This class has the following subclasses and if you are using this class as input to a service operations then you should favor using a subclass over the base class:

The following keyword arguments are supported (corresponding to the getters/setters of this class):

Parameters:runtime_type (str) – The value to assign to the runtime_type property of this LlmInferenceRequest. Allowed values for this property are: “COHERE”, “LLAMA”
static get_subtype(object_dictionary)

Given the hash representation of a subtype of this class, use the info in the hash to return the class of the subtype.

runtime_type

[Required] Gets the runtime_type of this LlmInferenceRequest. The runtime of the provided model.

Allowed values for this property are: “COHERE”, “LLAMA”

Returns:The runtime_type of this LlmInferenceRequest.
Return type:str