Package | Description |
---|---|
com.oracle.bmc.generativeaiinference.model |
Modifier and Type | Class and Description |
---|---|
class |
DedicatedServingMode
The model’s serving mode is dedicated serving and has an endpoint on a dedicated AI cluster.
|
class |
OnDemandServingMode
The model’s serving mode is on-demand serving on a shared infrastructure.
|
Modifier and Type | Method and Description |
---|---|
ServingMode |
EmbedTextDetails.getServingMode() |
ServingMode |
GenerateTextDetails.getServingMode() |
ServingMode |
SummarizeTextDetails.getServingMode() |
ServingMode |
ChatDetails.getServingMode() |
Modifier and Type | Method and Description |
---|---|
EmbedTextDetails.Builder |
EmbedTextDetails.Builder.servingMode(ServingMode servingMode) |
GenerateTextDetails.Builder |
GenerateTextDetails.Builder.servingMode(ServingMode servingMode) |
SummarizeTextDetails.Builder |
SummarizeTextDetails.Builder.servingMode(ServingMode servingMode) |
ChatDetails.Builder |
ChatDetails.Builder.servingMode(ServingMode servingMode) |
Constructor and Description |
---|
ChatDetails(String compartmentId,
ServingMode servingMode,
BaseChatRequest chatRequest)
Deprecated.
|
EmbedTextDetails(List<String> inputs,
ServingMode servingMode,
String compartmentId,
Boolean isEcho,
EmbedTextDetails.Truncate truncate,
EmbedTextDetails.InputType inputType)
Deprecated.
|
GenerateTextDetails(String compartmentId,
ServingMode servingMode,
LlmInferenceRequest inferenceRequest)
Deprecated.
|
SummarizeTextDetails(String input,
ServingMode servingMode,
String compartmentId,
Boolean isEcho,
Double temperature,
String additionalCommand,
SummarizeTextDetails.Length length,
SummarizeTextDetails.Format format,
SummarizeTextDetails.Extractiveness extractiveness)
Deprecated.
|
Copyright © 2016–2024. All rights reserved.