• Performs inferencing using the specified model and prompt, with optional settings.

    Parameters

    • model: string

      The model to use for inferencing.

    • prompt: string

      The prompt to use for inferencing.

    • Optionaloptions: InferencingOptions

      Optional settings for inferencing.

    Returns InferenceResult

    The result of the inference.