> ## ⚠️ 🔑 **Important: Do Not Store OpenAI API Keys Directly in Code!** 🔑⚠️
>
> When working with OpenAI's API, it's essential for security reasons not to embed your API keys directly within the codebase of your application. Embedding keys directly in your source code can expose them to unauthorized users and malicious actors, leading to potential misuse.
>
> **Best Practices:**
>
> 1. **Environment Variables:** Store your API keys in environment variables and access them in your code. This way, they aren't hard-coded into your application and can be managed securely.
> 2. **Secrets Management:** Use a secure secrets management tool or service to store and retrieve your API keys. Tools like AWS Secrets Manager, HashiCorp Vault, Firebase, CloudKit, or Azure Key Vault are designed to store, manage, and retrieve sensitive data. This way, the user does not have on-device access to the keys.
> 3. **.gitignore:** If you store your API keys in a configuration file, always ensure that this file is added to `.gitignore` or a similar mechanism to prevent it from being accidentally committed to a source control system.
> 4. **Regularly Rotate Keys:** Rotate your API keys periodically to mitigate the potential impact of a leak.
>
> Remember, security is of utmost importance. Always prioritize the safe handling of sensitive information like API keys!
>
> For more information, check out [this article](https://nshipster.com/secrets/) all about API key management for Swift developers.
### Image
Dive into the next evolution of AI-powered image generation with DALL-E 2 (and coming soon, DALL-E 3), brought to you by OpenAI. Building upon the legacy of its predecessor, DALL-E 2 offers enhanced capabilities to materialize intricate visuals from textual prompts, creating a seamless bridge between language and imagery. Developers can harness this groundbreaking technology to enrich user experiences, craft dynamic content, and inspire new avenues of creativity. Revolutionize your applications and projects by integrating the cutting-edge power of DALL-E 2.
There are three endpoints provided by OpenAI to interact with DALL-E 2:
- `createImage` is the main endpoint that allows direct image generation from a text prompt. Here's an example on how to use the endpoint, along with the correlating output:
```swift
do {
let imageParam = ImageParameters(
// A text description of the desired image(s).
prompt: "An armchair in the shape of an avocado",
// The size of the generated images.
resolution: .large,
// The format in which the generated images are returned.
responseFormat: .base64Json
)
let result = try await openAi.createImage(
parameters: imageParam
)
let b64Image = result.data[0].image
let image = try openAi.decodeBase64Image(b64Image)
} catch {
// Insert your own error handling method here.
}
```
- `createImageEdit` allows the developer to create edits based on the original image inputted, along with a transparent-enabled image, with the transparent portions being the areas to edit in; and a prompt of the original image with the edit the developer wants. Here's an example on how to use the endpoint, along with the correlating output:
```swift
do {
let imageEditParam = try ImageEditParameters(
// The image to edit.
image: image,
// An additional image whose fully transparent areas indicate where image should be edited.
mask: mask,
// A text description of the desired image(s).
prompt: "The Mona Lisa wearing a beret, in the style of Leonardo DaVinci",
// The size of the generated images.
resolution: .large,
// The format in which the generated images are returned.
responseFormat: .base64Json
)
let imageResponse = try await openAI.generateImageEdits(
parameters: imageEditParam
)
let image = try openAI.decodeBase64Image(imageResponse.data[0].image)
} catch {
// Insert your own error handling method here.
}
```
|  |  |  |
|-----------------------|---------------------------|---------------------------------|
| **Original** | **Mask** | **Edit** |
- `createImageVariation` allows the developer to create variations of a given input image. Here's an example on how to use the endpoint, along with the correlating output:
```swift
do {
let imageVariationParam = try ImageVariationParameters(
// The image to use as the basis for the variation(s).
image: image,
// The size of the generated images.
resolution: .large,
// The format in which the generated images are returned.
responseFormat: .base64Json
)
let variationResponse = try await openAI.generateImageVariations(
parameters: imageVariationParam
)
self.image = try openAI.decodeBase64Image(
variationResponse.data[0].image
)
} catch {
// Insert your own error handling method here.
}
```
|  |  |
|-----------------------|---------------------------------|
| **Original** | **Variation** |
### Chat
ChatGPT, built on OpenAI's GPT-4 architecture, is a cutting-edge conversational AI model. It provides developers with a robust tool for integrating advanced natural language processing capabilities into applications. Using ChatGPT can enhance user interactions, improve efficiency, and offer AI-driven solutions in various use cases. Incorporate GPT-4's strength into your projects for tangible results.
There is a single endpoint for this feature, however, this SDK splits the endpoint into two functions with three separate features in total:
- `generateChatCompletion` allows the developer to generate chat completions using the provided models from OpenAI; or the developer's owned fine tuned models. Here's an example on how to use the endpoint, along with the correlating output:
```swift
do {
let chat: [ChatMessage] = [
ChatMessage(role: .system, content: "You are a helpful assistant."),
ChatMessage(role: .user, content: "Who won the world series in 2020?"),
ChatMessage(role: .assistant, content: "The Los Angeles Dodgers won the World Series in 2020."),
ChatMessage(role: .user, content: "Where was it played?")
]
let chatParameters = ChatParameters(
model: .gpt4, // ID of the model to use.
messages: chat // A list of messages comprising the conversation so far.
)
let chatCompletion = try await openAI.generateChatCompletion(
parameters: chatParameters
)
if let message = chatCompletion.choices[0].message {
let content = message.content
}
} catch {
// Insert your own error handling method here.
}
```
```
ChatResponse(
id: "chatcmpl-88eG5VruffcNHPNVGBKGVAV5HGk4j",
object: OpenAIKit.OpenAIObject.chatCompletion,
created: 1697072069,
choices: [
OpenAIKit.ChatChoice(
message: Optional(
OpenAIKit.ChatMessage(
id: "250FDA2D-2F38-4E6F-B97E-DAD74FED1FB6",
role: OpenAIKit.ChatRole.assistant,
content: Optional(
"The 2020 World Series was played at Globe Life Field in Arlington, Texas."
),
functionCall: nil
)
),
delta: nil,
index: 0,
logprobs: nil,
finishReason: Optional("stop")
)
],
usage: Optional(
OpenAIKit.Usage(
promptTokens: 53,
completionTokens: 17,
totalTokens: 70
)
)
)
```
The developer is also able to use function calls to execute various functions (i.e., fetching weather info, uploading files, etc). Here's an example on how to use the parameter, the corresponding response, and example usage with a local function.
```swift
do {
let functions: [Function] = [
Function(
name: "getCurrentWeather",
description: "Get the current weather in a given location",
parameters: Parameters(
type: "object",
properties: [
"location": ParameterDetail(
type: "string",
description: "The city and state, e.g. San Francisco, CA"
),
"unit": ParameterDetail(
type: "string", enumValues: ["fahrenheit", "celsius"]
)
],
required: ["location"]
)
)
]
let messages: [ChatMessage] = [
ChatMessage(role: .user, content: "What's the weather like in Boston?")
]
let chatParameters = ChatParameters(
model: .gpt4, // ID of the model to use.
messages: messages, // A list of messages comprising the conversation so far.
functionCall: "auto", // Controls how the model calls functions.
functions: functions // A list of functions the model may generate JSON inputs for.
)
let chatCompletion = try await openAI.generateChatCompletion(
parameters: chatParameters
)
} catch {
// Insert your own error handling method here.
}
```
```
ChatResponse(
id: "chatcmpl-88eVjsHEPtDDiSEuCexsqO8iuhnfG",
object: OpenAIKit.OpenAIObject.chatCompletion,
created: 1697073039,
choices: [
OpenAIKit.ChatChoice(
message: Optional(
OpenAIKit.ChatMessage(
id: "DCE5EECB-9521-481D-9E75-C7FF9390E4CF",
role: OpenAIKit.ChatRole.assistant,
content: nil,
functionCall: Optional(
OpenAIKit.FunctionCall(
arguments: "{\n\"location\": \"Boston, MA\"\n}",
name: "getCurrentWeather"
)
)
)
),
delta: nil,
index: 0,
logprobs: nil,
finishReason: Optional("function_call")
)
],
usage: Optional(OpenAIKit.Usage(promptTokens: 81, completionTokens: 16, totalTokens: 97))
)
```
```swift
func getCurrentWeather(location: String, unit: TemperatureUnit = .fahrenheit) -> WeatherInfo {
return WeatherInfo(location: location, temperature: "72", unit: unit, forecast: ["sunny", "windy"])
}
if let message = chatCompletion.choices[0].message, let functionCall = message.functionCall {
let jsonString = functionCall.arguments
if let data = jsonString.data(using: .utf8) {
do {
if
let json = try JSONSerialization.jsonObject(with: data, options: []) as? [String: Any],
let location = json["location"] as? String
{
self.weatherInfo = self.getCurrentWeather(location: location)
}
} catch {
// Insert your own error handling method here.
}
}
}
```
- `generateChatCompletionStreaming` allows the developr to stream chat completion data from the endpoint. Here's an example on how to use the endpoint, along with the correlating output:
```swift
do {
let chat: [ChatMessage] = [
ChatMessage(role: .system, content: "You are a helpful assistant."),
ChatMessage(role: .user, content: "Who won the world series in 2020?"),
ChatMessage(role: .assistant, content: "The Los Angeles Dodgers won the World Series in 2020."),
ChatMessage(role: .user, content: "Where was it played?")
]
let chatParameters = ChatParameters(model: .chatGPTTurbo, messages: chat)
let stream = try openAI.generateChatCompletionStreaming(
parameters: chatParameters
)
} catch {
// Insert your own error handling method here.
}
```
```
ChatResponse(
id: "chatcmpl-88enklY0vmc4fNkM1mJQCkzW6hcST",
object: OpenAIKit.OpenAIObject.chatCompletionChunk,
created: 1697074156,
choices: [
OpenAIKit.ChatChoice(
message: nil,
delta: Optional(
OpenAIKit.ChatDelta(
role: Optional(OpenAIKit.ChatRole.assistant),
content: Optional("")
)
),
index: 0,
logprobs: nil,
finishReason: nil
)
],
usage: nil
)
ChatResponse(
id: "chatcmpl-88enklY0vmc4fNkM1mJQCkzW6hcST",
object: OpenAIKit.OpenAIObject.chatCompletionChunk,
created: 1697074156,
choices: [
OpenAIKit.ChatChoice(
message: nil,
delta: Optional(
OpenAIKit.ChatDelta(
role: nil,
content: Optional("The")
)
),
index: 0,
logprobs: nil,
finishReason: nil
)
],
usage: nil
)
// ...
ChatResponse(
id: "chatcmpl-88enklY0vmc4fNkM1mJQCkzW6hcST",
object: OpenAIKit.OpenAIObject.chatCompletionChunk,
created: 1697074156,
choices: [
OpenAIKit.ChatChoice(
message: nil,
delta: Optional(
OpenAIKit.ChatDelta(
role: nil,
content: Optional(".")
)
),
index: 0,
logprobs: nil,
finishReason: nil
)
],
usage: nil
)
ChatResponse(
id: "chatcmpl-88enklY0vmc4fNkM1mJQCkzW6hcST",
object: OpenAIKit.OpenAIObject.chatCompletionChunk,
created: 1697074156,
choices: [
OpenAIKit.ChatChoice(
message: nil,
delta: Optional(
OpenAIKit.ChatDelta(
role: nil,
content: nil
)
),
index: 0,
logprobs: nil,
finishReason: Optional("stop")
)
],
usage: nil
)
```
### Audio
Whisper is OpenAI's speech-to-text AI model, designed for accurate transcription of spoken content. By converting audio into text, it offers developers a straightforward tool for tasks like transcription services, voice commands, vocal language translations, or audio indexing. Implementing Whisper can help streamline processes, make applications more accessible, and leverage voice data efficiently.
There are the two main endpoints that use the Whisper model:
- `createTranscription` is the main endpoint that allows developers to transcribe spoken audio into text. Here's an example on how to use the endpoint, and the corresponding returned item:
```swift
do {
let audioParameters = TranscriptionParameters(file: audio)
let transcriptionCompletion = try await openAI.createTranscription(parameters: audioParameters)
} catch {
// Insert your own error handling method here.
}
```
```
Arousing from the most profound of slumbers, we break the gossamer web of some dream. Yet in a second afterward, so frail may that web have been, we remember not that we have dreamed. In the return to life
from the swoon there are two stages, first, that of the sense of mental or spiritual, secondly, that of the sense of physical existence. It seems probable that if, upon reaching the second stage, we could
recall the impressions of the first, we should find these impressions eloquent in memories of the gulf beyond. And that gulf is what? How at least shall we distinguish its shadows from those of the tomb?
```
- `createTranslation` is the second endpoint that allows developers to translate any non-English audio into transcribed English text. Here's an example on how to use the endpoint, and the corresponding returned item:
```swift
do {
let audioParameters = TranscriptionParameters(file: audio)
let transcriptionCompletion = try await openAI.createTranslation(parameters: audioParameters)
} catch {
// Insert your own error handling method here.
}
```
```
In the night, when I was out of my mind, and I, pained, numb and tired, thought about the beauty and goodness of a long-forgotten lesson, and my head began to shake, I suddenly heard something tick that scared
me out of my sleep. Who could it be in that weather? A visitor is knocking, I thought, at my room in that weather. That's all it is, and nothing more.
```
### Utilities
#### Embeddings
Embeddings utilize OpenAI's GPT models to generate dense vector representations for text. These embeddings capture semantic information, enabling developers to perform tasks like similarity search, clustering, or fine-tuning on specific tasks. By integrating GPT Embeddings into applications, developers can enhance text analysis and retrieval capabilities, leading to more effective data processing and insights.
Here's an example on how to use it, and the expected output for the endpoint:
```swift
do {
let embeddingsParam = EmbeddingsParameters(model: "text-similarity-ada-002", input: input)
self.embeddingsResponse = try await openAI.createEmbeddings(parameters: embeddingsParam)
} catch {
// Insert your own error handling method here.
}
```
```
OpenAIKit.EmbeddingsResponse(
object: OpenAIKit.OpenAIObject.list,
data: [
OpenAIKit.EmbeddingsData(
object: OpenAIKit.OpenAIObject.embedding,
embedding: [
0.0028667077,
0.018867997,
-0.030135695,
// ...
-0.004177677,
-0.015615467,
-0.008131327
],
index: 0
)
],
model: "text-similarity-ada:002",
usage: OpenAIKit.EmbeddingsUsage(
promptTokens: 8,
totalTokens: 8
)
)
```
#### Models
The models endpoint allows developers to fetch the currently available models from OpenAI, along with their own fine-tuned models.
There are two endpoints available for use with this feature:
- `listModels` fetches the entire list of models avaialbe to the developer. Here's an example usage, along with the corresponding output:
```swift
do {
let modelsResponse = try await openAi.listModels()
} catch {
// Insert your own error handling method here.
}
```
```
ListModelResponse(
object: OpenAIKit.OpenAIObject.list,
data: [
OpenAIKit.Model(
id: "text-search-babbage-doc-001",
object: OpenAIKit.OpenAIObject.model,
created: 1651172509,
ownedBy: "openai-dev"
),
OpenAIKit.Model(
id: "curie-search-query",
object: OpenAIKit.OpenAIObject.model,
created: 1651172509,
ownedBy: "openai-dev"
),
OpenAIKit.Model(
id: "text-search-babbage-query-001",
object: OpenAIKit.OpenAIObject.model,
created: 1651172509,
ownedBy: "openai-dev"
),
OpenAIKit.Model(
id: "babbage",
object: OpenAIKit.OpenAIObject.model,
created: 1649358449,
ownedBy: "openai"
),
OpenAIKit.Model(
id: "gpt-3.5-turbo-instruct-0914",
object: OpenAIKit.OpenAIObject.model,
created: 1694122472,
ownedBy: "system"
)
// ...
]
)
```
- `retrieveModels` fetches a single model, given the input model's ID. Here's an example usage, along with the corresponding output:
```swift
do {
let model = try await openAI.retrieveModel(modelId: "text-davinci-001")
} catch {
// Insert your own error handling method here.
}
```
```
OpenAIKit.Model(
id: "text-davinci-001",
object: OpenAIKit.OpenAIObject.model,
created: 1649364042,
ownedBy: "openai"
)
```
### Completion
#### ⚠️ Deprecation Notice for Completion API ⚠️
As of July 6, 2023, OpenAI has announced the deprecation of the older models in the Completions API, which are set to retire at the beginning of 2024. It's highly recommended to transition to the Chat Completions API which provides a more structured prompt interface and multi-turn conversation capabilities. The Chat Completions API has proven to handle a vast majority of previous use cases and new conversational needs with higher flexibility and specificity, significantly enhancing the developer experience. For more details, refer to the [official announcement](https://openai.com/blog/gpt-4-api-general-availability).
### Migration From 2.x to 3.x
OpenAIKit 3.x aligns with the modern OpenAI SDK shape and adopts Swift 6.2 concurrency checks.
Core usage is now centered around namespaced resources (for example `openAI.responses.create(...)`), while some legacy flat methods are still available for compatibility during migration.
#### 1) Move from flat client methods to namespaced resources
```swift
// ❌ 2.x style
let audio = try await openAI.createSpeech(parameters: speechParameters)
// ✅ 3.x style
let audio = try await openAI.audio.speech.create(parameters: speechParameters)
```
#### 2) Prefer typed, resource-based APIs
```swift
// ✅ 3.x responses API
let response = try await openAI.responses.create(
parameters: ResponseCreateParameters(
model: "gpt-5.2",
input: "Summarize this document"
)
)
// ✅ 3.x uploads API
let upload = try await openAI.uploads.create(
parameters: UploadCreateParameters(
bytes: fileSize,
filename: "training.jsonl",
purpose: "fine-tune"
)
)
// ✅ 3.x webhook verification
let event = try openAI.webhooks.unwrap(
payload: rawBody,
headers: requestHeaders,
as: MyWebhookEvent.self
)
```
#### 3) Prefer typed `Encodable` payloads over dictionary bodies
3.x request models use `Encodable` throughout, which improves compile-time safety and keeps payload mapping consistent.
#### 4) Use richer client configuration when needed
```swift
let openAI = OpenAI(
.init(
apiKey: apiKey,
organizationId: organizationID,
projectId: projectID,
webhookSecret: webhookSecret,
requestOptions: .init(timeoutInterval: 600, maxRetries: 2)
)
)
```
#### 5) Swift 6.2 concurrency
OpenAIKit 3.x is built and tested in Swift 6.2 language mode. If your package graph still targets an older Swift language version, update your `Package.swift` toolchain/language settings before migrating.
### Migration From 1.x to 2.x
As of 2.x, Chat Completion Models use enums to label each available model for the main usage of default models. Here is an example on how to migrate from using the older initializer for `ChatParameter` to the initializer for 2.x:
```swift
// ❌ 1.x implementation
let chatParameters = ChatParameters(model: "gpt4", messages: chat)
// ✅ 2.x implementation
let chatParameters = ChatParameters(model: .gpt4, messages: chat)
```
If you need a custom string for fine-tuned models, use the `customModel` parameter:
```swift
// ✅ 2.x implementation for custom model IDs
let chatParameters = ChatParameters(customModel: "INSERT-MODEL-ID", message: chat)
```
## Example Projects
(TBD)
## Development And Testing
We welcome anyone to contribute to the project through posting issues, if they encounter any bugs / glitches while using OpenAIKit; and as well with creating pull issues that add any additional features to OpenAIKit.
## License
OpenAIKit is released under the MIT license, and any use of OpenAI's REST endpoint will be under the [Usage policies](https://beta.openai.com/docs/usage-policies) set by them.
```
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.
```
================================================
FILE: SECURITY.md
================================================
# Security Policy
## Supported Versions
| Version | Supported |
| ------- | ------------------ |
| 2.0.1 | :white_check_mark: |
| 2.0 | :x: |
| 1.2 | :x: |
| 1.1.3 | :x: |
| 1.1.2 | :x: |
| 1.1.1 | :x: |
| 1.1.0 | :x: |
| 1.0.0 | :x: |
## Reporting a Vulnerability
Please create a GitHub issue or a GitHub Pull Request with a write up for the security vulnerability.
================================================
FILE: Sources/OpenAIKit/Client/AudioSpeechResource+API.swift
================================================
import Foundation
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public extension AudioSpeechResource {
func create(parameters: SpeechParameters) async throws -> Data {
let url = try client.getServerUrl(path: "/audio/speech")
return try await OpenAIKitSession.shared.decodeRawUrl(
with: url,
configuration: client.configuration,
body: parameters,
acceptHeader: "application/octet-stream"
)
}
}
================================================
FILE: Sources/OpenAIKit/Client/BatchesResource+API.swift
================================================
import Foundation
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public extension BatchesResource {
func create(parameters: BatchCreateParameters) async throws -> BatchObject {
let url = try client.getServerUrl(path: "/batches")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
body: parameters
)
}
func list() async throws -> BatchListResponse {
let url = try client.getServerUrl(path: "/batches")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
method: .get,
bodyRequired: false
)
}
func retrieve(id: String) async throws -> BatchObject {
let url = try client.getServerUrl(path: "/batches/\(id)")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
method: .get,
bodyRequired: false
)
}
func cancel(id: String) async throws -> BatchObject {
let url = try client.getServerUrl(path: "/batches/\(id)/cancel")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
method: .post,
bodyRequired: false
)
}
}
================================================
FILE: Sources/OpenAIKit/Client/BetaResources+API.swift
================================================
import Foundation
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public extension BetaAssistantsResource {
func create(parameters: AssistantCreateParameters) async throws -> AssistantObject {
let url = try client.getServerUrl(path: "/assistants")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
body: parameters
)
}
func list() async throws -> AssistantListResponse {
let url = try client.getServerUrl(path: "/assistants")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
method: .get,
bodyRequired: false
)
}
func retrieve(id: String) async throws -> AssistantObject {
let url = try client.getServerUrl(path: "/assistants/\(id)")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
method: .get,
bodyRequired: false
)
}
func update(id: String, parameters: AssistantUpdateParameters) async throws -> AssistantObject {
let url = try client.getServerUrl(path: "/assistants/\(id)")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
body: parameters
)
}
func delete(id: String) async throws -> DeleteObject {
let url = try client.getServerUrl(path: "/assistants/\(id)")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
method: .delete,
bodyRequired: false
)
}
}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public extension BetaThreadsResource {
func create(parameters: ThreadCreateParameters = .init()) async throws -> ThreadObject {
let url = try client.getServerUrl(path: "/threads")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
body: parameters
)
}
func retrieve(id: String) async throws -> ThreadObject {
let url = try client.getServerUrl(path: "/threads/\(id)")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
method: .get,
bodyRequired: false
)
}
func update(id: String, parameters: ThreadUpdateParameters) async throws -> ThreadObject {
let url = try client.getServerUrl(path: "/threads/\(id)")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
body: parameters
)
}
func delete(id: String) async throws -> DeleteObject {
let url = try client.getServerUrl(path: "/threads/\(id)")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
method: .delete,
bodyRequired: false
)
}
}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public extension BetaThreadRunsResource {
func create(threadID: String, parameters: ThreadRunCreateParameters) async throws -> ThreadRunObject {
let url = try client.getServerUrl(path: "/threads/\(threadID)/runs")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
body: parameters
)
}
func list(threadID: String) async throws -> ThreadRunListResponse {
let url = try client.getServerUrl(path: "/threads/\(threadID)/runs")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
method: .get,
bodyRequired: false
)
}
func retrieve(threadID: String, runID: String) async throws -> ThreadRunObject {
let url = try client.getServerUrl(path: "/threads/\(threadID)/runs/\(runID)")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
method: .get,
bodyRequired: false
)
}
func cancel(threadID: String, runID: String) async throws -> ThreadRunObject {
let url = try client.getServerUrl(path: "/threads/\(threadID)/runs/\(runID)/cancel")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
method: .post,
bodyRequired: false
)
}
}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public extension BetaThreadRunStepsResource {
func list(threadID: String, runID: String) async throws -> ThreadRunStepListResponse {
let url = try client.getServerUrl(path: "/threads/\(threadID)/runs/\(runID)/steps")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
method: .get,
bodyRequired: false
)
}
func retrieve(threadID: String, runID: String, stepID: String) async throws -> ThreadRunStepObject {
let url = try client.getServerUrl(path: "/threads/\(threadID)/runs/\(runID)/steps/\(stepID)")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
method: .get,
bodyRequired: false
)
}
}
================================================
FILE: Sources/OpenAIKit/Client/ContainersResource+API.swift
================================================
import Foundation
private struct EmptyContainerContentBody: Encodable {
init() {}
}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public extension ContainersResource {
func create(parameters: ContainerCreateParameters) async throws -> ContainerObject {
let url = try client.getServerUrl(path: "/containers")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
body: parameters
)
}
func list() async throws -> ContainerListResponse {
let url = try client.getServerUrl(path: "/containers")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
method: .get,
bodyRequired: false
)
}
func retrieve(id: String) async throws -> ContainerObject {
let url = try client.getServerUrl(path: "/containers/\(id)")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
method: .get,
bodyRequired: false
)
}
func delete(id: String) async throws -> DeleteObject {
let url = try client.getServerUrl(path: "/containers/\(id)")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
method: .delete,
bodyRequired: false
)
}
}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public extension ContainerFilesResource {
func create(containerID: String, parameters: ContainerFileCreateParameters) async throws -> ContainerFileObject {
let url = try client.getServerUrl(path: "/containers/\(containerID)/files")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
body: parameters
)
}
func list(containerID: String) async throws -> ContainerFileListResponse {
let url = try client.getServerUrl(path: "/containers/\(containerID)/files")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
method: .get,
bodyRequired: false
)
}
func retrieve(containerID: String, fileID: String) async throws -> ContainerFileObject {
let url = try client.getServerUrl(path: "/containers/\(containerID)/files/\(fileID)")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
method: .get,
bodyRequired: false
)
}
func delete(containerID: String, fileID: String) async throws -> DeleteObject {
let url = try client.getServerUrl(path: "/containers/\(containerID)/files/\(fileID)")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
method: .delete,
bodyRequired: false
)
}
}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public extension ContainerFileContentResource {
func retrieve(containerID: String, fileID: String) async throws -> Data {
let url = try client.getServerUrl(path: "/containers/\(containerID)/files/\(fileID)/content")
return try await OpenAIKitSession.shared.decodeRawUrl(
with: url,
configuration: client.configuration,
body: EmptyContainerContentBody(),
method: .get
)
}
}
================================================
FILE: Sources/OpenAIKit/Client/ConversationsResource+API.swift
================================================
import Foundation
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public extension ConversationsResource {
func create(parameters: ConversationCreateParameters = .init()) async throws -> ConversationObject {
let url = try client.getServerUrl(path: "/conversations")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
body: parameters
)
}
func list() async throws -> ConversationListResponse {
let url = try client.getServerUrl(path: "/conversations")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
method: .get,
bodyRequired: false
)
}
func retrieve(id: String) async throws -> ConversationObject {
let url = try client.getServerUrl(path: "/conversations/\(id)")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
method: .get,
bodyRequired: false
)
}
func delete(id: String) async throws -> DeleteObject {
let url = try client.getServerUrl(path: "/conversations/\(id)")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
method: .delete,
bodyRequired: false
)
}
}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public extension ConversationItemsResource {
func list(conversationID: String) async throws -> ConversationItemListResponse {
let url = try client.getServerUrl(path: "/conversations/\(conversationID)/items")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
method: .get,
bodyRequired: false
)
}
func create(
conversationID: String,
parameters: ConversationItemCreateParameters
) async throws -> ConversationItemObject {
let url = try client.getServerUrl(path: "/conversations/\(conversationID)/items")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
body: parameters
)
}
func retrieve(conversationID: String, itemID: String) async throws -> ConversationItemObject {
let url = try client.getServerUrl(path: "/conversations/\(conversationID)/items/\(itemID)")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
method: .get,
bodyRequired: false
)
}
func delete(conversationID: String, itemID: String) async throws -> DeleteObject {
let url = try client.getServerUrl(path: "/conversations/\(conversationID)/items/\(itemID)")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
method: .delete,
bodyRequired: false
)
}
}
================================================
FILE: Sources/OpenAIKit/Client/EvalsResource+API.swift
================================================
import Foundation
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public extension EvalsResource {
func create(parameters: EvalCreateParameters) async throws -> EvalObject {
let url = try client.getServerUrl(path: "/evals")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
body: parameters
)
}
func list() async throws -> EvalListResponse {
let url = try client.getServerUrl(path: "/evals")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
method: .get,
bodyRequired: false
)
}
func retrieve(id: String) async throws -> EvalObject {
let url = try client.getServerUrl(path: "/evals/\(id)")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
method: .get,
bodyRequired: false
)
}
func delete(id: String) async throws -> DeleteObject {
let url = try client.getServerUrl(path: "/evals/\(id)")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
method: .delete,
bodyRequired: false
)
}
}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public extension EvalRunsResource {
func create(evalID: String, parameters: EvalRunCreateParameters) async throws -> EvalRunObject {
let url = try client.getServerUrl(path: "/evals/\(evalID)/runs")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
body: parameters
)
}
func list(evalID: String) async throws -> EvalRunListResponse {
let url = try client.getServerUrl(path: "/evals/\(evalID)/runs")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
method: .get,
bodyRequired: false
)
}
func retrieve(evalID: String, runID: String) async throws -> EvalRunObject {
let url = try client.getServerUrl(path: "/evals/\(evalID)/runs/\(runID)")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
method: .get,
bodyRequired: false
)
}
func cancel(evalID: String, runID: String) async throws -> EvalRunObject {
let url = try client.getServerUrl(path: "/evals/\(evalID)/runs/\(runID)/cancel")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
method: .post,
bodyRequired: false
)
}
}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public extension EvalRunOutputItemsResource {
func list(evalID: String, runID: String) async throws -> EvalRunOutputItemListResponse {
let url = try client.getServerUrl(path: "/evals/\(evalID)/runs/\(runID)/output_items")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
method: .get,
bodyRequired: false
)
}
}
================================================
FILE: Sources/OpenAIKit/Client/FineTuningResource+API.swift
================================================
import Foundation
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public extension FineTuningJobsResource {
func create(parameters: FineTuningJobCreateParameters) async throws -> FineTuningJobObject {
let url = try client.getServerUrl(path: "/fine_tuning/jobs")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
body: parameters
)
}
func list() async throws -> FineTuningJobListResponse {
let url = try client.getServerUrl(path: "/fine_tuning/jobs")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
method: .get,
bodyRequired: false
)
}
func retrieve(id: String) async throws -> FineTuningJobObject {
let url = try client.getServerUrl(path: "/fine_tuning/jobs/\(id)")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
method: .get,
bodyRequired: false
)
}
func cancel(id: String) async throws -> FineTuningJobObject {
let url = try client.getServerUrl(path: "/fine_tuning/jobs/\(id)/cancel")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
method: .post,
bodyRequired: false
)
}
func listEvents(jobID: String) async throws -> FineTuningJobEventListResponse {
let url = try client.getServerUrl(path: "/fine_tuning/jobs/\(jobID)/events")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
method: .get,
bodyRequired: false
)
}
func listCheckpoints(jobID: String) async throws -> FineTuningCheckpointListResponse {
let url = try client.getServerUrl(path: "/fine_tuning/jobs/\(jobID)/checkpoints")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
method: .get,
bodyRequired: false
)
}
}
================================================
FILE: Sources/OpenAIKit/Client/RealtimeResource+API.swift
================================================
import Foundation
public struct RealtimeConnectionOptions: Sendable {
public var model: String?
public var callID: String?
public init(model: String? = nil, callID: String? = nil) {
self.model = model
self.callID = callID
}
}
public struct RealtimeConnectionEvent: Sendable {
public let type: String
public let payload: String?
public init(type: String, payload: String? = nil) {
self.type = type
self.payload = payload
}
}
public actor RealtimeConnection {
private var isClosed = false
private let options: RealtimeConnectionOptions
init(options: RealtimeConnectionOptions) {
self.options = options
}
public func receive() -> RealtimeConnectionEvent? {
guard !isClosed else { return nil }
return RealtimeConnectionEvent(type: "connected", payload: options.model)
}
public func send(_ event: RealtimeConnectionEvent) {
_ = event
}
public func close() {
isClosed = true
}
}
public struct RealtimeConnectionManager {
private let options: RealtimeConnectionOptions
init(options: RealtimeConnectionOptions) {
self.options = options
}
public func enter() async -> RealtimeConnection {
RealtimeConnection(options: options)
}
}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public extension RealtimeResource {
func connect(options: RealtimeConnectionOptions = .init()) -> RealtimeConnectionManager {
RealtimeConnectionManager(options: options)
}
}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public extension RealtimeCallsResource {
func create(parameters: RealtimeCallCreateParameters = .init()) async throws -> RealtimeCallObject {
let url = try client.getServerUrl(path: "/realtime/calls")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
body: parameters
)
}
func accept(callID: String) async throws -> RealtimeCallObject {
let url = try client.getServerUrl(path: "/realtime/calls/\(callID)/accept")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
method: .post,
bodyRequired: false
)
}
func hangup(callID: String) async throws -> RealtimeCallObject {
let url = try client.getServerUrl(path: "/realtime/calls/\(callID)/hangup")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
method: .post,
bodyRequired: false
)
}
func refer(callID: String, parameters: RealtimeCallReferParameters) async throws -> RealtimeCallObject {
let url = try client.getServerUrl(path: "/realtime/calls/\(callID)/refer")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
body: parameters
)
}
func reject(callID: String) async throws -> RealtimeCallObject {
let url = try client.getServerUrl(path: "/realtime/calls/\(callID)/reject")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
method: .post,
bodyRequired: false
)
}
}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public extension RealtimeClientSecretsResource {
func create(parameters: RealtimeClientSecretCreateParameters = .init()) async throws -> RealtimeClientSecretResponse {
let url = try client.getServerUrl(path: "/realtime/client_secrets")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
body: parameters
)
}
}
================================================
FILE: Sources/OpenAIKit/Client/ResourceNamespaces.swift
================================================
import Foundation
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public class OpenAIResource {
internal unowned let client: OpenAI
internal init(_ client: OpenAI) {
self.client = client
}
}
// MARK: Top-level resources
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class CompletionsResource: OpenAIResource {}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class ChatResource: OpenAIResource {
public lazy var completions = ChatCompletionsResource(client)
}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class EmbeddingsResource: OpenAIResource {}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class FilesResource: OpenAIResource {}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class ImagesResource: OpenAIResource {}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class AudioResource: OpenAIResource {
public lazy var speech = AudioSpeechResource(client)
public lazy var transcriptions = AudioTranscriptionsResource(client)
public lazy var translations = AudioTranslationsResource(client)
}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class ModerationsResource: OpenAIResource {}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class ModelsResource: OpenAIResource {}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class FineTuningResource: OpenAIResource {
public lazy var jobs = FineTuningJobsResource(client)
}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class VectorStoresResource: OpenAIResource {
public lazy var files = VectorStoreFilesResource(client)
public lazy var fileBatches = VectorStoreFileBatchesResource(client)
}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class WebhooksResource: OpenAIResource {}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class BatchesResource: OpenAIResource {}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class UploadsResource: OpenAIResource {
public lazy var parts = UploadPartsResource(client)
}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class ResponsesResource: OpenAIResource {
public lazy var inputItems = ResponseInputItemsResource(client)
public lazy var inputTokens = ResponseInputTokensResource(client)
}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class RealtimeResource: OpenAIResource {
public lazy var calls = RealtimeCallsResource(client)
public lazy var clientSecrets = RealtimeClientSecretsResource(client)
}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class ConversationsResource: OpenAIResource {
public lazy var items = ConversationItemsResource(client)
}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class EvalsResource: OpenAIResource {
public lazy var runs = EvalRunsResource(client)
}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class ContainersResource: OpenAIResource {
public lazy var files = ContainerFilesResource(client)
}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class SkillsResource: OpenAIResource {
public lazy var versions = SkillVersionsResource(client)
public lazy var content = SkillContentResource(client)
}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class VideosResource: OpenAIResource {}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class BetaResource: OpenAIResource {
public lazy var assistants = BetaAssistantsResource(client)
public lazy var threads = BetaThreadsResource(client)
public lazy var chatkit = BetaChatKitResource(client)
public lazy var realtime = BetaRealtimeResource(client)
}
// MARK: Nested resources
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class ChatCompletionsResource: OpenAIResource {}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class AudioSpeechResource: OpenAIResource {}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class AudioTranscriptionsResource: OpenAIResource {}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class AudioTranslationsResource: OpenAIResource {}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class VectorStoreFilesResource: OpenAIResource {}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class VectorStoreFileBatchesResource: OpenAIResource {}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class UploadPartsResource: OpenAIResource {}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class ResponseInputItemsResource: OpenAIResource {}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class ResponseInputTokensResource: OpenAIResource {}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class RealtimeCallsResource: OpenAIResource {}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class RealtimeClientSecretsResource: OpenAIResource {}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class ConversationItemsResource: OpenAIResource {}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class EvalRunsResource: OpenAIResource {
public lazy var outputItems = EvalRunOutputItemsResource(client)
}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class EvalRunOutputItemsResource: OpenAIResource {}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class ContainerFilesResource: OpenAIResource {
public lazy var content = ContainerFileContentResource(client)
}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class ContainerFileContentResource: OpenAIResource {}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class SkillVersionsResource: OpenAIResource {
public lazy var content = SkillVersionContentResource(client)
}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class SkillVersionContentResource: OpenAIResource {}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class SkillContentResource: OpenAIResource {}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class BetaAssistantsResource: OpenAIResource {}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class BetaThreadsResource: OpenAIResource {
public lazy var messages = BetaThreadMessagesResource(client)
public lazy var runs = BetaThreadRunsResource(client)
}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class BetaThreadMessagesResource: OpenAIResource {}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class BetaThreadRunsResource: OpenAIResource {
public lazy var steps = BetaThreadRunStepsResource(client)
}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class BetaThreadRunStepsResource: OpenAIResource {}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class BetaChatKitResource: OpenAIResource {
public lazy var sessions = BetaChatKitSessionsResource(client)
public lazy var threads = BetaChatKitThreadsResource(client)
}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class BetaChatKitSessionsResource: OpenAIResource {}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class BetaChatKitThreadsResource: OpenAIResource {}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class BetaRealtimeResource: OpenAIResource {
public lazy var sessions = BetaRealtimeSessionsResource(client)
public lazy var transcriptionSessions = BetaRealtimeTranscriptionSessionsResource(client)
}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class BetaRealtimeSessionsResource: OpenAIResource {}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class BetaRealtimeTranscriptionSessionsResource: OpenAIResource {}
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public final class FineTuningJobsResource: OpenAIResource {}
================================================
FILE: Sources/OpenAIKit/Client/ResponsesResource+API.swift
================================================
import Foundation
@available(macOS 10.15, iOS 13, watchOS 6, tvOS 13, *)
public extension ResponsesResource {
func create(parameters: ResponseCreateParameters) async throws -> ResponseObject {
let url = try client.getServerUrl(path: "/responses")
return try await OpenAIKitSession.shared.decodeUrl(
with: url,
configuration: client.configuration,
body: parameters
)
}
func stream(parameters: ResponseCreateParameters) throws -> AsyncThrowingStream