Full Code of jamesrochabrun/SwiftOpenAI for AI

main bc6b84767c3a cached
219 files
1.3 MB
331.1k tokens
1 requests
Download .txt
Showing preview only (1,406K chars total). Download the full file or copy to clipboard to get everything.
Repository: jamesrochabrun/SwiftOpenAI
Branch: main
Commit: bc6b84767c3a
Files: 219
Total size: 1.3 MB

Directory structure:
gitextract_oinuzyhk/

├── .github/
│   └── workflows/
│       └── ci.yml
├── .gitignore
├── CONTRIBUTING.md
├── Examples/
│   ├── RealtimeExample/
│   │   ├── README.md
│   │   └── RealtimeExample.swift
│   └── SwiftOpenAIExample/
│       ├── SwiftOpenAIExample/
│       │   ├── AIProxyIntroView.swift
│       │   ├── ApiKeyIntroView.swift
│       │   ├── Assets.xcassets/
│       │   │   ├── AccentColor.colorset/
│       │   │   │   └── Contents.json
│       │   │   ├── AppIcon.appiconset/
│       │   │   │   └── Contents.json
│       │   │   └── Contents.json
│       │   ├── Assistants/
│       │   │   ├── AssistantConfigurationDemoView.swift
│       │   │   ├── AssistantConfigurationProvider.swift
│       │   │   ├── AssistantStreamDemoScreen.swift
│       │   │   ├── AssistantThreadConfigurationProvider.swift
│       │   │   └── AssistantsListDemoView.swift
│       │   ├── AudioDemo/
│       │   │   ├── AudioDemoView.swift
│       │   │   └── AudioProvider.swift
│       │   ├── ChatDemo/
│       │   │   ├── ChatDemoView.swift
│       │   │   └── ChatProvider.swift
│       │   ├── ChatFunctionsCall/
│       │   │   ├── ChatMessageDisplayModel.swift
│       │   │   ├── ChatMessageView.swift
│       │   │   ├── Completion/
│       │   │   │   ├── ChatFunctionCallDemoView.swift
│       │   │   │   └── ChatFunctionCallProvider.swift
│       │   │   └── Stream/
│       │   │       ├── ChatFunctionsCallStreamProvider.swift
│       │   │       └── ChatFunctionsCalllStreamDemoView.swift
│       │   ├── ChatStreamFluidConversationDemo/
│       │   │   ├── ChatFluidConversationProvider.swift
│       │   │   └── ChatStreamFluidConversationDemoView.swift
│       │   ├── ChatStructureOutputTool/
│       │   │   ├── ChatStructureOutputToolDemoView.swift
│       │   │   └── ChatStructuredOutputToolProvider.swift
│       │   ├── ChatStructuredOutputs/
│       │   │   ├── ChatStructuredOutputDemoView.swift
│       │   │   └── ChatStructuredOutputProvider.swift
│       │   ├── EmbeddingsDemo/
│       │   │   ├── EmbeddingsDemoView.swift
│       │   │   └── Embeddingsprovider.swift
│       │   ├── Files/
│       │   │   ├── AttachmentView.swift
│       │   │   ├── FileAttachmentView.swift
│       │   │   ├── FilesPicker.swift
│       │   │   └── FilesPickerProvider.swift
│       │   ├── FilesDemo/
│       │   │   ├── FilesDemoView.swift
│       │   │   └── FilesProvider.swift
│       │   ├── FineTuningDemo/
│       │   │   ├── FineTuningJobDemoView.swift
│       │   │   └── FineTuningJobProvider.swift
│       │   ├── ImagesDemo/
│       │   │   ├── ImagesDemoView.swift
│       │   │   └── ImagesProvider.swift
│       │   ├── LocalChatDemo/
│       │   │   └── LocalChatDemoView.swift
│       │   ├── LocalHostEntryView.swift
│       │   ├── ModelsDemo/
│       │   │   ├── ModelsDemoView.swift
│       │   │   └── ModelsProvider.swift
│       │   ├── ModerationsDemo/
│       │   │   ├── ModerationDemoView.swift
│       │   │   └── ModerationProvider.swift
│       │   ├── OptionsListView.swift
│       │   ├── PredictedOutputsDemo/
│       │   │   └── ChatPredictedOutputDemoView.swift
│       │   ├── Preview Content/
│       │   │   └── Preview Assets.xcassets/
│       │   │       └── Contents.json
│       │   ├── Resources/
│       │   │   ├── WorldCupData.jsonl
│       │   │   ├── german.m4a
│       │   │   └── narcos.m4a
│       │   ├── ResponseAPIDemo/
│       │   │   ├── ResponseStreamDemoView.swift
│       │   │   └── ResponseStreamProvider.swift
│       │   ├── ServiceSelectionView.swift
│       │   ├── SharedModels/
│       │   │   └── ChatDisplayMessage.swift
│       │   ├── SharedUI/
│       │   │   ├── ChatDisplayMessageView.swift
│       │   │   ├── ChatMessageLoadingView.swift
│       │   │   ├── LoadingView.swift
│       │   │   ├── Theme/
│       │   │   │   ├── Sizes.swift
│       │   │   │   └── ThemeColor.swift
│       │   │   └── URLImageView.swift
│       │   ├── SwiftOpenAIExample.entitlements
│       │   ├── SwiftOpenAIExampleApp.swift
│       │   ├── Utilities/
│       │   │   └── ContentLoader.swift
│       │   └── Vision/
│       │       ├── ChatVisionDemoView.swift
│       │       └── ChatVisionProvider.swift
│       ├── SwiftOpenAIExample.xcodeproj/
│       │   ├── project.pbxproj
│       │   ├── project.xcworkspace/
│       │   │   └── contents.xcworkspacedata
│       │   └── xcshareddata/
│       │       └── xcschemes/
│       │           └── SwiftOpenAIExample.xcscheme
│       ├── SwiftOpenAIExampleTests/
│       │   └── SwiftOpenAIExampleTests.swift
│       └── SwiftOpenAIExampleUITests/
│           ├── SwiftOpenAIExampleUITests.swift
│           └── SwiftOpenAIExampleUITestsLaunchTests.swift
├── LICENSE
├── Package.swift
├── README.md
├── Sources/
│   └── OpenAI/
│       ├── AIProxy/
│       │   ├── AIProxyCertificatePinning.swift
│       │   ├── AIProxyService.swift
│       │   └── Endpoint+AIProxy.swift
│       ├── Azure/
│       │   ├── AzureOpenAIAPI.swift
│       │   ├── AzureOpenAIConfiguration.swift
│       │   └── DefaultOpenAIAzureService.swift
│       ├── LocalModelService/
│       │   ├── LocalModelAPI.swift
│       │   └── LocalModelService.swift
│       ├── Private/
│       │   ├── Audio/
│       │   │   ├── AudioPCMPlayer.swift
│       │   │   ├── AudioPCMPlayerError.swift
│       │   │   ├── AudioUtils.swift
│       │   │   ├── MicrophonePCMSampleVendor.swift
│       │   │   ├── MicrophonePCMSampleVendorAE.swift
│       │   │   ├── MicrophonePCMSampleVendorAT.swift
│       │   │   ├── MicrophonePCMSampleVendorCommon.swift
│       │   │   └── MicrophonePCMSampleVendorError.swift
│       │   ├── Networking/
│       │   │   ├── AsyncHTTPClientAdapter.swift
│       │   │   ├── Endpoint.swift
│       │   │   ├── HTTPClient.swift
│       │   │   ├── MultipartFormDataBuilder.swift
│       │   │   ├── OpenAIAPI.swift
│       │   │   └── URLSessionHTTPClientAdapter.swift
│       │   └── Realtime/
│       │       ├── OpenAIRealtimeSession.swift
│       │       └── RealtimeActor.swift
│       └── Public/
│           ├── Parameters/
│           │   ├── Assistant/
│           │   │   └── AssistantParameters.swift
│           │   ├── Audio/
│           │   │   ├── AudioSpeechParameters.swift
│           │   │   ├── AudioTranscriptionParameters.swift
│           │   │   └── AudioTranslationParameters.swift
│           │   ├── Batch/
│           │   │   └── BatchParameter.swift
│           │   ├── Chat/
│           │   │   └── ChatCompletionParameters.swift
│           │   ├── Conversations/
│           │   │   ├── CreateConversationItemsParameter.swift
│           │   │   ├── CreateConversationParameter.swift
│           │   │   ├── GetConversationItemParameter.swift
│           │   │   ├── GetConversationItemsParameter.swift
│           │   │   └── UpdateConversationParameter.swift
│           │   ├── Embedding/
│           │   │   └── EmbeddingParameter.swift
│           │   ├── File/
│           │   │   └── FileParameter.swift
│           │   ├── FineTuning/
│           │   │   └── FineTuningJobParameters.swift
│           │   ├── Image/
│           │   │   ├── Dalle.swift
│           │   │   ├── ImageCreateParameters.swift
│           │   │   ├── ImageEditParameters.swift
│           │   │   └── ImageVariationParameters.swift
│           │   ├── ImageGen/
│           │   │   ├── CreateImageEditParameters.swift
│           │   │   ├── CreateImageParameters.swift
│           │   │   └── CreateImageVariationParameters.swift
│           │   ├── Message/
│           │   │   ├── MessageParameter.swift
│           │   │   └── ModifyMessageParameters.swift
│           │   ├── Model.swift
│           │   ├── Moderation/
│           │   │   └── ModerationParameter.swift
│           │   ├── MultipartFormDataParameters.swift
│           │   ├── Realtime/
│           │   │   ├── OpenAIRealtimeConversationItemCreate.swift
│           │   │   ├── OpenAIRealtimeInputAudioBufferAppend.swift
│           │   │   ├── OpenAIRealtimeResponseCreate.swift
│           │   │   ├── OpenAIRealtimeSessionConfiguration.swift
│           │   │   └── OpenAIRealtimeSessionUpdate.swift
│           │   ├── Response/
│           │   │   ├── Conversation.swift
│           │   │   ├── GetInputItemsParameter.swift
│           │   │   ├── GetResponseParameter.swift
│           │   │   ├── InputType.swift
│           │   │   ├── ModelResponseParameter.swift
│           │   │   ├── Prompt.swift
│           │   │   ├── ResponseInclude.swift
│           │   │   └── StreamOptions.swift
│           │   ├── Runs/
│           │   │   ├── CreateThreadAndRunParameter.swift
│           │   │   ├── ModifyRunParameters.swift
│           │   │   ├── RunParameter.swift
│           │   │   └── RunToolsOutputParameter.swift
│           │   ├── Threads/
│           │   │   ├── CreateThreadParameters.swift
│           │   │   └── ModifyThreadParameters.swift
│           │   ├── VectorStore/
│           │   │   └── VectorStoreParameter.swift
│           │   ├── VectorStoreFileBatch/
│           │   │   └── VectorStoreFileBatchParameter.swift
│           │   └── VectorStoreFiles/
│           │       └── VectorStoreFileParameter.swift
│           ├── ResponseModels/
│           │   ├── Assistants/
│           │   │   ├── AssistantObject.swift
│           │   │   ├── AssistantStreamEvent.swift
│           │   │   └── AssistantStreamEventObject.swift
│           │   ├── Audio/
│           │   │   ├── AudioObject.swift
│           │   │   └── AudioSpeechObject.swift
│           │   ├── Batch/
│           │   │   └── BatchObject.swift
│           │   ├── Chat/
│           │   │   ├── ChatCompletionChunkObject.swift
│           │   │   ├── ChatCompletionObject.swift
│           │   │   └── ChatUsage.swift
│           │   ├── Conversations/
│           │   │   └── ConversationModel.swift
│           │   ├── Delta.swift
│           │   ├── Embedding/
│           │   │   └── EmbeddingObject.swift
│           │   ├── File/
│           │   │   └── FileObject.swift
│           │   ├── FineTuning/
│           │   │   ├── FineTuningJobEventObject.swift
│           │   │   └── FineTuningJobObject.swift
│           │   ├── Image/
│           │   │   └── ImageObject.swift
│           │   ├── ImageGen/
│           │   │   └── CreateImageResponse.swift
│           │   ├── Messages/
│           │   │   ├── MessageContent.swift
│           │   │   ├── MessageDeltaObject.swift
│           │   │   └── MessageObject.swift
│           │   ├── Model/
│           │   │   └── ModelObject.swift
│           │   ├── Moderation/
│           │   │   └── ModerationObject.swift
│           │   ├── OpenAIErrorResponse.swift
│           │   ├── OpenAIResponse.swift
│           │   ├── Realtime/
│           │   │   ├── OpenAIRealtimeInputAudioBufferSpeechStarted.swift
│           │   │   ├── OpenAIRealtimeMessage.swift
│           │   │   └── OpenAIRealtimeResponseFunctionCallArgumentsDone.swift
│           │   ├── Response/
│           │   │   ├── OutputItem.swift
│           │   │   ├── ResponseModel.swift
│           │   │   └── ResponseStreamEvent.swift
│           │   ├── Runs/
│           │   │   ├── RunObject.swift
│           │   │   ├── RunStepDeltaObject.swift
│           │   │   ├── RunStepDetails.swift
│           │   │   └── RunStepObject.swift
│           │   ├── Threads/
│           │   │   └── ThreadObject.swift
│           │   ├── VectorStore/
│           │   │   └── VectorStoreObject.swift
│           │   ├── VectorStoreFile/
│           │   │   └── VectorStoreFileObject.swift
│           │   └── VectorStoreFileBatch/
│           │       └── VectorStoreFileBatchObject.swift
│           ├── Service/
│           │   ├── DefaultOpenAIService.swift
│           │   ├── OpenAIService.swift
│           │   └── OpenAIServiceFactory.swift
│           └── Shared/
│               ├── AudioController.swift
│               ├── DeletionStatus.swift
│               ├── ExpirationPolicy.swift
│               ├── FileCount.swift
│               ├── IncompleteDetails.swift
│               ├── JSONSchema.swift
│               ├── LastError.swift
│               ├── MessageAttachment.swift
│               ├── OpenAIError.swift
│               ├── OpenAIJSONValue.swift
│               ├── Reasoning.swift
│               ├── ResponseFormat.swift
│               ├── TextConfiguration.swift
│               ├── Tool.swift
│               ├── ToolCall.swift
│               ├── ToolChoice.swift
│               ├── ToolChoiceMode.swift
│               ├── ToolResources.swift
│               ├── TruncationStrategy.swift
│               └── Usage.swift
├── Tests/
│   └── OpenAITests/
│       ├── InputTypeTests.swift
│       ├── ModelResponseParameterTests.swift
│       ├── OpenAITests.swift
│       ├── OutputItemTests.swift
│       ├── ResponseModelValidationTests.swift
│       └── ResponseStreamEventTests.swift
└── rules.swiftformat

================================================
FILE CONTENTS
================================================

================================================
FILE: .github/workflows/ci.yml
================================================
# This workflow will build a Swift project
# For more information see: https://docs.github.com/en/actions/automating-builds-and-tests/building-and-testing-swift

name: CI tests

on:
  push:
    branches: [ "main" ]
  pull_request:
    branches: [ "main" ]

jobs:
  build_and_test_macos:
    runs-on: macos-latest
    steps:
    - uses: actions/checkout@v4
    - name: Get swift version
      run: swift --version
    - name: Build
      run: swift build -q
    - name: Run tests
      run: swift test -q

  build_and_test_linux:
    runs-on: ubuntu-latest
    container:
      image: swift:6.0.1-jammy
    steps:
    - name: Install dependencies
      run: |
        apt-get update
        apt-get install -y curl git
    - uses: actions/checkout@v4
    - name: Get swift version
      run: swift --version
    - name: Build
      run: swift build -q
    - name: Run tests
      run: swift test -q

  lint:
    runs-on: macos-latest
    steps:
    - uses: actions/checkout@v4
    - name: Set up Homebrew
      id: set-up-homebrew
      uses: Homebrew/actions/setup-homebrew@master
    - name: Install swiftformat
      run: brew install swiftformat
    - name: Check formatting
      run: swiftformat --config rules.swiftformat --lint .

================================================
FILE: .gitignore
================================================
.DS_Store
/.build
/Packages
xcuserdata/
DerivedData/
.swiftpm/configuration/registries.json
.swiftpm/xcode/package.xcworkspace/contents.xcworkspacedata
.netrc
Package.resolved

# Xcode Swift Package Manager
**/xcshareddata/swiftpm/
**/project.xcworkspace/xcshareddata/swiftpm/
**/xcshareddata/IDEWorkspaceChecks.plist


================================================
FILE: CONTRIBUTING.md
================================================

### Linting:
```bash
brew install swiftformat
swiftformat --config rules.swiftformat .
```

================================================
FILE: Examples/RealtimeExample/README.md
================================================
# OpenAI Realtime API Example

This example demonstrates how to use SwiftOpenAI's Realtime API for bidirectional voice conversations with OpenAI's GPT-4o models.

## Features

- Real-time bidirectional audio streaming
- Voice Activity Detection (VAD) for automatic turn-taking
- Audio transcription of both user and AI speech
- Function calling support
- Interrupt handling when user starts speaking

## Requirements

- iOS 15+, macOS 12+, watchOS 9+
- Microphone permissions
- OpenAI API key

## Setup

### 1. Add Microphone Permission

Add the following to your `Info.plist`:

```xml
<key>NSMicrophoneUsageDescription</key>
<string>We need access to your microphone for voice conversations with AI</string>
```

### 2. macOS Sandbox Configuration

If targeting macOS, enable the following in your target's Signing & Capabilities:

- **App Sandbox**:
  - Outgoing Connections (Client) ✓
  - Audio Input ✓
- **Hardened Runtime**:
  - Audio Input ✓

## Usage

### Basic Example

```swift
import SwiftUI
import OpenAI

struct ContentView: View {
    let realtimeManager = RealtimeManager()
    @State private var isActive = false

    var body: some View {
        Button(isActive ? "Stop" : "Start") {
            isActive.toggle()
            if isActive {
                Task {
                    try? await realtimeManager.startConversation()
                }
            } else {
                Task {
                    await realtimeManager.stopConversation()
                }
            }
        }
    }
}

@RealtimeActor
final class RealtimeManager {
    private var session: OpenAIRealtimeSession?
    private var audioController: AudioController?

    func startConversation() async throws {
        // Initialize service
        let service = OpenAIServiceFactory.service(apiKey: "your-api-key")

        // Configure session
        let config = OpenAIRealtimeSessionConfiguration(
            inputAudioFormat: .pcm16,
            inputAudioTranscription: .init(model: "whisper-1"),
            instructions: "You are a helpful assistant",
            modalities: [.audio, .text],
            outputAudioFormat: .pcm16,
            voice: "shimmer"
        )

        // Create session
        session = try await service.realtimeSession(
            model: "gpt-4o-mini-realtime-preview-2024-12-17",
            configuration: config
        )

        // Setup audio
        audioController = try await AudioController(modes: [.playback, .record])

        // Handle microphone input
        Task {
            let micStream = try audioController!.micStream()
            for await buffer in micStream {
                if let base64Audio = AudioUtils.base64EncodeAudioPCMBuffer(from: buffer) {
                    await session?.sendMessage(
                        OpenAIRealtimeInputAudioBufferAppend(audio: base64Audio)
                    )
                }
            }
        }

        // Handle AI responses
        Task {
            for await message in session!.receiver {
                switch message {
                case .responseAudioDelta(let audio):
                    audioController?.playPCM16Audio(base64String: audio)
                case .inputAudioBufferSpeechStarted:
                    audioController?.interruptPlayback()
                default:
                    break
                }
            }
        }
    }

    func stopConversation() {
        audioController?.stop()
        session?.disconnect()
    }
}
```

## Configuration Options

### Voice Options

- `alloy` - Neutral and balanced
- `echo` - Friendly and warm
- `shimmer` - Gentle and calming

### Turn Detection

#### Server VAD (Voice Activity Detection)

```swift
turnDetection: .init(type: .serverVAD(
    prefixPaddingMs: 300,  // Audio to include before speech
    silenceDurationMs: 500, // Silence duration to detect end
    threshold: 0.5         // Activation threshold (0.0-1.0)
))
```

#### Semantic VAD

```swift
turnDetection: .init(type: .semanticVAD(
    eagerness: .medium  // .low, .medium, or .high
))
```

### Modalities

```swift
modalities: [.audio, .text]  // Both audio and text
modalities: [.text]          // Text only (disables audio)
```

## Handling Different Events

```swift
for await message in session.receiver {
    switch message {
    case .error(let error):
        print("Error: \(error ?? "Unknown")")

    case .sessionCreated:
        print("Session started")

    case .sessionUpdated:
        // Trigger first response if AI speaks first
        await session.sendMessage(OpenAIRealtimeResponseCreate())

    case .responseAudioDelta(let base64Audio):
        audioController.playPCM16Audio(base64String: base64Audio)

    case .inputAudioBufferSpeechStarted:
        // User started speaking, interrupt AI
        audioController.interruptPlayback()

    case .responseTranscriptDone(let transcript):
        print("AI said: \(transcript)")

    case .inputAudioTranscriptionCompleted(let transcript):
        print("User said: \(transcript)")

    case .responseFunctionCallArgumentsDone(let name, let args, let callId):
        print("Function \(name) called with: \(args)")
        // Handle function call and return result

    default:
        break
    }
}
```

## Function Calling

Add tools to your configuration:

```swift
let config = OpenAIRealtimeSessionConfiguration(
    tools: [
        .init(
            name: "get_weather",
            description: "Get the current weather for a location",
            parameters: [
                "type": "object",
                "properties": [
                    "location": [
                        "type": "string",
                        "description": "City name"
                    ]
                ],
                "required": ["location"]
            ]
        )
    ],
    toolChoice: .auto
)
```

Handle function calls in the message loop:

```swift
case .responseFunctionCallArgumentsDone(let name, let args, let callId):
    // Parse arguments and execute function
    let result = handleFunction(name: name, args: args)

    // Return result to OpenAI
    await session.sendMessage(
        OpenAIRealtimeConversationItemCreate(
            item: .init(role: "function", text: result)
        )
    )
```

## Troubleshooting

### No Audio Output

- Check that `.playback` mode is included in AudioController initialization
- Verify audio permissions are granted
- Ensure `outputAudioFormat` is set to `.pcm16`

### No Microphone Input

- Check that `.record` mode is included in AudioController initialization
- Verify microphone permissions in Info.plist
- Check System Settings > Privacy & Security > Microphone

### WebSocket Connection Fails

- Verify API key is correct
- Check that `openai-beta: realtime=v1` header is included (SwiftOpenAI handles this automatically)
- Ensure you're using a compatible model (gpt-4o-mini-realtime-preview or newer)

## Resources

- [OpenAI Realtime API Documentation](https://platform.openai.com/docs/api-reference/realtime)
- [SwiftOpenAI GitHub](https://github.com/jamesrochabrun/SwiftOpenAI)


================================================
FILE: Examples/RealtimeExample/RealtimeExample.swift
================================================
//
//  RealtimeExample.swift
//  SwiftOpenAI
//
//  Example implementation of OpenAI Realtime API for bidirectional voice conversation
//

import AVFoundation
import OpenAI
import SwiftUI

// MARK: - RealtimeExampleView

struct RealtimeExampleView: View {
  let realtimeManager = RealtimeManager()

  var body: some View {
    VStack {
      Button(isRealtimeActive ? "Stop OpenAI Realtime" : "Start OpenAI Realtime") {
        isRealtimeActive.toggle()
      }
    }
  }

  @State private var isRealtimeActive = false {
    willSet {
      if newValue {
        startRealtime()
      } else {
        stopRealtime()
      }
    }
  }

  private func startRealtime() {
    Task {
      do {
        try await realtimeManager.startConversation()
      } catch {
        print("Could not start OpenAI realtime: \(error.localizedDescription)")
      }
    }
  }

  private func stopRealtime() {
    Task {
      await realtimeManager.stopConversation()
    }
  }

}

// MARK: - RealtimeManager

@RealtimeActor
final class RealtimeManager {
  nonisolated init() { }

  func startConversation() async throws {
    // Initialize the OpenAI service with your API key
    let service = OpenAIServiceFactory.service(apiKey: "your-api-key-here")

    // Set to false if you want your user to speak first
    let aiSpeaksFirst = true

    let audioController = try await AudioController(modes: [.playback, .record])
    let micStream = try audioController.micStream()

    // Configure the realtime session
    let configuration = OpenAIRealtimeSessionConfiguration(
      inputAudioFormat: .pcm16,
      inputAudioTranscription: .init(model: "whisper-1"),
      instructions: "You are a helpful, witty, and friendly AI assistant. " +
        "Your voice and personality should be warm and engaging, " +
        "with a lively and playful tone. Talk quickly.",
      maxResponseOutputTokens: .int(4096),
      modalities: [.audio, .text],
      outputAudioFormat: .pcm16,
      temperature: 0.7,
      turnDetection: .init(
        type: .semanticVAD(eagerness: .medium)),
      voice: "shimmer")

    // Create the realtime session
    let realtimeSession = try await service.realtimeSession(
      model: "gpt-4o-mini-realtime-preview-2024-12-17",
      configuration: configuration)

    // Send audio from the microphone to OpenAI once OpenAI is ready for it
    var isOpenAIReadyForAudio = false
    Task {
      for await buffer in micStream {
        if
          isOpenAIReadyForAudio,
          let base64Audio = AudioUtils.base64EncodeAudioPCMBuffer(from: buffer)
        {
          await realtimeSession.sendMessage(
            OpenAIRealtimeInputAudioBufferAppend(audio: base64Audio))
        }
      }
    }

    // Listen for messages from OpenAI
    Task {
      for await message in realtimeSession.receiver {
        switch message {
        case .error(let errorMessage):
          print("Received error from OpenAI: \(errorMessage ?? "Unknown error")")
          realtimeSession.disconnect()

        case .sessionUpdated:
          if aiSpeaksFirst {
            await realtimeSession.sendMessage(OpenAIRealtimeResponseCreate())
          } else {
            isOpenAIReadyForAudio = true
          }

        case .responseAudioDelta(let base64String):
          audioController.playPCM16Audio(base64String: base64String)

        case .inputAudioBufferSpeechStarted:
          // User started speaking, interrupt AI playback
          audioController.interruptPlayback()

        case .responseCreated:
          isOpenAIReadyForAudio = true

        case .responseTranscriptDone(let transcript):
          print("AI said: \(transcript)")

        case .inputAudioTranscriptionCompleted(let transcript):
          print("User said: \(transcript)")

        case .responseFunctionCallArgumentsDone(let name, let arguments, let callId):
          print("Function call: \(name) with args: \(arguments)")
                    // Handle function calls here

        default:
          break
        }
      }
    }

    self.realtimeSession = realtimeSession
    self.audioController = audioController
  }

  func stopConversation() {
    audioController?.stop()
    realtimeSession?.disconnect()
    audioController = nil
    realtimeSession = nil
  }

  private var realtimeSession: OpenAIRealtimeSession?
  private var audioController: AudioController?

}

// MARK: - Basic Usage Example

// To use the Realtime API:
//
// 1. Add NSMicrophoneUsageDescription to your Info.plist:
//   <key>NSMicrophoneUsageDescription</key>
//   <string>We need access to your microphone for voice conversations with AI</string>
//
// 2. On macOS, enable the following in your target's Signing & Capabilities:
//   - App Sandbox > Outgoing Connections (client)
//   - App Sandbox > Audio Input
//   - Hardened Runtime > Audio Input
//
// 3. Initialize the service with your API key:
//   let service = OpenAIServiceFactory.service(apiKey: "your-api-key")
//
// 4. Create a session configuration:
//   let config = OpenAIRealtimeSessionConfiguration(
//       inputAudioFormat: .pcm16,
//       instructions: "You are a helpful assistant",
//       modalities: [.audio, .text],
//       outputAudioFormat: .pcm16,
//       voice: "shimmer"
//   )
//
// 5. Create the realtime session:
//   let session = try await service.realtimeSession(
//       model: "gpt-4o-mini-realtime-preview-2024-12-17",
//       configuration: config
//   )
//
// 6. Set up audio controller:
//   let audioController = try await AudioController(modes: [.playback, .record])
//
// 7. Stream microphone audio to OpenAI:
//   for await buffer in try audioController.micStream() {
//       if let base64Audio = AudioUtils.base64EncodeAudioPCMBuffer(from: buffer) {
//           await session.sendMessage(
//               OpenAIRealtimeInputAudioBufferAppend(audio: base64Audio)
//           )
//       }
//   }
//
// 8. Listen for and play responses:
//   for await message in session.receiver {
//       switch message {
//       case .responseAudioDelta(let base64Audio):
//           audioController.playPCM16Audio(base64String: base64Audio)
//       default:
//           break
//       }
//   }


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/AIProxyIntroView.swift
================================================
//
//  AIProxyIntroView.swift
//  SwiftOpenAIExample
//
//  Created by Lou Zell on 3/27/24.
//

import SwiftOpenAI
import SwiftUI

struct AIProxyIntroView: View {
  var body: some View {
    NavigationStack {
      VStack {
        Spacer()
        VStack(spacing: 24) {
          TextField("Enter partial key", text: $partialKey)
          TextField("Enter your service's URL", text: $serviceURL)
        }
        .padding()
        .textFieldStyle(.roundedBorder)

        Text("You receive a partial key and service URL when you configure an app in the AIProxy dashboard")
          .font(.caption)

        NavigationLink(destination: OptionsListView(
          openAIService: aiproxyService,
          options: OptionsListView.APIOption.allCases.filter { $0 != .localChat }))
        {
          Text("Continue")
            .padding()
            .padding(.horizontal, 48)
            .foregroundColor(.white)
            .background(
              Capsule()
                .foregroundColor(canProceed ? Color(red: 64 / 255, green: 195 / 255, blue: 125 / 255) : .gray.opacity(0.2)))
        }
        .disabled(!canProceed)
        Spacer()
        Group {
          Text(
            "AIProxy keeps your OpenAI API key secure. To configure AIProxy for your project, or to learn more about how it works, please see the docs at ") +
            Text("[this link](https://www.aiproxy.pro/docs).")
        }
        .font(.caption)
      }
      .padding()
      .navigationTitle("AIProxy Configuration")
    }
  }

  @State private var partialKey = ""
  @State private var serviceURL = ""

  private var canProceed: Bool {
    !(partialKey.isEmpty || serviceURL.isEmpty)
  }

  private var aiproxyService: OpenAIService {
    OpenAIServiceFactory.service(
      aiproxyPartialKey: partialKey,
      aiproxyServiceURL: serviceURL != "" ? serviceURL : nil)
  }
}

#Preview {
  ApiKeyIntroView()
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/ApiKeyIntroView.swift
================================================
//
//  ApiKeyIntroView.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 10/19/23.
//

import SwiftOpenAI
import SwiftUI

struct ApiKeyIntroView: View {
  var body: some View {
    NavigationStack {
      VStack {
        Spacer()
        VStack(spacing: 24) {
          TextField("Enter API Key", text: $apiKey)
          TextField("Enter Organization ID (Optional)", text: $organizationIdentifier)
            .onChange(of: organizationIdentifier) { _, newValue in
              if !newValue.isEmpty {
                localOrganizationID = newValue
              }
            }
        }
        .padding()
        .textFieldStyle(.roundedBorder)
        NavigationLink(destination: OptionsListView(
          openAIService: OpenAIServiceFactory.service(apiKey: apiKey, organizationID: localOrganizationID, debugEnabled: true),
          options: OptionsListView.APIOption.allCases.filter { $0 != .localChat }))
        {
          Text("Continue")
            .padding()
            .padding(.horizontal, 48)
            .foregroundColor(.white)
            .background(
              Capsule()
                .foregroundColor(apiKey.isEmpty ? .gray.opacity(0.2) : Color(red: 64 / 255, green: 195 / 255, blue: 125 / 255)))
        }
        .disabled(apiKey.isEmpty)
        Spacer()
        Group {
          Text("If you don't have a valid API KEY yet, you can visit ") +
            Text("[this link](https://platform.openai.com/account/api-keys)") + Text(" to get started.")
        }
        .font(.caption)
      }
      .padding()
      .navigationTitle("Enter OpenAI API KEY")
    }
  }

  @State private var apiKey = ""
  @State private var organizationIdentifier = ""
  @State private var localOrganizationID: String? = nil
}

#Preview {
  ApiKeyIntroView()
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/Assets.xcassets/AccentColor.colorset/Contents.json
================================================
{
  "colors" : [
    {
      "idiom" : "universal"
    }
  ],
  "info" : {
    "author" : "xcode",
    "version" : 1
  }
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/Assets.xcassets/AppIcon.appiconset/Contents.json
================================================
{
  "images" : [
    {
      "idiom" : "universal",
      "platform" : "ios",
      "size" : "1024x1024"
    },
    {
      "idiom" : "mac",
      "scale" : "1x",
      "size" : "16x16"
    },
    {
      "idiom" : "mac",
      "scale" : "2x",
      "size" : "16x16"
    },
    {
      "idiom" : "mac",
      "scale" : "1x",
      "size" : "32x32"
    },
    {
      "idiom" : "mac",
      "scale" : "2x",
      "size" : "32x32"
    },
    {
      "idiom" : "mac",
      "scale" : "1x",
      "size" : "128x128"
    },
    {
      "idiom" : "mac",
      "scale" : "2x",
      "size" : "128x128"
    },
    {
      "idiom" : "mac",
      "scale" : "1x",
      "size" : "256x256"
    },
    {
      "idiom" : "mac",
      "scale" : "2x",
      "size" : "256x256"
    },
    {
      "idiom" : "mac",
      "scale" : "1x",
      "size" : "512x512"
    },
    {
      "idiom" : "mac",
      "scale" : "2x",
      "size" : "512x512"
    }
  ],
  "info" : {
    "author" : "xcode",
    "version" : 1
  }
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/Assets.xcassets/Contents.json
================================================
{
  "info" : {
    "author" : "xcode",
    "version" : 1
  }
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/Assistants/AssistantConfigurationDemoView.swift
================================================
//
//  AssistantConfigurationDemoView.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 11/17/23.
//

import Foundation
import SwiftOpenAI
import SwiftUI

// MARK: - AssistantFunctionCallDefinition

enum AssistantFunctionCallDefinition: String, CaseIterable {
  case createImage = "create_image"

  var functionTool: AssistantObject.Tool {
    switch self {
    case .createImage:
      .init(type: .function, function: .init(
        name: rawValue,
        strict: nil,
        description: "call this function if the request asks to generate an image",
        parameters: .init(
          type: .object,
          properties: [
            "prompt": .init(type: .string, description: "The exact prompt passed in."),
            "count": .init(type: .integer, description: "The number of images requested"),
          ],
          required: ["prompt", "count"])))
    }
  }
}

// MARK: - AssistantConfigurationDemoView

struct AssistantConfigurationDemoView: View {
  init(service: OpenAIService) {
    self.service = service
    _provider = State(initialValue: AssistantConfigurationProvider(service: service))
  }

  var isCodeInterpreterOn: Binding<Bool> {
    Binding(
      get: {
        parameters.tools.contains { $0.displayToolType == .codeInterpreter } == true
      },
      set: { newValue in
        if newValue {
          parameters.tools.append(AssistantObject.Tool(type: .codeInterpreter))
        } else {
          parameters.tools.removeAll { $0.displayToolType == .codeInterpreter }
        }
      })
  }

  var isDalleToolOn: Binding<Bool> {
    Binding(
      get: {
        parameters.tools.contains { $0.displayToolType == .function } == true
      },
      set: { newValue in
        if newValue {
          parameters.tools.append(AssistantFunctionCallDefinition.createImage.functionTool)
        } else {
          parameters.tools.removeAll { $0.displayToolType == .function }
        }
      })
  }

  var isFileSearchOn: Binding<Bool> {
    Binding(
      get: {
        parameters.tools.contains { $0.displayToolType == .fileSearch } == true
      },
      set: { newValue in
        if newValue {
          parameters.tools.append(AssistantObject.Tool(type: .fileSearch))
        } else {
          parameters.tools.removeAll { $0.displayToolType == .fileSearch }
        }
      })
  }

  var body: some View {
    ScrollView {
      VStack(spacing: 16) {
        avatarView
        inputViews
        capabilities
        footerActions
        knowledge
      }
      .padding()
    }.sheet(isPresented: $showAvatarFlow) {
      AssistantsListDemoView(assistants: provider.assistants, service: service)
    }
  }

  var footerActions: some View {
    HStack {
      Button("Save") {
        Task {
          try await provider.createAssistant(parameters: parameters)
        }
      }
      Button("Delete") {
        Task {
          for assistant in provider.assistants {
            try await provider.deleteAssistant(id: assistant.id)
          }
        }
      }
      Button("Show Assistants") {
        Task {
          try await provider.listAssistants()
          showAvatarFlow = true
        }
      }
    }
    .buttonStyle(.borderedProminent)
  }

  @ViewBuilder
  var avatarView: some View {
    if isAvatarLoading {
      Circle()
        .stroke(.gray, style: StrokeStyle(lineWidth: 1, dash: [5, 5]))
        .frame(width: 100, height: 100)
        .overlay(
          Image(systemName: "rays")
            .resizable()
            .frame(width: 20, height: 20)
            .tint(.gray)
            .symbolEffect(.variableColor.iterative.dimInactiveLayers))
    } else if let avatarURL = provider.avatarURL {
      URLImageView(url: avatarURL)
        .clipShape(Circle())
        .overlay(Circle().stroke(Color.white, lineWidth: 1))
        .shadow(radius: 10)
    } else {
      Circle()
        .stroke(.gray, style: StrokeStyle(lineWidth: 1, dash: [5, 5]))
        .frame(width: 100, height: 100)
        .overlay(
          Menu(content: {
            Button {
              Task {
                isAvatarLoading = true
                defer { isAvatarLoading = false } // ensure isLoading is set to false when the
                let prompt = parameters.description ?? "Some random image for an avatar"
                try await provider.createAvatar(prompt: prompt)
              }
            } label: {
              Text("Use DALL·E")
            }
          }, label: {
            Image(systemName: "plus")
              .resizable()
              .frame(width: 20, height: 20)
              .tint(.gray)
          }))
    }
  }

  var inputViews: some View {
    VStack(spacing: 16) {
      InputView(title: "Name") {
        TextField("", text: $parameters.name.orEmpty, axis: .vertical)
      }
      InputView(title: "Description") {
        TextField("", text: $parameters.description.orEmpty, axis: .vertical)
      }
      InputView(title: "Instructions") {
        ZStack {
          RoundedRectangle(cornerRadius: 4)
            .stroke(.gray.opacity(0.3))
          TextEditor(text: $parameters.instructions.orEmpty)
            .foregroundStyle(.primary)
            .clipShape(RoundedRectangle(cornerRadius: 4))
            .frame(minHeight: 100)
        }
      }
    }
    .textFieldStyle(.roundedBorder)
  }

  var capabilities: some View {
    InputView(title: "Capabilities") {
      VStack(spacing: 16) {
        CheckboxRow(title: "Code interpreter", isChecked: isCodeInterpreterOn)
        CheckboxRow(title: "File Search", isChecked: isFileSearchOn)
        CheckboxRow(title: "DALL·E Image Generation", isChecked: isDalleToolOn)
      }
    }
    .inputViewStyle(.init(verticalPadding: 16.0))
  }

  // TODO: Add a demo to create a vector store and add files in to it.
  var knowledge: some View {
    FilesPicker(
      service: service,
      sectionTitle: "Knowledge",
      actionTitle: "Upload files",
      fileIDS: $fileIDS,
      actions: $filePickerInitialActions)
  }

  @State private var provider: AssistantConfigurationProvider
  @State private var parameters = AssistantParameters(action: .create(model: Model.gpt41106Preview.value))
  @State private var isAvatarLoading = false
  @State private var showAvatarFlow = false
  @State private var fileIDS = [String]()
  /// Used mostly to display already uploaded files if any.
  @State private var filePickerInitialActions = [FilePickerAction]()

  private let service: OpenAIService
}

extension Binding where Value == String? {
  var orEmpty: Binding<String> {
    Binding<String>(
      get: { self.wrappedValue ?? "" },
      set: { self.wrappedValue = $0 })
  }
}

#Preview {
  AssistantConfigurationDemoView(service: OpenAIServiceFactory.service(apiKey: ""))
}

// MARK: - InputView

struct InputView<Content: View>: View {
  let content: Content
  let title: String

  init(title: String, @ViewBuilder content: () -> Content) {
    self.title = title
    self.content = content()
  }

  var body: some View {
    VStack(alignment: .leading, spacing: style.verticalPadding) {
      Text(title)
        .font(.headline)
      content
    }
  }

  @Environment(\.inputViewStyle) private var style: InputViewStyle
}

// MARK: - InputViewStyle

struct InputViewStyle {
  let verticalPadding: CGFloat

  init(verticalPadding: CGFloat = 8.0) {
    self.verticalPadding = verticalPadding
  }
}

// MARK: - InputViewStyleKey

struct InputViewStyleKey: EnvironmentKey {
  static let defaultValue = InputViewStyle()
}

extension EnvironmentValues {
  var inputViewStyle: InputViewStyle {
    get { self[InputViewStyleKey.self] }
    set { self[InputViewStyleKey.self] = newValue }
  }
}

extension View {
  func inputViewStyle(_ style: InputViewStyle) -> some View {
    environment(\.inputViewStyle, style)
  }
}

// MARK: - CheckboxView

struct CheckboxView: View {
  @Binding var isChecked: Bool

  var body: some View {
    Button(action: {
      withAnimation {
        isChecked.toggle()
      }
    }) {
      Image(systemName: isChecked ? "checkmark.square" : "square")
    }
    .buttonStyle(PlainButtonStyle())
  }
}

// MARK: - CheckboxRow

struct CheckboxRow: View {
  let title: String
  @Binding var isChecked: Bool

  var body: some View {
    HStack {
      CheckboxView(isChecked: $isChecked)
      Text(title)
      Spacer()
    }
  }
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/Assistants/AssistantConfigurationProvider.swift
================================================
//
//  AssistantConfigurationProvider.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 11/17/23.
//

import Foundation
import SwiftOpenAI

@Observable
class AssistantConfigurationProvider {
  // MARK: - Initializer

  init(service: OpenAIService) {
    self.service = service
  }

  var assistant: AssistantObject?
  var assistants = [AssistantObject]()
  var avatarURL: URL?
  var assistantDeletionStatus: DeletionStatus?

  func listAssistants()
    async throws
  {
    do {
      let assistants = try await service.listAssistants(limit: nil, order: nil, after: nil, before: nil)
      self.assistants = assistants.data
    } catch {
      debugPrint("\(error)")
    }
  }

  func deleteAssistant(
    id: String)
    async throws
  {
    do {
      assistantDeletionStatus = try await service.deleteAssistant(id: id)
    } catch {
      debugPrint("\(error)")
    }
  }

  func createAssistant(
    parameters: AssistantParameters)
    async throws
  {
    do {
      assistant = try await service.createAssistant(parameters: parameters)
    } catch {
      debugPrint("\(error)")
    }
  }

  func createAvatar(
    prompt: String)
    async throws
  {
    do {
      let avatarURLs = try await service.createImages(parameters: .init(prompt: prompt, model: .dallE3)).data?.compactMap(\.url)
      avatarURL = URL(string: avatarURLs?.first ?? "")
    } catch {
      debugPrint("\(error)")
    }
  }

  // TODO: Create demo for this.
  func createVStore() async throws {
    _ = try await service.createVectorStore(parameters: .init(name: "Personal Data"))
  }

  // MARK: - Private Properties

  private let service: OpenAIService
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/Assistants/AssistantStreamDemoScreen.swift
================================================
//
//  AssistantStreamDemoScreen.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 3/19/24.
//

import SwiftOpenAI
import SwiftUI

// MARK: - AssistantStartThreadScreen

public struct AssistantStartThreadScreen: View {
  init(assistant: AssistantObject, service: OpenAIService) {
    self.assistant = assistant
    self.service = service
    _threadProvider = State(initialValue: AssistantThreadConfigurationProvider(service: service))
  }

  public var body: some View {
    ScrollView {
      VStack {
        Text(assistant.name ?? "No name")
          .font(.largeTitle).bold()
        Text("For function call demo type: Create an image of a cow.")
          .font(.caption)
        switch tutorialStage {
        case .crateThread:
          createThreadView

        case .createMessage(let threadID):
          createMessageView(threadID: threadID)

        case .createRunAndStream(let message):
          createRunAndStreamView(threadID: message.threadID)

        case .showStream(let threadID):
          showStreamView(threadID: threadID)
        }
      }
      .padding()
    }
  }

  enum TutorialState {
    case crateThread
    case createMessage(threadID: String)
    case createRunAndStream(message: MessageObject)
    case showStream(threadID: String)
  }

  let assistant: AssistantObject
  let service: OpenAIService

  var createThreadView: some View {
    Button {
      Task {
        try await threadProvider.createThread()
        if let threadID = threadProvider.thread?.id {
          tutorialStage = .createMessage(threadID: threadID)
        }
      }
    } label: {
      Text("Step 1: Create a thread")
    }
  }

  func createMessageView(threadID: String) -> some View {
    VStack(alignment: .leading, spacing: 20) {
      Text("Nice! Thread created id:")
        .font(.title).bold()
      Text("\(threadID)")
      Text("Step 2: Create a message in the text field and press the button ✈️").font(.title)
      Text("eg: Briefly explain SwiftUI state.")
      HStack(spacing: 4) {
        TextField("Enter prompt", text: $prompt, axis: .vertical)
          .textFieldStyle(.roundedBorder)
          .padding()
        Button {
          Task {
            try await threadProvider.createMessage(
              threadID: threadID,
              parameters: .init(role: .user, content: .stringContent(prompt)))
            if let message = threadProvider.message {
              tutorialStage = .createRunAndStream(message: message)
            }
          }
        } label: {
          Image(systemName: "paperplane")
        }
      }
    }
    .padding()
  }

  func createRunAndStreamView(threadID: String) -> some View {
    VStack(spacing: 20) {
      Text("Nice! Message created with id:")
        .font(.title2).bold()
      Text("\(threadID)")
        .font(.body)
      Text("Step 3: Run and Stream the message")
        .font(.title2)

      Button {
        Task {
          tutorialStage = .showStream(threadID: threadID)
          try await threadProvider.createRunAndStreamMessage(
            threadID: threadID,
            parameters: .init(assistantID: assistant.id))
        }
      } label: {
        Text("Run and Stream the message")
      }
      .buttonStyle(.borderedProminent)
      ChatStreamView(provider: threadProvider, prompt: prompt, assistantName: assistant.name)
    }
  }

  func showStreamView(threadID: String) -> some View {
    VStack {
      TextField("Enter prompt", text: $prompt, axis: .vertical)
        .textFieldStyle(.roundedBorder)
        .padding()
      Button {
        Task {
          try await threadProvider.createMessage(
            threadID: threadID,
            parameters: .init(role: .user, content: .stringContent(prompt)))
          threadProvider.messageText = ""
          threadProvider.toolOuptutMessage = ""
          try await threadProvider.createRunAndStreamMessage(
            threadID: threadID,
            parameters: .init(assistantID: assistant.id))
        }
      } label: {
        Text("Run and Stream the message")
      }
      .buttonStyle(.borderedProminent)
      ChatStreamView(provider: threadProvider, prompt: prompt, assistantName: assistant.name)
    }
  }

  @State private var threadProvider: AssistantThreadConfigurationProvider
  @State private var prompt = ""

  @State private var tutorialStage = TutorialState.crateThread
}

// MARK: - ChatStreamView

struct ChatStreamView: View {
  let provider: AssistantThreadConfigurationProvider
  let prompt: String
  let assistantName: String?

  var body: some View {
    VStack(spacing: 24) {
      VStack(alignment: .leading, spacing: 16) {
        Text("User:")
          .font(.title2)
          .bold()
        Text(prompt)
      }
      .frame(maxWidth: .infinity, alignment: .leading)

      VStack(alignment: .leading, spacing: 16) {
        Text("\(assistantName ?? "Assistant"):")
          .font(.title2)
          .bold()
        if !provider.toolOuptutMessage.isEmpty {
          Text("Code Intepreter")
            .foregroundColor(.mint)
            .fontDesign(.monospaced)
            .bold()
            .font(.title3)
          Text(LocalizedStringKey(provider.toolOuptutMessage))
            .fontDesign(.monospaced)
        }
        if !provider.messageText.isEmpty {
          Text("Message")
            .font(.title3)
            .foregroundColor(.mint)
            .fontDesign(.monospaced)
            .bold()
          Text(provider.messageText)
            .font(.body)
        }
        if !provider.functionCallOutput.isEmpty {
          Text("Function Call")
            .font(.title3)
            .foregroundColor(.pink)
            .fontDesign(.monospaced)
            .bold()
          Text(provider.functionCallOutput)
            .font(.body)
        }
      }
      .frame(maxWidth: .infinity, alignment: .leading)
    }
  }
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/Assistants/AssistantThreadConfigurationProvider.swift
================================================
//
//  AssistantThreadConfigurationProvider.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 3/19/24.
//

import Foundation
import SwiftOpenAI

@Observable
class AssistantThreadConfigurationProvider {
  // MARK: - Initializer

  init(service: OpenAIService) {
    self.service = service
  }

  var thread: ThreadObject?
  var message: MessageObject?
  var runObject: RunObject?
  var messageText = ""
  var toolOuptutMessage = ""
  var functionCallOutput = ""

  func createThread()
    async throws
  {
    do {
      thread = try await service.createThread(parameters: .init())
    } catch {
      print("THREAD ERROR: \(error)")
    }
  }

  func createMessage(
    threadID: String,
    parameters: MessageParameter)
    async throws
  {
    do {
      message = try await service.createMessage(threadID: threadID, parameters: parameters)
    } catch {
      print("THREAD ERROR: \(error)")
    }
  }

  func createRunAndStreamMessage(
    threadID: String,
    parameters: RunParameter)
    async throws
  {
    do {
      let stream = try await service.createRunStream(threadID: threadID, parameters: parameters)
      for try await result in stream {
        switch result {
        case .threadMessageDelta(let messageDelta):
          let content = messageDelta.delta.content.first
          switch content {
          case .imageFile, .imageUrl, nil:
            break
          case .text(let textContent):
            messageText += textContent.text.value
          }

        case .threadRunStepDelta(let runStepDelta):
          let toolCall = runStepDelta.delta.stepDetails.toolCalls?.first?.toolCall
          switch toolCall {
          case .codeInterpreterToolCall(let toolCall):
            toolOuptutMessage += toolCall.input ?? ""
          case .fileSearchToolCall(let toolCall):
            print("PROVIDER: File search tool call \(toolCall)")
          case .functionToolCall(let toolCall):
            functionCallOutput += toolCall.arguments
          case nil:
            print("PROVIDER: tool call nil")
          }

        case .threadRunCompleted(let runObject):
          print("PROVIDER: the run is completed - \(runObject)")

        default: break
        }
      }
    } catch {
      print("THREAD ERROR: \(error)")
    }
  }

  // MARK: - Private Properties

  private let service: OpenAIService
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/Assistants/AssistantsListDemoView.swift
================================================
//
//  AssistantsListDemoView.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 3/19/24.
//

import SwiftOpenAI
import SwiftUI

// MARK: - AssistantObject + Identifiable

extension AssistantObject: Identifiable { }

// MARK: - AssistantsListDemoView

public struct AssistantsListDemoView: View {
  public var body: some View {
    NavigationView {
      ForEach(assistants) { assistant in
        NavigationLink(destination: AssistantStartThreadScreen(assistant: assistant, service: service)) {
          VStack(alignment: .leading) {
            Text(assistant.name ?? "No name")
              .font(.title).bold()
            Text(assistant.description ?? "No Description")
              .font(.subheadline).fontWeight(.medium)
            Text(assistant.id)
              .font(.caption).fontWeight(.bold)
          }
          .padding()
          .frame(maxWidth: .infinity, alignment: .leading)
          .background {
            RoundedRectangle(cornerRadius: 25.0)
              .fill(.mint)
          }
          .padding()
        }
      }
    }
  }

  let assistants: [AssistantObject]
  let service: OpenAIService
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/AudioDemo/AudioDemoView.swift
================================================
//
//  AudioDemoView.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 10/19/23.
//

import SwiftOpenAI
import SwiftUI

struct AudioDemoView: View {
  init(service: OpenAIService) {
    _audioProvider = State(initialValue: AudioProvider(service: service))
  }

  var textArea: some View {
    HStack(spacing: 4) {
      TextField("Enter message to convert to speech", text: $prompt, axis: .vertical)
        .textFieldStyle(.roundedBorder)
        .padding()
      Button {
        Task {
          isLoading = true
          defer { isLoading = false } // ensure isLoading is set to false when the
          try await audioProvider.speech(parameters: .init(model: .tts1, input: prompt, voice: .shimmer))
        }
      } label: {
        Image(systemName: "paperplane")
      }
      .buttonStyle(.bordered)
    }
    .padding()
  }

  var transcriptionView: some View {
    VStack {
      Text("Tap this button to use the transcript API, a `m4a` file has been added to the app's bundle.")
        .font(.callout)
        .padding()
      Button("Transcript") {
        Task {
          isLoading = true
          defer { isLoading = false } // ensure isLoading is set to false when the function exits
          /// ['flac', 'm4a', 'mp3', 'mp4', 'mpeg', 'mpga', 'oga', 'ogg', 'wav', 'webm'] (supported formats)
          let data = try contentLoader.loadBundledContent(fromFileNamed: "narcos", ext: "m4a")
          try await audioProvider.transcript(parameters: .init(fileName: "narcos.m4a", file: data))
        }
      }
      .buttonStyle(.borderedProminent)
      Text(audioProvider.transcription)
        .padding()
    }
  }

  var translationView: some View {
    VStack {
      Text("Tap this button to use the translationView API, a `m4a` file in German has been added to the app's bundle.")
        .font(.callout)
        .padding()
      Button("Translate") {
        Task {
          isLoading = true
          defer { isLoading = false } // ensure isLoading is set to false when the function exits
          /// ['flac', 'm4a', 'mp3', 'mp4', 'mpeg', 'mpga', 'oga', 'ogg', 'wav', 'webm'] (supported formats)
          let data = try contentLoader.loadBundledContent(fromFileNamed: "german", ext: "m4a")
          try await audioProvider.translate(parameters: .init(fileName: "german.m4a", file: data))
        }
      }
      .buttonStyle(.borderedProminent)
      Text(audioProvider.translation)
        .padding()
    }
  }

  var body: some View {
    ScrollView {
      VStack {
        VStack {
          Text("Add a text to convert to speech")
          textArea
        }
        transcriptionView
          .padding()
        Divider()
        translationView
          .padding()
      }
    }.overlay(
      Group {
        if isLoading {
          ProgressView()
        } else {
          EmptyView()
        }
      })
      .safeAreaPadding()
  }

  @State private var audioProvider: AudioProvider
  @State private var isLoading = false
  @State private var prompt = ""

  private let contentLoader = ContentLoader()
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/AudioDemo/AudioProvider.swift
================================================
//
//  AudioProvider.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 10/19/23.
//

import AVFoundation
import SwiftOpenAI
import SwiftUI

@Observable
class AudioProvider {
  init(service: OpenAIService) {
    self.service = service
  }

  var transcription = ""
  var translation = ""
  var speechErrorMessage = ""
  var audioPlayer: AVAudioPlayer?

  func transcript(
    parameters: AudioTranscriptionParameters)
    async throws
  {
    do {
      transcription = try await service.createTranscription(parameters: parameters).text
    } catch {
      transcription = "\(error)"
    }
  }

  func translate(
    parameters: AudioTranslationParameters)
    async throws
  {
    do {
      translation = try await service.createTranslation(parameters: parameters).text
    } catch {
      translation = "\(error)"
    }
  }

  func speech(
    parameters: AudioSpeechParameters)
    async throws
  {
    do {
      let speech = try await service.createSpeech(parameters: parameters).output
      playAudio(from: speech)
    } catch let error as APIError {
      speechErrorMessage = error.displayDescription
    } catch {
      speechErrorMessage = "\(error)"
    }
  }

  private let service: OpenAIService

  private func playAudio(from data: Data) {
    do {
      // Initialize the audio player with the data
      audioPlayer = try AVAudioPlayer(data: data)
      audioPlayer?.prepareToPlay()
      audioPlayer?.play()
    } catch {
      // Handle errors
      print("Error playing audio: \(error.localizedDescription)")
    }
  }
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/ChatDemo/ChatDemoView.swift
================================================
//
//  ChatDemoView.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 10/19/23.
//

import SwiftOpenAI
import SwiftUI

struct ChatDemoView: View {
  init(service: OpenAIService, customModel: String? = nil) {
    self.customModel = customModel
    _chatProvider = State(initialValue: ChatProvider(service: service))
  }

  enum ChatConfig {
    case chatCompletion
    case chatCompeltionStream
  }

  let customModel: String?

  var body: some View {
    ScrollView {
      VStack {
        picker
        textArea
        Text(chatProvider.errorMessage)
          .foregroundColor(.red)
        switch selectedSegment {
        case .chatCompeltionStream:
          streamedChatResultView
        case .chatCompletion:
          chatCompletionResultView
        }
      }
    }
    .overlay(
      Group {
        if isLoading {
          ProgressView()
        } else {
          EmptyView()
        }
      })
  }

  var picker: some View {
    Picker("Options", selection: $selectedSegment) {
      Text("Chat Completion").tag(ChatConfig.chatCompletion)
      Text("Chat Completion stream").tag(ChatConfig.chatCompeltionStream)
    }
    .pickerStyle(SegmentedPickerStyle())
    .padding()
  }

  var textArea: some View {
    HStack(spacing: 4) {
      TextField("Enter prompt", text: $prompt, axis: .vertical)
        .textFieldStyle(.roundedBorder)
        .padding()
      Button {
        Task {
          isLoading = true
          defer { isLoading = false } // ensure isLoading is set to false when the

          let content = ChatCompletionParameters.Message.ContentType.text(prompt)
          prompt = ""
          let model: Model =
            if let customModel, !customModel.isEmpty {
              .custom(customModel)
            } else {
              .gpt4o
            }

          let parameters = ChatCompletionParameters(
            messages: [.init(
              role: .user,
              content: content)],
            model: model)
          switch selectedSegment {
          case .chatCompletion:
            try await chatProvider.startChat(parameters: parameters)
          case .chatCompeltionStream:
            try await chatProvider.startStreamedChat(parameters: parameters)
          }
        }
      } label: {
        Image(systemName: "paperplane")
      }
      .buttonStyle(.bordered)
    }
    .padding()
  }

  /// stream = `false`
  var chatCompletionResultView: some View {
    ForEach(Array(chatProvider.messages.enumerated()), id: \.offset) { _, val in
      VStack(spacing: 0) {
        Text("\(val)")
      }
    }
  }

  /// stream = `true`
  var streamedChatResultView: some View {
    VStack {
      Button("Cancel stream") {
        chatProvider.cancelStream()
      }
      Text(chatProvider.message)
    }
  }

  @State private var chatProvider: ChatProvider
  @State private var isLoading = false
  @State private var prompt = ""
  @State private var selectedSegment = ChatConfig.chatCompeltionStream
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/ChatDemo/ChatProvider.swift
================================================
//
//  ChatProvider.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 10/19/23.
//

import SwiftOpenAI
import SwiftUI

@Observable
class ChatProvider {
  init(service: OpenAIService) {
    self.service = service
  }

  var messages = [String]()
  var errorMessage = ""
  var message = ""
  var usage: ChatUsage?

  func startChat(
    parameters: ChatCompletionParameters)
    async throws
  {
    do {
      let response = try await service.startChat(parameters: parameters)
      let choices = response.choices
      let chatUsage = response.usage
      let logprobs = choices?.compactMap(\.logprobs)
      dump(logprobs)
      messages = choices?.compactMap(\.message?.content) ?? []
      dump(chatUsage)
      usage = chatUsage
    } catch APIError.responseUnsuccessful(let description, let statusCode) {
      self.errorMessage = "Network error with status code: \(statusCode) and description: \(description)"
    } catch {
      errorMessage = error.localizedDescription
    }
  }

  func startStreamedChat(
    parameters: ChatCompletionParameters)
    async throws
  {
    streamTask = Task {
      do {
        let stream = try await service.startStreamedChat(parameters: parameters)
        for try await result in stream {
          let content = result.choices?.first?.delta?.content ?? ""
          self.message += content
        }
      } catch APIError.responseUnsuccessful(let description, let statusCode) {
        self.errorMessage = "Network error with status code: \(statusCode) and description: \(description)"
      } catch {
        self.errorMessage = error.localizedDescription
      }
    }
  }

  func cancelStream() {
    streamTask?.cancel()
  }

  private let service: OpenAIService
  private var streamTask: Task<Void, Never>?
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/ChatFunctionsCall/ChatMessageDisplayModel.swift
================================================
//
//  ChatMessageDisplayModel.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 11/13/23.
//

import Foundation
import SwiftOpenAI

struct ChatMessageDisplayModel: Identifiable {
  init(
    id: UUID = UUID(),
    content: DisplayContent,
    origin: MessageOrigin)
  {
    self.id = id
    self.content = content
    self.origin = origin
  }

  enum DisplayContent: Equatable {
    case content(DisplayMessageType)
    case error(String)

    static func ==(lhs: DisplayContent, rhs: DisplayContent) -> Bool {
      switch (lhs, rhs) {
      case (.content(let a), .content(let b)):
        a == b
      case (.error(let a), .error(let b)):
        a == b
      default:
        false
      }
    }

    struct DisplayMessageType: Equatable {
      var text: String?
      var urls: [URL]? = nil
    }
  }

  enum MessageOrigin {
    case received(ReceivedSource)
    case sent

    enum ReceivedSource {
      case gpt
      case dalle
    }
  }

  let id: UUID
  var content: DisplayContent
  let origin: MessageOrigin
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/ChatFunctionsCall/ChatMessageView.swift
================================================
//
//  ChatMessageView.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 11/13/23.
//

import Foundation
import SwiftUI

struct ChatMessageView: View {
  let message: ChatMessageDisplayModel

  @ViewBuilder
  var header: some View {
    switch message.origin {
    case .received(let source):
      switch source {
      case .gpt:
        headerWith("wand.and.stars", title: "CHATGPT")
      case .dalle:
        EmptyView()
      }

    case .sent:
      headerWith("person.circle", title: "USER")
    }
  }

  var body: some View {
    VStack(alignment: .leading, spacing: 8) {
      header
      Group {
        switch message.content {
        case .content(let mediaType):
          VStack(alignment: .leading, spacing: Sizes.spacingMedium) {
            imagesFrom(urls: mediaType.urls ?? [])
            chatMessageViewWith(mediaType.text)
          }
          .transition(.opacity)

        case .error(let error):
          Text(error)
            .padding()
            .font(.callout)
            .background(
              RoundedRectangle(cornerRadius: 20)
                .foregroundColor(.red.opacity(0.7)))
        }
      }
      .padding(.leading, 23)
    }
  }

  @ViewBuilder
  func chatMessageViewWith(
    _ text: String?)
    -> some View
  {
    if let text {
      if text.isEmpty {
        LoadingView()
      } else {
        Text(text)
          .font(.body)
      }
    } else {
      EmptyView()
    }
  }

  func headerWith(
    _ systemImageName: String,
    title: String)
    -> some View
  {
    HStack {
      Image(systemName: systemImageName)
        .resizable()
        .frame(width: 16, height: 16)
      Text(title)
        .font(.caption2)
    }
    .foregroundColor(.gray.opacity(0.9))
  }

  func imagesFrom(
    urls: [URL])
    -> some View
  {
    ScrollView(.horizontal, showsIndicators: false) {
      HStack(spacing: 8) {
        ForEach(urls, id: \.self) { url in
          URLImageView(url: url)
        }
      }
    }
  }
}

#Preview {
  VStack {
    ChatMessageView(message: .init(
      content: .content(.init(text: "What is the capital of Peru? and what is the population")),
      origin: .sent))
    ChatMessageView(message: .init(content: .content(.init(text: "Lima, an its 28 million habitants.")), origin: .received(.gpt)))
    ChatMessageView(message: .init(
      content: .content(.init(
        text: "The image you requested is ready 🐱",
        urls: [
          URL(
            string: "https://upload.wikimedia.org/wikipedia/commons/thumb/d/dd/Gfp-wisconsin-madison-the-nature-boardwalk.jpg/2560px-Gfp-wisconsin-madison-the-nature-boardwalk.jpg")!,
        ])),
      origin: .received(.dalle)))
    ChatMessageView(message: .init(content: .content(.init(text: "")), origin: .received(.gpt)))
  }
  .padding()
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/ChatFunctionsCall/Completion/ChatFunctionCallDemoView.swift
================================================
//
//  ChatFunctionCallDemoView.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 11/14/23.
//

import SwiftOpenAI
import SwiftUI

struct ChatFunctionCallDemoView: View {
  init(service: OpenAIService, customModel: String? = nil) {
    self.customModel = customModel
    _chatProvider = State(initialValue: ChatFunctionCallProvider(service: service, customModel: customModel))
  }

  let customModel: String?

  var body: some View {
    ScrollViewReader { proxy in
      VStack {
        List(chatProvider.chatDisplayMessages) { message in
          ChatMessageView(message: message)
            .listRowSeparator(.hidden)
        }
        .listStyle(.plain)
        .onChange(of: chatProvider.chatDisplayMessages.last?.content) {
          let lastMessage = chatProvider.chatDisplayMessages.last
          if let id = lastMessage?.id {
            proxy.scrollTo(id, anchor: .bottom)
          }
        }
        textArea
      }
    }
  }

  var textArea: some View {
    HStack(spacing: 0) {
      VStack(alignment: .leading, spacing: 0) {
        textField
          .padding(.vertical, Sizes.spacingExtraSmall)
          .padding(.horizontal, Sizes.spacingSmall)
      }
      .padding(.vertical, Sizes.spacingExtraSmall)
      .padding(.horizontal, Sizes.spacingExtraSmall)
      .background(
        RoundedRectangle(cornerRadius: 20)
          .stroke(.gray, lineWidth: 1))
      .padding(.horizontal, Sizes.spacingMedium)
      textAreSendButton
    }
    .padding(.horizontal)
    .disabled(isLoading)
  }

  var textField: some View {
    TextField(
      "How Can I help you today?",
      text: $prompt,
      axis: .vertical)
  }

  var textAreSendButton: some View {
    Button {
      Task {
        /// Loading UI
        isLoading = true
        defer { isLoading = false }
        // Clears text field.
        let userPrompt = prompt
        prompt = ""
        try await chatProvider.startChat(prompt: userPrompt)
      }
    } label: {
      Image(systemName: "paperplane")
    }
    .buttonStyle(.bordered)
    .tint(ThemeColor.tintColor)
    .disabled(prompt.isEmpty)
  }

  @State private var chatProvider: ChatFunctionCallProvider
  @State private var isLoading = false
  @State private var prompt = ""
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/ChatFunctionsCall/Completion/ChatFunctionCallProvider.swift
================================================
//
//  ChatFunctionCallProvider.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 11/14/23.
//

import SwiftOpenAI
import SwiftUI

// MARK: - FunctionCallDefinition

enum FunctionCallDefinition: String, CaseIterable {
  case createImage = "create_image"

  // Add more functions if needed, parallel function calling is supported.

  var functionTool: ChatCompletionParameters.Tool {
    switch self {
    case .createImage:
      .init(function: .init(
        name: rawValue,
        strict: nil,
        description: "call this function if the request asks to generate an image",
        parameters: .init(
          type: .object,
          properties: [
            "prompt": .init(type: .string, description: "The exact prompt passed in."),
            "count": .init(type: .integer, description: "The number of images requested"),
          ],
          required: ["prompt", "count"])))
    }
  }
}

// MARK: - ChatFunctionCallProvider

@Observable
class ChatFunctionCallProvider {
  init(service: OpenAIService, customModel: String? = nil) {
    self.service = service
    self.customModel = customModel
  }

  // MARK: - Initializer

  let customModel: String?

  // MARK: - Public Properties

  /// To be used for UI purposes.
  var chatDisplayMessages = [ChatMessageDisplayModel]()

  @MainActor
  func generateImage(arguments: String) async throws -> String {
    let dictionary = arguments.toDictionary()!
    let prompt = dictionary["prompt"] as! String

    let assistantMessage = ChatMessageDisplayModel(
      content: .content(.init(text: "Generating images...")),
      origin: .received(.gpt))
    updateLastAssistantMessage(assistantMessage)

    let urls = try await service.createImages(parameters: .init(prompt: prompt, model: .dallE3)).data?.compactMap(\.url)
      .compactMap { URL(string: $0) } ?? []

    let dalleAssistantMessage = ChatMessageDisplayModel(
      content: .content(.init(text: nil, urls: urls)),
      origin: .received(.dalle))
    updateLastAssistantMessage(dalleAssistantMessage)

    return prompt
  }

  // MARK: - Public Methods

  func startChat(
    prompt: String)
    async throws
  {
    defer {
      chatMessageParameters = []
    }

    await startNewUserDisplayMessage(prompt)

    await startNewAssistantEmptyDisplayMessage()

    /// # Step 1: send the conversation and available functions to the model
    let userMessage = createUserMessage(prompt)
    chatMessageParameters.append(userMessage)

    let tools = FunctionCallDefinition.allCases.map(\.functionTool)

    let model: Model =
      if let customModel, !customModel.isEmpty {
        .custom(customModel)
      } else {
        .gpt41106Preview
      }

    let parameters = ChatCompletionParameters(
      messages: chatMessageParameters,
      model: model,
      toolChoice: ToolChoice.auto,
      tools: tools)

    do {
      let chat = try await service.startChat(parameters: parameters)

      guard let assistantMessage = chat.choices?.first?.message else { return }

      let content = assistantMessage.content ?? ""

      await updateLastAssistantMessage(.init(content: .content(.init(text: content)), origin: .received(.gpt)))

      /// # Step 2: check if the model wanted to call a function
      if let toolCalls = assistantMessage.toolCalls {
        /// # Step 3: call the function
        availableFunctions = [.createImage: generateImage(arguments:)]
        // Append the `assistantMessage` in to the `chatMessageParameters` to extend the conversation
        let parameterAssistantMessage = ChatCompletionParameters.Message(
          role: .assistant,
          content: .text(content), toolCalls: assistantMessage.toolCalls)

        chatMessageParameters.append(parameterAssistantMessage)

        /// # Step 4: send the info for each function call and function response to the model
        for toolCall in toolCalls {
          let name = toolCall.function.name
          let id = toolCall.id
          let functionToCall = availableFunctions[FunctionCallDefinition(rawValue: name!)!]!
          let arguments = toolCall.function.arguments
          let content = try await functionToCall(arguments)
          let toolMessage = ChatCompletionParameters.Message(
            role: .tool,
            content: .text(content),
            name: name,
            toolCallID: id)
          chatMessageParameters.append(toolMessage)
        }

        /// # get a new response from the model where it can see the function response
        await continueChat()
      }

    } catch let error as APIError {
      // If an error occurs, update the UI to display the error message.
      await updateLastAssistantMessage(.init(content: .error("\(error.displayDescription)"), origin: .received(.gpt)))
    }
  }

  func continueChat() async {
    let systemMessage = ChatCompletionParameters.Message(
      role: .system,
      content: .text(
        "You are an artist powered by AI, if the messages has a tool message you will weight that bigger in order to create a response, and you are providing me an image, you always respond in readable language and never providing URLs of images, most of the times you add an emoji on your responses if makes sense, do not describe the image. also always offer more help"))

    chatMessageParameters.insert(systemMessage, at: 0)

    let model: Model =
      if let customModel, !customModel.isEmpty {
        .custom(customModel)
      } else {
        .gpt41106Preview
      }

    let paramsForChat = ChatCompletionParameters(
      messages: chatMessageParameters,
      model: model)
    do {
      let chat = try await service.startChat(parameters: paramsForChat)
      guard let assistantMessage = chat.choices?.first?.message else { return }
      await updateLastAssistantMessage(.init(content: .content(.init(text: assistantMessage.content)), origin: .received(.gpt)))
    } catch {
      // If an error occurs, update the UI to display the error message.
      await updateLastAssistantMessage(.init(content: .error("\(error)"), origin: .received(.gpt)))
    }
  }

  func createUserMessage(
    _ prompt: String)
    -> ChatCompletionParameters.Message
  {
    .init(role: .user, content: .text(prompt))
  }

  // MARK: - Private Properties

  private let service: OpenAIService
  private var lastDisplayedMessageID: UUID?
  /// To be used for a new request
  private var chatMessageParameters = [ChatCompletionParameters.Message]()
  private var availableFunctions = [FunctionCallDefinition: @MainActor (String) async throws -> String]()

  // MARK: - Private Methods

  @MainActor
  private func startNewUserDisplayMessage(_ prompt: String) {
    let startingMessage = ChatMessageDisplayModel(
      content: .content(.init(text: prompt)),
      origin: .sent)
    addMessage(startingMessage)
  }

  @MainActor
  private func startNewAssistantEmptyDisplayMessage() {
    let newMessage = ChatMessageDisplayModel(
      content: .content(.init(text: "")),
      origin: .received(.gpt))
    addMessage(newMessage)
  }

  @MainActor
  private func updateLastAssistantMessage(
    _ message: ChatMessageDisplayModel)
  {
    guard let id = lastDisplayedMessageID, let index = chatDisplayMessages.firstIndex(where: { $0.id == id }) else { return }

    var lastMessage = chatDisplayMessages[index]

    switch message.content {
    case .content(let newMedia):
      switch lastMessage.content {
      case .content(let lastMedia):
        var updatedMedia = lastMedia
        if
          let newText = newMedia.text,
          var lastMediaText = lastMedia.text
        {
          lastMediaText += newText
          updatedMedia.text = lastMediaText
        } else {
          updatedMedia.text = ""
        }
        if let urls = newMedia.urls {
          updatedMedia.urls = urls
        }
        lastMessage.content = .content(updatedMedia)

      case .error:
        break
      }

    case .error:
      lastMessage.content = message.content
    }

    chatDisplayMessages[index] = ChatMessageDisplayModel(
      id: id,
      content: lastMessage.content,
      origin: message.origin)
  }

  @MainActor
  private func addMessage(_ message: ChatMessageDisplayModel) {
    let newMessageId = message.id
    lastDisplayedMessageID = newMessageId
    withAnimation {
      chatDisplayMessages.append(message)
    }
  }
}

extension String {
  fileprivate func toDictionary() -> [String: Any]? {
    guard let jsonData = data(using: .utf8) else {
      print("Failed to convert JSON string to Data.")
      return nil
    }

    do {
      return try JSONSerialization.jsonObject(with: jsonData, options: []) as? [String: Any]
    } catch {
      print("Failed to deserialize JSON: \(error.localizedDescription)")
      return nil
    }
  }
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/ChatFunctionsCall/Stream/ChatFunctionsCallStreamProvider.swift
================================================
//
//  ChatFunctionsCallStreamProvider.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 11/6/23.
//

import SwiftOpenAI
import SwiftUI

// MARK: - FunctionCallStreamedResponse

/// This is a demo in how to implement parallel function calling when using the completion API stream = true

struct FunctionCallStreamedResponse {
  let name: String
  let id: String
  let toolCall: ToolCall
  var argument: String
}

// MARK: - ChatFunctionsCallStreamProvider

@Observable
class ChatFunctionsCallStreamProvider {
  init(service: OpenAIService, customModel: String? = nil) {
    self.service = service
    self.customModel = customModel
  }

  // MARK: - Initializer

  let customModel: String?

  // MARK: - Public Properties

  /// To be used for UI purposes.
  var chatDisplayMessages = [ChatMessageDisplayModel]()

  @MainActor
  func generateImage(arguments: String) async throws -> String {
    let dictionary = arguments.toDictionary()!
    let prompt = dictionary["prompt"] as! String
    let count = (dictionary["count"] as? Int) ?? 1

    // TODO: Improve the loading state
    let assistantMessage = ChatMessageDisplayModel(
      content: .content(.init(text: "Generating images...")),
      origin: .received(.gpt))
    updateLastAssistantMessage(assistantMessage)

    let urls = try await service.legacyCreateImages(
      parameters: .init(prompt: prompt, model: .dalle2(.small), numberOfImages: count)).data.compactMap(\.url)

    let dalleAssistantMessage = ChatMessageDisplayModel(
      content: .content(.init(text: nil, urls: urls)),
      origin: .received(.dalle))
    updateLastAssistantMessage(dalleAssistantMessage)

    return prompt
  }

  // MARK: - Public Methods

  func chat(
    prompt: String)
    async throws
  {
    defer {
      functionCallsMap = [:]
      chatMessageParameters = []
    }

    await startNewUserDisplayMessage(prompt)

    await startNewAssistantEmptyDisplayMessage()

    let systemMessage = ChatCompletionParameters.Message(
      role: .system,
      content: .text(
        "You are an artist powered by AI, if the messages has a tool message you will weight that bigger in order to create a response, and you are providing me an image, you always respond in readable language and never providing URLs of images, most of the times you add an emoji on your responses if makes sense, do not describe the image. also always offer more help"))
    chatMessageParameters.append(systemMessage)

    /// # Step 1: send the conversation and available functions to the model
    let userMessage = createUserMessage(prompt)
    chatMessageParameters.append(userMessage)

    let tools = FunctionCallDefinition.allCases.map(\.functionTool)

    let model: Model =
      if let customModel, !customModel.isEmpty {
        .custom(customModel)
      } else {
        .gpt35Turbo1106
      }

    let parameters = ChatCompletionParameters(
      messages: chatMessageParameters,
      model: model,
      toolChoice: ToolChoice.auto,
      tools: tools)

    do {
      // Begin the chat stream with the updated parameters.
      let stream = try await service.startStreamedChat(parameters: parameters)
      for try await result in stream {
        // Extract the first choice from the stream results, if none exist, exit the loop.
        if let choice = result.choices?.first {
          /// Because we are using the stream API we need to wait to populate
          /// the needed values that comes from the streamed API to construct a valid tool call response.
          /// This is not needed if the stream is set to false in the API completion request.
          /// # Step 2: check if the model wanted to call a function
          if let toolCalls = choice.delta?.toolCalls {
            /// # Step 3: Define the available functions to be called
            availableFunctions = [.createImage: generateImage(arguments:)]

            mapStreamedToolCallsResponse(toolCalls)
          }

          /// The streamed content to display
          if let newContent = choice.delta?.content {
            await updateLastAssistantMessage(.init(
              content: .content(.init(text: newContent)),
              origin: .received(.gpt)))
          }
        }
      }
      // # extend conversation with assistant's reply
      // Append the `assistantMessage` in to the `chatMessageParameters` to extend the conversation
      if !functionCallsMap.isEmpty {
        let assistantMessage = createAssistantMessage()
        chatMessageParameters.append(assistantMessage)
        /// # Step 4: send the info for each function call and function response to the model
        let toolMessages = try await createToolsMessages()
        chatMessageParameters.append(contentsOf: toolMessages)

        // Lastly call the chat again
        await continueChat()
      }

      // TUTORIAL
    } catch {
      // If an error occurs, update the UI to display the error message.
      await updateLastAssistantMessage(.init(content: .error("\(error)"), origin: .received(.gpt)))
    }
  }

  func mapStreamedToolCallsResponse(
    _ toolCalls: [ToolCall])
  {
    for toolCall in toolCalls {
      // Intentionally force unwrapped to catch errrors quickly on demo. // This should be properly handled.
      let function = FunctionCallDefinition.allCases[toolCall.index!]
      if var streamedFunctionCallResponse = functionCallsMap[function] {
        streamedFunctionCallResponse.argument += toolCall.function.arguments
        functionCallsMap[function] = streamedFunctionCallResponse
      } else {
        let streamedFunctionCallResponse = FunctionCallStreamedResponse(
          name: toolCall.function.name!,
          id: toolCall.id!,
          toolCall: toolCall,
          argument: toolCall.function.arguments)
        functionCallsMap[function] = streamedFunctionCallResponse
      }
    }
  }

  func createUserMessage(
    _ prompt: String)
    -> ChatCompletionParameters.Message
  {
    .init(role: .user, content: .text(prompt))
  }

  func createAssistantMessage() -> ChatCompletionParameters.Message {
    var toolCalls = [ToolCall]()
    for (_, functionCallStreamedResponse) in functionCallsMap {
      let toolCall = functionCallStreamedResponse.toolCall
      // Intentionally force unwrapped to catch errrors quickly on demo. // This should be properly handled.
      let messageToolCall = ToolCall(
        id: toolCall.id!,
        function: .init(arguments: toolCall.function.arguments, name: toolCall.function.name!))
      toolCalls.append(messageToolCall)
    }
    return .init(role: .assistant, content: .text(""), toolCalls: toolCalls)
  }

  func createToolsMessages() async throws
    -> [ChatCompletionParameters.Message]
  {
    var toolMessages = [ChatCompletionParameters.Message]()
    for (key, functionCallStreamedResponse) in functionCallsMap {
      let name = functionCallStreamedResponse.name
      let id = functionCallStreamedResponse.id
      let functionToCall = availableFunctions[key]!
      let arguments = functionCallStreamedResponse.argument
      let content = try await functionToCall(arguments)
      let toolMessage = ChatCompletionParameters.Message(
        role: .tool,
        content: .text(content),
        name: name,
        toolCallID: id)
      toolMessages.append(toolMessage)
    }
    return toolMessages
  }

  func continueChat() async {
    let paramsForChat = ChatCompletionParameters(
      messages: chatMessageParameters,
      model: .gpt41106Preview)
    do {
      // Begin the chat stream with the updated parameters.
      let stream = try await service.startStreamedChat(parameters: paramsForChat)
      for try await result in stream {
        // Extract the first choice from the stream results, if none exist, exit the loop.
        guard let choice = result.choices?.first else { return }

        /// The streamed content to display
        if let newContent = choice.delta?.content {
          await updateLastAssistantMessage(.init(content: .content(.init(text: newContent)), origin: .received(.gpt)))
        }
      }
    } catch {
      // If an error occurs, update the UI to display the error message.
      await updateLastAssistantMessage(.init(content: .error("\(error)"), origin: .received(.gpt)))
    }
  }

  // MARK: - Private Properties

  private let service: OpenAIService
  private var lastDisplayedMessageID: UUID?
  /// To be used for a new request
  private var chatMessageParameters = [ChatCompletionParameters.Message]()
  private var functionCallsMap = [FunctionCallDefinition: FunctionCallStreamedResponse]()
  private var availableFunctions = [FunctionCallDefinition: @MainActor (String) async throws -> String]()

  // MARK: - Private Methods

  @MainActor
  private func startNewUserDisplayMessage(_ prompt: String) {
    let startingMessage = ChatMessageDisplayModel(
      content: .content(.init(text: prompt)),
      origin: .sent)
    addMessage(startingMessage)
  }

  @MainActor
  private func startNewAssistantEmptyDisplayMessage() {
    let newMessage = ChatMessageDisplayModel(
      content: .content(.init(text: "")),
      origin: .received(.gpt))
    addMessage(newMessage)
  }

  @MainActor
  private func updateLastAssistantMessage(
    _ message: ChatMessageDisplayModel)
  {
    guard let id = lastDisplayedMessageID, let index = chatDisplayMessages.firstIndex(where: { $0.id == id }) else { return }

    var lastMessage = chatDisplayMessages[index]

    switch message.content {
    case .content(let newMedia):
      switch lastMessage.content {
      case .content(let lastMedia):
        var updatedMedia = lastMedia
        if
          let newText = newMedia.text,
          var lastMediaText = lastMedia.text
        {
          lastMediaText += newText
          updatedMedia.text = lastMediaText
        } else {
          updatedMedia.text = ""
        }
        if let urls = newMedia.urls {
          updatedMedia.urls = urls
        }
        lastMessage.content = .content(updatedMedia)

      case .error:
        break
      }

    case .error:
      lastMessage.content = message.content
    }

    chatDisplayMessages[index] = ChatMessageDisplayModel(
      id: id,
      content: lastMessage.content,
      origin: message.origin)
  }

  @MainActor
  private func addMessage(_ message: ChatMessageDisplayModel) {
    let newMessageId = message.id
    lastDisplayedMessageID = newMessageId
    withAnimation {
      chatDisplayMessages.append(message)
    }
  }
}

extension String {
  fileprivate func toDictionary() -> [String: Any]? {
    guard let jsonData = data(using: .utf8) else {
      print("Failed to convert JSON string to Data.")
      return nil
    }

    do {
      return try JSONSerialization.jsonObject(with: jsonData, options: []) as? [String: Any]
    } catch {
      print("Failed to deserialize JSON: \(error.localizedDescription)")
      return nil
    }
  }
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/ChatFunctionsCall/Stream/ChatFunctionsCalllStreamDemoView.swift
================================================
//
//  ChatFunctionsCalllStreamDemoView.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 11/6/23.
//

import SwiftOpenAI
import SwiftUI

struct ChatFunctionsCalllStreamDemoView: View {
  init(service: OpenAIService, customModel: String? = nil) {
    self.customModel = customModel
    _chatProvider = State(initialValue: ChatFunctionsCallStreamProvider(service: service, customModel: customModel))
  }

  let customModel: String?

  var body: some View {
    ScrollViewReader { proxy in
      VStack {
        List(chatProvider.chatDisplayMessages) { message in
          ChatMessageView(message: message)
            .listRowSeparator(.hidden)
        }
        .listStyle(.plain)
        .onChange(of: chatProvider.chatDisplayMessages.last?.content) {
          let lastMessage = chatProvider.chatDisplayMessages.last
          if let id = lastMessage?.id {
            proxy.scrollTo(id, anchor: .bottom)
          }
        }
        textArea
      }
    }
  }

  var textArea: some View {
    HStack(spacing: 0) {
      VStack(alignment: .leading, spacing: 0) {
        textField
          .padding(.vertical, Sizes.spacingExtraSmall)
          .padding(.horizontal, Sizes.spacingSmall)
      }
      .padding(.vertical, Sizes.spacingExtraSmall)
      .padding(.horizontal, Sizes.spacingExtraSmall)
      .background(
        RoundedRectangle(cornerRadius: 20)
          .stroke(.gray, lineWidth: 1))
      .padding(.horizontal, Sizes.spacingMedium)
      textAreSendButton
    }
    .padding(.horizontal)
    .disabled(isLoading)
  }

  var textField: some View {
    TextField(
      "How Can I help you today?",
      text: $prompt,
      axis: .vertical)
  }

  var textAreSendButton: some View {
    Button {
      Task {
        /// Loading UI
        isLoading = true
        defer { isLoading = false }
        // Clears text field.
        let userPrompt = prompt
        prompt = ""
        try await chatProvider.chat(prompt: userPrompt)
      }
    } label: {
      Image(systemName: "paperplane")
    }
    .buttonStyle(.bordered)
    .tint(ThemeColor.tintColor)
    .disabled(prompt.isEmpty)
  }

  @State private var isLoading = false
  @State private var prompt = ""
  @State private var chatProvider: ChatFunctionsCallStreamProvider
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/ChatStreamFluidConversationDemo/ChatFluidConversationProvider.swift
================================================
//
//  ChatFluidConversationProvider.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 11/4/23.
//

import SwiftOpenAI
import SwiftUI

@Observable
class ChatFluidConversationProvider {
  init(service: OpenAIService, customModel: String? = nil) {
    self.service = service
    self.customModel = customModel
  }

  // MARK: - Initializer

  let customModel: String?

  // MARK: - Public Properties

  /// A collection of messages for display in the UI, representing the conversation.
  var chatMessages = [ChatDisplayMessage]()

  // MARK: - Public Methods

  func startStreamedChat(
    parameters: ChatCompletionParameters,
    prompt: String)
    async throws
  {
    // Displays the user message in the UI
    await startNewUserDisplayMessage(prompt)
    // Start a new assistant message that is initially empty.
    await startNewAssistantEmptyDisplayMessage()

    // Copy the provided parameters and update the messages for the chat stream.
    var localParameters = parameters
    localParameters.messages = parameterMessages

    do {
      // Begin the chat stream with the updated parameters.
      let stream = try await service.startStreamedChat(parameters: localParameters)
      for try await result in stream {
        // Extract the first choice from the stream results, if none exist, exit the loop.
        guard let choice = result.choices?.first else { return }

        // Store initial `role` and `functionCall` data from the first `choice.delta` for UI display.
        // This information is essential for maintaining context in the conversation and for updating
        // the chat UI with proper role attributions for each message.
        var newDelta = ChatDisplayMessage.Delta(role: "", content: "")
        if let firstDelta = firstChatMessageResponseDelta[result.id ?? ""] {
          // If we have already stored the first delta for this result ID, reuse its role.
          newDelta.role = firstDelta.role!
        } else {
          // Otherwise, store the first delta received for future reference.
          firstChatMessageResponseDelta[result.id ?? ""] = choice.delta
        }
        // Assign the content received in the current message to the newDelta.
        newDelta.content = temporalReceivedMessageContent
        // Update the UI with the latest assistant message and the corresponding delta.
        updateLastAssistantMessage(content: choice.delta?.content ?? "", delta: newDelta)

        // Evaluate the `finishReason` to determine if the conversation has reached a logical end.
        // If so, package the accumulated data into a new message parameter that will be used
        // to enrich context in subsequent API calls, leading to better conversation continuity.
        if let finishReason = choice.finishReason {
          debugPrint("FINISH_REASON \(finishReason)")
          // Construct a new message parameter with the role and content derived from the delta.
          // Intentionally force unwrapped, if fails is programming error.
          let newMessage = ChatCompletionParameters.Message(
            role: .init(rawValue: newDelta.role)!,
            content: .text(newDelta.content))
          // Append the new message parameter to the collection for future requests.
          updateParameterMessagesArray(newMessage)
        }
      }
    } catch {
      // If an error occurs, update the UI to display the error message.
      updateLastDisplayedMessage(.init(content: .error("\(error)"), type: .received, delta: nil))
    }
  }

  /// Defines the maximum number of parameter messages to retain for context. A larger history can enrich
  /// the language model's responses but be mindful as it will also increase the number of tokens sent in each request,
  /// thus affecting API consumption. A balance is required; a count of 5 is a reasonable starting point.
  private static var parameterMessagesMaxStorageCount = 5

  // MARK: - Private Properties

  private let service: OpenAIService

  /// Accumulates the streamed message content for real-time display updates in the UI.
  private var temporalReceivedMessageContent = ""
  /// Tracks the identifier of the last message displayed, enabling updates in the from the streaming API response.
  private var lastDisplayedMessageID: UUID?
  /// Stores the initial chat message's delta, which uniquely includes metadata like `role`.
  private var firstChatMessageResponseDelta = [String: ChatCompletionChunkObject.ChatChoice.Delta]()
  /// Builds a history of messages sent and received, enhancing the chat's context for future requests.
  private var parameterMessages = [ChatCompletionParameters.Message]()

  // MARK: - Private Methods

  @MainActor
  private func startNewUserDisplayMessage(_ prompt: String) {
    // Updates the UI with
    let startingMessage = ChatDisplayMessage(
      content: .text(prompt),
      type: .sent, delta: nil)
    addMessage(startingMessage)
    // Stores a new
    let newParameterMessage = ChatCompletionParameters.Message(role: .user, content: .text(prompt))
    updateParameterMessagesArray(newParameterMessage)
  }

  @MainActor
  private func startNewAssistantEmptyDisplayMessage() {
    firstChatMessageResponseDelta = [:]
    temporalReceivedMessageContent = ""
    let newMessage = ChatDisplayMessage(content: .text(temporalReceivedMessageContent), type: .received, delta: nil)
    let newMessageId = newMessage.id
    lastDisplayedMessageID = newMessageId
    addMessage(newMessage)
  }

  private func updateLastAssistantMessage(
    content: String,
    delta: ChatDisplayMessage.Delta)
  {
    temporalReceivedMessageContent += content
    guard let id = lastDisplayedMessageID, let index = chatMessages.firstIndex(where: { $0.id == id }) else { return }
    chatMessages[index] = ChatDisplayMessage(
      id: id,
      content: .text(temporalReceivedMessageContent),
      type: .received,
      delta: delta)
  }

  @MainActor
  private func addMessage(_ message: ChatDisplayMessage) {
    withAnimation {
      chatMessages.append(message)
    }
  }

  private func updateParameterMessagesArray(_ message: ChatCompletionParameters.Message) {
    parameterMessages.append(message)
    debugPrint("NEWCOUNT \(parameterMessages.count) message \(message)")
    if parameterMessages.count > Self.parameterMessagesMaxStorageCount {
      debugPrint("NEWCOUNT \(parameterMessages.count) removed message \(parameterMessages[0])")
      parameterMessages.removeFirst()
    }
  }

  private func updateLastDisplayedMessage(_ message: ChatDisplayMessage) {
    chatMessages[chatMessages.count - 1] = message
  }
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/ChatStreamFluidConversationDemo/ChatStreamFluidConversationDemoView.swift
================================================
//
//  ChatStreamFluidConversationDemoView.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 11/4/23.
//

import SwiftOpenAI
import SwiftUI

struct ChatStreamFluidConversationDemoView: View {
  init(service: OpenAIService, customModel: String? = nil) {
    self.customModel = customModel
    _chatProvider = State(initialValue: ChatFluidConversationProvider(service: service, customModel: customModel))
  }

  enum GPTModel: String, CaseIterable {
    case gpt3dot5 = "GPT-3.5"
    case gpt4 = "GPT-4"
  }

  let customModel: String?

  var body: some View {
    ScrollViewReader { proxy in
      VStack {
        picker
        List(chatProvider.chatMessages) { message in
          ChatDisplayMessageView(message: message)
            .listRowSeparator(.hidden)
        }
        .listStyle(.plain)
        .onChange(of: chatProvider.chatMessages.last?.content) {
          let lastMessage = chatProvider.chatMessages.last
          if let id = lastMessage?.id {
            proxy.scrollTo(id, anchor: .bottom)
          }
        }
        textArea
      }
    }
  }

  var picker: some View {
    Picker("", selection: $selectedModel) {
      ForEach(GPTModel.allCases, id: \.self) { model in
        Text(model.rawValue)
          .font(.title)
          .tag(model)
      }
    }
    .pickerStyle(.segmented)
    .padding()
  }

  var textArea: some View {
    HStack(spacing: 0) {
      TextField(
        "How Can I help you today?",
        text: $prompt,
        axis: .vertical)
        .textFieldStyle(.roundedBorder)
        .padding()
      textAreButton
    }
    .padding(.horizontal)
    .disabled(isLoading)
  }

  var textAreButton: some View {
    Button {
      Task {
        isLoading = true
        defer {
          // ensure isLoading is set to false after the function executes.
          isLoading = false
          prompt = ""
        }
        /// Make the request
        let model: Model =
          if let customModel, !customModel.isEmpty {
            .custom(customModel)
          } else {
            selectedModel == .gpt3dot5 ? .gpt35Turbo : .gpt4
          }

        try await chatProvider.startStreamedChat(parameters: .init(
          messages: [.init(role: .user, content: .text(prompt))],
          model: model), prompt: prompt)
      }
    } label: {
      Image(systemName: "paperplane")
    }
    .buttonStyle(.bordered)
  }

  @State private var chatProvider: ChatFluidConversationProvider
  @State private var isLoading = false
  @State private var prompt = ""
  @State private var selectedModel = GPTModel.gpt3dot5
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/ChatStructureOutputTool/ChatStructureOutputToolDemoView.swift
================================================
//
//  ChatStructureOutputToolDemoView.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 8/11/24.
//

import Foundation
import SwiftOpenAI
import SwiftUI

struct ChatStructureOutputToolDemoView: View {
  init(service: OpenAIService, customModel: String? = nil) {
    self.customModel = customModel
    chatProvider = .init(service: service, customModel: customModel)
  }

  let customModel: String?

  var body: some View {
    ScrollViewReader { proxy in
      VStack {
        List(chatProvider.chatDisplayMessages) { message in
          ChatMessageView(message: message)
            .listRowSeparator(.hidden)
        }
        .listStyle(.plain)
        .onChange(of: chatProvider.chatDisplayMessages.last?.content) {
          let lastMessage = chatProvider.chatDisplayMessages.last
          if let id = lastMessage?.id {
            proxy.scrollTo(id, anchor: .bottom)
          }
        }
        textArea
      }
    }
  }

  var textArea: some View {
    HStack(spacing: 0) {
      VStack(alignment: .leading, spacing: 0) {
        textField
          .padding(.vertical, Sizes.spacingExtraSmall)
          .padding(.horizontal, Sizes.spacingSmall)
      }
      .padding(.vertical, Sizes.spacingExtraSmall)
      .padding(.horizontal, Sizes.spacingExtraSmall)
      .background(
        RoundedRectangle(cornerRadius: 20)
          .stroke(.gray, lineWidth: 1))
      .padding(.horizontal, Sizes.spacingMedium)
      textAreSendButton
    }
    .padding(.horizontal)
    .disabled(isLoading)
  }

  var textField: some View {
    TextField(
      "How Can I help you today?",
      text: $prompt,
      axis: .vertical)
  }

  var textAreSendButton: some View {
    Button {
      Task {
        /// Loading UI
        isLoading = true
        defer { isLoading = false }
        // Clears text field.
        let userPrompt = prompt
        prompt = ""
        try await chatProvider.startChat(prompt: userPrompt)
      }
    } label: {
      Image(systemName: "paperplane")
    }
    .buttonStyle(.bordered)
    .tint(ThemeColor.tintColor)
    .disabled(prompt.isEmpty)
  }

  @State private var chatProvider: ChatStructuredOutputToolProvider
  @State private var isLoading = false
  @State private var prompt = ""
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/ChatStructureOutputTool/ChatStructuredOutputToolProvider.swift
================================================
//
//  ChatStructuredOutputToolProvider.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 8/11/24.
//

import Foundation
import SwiftOpenAI
import SwiftUI

// MARK: - StructuredToolCall

enum StructuredToolCall: String, CaseIterable {
  case structureUI = "structured_ui"

  var functionTool: ChatCompletionParameters.Tool {
    switch self {
    case .structureUI:
      .init(
        function: .init(
          name: rawValue,
          strict: true,
          description: "Dynamically generated UI",
          parameters: structureUISchema))
    }
  }

  var structureUISchema: JSONSchema {
    JSONSchema(
      type: .object,
      properties: [
        "type": JSONSchema(
          type: .string,
          description: "The type of the UI component",
          additionalProperties: false,
          enum: ["div", "button", "header", "section", "field", "form"]),
        "label": JSONSchema(
          type: .string,
          description: "The label of the UI component, used for buttons or form fields",
          additionalProperties: false),
        "children": JSONSchema(
          type: .array,
          description: "Nested UI components",
          items: JSONSchema(ref: "#"),
          additionalProperties: false),
        "attributes": JSONSchema(
          type: .array,
          description: "Arbitrary attributes for the UI component, suitable for any element",
          items: JSONSchema(
            type: .object,
            properties: [
              "name": JSONSchema(
                type: .string,
                description: "The name of the attribute, for example onClick or className",
                additionalProperties: false),
              "value": JSONSchema(
                type: .string,
                description: "The value of the attribute",
                additionalProperties: false),
            ],
            required: ["name", "value"],
            additionalProperties: false),
          additionalProperties: false),
      ],
      required: ["type", "label", "children", "attributes"],
      additionalProperties: false)
  }
}

// MARK: - ChatStructuredOutputToolProvider

@Observable
final class ChatStructuredOutputToolProvider {
  init(service: OpenAIService, customModel: String? = nil) {
    self.service = service
    self.customModel = customModel
  }

  // MARK: - Init

  let customModel: String?

  var chatDisplayMessages = [ChatMessageDisplayModel]()
  let systemMessage = ChatCompletionParameters.Message(role: .system, content: .text("You are a math tutor"))

  func startChat(
    prompt: String)
    async throws
  {
    await startNewUserDisplayMessage(prompt)
    await startNewAssistantEmptyDisplayMessage()

    let userMessage = createUserMessage(prompt)
    chatMessageParameters.append(userMessage)

    let model: Model =
      if let customModel, !customModel.isEmpty {
        .custom(customModel)
      } else {
        .gpt4o20240806
      }

    let parameters = ChatCompletionParameters(
      messages: [systemMessage] + chatMessageParameters,
      model: model,
      tools: StructuredToolCall.allCases.map(\.functionTool))

    do {
      let chat = try await service.startChat(parameters: parameters)
      guard let assistantMessage = chat.choices?.first?.message else { return }
      let content = assistantMessage.content ?? ""
      await updateLastAssistantMessage(.init(content: .content(.init(text: content)), origin: .received(.gpt)))
      if let toolCalls = assistantMessage.toolCalls {
        availableFunctions = [.structureUI: getStructureOutput(arguments:)]
        // Append the `assistantMessage` in to the `chatMessageParameters` to extend the conversation
        let parameterAssistantMessage = ChatCompletionParameters.Message(
          role: .assistant,
          content: .text(content), toolCalls: assistantMessage.toolCalls)

        chatMessageParameters.append(parameterAssistantMessage)

        /// # Step 4: send the info for each function call and function response to the model
        for toolCall in toolCalls {
          let name = toolCall.function.name
          let id = toolCall.id
          let functionToCall = availableFunctions[StructuredToolCall(rawValue: name!)!]!
          let arguments = toolCall.function.arguments
          let content = functionToCall(arguments)
          let toolMessage = ChatCompletionParameters.Message(
            role: .tool,
            content: .text(content),
            name: name,
            toolCallID: id)
          chatMessageParameters.append(toolMessage)
        }

        /// # get a new response from the model where it can see the function response
        await continueChat()
      }

    } catch let error as APIError {
      // If an error occurs, update the UI to display the error message.
      await updateLastAssistantMessage(.init(content: .error("\(error.displayDescription)"), origin: .received(.gpt)))
    }
  }

  // MARK: Tool functions

  func getStructureOutput(arguments: String) -> String {
    arguments
  }

  private let service: OpenAIService
  private var lastDisplayedMessageID: UUID?
  private var chatMessageParameters = [ChatCompletionParameters.Message]()
  private var availableFunctions = [StructuredToolCall: (String) -> String]()
}

// MARK: UI related

extension ChatStructuredOutputToolProvider {
  func createUserMessage(
    _ prompt: String)
    -> ChatCompletionParameters.Message
  {
    .init(role: .user, content: .text(prompt))
  }

  func continueChat() async {
    let model: Model =
      if let customModel, !customModel.isEmpty {
        .custom(customModel)
      } else {
        .gpt4o
      }

    let paramsForChat = ChatCompletionParameters(
      messages: chatMessageParameters,
      model: model)
    do {
      let chat = try await service.startChat(parameters: paramsForChat)
      guard let assistantMessage = chat.choices?.first?.message else { return }
      await updateLastAssistantMessage(.init(content: .content(.init(text: assistantMessage.content)), origin: .received(.gpt)))
    } catch {
      // If an error occurs, update the UI to display the error message.
      await updateLastAssistantMessage(.init(content: .error("\(error)"), origin: .received(.gpt)))
    }
  }

  // MARK: - Private Methods

  @MainActor
  private func startNewUserDisplayMessage(_ prompt: String) {
    let startingMessage = ChatMessageDisplayModel(
      content: .content(.init(text: prompt)),
      origin: .sent)
    addMessage(startingMessage)
  }

  @MainActor
  private func startNewAssistantEmptyDisplayMessage() {
    let newMessage = ChatMessageDisplayModel(
      content: .content(.init(text: "")),
      origin: .received(.gpt))
    addMessage(newMessage)
  }

  @MainActor
  private func updateLastAssistantMessage(
    _ message: ChatMessageDisplayModel)
  {
    guard let id = lastDisplayedMessageID, let index = chatDisplayMessages.firstIndex(where: { $0.id == id }) else { return }

    var lastMessage = chatDisplayMessages[index]

    switch message.content {
    case .content(let newMedia):
      switch lastMessage.content {
      case .content(let lastMedia):
        var updatedMedia = lastMedia
        if
          let newText = newMedia.text,
          var lastMediaText = lastMedia.text
        {
          lastMediaText += newText
          updatedMedia.text = lastMediaText
        } else {
          updatedMedia.text = ""
        }
        if let urls = newMedia.urls {
          updatedMedia.urls = urls
        }
        lastMessage.content = .content(updatedMedia)

      case .error:
        break
      }

    case .error:
      lastMessage.content = message.content
    }

    chatDisplayMessages[index] = ChatMessageDisplayModel(
      id: id,
      content: lastMessage.content,
      origin: message.origin)
  }

  @MainActor
  private func addMessage(_ message: ChatMessageDisplayModel) {
    let newMessageId = message.id
    lastDisplayedMessageID = newMessageId
    withAnimation {
      chatDisplayMessages.append(message)
    }
  }
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/ChatStructuredOutputs/ChatStructuredOutputDemoView.swift
================================================
//
//  ChatStructuredOutputDemoView.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 8/10/24.
//

import Foundation
import SwiftOpenAI
import SwiftUI

/// Schema demo
///
/// https://openai.com/index/introducing-structured-outputs-in-the-api/
///
/// "response_format": {
/// "type": "json_schema",
/// "json_schema": {
/// "name": "math_response",
/// "strict": true,
/// "schema": {
/// "type": "object",
/// "properties": {
/// "steps": {
/// "type": "array",
/// "items": {
/// "type": "object",
/// "properties": {
/// "explanation": {
/// "type": "string"
/// },
/// "output": {
/// "type": "string"
/// }
/// },
/// "required": ["explanation", "output"],
/// "additionalProperties": false
/// }
/// },
/// "final_answer": {
/// "type": "string"
/// }
/// },
/// "required": ["steps", "final_answer"],
/// "additionalProperties": false
/// }
/// }
/// }

// Steps to define the above Schema:

// 1: Define the Step schema object

let stepSchema = JSONSchema(
  type: .object,
  properties: [
    "explanation": JSONSchema(type: .string),
    "output": JSONSchema(
      type: .string),
  ],
  required: ["explanation", "output"],
  additionalProperties: false)

// 2. Define the steps Array schema.

let stepsArraySchema = JSONSchema(type: .array, items: stepSchema)

/// 3. Define the final Answer schema.
let finalAnswerSchema = JSONSchema(type: .string)

/// 4. Define the response format JSON schema.
let responseFormatSchema = JSONSchemaResponseFormat(
  name: "math_response",
  strict: true,
  schema: JSONSchema(
    type: .object,
    properties: [
      "steps": stepsArraySchema,
      "final_answer": finalAnswerSchema,
    ],
    required: ["steps", "final_answer"],
    additionalProperties: false))

// MARK: - ChatStructuredOutputDemoView

// We can also handle optional values.
//
// let weatherSchema = JSONSchemaResponseFormat(
// name: "get_weather",
// description: "Fetches the weather in the given location",
// strict: true,
// schema: JSONSchema(
// type: .object,
// properties: [
// "location": JSONSchema(
// type: .string,
// description: "The location to get the weather for"
// ),
// "unit": JSONSchema(
// type: .optional(.string),
// description: "The unit to return the temperature in",
// enum: ["F", "C"]
// ),
// "thinking": .init(
// type: .object,
// description: "your thinking",
// properties: ["step": .init(type: .string)],
// required: ["step"])
// ],
// required: ["location", "unit", "thinking"]
// )
// )

struct ChatStructuredOutputDemoView: View {
  init(service: OpenAIService, customModel: String? = nil) {
    self.customModel = customModel
    _chatProvider = State(initialValue: ChatStructuredOutputProvider(service: service, customModel: customModel))
  }

  enum ChatConfig {
    case chatCompletion
    case chatCompeltionStream
  }

  let customModel: String?

  var body: some View {
    ScrollView {
      VStack {
        picker
        textArea
        Text(chatProvider.errorMessage)
          .foregroundColor(.red)
        switch selectedSegment {
        case .chatCompeltionStream:
          streamedChatResultView
        case .chatCompletion:
          chatCompletionResultView
        }
      }
    }
    .overlay(
      Group {
        if isLoading {
          ProgressView()
        } else {
          EmptyView()
        }
      })
  }

  var picker: some View {
    Picker("Options", selection: $selectedSegment) {
      Text("Chat Completion").tag(ChatConfig.chatCompletion)
      Text("Chat Completion stream").tag(ChatConfig.chatCompeltionStream)
    }
    .pickerStyle(SegmentedPickerStyle())
    .padding()
  }

  var textArea: some View {
    HStack(spacing: 4) {
      TextField("Enter prompt", text: $prompt, axis: .vertical)
        .textFieldStyle(.roundedBorder)
        .padding()
      Button {
        Task {
          isLoading = true
          defer { isLoading = false } // ensure isLoading is set to false when the

          let content = ChatCompletionParameters.Message.ContentType.text(prompt)
          prompt = ""
          let parameters = ChatCompletionParameters(
            messages: [
              .init(role: .system, content: .text("You are a helpful math tutor.")),
              .init(
                role: .user,
                content: content),
            ],
            model: .gpt4o20240806,
            responseFormat: .jsonSchema(responseFormatSchema))
          switch selectedSegment {
          case .chatCompletion:
            try await chatProvider.startChat(parameters: parameters)
          case .chatCompeltionStream:
            try await chatProvider.startStreamedChat(parameters: parameters)
          }
        }
      } label: {
        Image(systemName: "paperplane")
      }
      .buttonStyle(.bordered)
    }
    .padding()
  }

  /// stream = `false`
  var chatCompletionResultView: some View {
    ForEach(Array(chatProvider.messages.enumerated()), id: \.offset) { _, val in
      VStack(spacing: 0) {
        Text("\(val)")
      }
    }
  }

  /// stream = `true`
  var streamedChatResultView: some View {
    VStack {
      Button("Cancel stream") {
        chatProvider.cancelStream()
      }
      Text(chatProvider.message)
    }
  }

  @State private var chatProvider: ChatStructuredOutputProvider
  @State private var isLoading = false
  @State private var prompt = ""
  @State private var selectedSegment = ChatConfig.chatCompeltionStream
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/ChatStructuredOutputs/ChatStructuredOutputProvider.swift
================================================
//
//  ChatStructuredOutputProvider.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 8/10/24.
//

import Foundation
import SwiftOpenAI

// MARK: - ChatStructuredOutputProvider

@Observable
final class ChatStructuredOutputProvider {
  init(service: OpenAIService, customModel: String? = nil) {
    self.service = service
    self.customModel = customModel
  }

  // MARK: - Initializer

  let customModel: String?

  var message = ""
  var messages = [String]()
  var errorMessage = ""

  // MARK: - Public Methods

  func startChat(
    parameters: ChatCompletionParameters)
    async throws
  {
    do {
      let choices = try await service.startChat(parameters: parameters).choices ?? []
      messages = choices.compactMap(\.message?.content).map { $0.asJsonFormatted() }
      assert(messages.count == 1)
      errorMessage = choices.first?.message?.refusal ?? ""
    } catch APIError.responseUnsuccessful(let description, let statusCode) {
      self.errorMessage = "Network error with status code: \(statusCode) and description: \(description)"
    } catch {
      errorMessage = error.localizedDescription
    }
  }

  func startStreamedChat(
    parameters: ChatCompletionParameters)
    async throws
  {
    streamTask = Task {
      do {
        let stream = try await service.startStreamedChat(parameters: parameters)
        for try await result in stream {
          let firstChoiceDelta = result.choices?.first?.delta
          let content = firstChoiceDelta?.refusal ?? firstChoiceDelta?.content ?? ""
          self.message += content
          if result.choices?.first?.finishReason != nil {
            self.message = self.message.asJsonFormatted()
          }
        }
      } catch APIError.responseUnsuccessful(let description, let statusCode) {
        self.errorMessage = "Network error with status code: \(statusCode) and description: \(description)"
      } catch {
        self.errorMessage = error.localizedDescription
      }
    }
  }

  func cancelStream() {
    streamTask?.cancel()
  }

  private let service: OpenAIService
  private var streamTask: Task<Void, Never>?
}

/// Helper that allows to display the JSON Schema.
extension String {
  func asJsonFormatted() -> String {
    guard let data = data(using: .utf8) else { return self }
    do {
      // Parse JSON string to Any object
      if let jsonObject = try JSONSerialization.jsonObject(with: data, options: []) as? [String: Any] {
        // Convert back to data with pretty-printing
        let prettyPrintedData = try JSONSerialization.data(withJSONObject: jsonObject, options: [.prettyPrinted, .sortedKeys])

        // Convert formatted data back to string
        return String(data: prettyPrintedData, encoding: .utf8) ?? self
      }
    } catch {
      print("Error formatting JSON: \(error)")
    }
    return self
  }
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/EmbeddingsDemo/EmbeddingsDemoView.swift
================================================
//
//  EmbeddingsDemoView.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 10/23/23.
//

import SwiftOpenAI
import SwiftUI

struct EmbeddingsDemoView: View {
  init(service: OpenAIService) {
    _embeddingsProvider = State(initialValue: EmbeddingsProvider(service: service))
  }

  var textArea: some View {
    HStack(spacing: 4) {
      TextField("Enter prompt", text: $prompt, axis: .vertical)
        .textFieldStyle(.roundedBorder)
        .padding()
      Button {
        Task {
          isLoading = true
          defer { isLoading = false } // ensure isLoading is set to false when the
          do {
            try await embeddingsProvider.createEmbeddings(parameters: .init(
              input: prompt,
              model: .textEmbedding3Large,
              encodingFormat: nil,
              dimensions: nil))
          } catch {
            errorMessage = "\(error)"
          }
        }
      } label: {
        Image(systemName: "paperplane")
      }
      .buttonStyle(.bordered)
    }
    .padding()
  }

  var list: some View {
    List {
      ForEach(Array(embeddingsProvider.embeddings.enumerated()), id: \.offset) { _, embeddingObject in
        Section(header: Text("Section \(embeddingObject.index) \(embeddingObject.object)")) {
          ForEach(embeddingObject.embedding, id: \.self) { embedding in
            Text("Embedding Value \(embedding)")
          }
        }
      }
    }
  }

  var body: some View {
    VStack {
      textArea
      if !errorMessage.isEmpty {
        Text("Error \(errorMessage)")
          .bold()
      }
      list
    }
    .overlay(
      Group {
        if isLoading {
          ProgressView()
        } else {
          EmptyView()
        }
      })
  }

  @State private var embeddingsProvider: EmbeddingsProvider
  @State private var isLoading = false
  @State private var prompt = ""
  @State private var errorMessage = ""
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/EmbeddingsDemo/Embeddingsprovider.swift
================================================
//
//  Embeddingsprovider.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 10/23/23.
//

import SwiftOpenAI
import SwiftUI

@Observable
class EmbeddingsProvider {
  init(service: OpenAIService) {
    self.service = service
  }

  var embeddings = [EmbeddingObject]()

  func createEmbeddings(
    parameters: EmbeddingParameter)
    async throws
  {
    embeddings = try await service.createEmbeddings(parameters: parameters).data
  }

  private let service: OpenAIService
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/Files/AttachmentView.swift
================================================
//
//  AttachmentView.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 5/29/24.
//

import SwiftUI

struct AttachmentView: View {
  let fileName: String
  @Binding var actionTrigger: Bool

  let isLoading: Bool

  var body: some View {
    HStack(spacing: Sizes.spacingExtraSmall) {
      HStack {
        if isLoading == true {
          ProgressView()
            .frame(width: 10, height: 10)
            .padding(.horizontal, Sizes.spacingExtraSmall)
        } else {
          Image(systemName: "doc")
            .resizable()
            .aspectRatio(contentMode: .fit)
            .frame(width: 10)
            .foregroundColor(.secondary)
        }
        Text(fileName)
          .font(.caption2)
      }
      Button {
        actionTrigger = true

      } label: {
        Image(systemName: "xmark.circle.fill")
      }
      .disabled(isLoading)
    }
    .padding(.leading, Sizes.spacingMedium)
    .background(
      RoundedRectangle(cornerRadius: 8)
        .stroke(.gray.opacity(0.5), lineWidth: 0.5))
  }
}

#Preview {
  AttachmentView(fileName: "Mydocument.pdf", actionTrigger: .constant(true), isLoading: true)
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/Files/FileAttachmentView.swift
================================================
//
//  FileAttachmentView.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 5/29/24.
//

import SwiftOpenAI
import SwiftUI

// MARK: - FileAttachmentView

struct FileAttachmentView: View {
  init(
    service: OpenAIService,
    action: FilePickerAction,
    fileUploadedCompletion: @escaping (_ file: FileObject) -> Void,
    fileDeletedCompletion: @escaping (_ parameters: FilePickerAction, _ id: String) -> Void)
  {
    fileProvider = FilesPickerProvider(service: service)
    self.action = action
    self.fileUploadedCompletion = fileUploadedCompletion
    self.fileDeletedCompletion = fileDeletedCompletion
  }

  var body: some View {
    Group {
      switch action {
      case .request(let parameters):
        newUploadedFileView(parameters: parameters)
      case .retrieveAndDisplay(let id):
        previousUploadedFileView(id: id)
      }
    }
    .onChange(of: deleted) { oldValue, newValue in
      if oldValue != newValue, newValue {
        Task {
          if let fileObject {
            fileDeleteStatus = try await fileProvider.deleteFileWith(id: fileObject.id)
          }
        }
      }
    }
    .onChange(of: fileDeleteStatus) { oldValue, newValue in
      if oldValue != newValue, let newValue, newValue.deleted {
        fileDeletedCompletion(action, newValue.id)
      }
    }
  }

  func newUploadedFileView(
    parameters: FileParameters)
    -> some View
  {
    AttachmentView(
      fileName: (fileObject?.filename ?? parameters.fileName) ?? "",
      actionTrigger: $deleted,
      isLoading: fileObject == nil || deleted)
      .disabled(fileObject == nil)
      .opacity(fileObject == nil ? 0.3 : 1)
      .onFirstAppear {
        Task {
          fileObject = try await fileProvider.uploadFile(parameters: parameters)
        }
      }
      .onChange(of: fileObject) { oldValue, newValue in
        if oldValue != newValue, let newValue {
          fileUploadedCompletion(newValue)
        }
      }
  }

  func previousUploadedFileView(
    id: String)
    -> some View
  {
    AttachmentView(fileName: fileObject?.filename ?? "Document", actionTrigger: $deleted, isLoading: fileObject == nil || deleted)
      .onFirstAppear {
        Task {
          fileObject = try await fileProvider.retrieveFileWith(id: id)
        }
      }
  }

  @State private var fileObject: FileObject?
  @State private var fileDeleteStatus: DeletionStatus?
  @State private var deleted = false

  private let fileProvider: FilesPickerProvider
  private let fileUploadedCompletion: (_ file: FileObject) -> Void
  private let fileDeletedCompletion: (_ action: FilePickerAction, _ id: String) -> Void
  private let action: FilePickerAction
}

// MARK: - OnFirstAppear

private struct OnFirstAppear: ViewModifier {
  let perform: () -> Void

  @State private var firstTime = true

  func body(content: Content) -> some View {
    content.onAppear {
      if firstTime {
        firstTime = false
        perform()
      }
    }
  }
}

extension View {
  func onFirstAppear(perform: @escaping () -> Void) -> some View {
    modifier(OnFirstAppear(perform: perform))
  }
}

// MARK: - DeletionStatus + @retroactive Equatable

extension DeletionStatus: @retroactive Equatable {
  public static func ==(lhs: DeletionStatus, rhs: DeletionStatus) -> Bool {
    lhs.id == rhs.id
  }
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/Files/FilesPicker.swift
================================================
//
//  FilesPicker.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 5/29/24.
//

import SwiftOpenAI
import SwiftUI

// MARK: - FileObject + @retroactive Equatable

extension FileObject: @retroactive Equatable {
  public static func ==(lhs: FileObject, rhs: FileObject) -> Bool {
    lhs.id == rhs.id
  }
}

// MARK: - FileParameters + @retroactive Equatable, @retroactive Identifiable

extension FileParameters: @retroactive Equatable, @retroactive Identifiable {
  public var id: String {
    fileName ?? ""
  }

  public static func ==(lhs: FileParameters, rhs: FileParameters) -> Bool {
    lhs.file == rhs.file &&
      lhs.fileName == rhs.fileName &&
      lhs.purpose == rhs.purpose
  }
}

// MARK: - FilePickerAction

enum FilePickerAction: Identifiable, Equatable {
  case request(FileParameters)
  case retrieveAndDisplay(id: String)

  var id: String {
    switch self {
    case .request(let fileParameters): fileParameters.id
    case .retrieveAndDisplay(let id): id
    }
  }
}

// MARK: - FilesPicker

struct FilesPicker: View {
  init(
    service: OpenAIService,
    sectionTitle: String? = nil,
    actionTitle: String,
    fileIDS: Binding<[String]>,
    actions: Binding<[FilePickerAction]>)
  {
    self.service = service
    self.sectionTitle = sectionTitle
    self.actionTitle = actionTitle
    _fileIDS = fileIDS
    _actions = actions
  }

  var body: some View {
    VStack(alignment: .leading) {
      Group {
        if let sectionTitle {
          VStack {
            Text(sectionTitle)
            Button {
              presentImporter = true
            } label: {
              Text(actionTitle)
            }
          }

        } else {
          Button {
            presentImporter = true
          } label: {
            Text(actionTitle)
          }
        }
      }
      .fileImporter(
        isPresented: $presentImporter,
        allowedContentTypes: [.pdf, .text, .mp3, .mpeg],
        allowsMultipleSelection: true)
      { result in
        switch result {
        case .success(let files):
          for file in files {
            // gain access to the directory
            let gotAccess = file.startAccessingSecurityScopedResource()
            guard gotAccess else { continue }
            if
              let data = try? Data(contentsOf: file.absoluteURL)
            {
              let parameter = FileParameters(fileName: file.lastPathComponent, file: data, purpose: "assistants")
              actions.append(.request(parameter))
            }
            file.stopAccessingSecurityScopedResource()
          }

        case .failure(let error):
          print(error)
        }
      }
      ForEach(actions, id: \.id) { action in
        FileAttachmentView(
          service: service,
          action: action)
        { fileResponse in
          fileIDS.append(fileResponse.id)
        } fileDeletedCompletion: { actionToDelete, deletedFileID in
          /// Remove file ids from network request.
          fileIDS.removeAll(where: { id in
            id == deletedFileID
          })
          /// Update UI
          actions.removeAll { action in
            actionToDelete.id == action.id
          }
        }
      }
    }
    .frame(maxWidth: .infinity, alignment: .leading)
  }

  @State private var presentImporter = false
  @Binding private var actions: [FilePickerAction]
  @Binding private var fileIDS: [String]

  private let service: OpenAIService
  private let sectionTitle: String?
  private let actionTitle: String
}

#Preview {
  FilesPicker(
    service: OpenAIServiceFactory.service(apiKey: ""),
    sectionTitle: "Knowledge",
    actionTitle: "Uplodad File",
    fileIDS: .constant(["s"]),
    actions: .constant(
      [
        .retrieveAndDisplay(id: "id1"),
        .retrieveAndDisplay(id: "id2"),
        .retrieveAndDisplay(id: "id3"),
        .retrieveAndDisplay(id: "id4"),
        .retrieveAndDisplay(id: "id5"),
        .retrieveAndDisplay(id: "id6"),
      ]))
      .padding()
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/Files/FilesPickerProvider.swift
================================================
//
//  FilesPickerProvider.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 5/29/24.
//

import SwiftOpenAI
import SwiftUI

final class FilesPickerProvider {
  init(service: OpenAIService) {
    self.service = service
  }

  var files = [FileObject]()
  var uploadedFile: FileObject?
  var deletedStatus: DeletionStatus?
  var retrievedFile: FileObject?
  var fileContent = [[String: Any]]()

  func listFiles() async throws {
    files = try await service.listFiles().data
  }

  func uploadFile(
    parameters: FileParameters)
    async throws -> FileObject?
  {
    try await service.uploadFile(parameters: parameters)
  }

  func deleteFileWith(
    id: String)
    async throws -> DeletionStatus?
  {
    try await service.deleteFileWith(id: id)
  }

  func retrieveFileWith(
    id: String)
    async throws -> FileObject?
  {
    try await service.retrieveFileWith(id: id)
  }

  func retrieveContentForFileWith(
    id: String)
    async throws
  {
    fileContent = try await service.retrieveContentForFileWith(id: id)
  }

  private let service: OpenAIService
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/FilesDemo/FilesDemoView.swift
================================================
//
//  FilesDemoView.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 10/23/23.
//

import SwiftOpenAI
import SwiftUI

// MARK: - FilesDemoView

struct FilesDemoView: View {
  init(service: OpenAIService) {
    _filesProvider = State(initialValue: FilesProvider(service: service))
  }

  enum Config {
    case list
    case moreOptions
  }

  var body: some View {
    VStack {
      picker
      if !errorMessage.isEmpty {
        Text("Error \(errorMessage)")
          .bold()
      }
      switch selectedSegment {
      case .list:
        listView
      case .moreOptions:
        moreOptionsView
      }
    }
    .overlay(
      Group {
        if isLoading {
          ProgressView()
        } else {
          EmptyView()
        }
      })
  }

  var picker: some View {
    Picker("Options", selection: $selectedSegment) {
      Text("Shows List").tag(Config.list)
      Text("Show More options").tag(Config.moreOptions)
    }
    .pickerStyle(SegmentedPickerStyle())
    .padding()
  }

  var moreOptionsView: some View {
    ScrollView {
      VStack {
        uploadFileButton
        Text("This button will load a file that has been added to this app bundle.")
        if let uploadedFile = filesProvider.uploadedFile {
          FileObjectView(file: uploadedFile)
        }
      }
    }
  }

  var listView: some View {
    VStack(spacing: 0) {
      listFilesButton
      list
    }
  }

  var listFilesButton: some View {
    Button("List Files") {
      Task {
        isLoading = true
        defer { isLoading = false } // ensure isLoading is set to false when the
        do {
          try await filesProvider.listFiles()
        } catch {
          errorMessage = "\(error)"
        }
      }
    }
    .buttonStyle(.borderedProminent)
  }

  var uploadFileButton: some View {
    Button("Upload File") {
      Task {
        isLoading = true
        defer { isLoading = false } // ensure isLoading is set to false when the
        do {
          let fileData = try contentLoader.loadBundledContent(fromFileNamed: "WorldCupData", ext: "jsonl")
          try await filesProvider.uploadFile(parameters: .init(fileName: "WorldCupData", file: fileData, purpose: "fine-tune"))
        } catch {
          errorMessage = "\(error)"
        }
      }
    }
    .buttonStyle(.borderedProminent)
  }

  var list: some View {
    List {
      ForEach(Array(filesProvider.files.enumerated()), id: \.offset) { _, file in
        FileObjectView(file: file)
      }
    }
  }

  @State private var filesProvider: FilesProvider
  @State private var isLoading = false
  @State private var errorMessage = ""
  @State private var selectedSegment = Config.list

  private let contentLoader = ContentLoader()
}

// MARK: - FileObjectView

struct FileObjectView: View {
  init(file: FileObject) {
    self.file = file
  }

  var body: some View {
    VStack(alignment: .leading, spacing: 4) {
      Text("File name = \(file.filename)")
        .font(.title2)
      VStack(alignment: .leading, spacing: 2) {
        Text("ID = \(file.id)")
        Text("Created = \(file.createdAt)")
        Text("Object = \(file.object)")
        Text("Purpose = \(file.purpose)")
        Text("Status = \(file.status ?? "NO STATUS")")
        Text("Status Details = \(file.statusDetails ?? "NO DETAILS")")
      }
      .font(.callout)
    }
    .foregroundColor(.primary)
    .padding()
    .background(
      RoundedRectangle(cornerSize: .init(width: 20, height: 20))
        .foregroundColor(.mint))
  }

  private let file: FileObject
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/FilesDemo/FilesProvider.swift
================================================
//
//  FilesProvider.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 10/23/23.
//

import SwiftOpenAI
import SwiftUI

@Observable
class FilesProvider {
  init(service: OpenAIService) {
    self.service = service
  }

  var files = [FileObject]()
  var uploadedFile: FileObject?
  var deletedStatus: DeletionStatus?
  var retrievedFile: FileObject?
  var fileContent = [[String: Any]]()

  func listFiles() async throws {
    files = try await service.listFiles().data
  }

  func uploadFile(
    parameters: FileParameters)
    async throws
  {
    uploadedFile = try await service.uploadFile(parameters: parameters)
  }

  func deleteFileWith(
    id: String)
    async throws
  {
    deletedStatus = try await service.deleteFileWith(id: id)
  }

  func retrieveFileWith(
    id: String)
    async throws
  {
    retrievedFile = try await service.retrieveFileWith(id: id)
  }

  func retrieveContentForFileWith(
    id: String)
    async throws
  {
    fileContent = try await service.retrieveContentForFileWith(id: id)
  }

  private let service: OpenAIService
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/FineTuningDemo/FineTuningJobDemoView.swift
================================================
//
//  FineTuningJobDemoView.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 10/23/23.
//

import SwiftOpenAI
import SwiftUI

// MARK: - FineTuningJobDemoView

struct FineTuningJobDemoView: View {
  init(service: OpenAIService) {
    _fineTuningJobProvider = State(initialValue: FineTuningJobProvider(service: service))
  }

  var body: some View {
    VStack {
      Button("List Fine tuning jobs") {
        Task {
          isLoading = true
          defer { isLoading = false } // ensure isLoading is set to false when the
          try await fineTuningJobProvider.listFineTuningJobs()
        }
      }
      .buttonStyle(.borderedProminent)
      List {
        ForEach(Array(fineTuningJobProvider.fineTunedJobs.enumerated()), id: \.offset) { _, job in
          FineTuningObjectView(job: job)
        }
      }
    }
    .overlay(
      Group {
        if isLoading {
          ProgressView()
        } else {
          EmptyView()
        }
      })
  }

  @State private var fineTuningJobProvider: FineTuningJobProvider
  @State private var isLoading = false
}

// MARK: - FineTuningObjectView

struct FineTuningObjectView: View {
  init(job: FineTuningJobObject) {
    self.job = job
  }

  var body: some View {
    VStack(alignment: .leading, spacing: 4) {
      Text("Fine Tuned Model - \(job.fineTunedModel ?? "NO MODEL")")
        .font(.title2)
      VStack(alignment: .leading, spacing: 2) {
        Text("Model = \(job.model)")
        Text("Object = \(job.object)")
        Text("ID = \(job.id)")
        Text("Created = \(job.createdAt)")
        Text("Organization ID = \(job.organizationId)")
        Text("Training file = \(job.trainingFile)")
        Text("Status = \(job.status)")
          .bold()
      }
      .font(.callout)
    }
    .foregroundColor(.primary)
    .padding()
    .background(
      RoundedRectangle(cornerSize: .init(width: 20, height: 20))
        .foregroundColor(.mint))
  }

  private let job: FineTuningJobObject
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/FineTuningDemo/FineTuningJobProvider.swift
================================================
//
//  FineTuningJobProvider.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 10/23/23.
//

import SwiftOpenAI
import SwiftUI

@Observable
class FineTuningJobProvider {
  init(service: OpenAIService) {
    self.service = service
  }

  var createdFineTuningJob: FineTuningJobObject?
  var canceledFineTuningJob: FineTuningJobObject?
  var retrievedFineTuningJob: FineTuningJobObject?
  var fineTunedJobs = [FineTuningJobObject]()
  var finteTuningEventObjects = [FineTuningJobEventObject]()

  func createFineTuningJob(
    parameters: FineTuningJobParameters)
    async throws
  {
    createdFineTuningJob = try await service.createFineTuningJob(parameters: parameters)
  }

  func listFineTuningJobs()
    async throws
  {
    fineTunedJobs = try await service.listFineTuningJobs(after: nil, limit: nil).data
  }

  func retrieveFineTuningJob(
    id: String)
    async throws
  {
    retrievedFineTuningJob = try await service.retrieveFineTuningJob(id: id)
  }

  func cancelFineTuningJob(
    id: String)
    async throws
  {
    canceledFineTuningJob = try await service.cancelFineTuningJobWith(id: id)
  }

  func listFineTuningEventsForJobWith(
    id: String)
    async throws
  {
    finteTuningEventObjects = try await service.listFineTuningEventsForJobWith(id: id, after: nil, limit: nil).data
  }

  private let service: OpenAIService
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/ImagesDemo/ImagesDemoView.swift
================================================
//
//  ImagesDemoView.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 10/24/23.
//

import SwiftOpenAI
import SwiftUI

struct ImagesDemoView: View {
  init(service: OpenAIService) {
    _imagesProvider = State(initialValue: ImagesProvider(service: service))
  }

  var body: some View {
    ScrollView {
      textArea
      if !errorMessage.isEmpty {
        Text("Error \(errorMessage)")
          .bold()
      }
      ForEach(Array(imagesProvider.images.enumerated()), id: \.offset) { _, url in
        AsyncImage(url: url, scale: 1) { image in
          image
            .resizable()
            .aspectRatio(contentMode: .fill)
            .clipped()
        } placeholder: {
          EmptyView()
        }
      }
    }
    .overlay(
      Group {
        if isLoading {
          ProgressView()
        } else {
          EmptyView()
        }
      })
  }

  var textArea: some View {
    HStack(spacing: 4) {
      TextField("Enter prompt", text: $prompt, axis: .vertical)
        .textFieldStyle(.roundedBorder)
        .padding()
      Button {
        Task {
          isLoading = true
          defer { isLoading = false } // ensure isLoading is set to false when the
          do {
            try await imagesProvider.createImages(parameters: .init(prompt: prompt, model: .dalle3(.largeSquare)))
          } catch {
            errorMessage = "\(error)"
          }
        }
      } label: {
        Image(systemName: "paperplane")
      }
      .buttonStyle(.bordered)
    }
    .padding()
  }

  @State private var imagesProvider: ImagesProvider
  @State private var isLoading = false
  @State private var prompt = ""
  @State private var errorMessage = ""
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/ImagesDemo/ImagesProvider.swift
================================================
//
//  ImagesProvider.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 10/23/23.
//

import SwiftOpenAI
import SwiftUI

@Observable
class ImagesProvider {
  init(service: OpenAIService) {
    self.service = service
  }

  var images = [URL]()

  func createImages(
    parameters: ImageCreateParameters)
    async throws
  {
    let urls = try await service.legacyCreateImages(
      parameters: parameters).data.map(\.url)
    images = urls.compactMap(\.self)
  }

  func editImages(
    parameters: ImageEditParameters)
    async throws
  {
    let urls = try await service.legacyEditImage(
      parameters: parameters).data.map(\.url)
    images = urls.compactMap(\.self)
  }

  func createImageVariations(
    parameters: ImageVariationParameters)
    async throws
  {
    let urls = try await service.legacyCreateImageVariations(parameters: parameters).data.map(\.url)
    images = urls.compactMap(\.self)
  }

  private let service: OpenAIService
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/LocalChatDemo/LocalChatDemoView.swift
================================================
//
//  LocalChatDemoView.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 6/24/24.
//

import SwiftOpenAI
import SwiftUI

/// For more visit https://github.com/ollama/ollama/blob/main/docs/openai.md

/// Important:
/// Before using a model, pull it locally ollama pull:

/// `ollama pull llama3`
/// Default model names
/// For tooling that relies on default OpenAI model names such as gpt-3.5-turbo, use ollama cp to copy an existing model name to a temporary name:

/// `ollama cp llama3 gpt-3.5-turbo`
/// Afterwards, this new model name can be specified the model field:

/// ```curl http://localhost:11434/v1/chat/completions \
///    -H "Content-Type: application/json" \
///    -d '{
///        "model": "gpt-3.5-turbo",
///        "messages": [
///           {
///               "role": "user",
///                "content": "Hello!"
///            }
///        ]
///    }'```

struct LocalChatDemoView: View {
  init(service: OpenAIService, customModel: String? = nil) {
    self.customModel = customModel
    _chatProvider = State(initialValue: ChatProvider(service: service))
  }

  enum ChatConfig {
    case chatCompletion
    case chatCompeltionStream
  }

  let customModel: String?

  var body: some View {
    ScrollView {
      VStack {
        picker
        textArea
        Text(chatProvider.errorMessage)
          .foregroundColor(.red)
        switch selectedSegment {
        case .chatCompeltionStream:
          streamedChatResultView
        case .chatCompletion:
          chatCompletionResultView
        }
      }
    }
    .overlay(
      Group {
        if isLoading {
          ProgressView()
        } else {
          EmptyView()
        }
      })
  }

  var picker: some View {
    Picker("Options", selection: $selectedSegment) {
      Text("Chat Completion").tag(ChatConfig.chatCompletion)
      Text("Chat Completion stream").tag(ChatConfig.chatCompeltionStream)
    }
    .pickerStyle(SegmentedPickerStyle())
    .padding()
  }

  var textArea: some View {
    HStack(spacing: 4) {
      TextField("Enter prompt", text: $prompt, axis: .vertical)
        .textFieldStyle(.roundedBorder)
        .padding()
      Button {
        Task {
          isLoading = true
          defer { isLoading = false } // ensure isLoading is set to false when the

          let content = ChatCompletionParameters.Message.ContentType.text(prompt)
          prompt = ""
          let model: Model =
            if let customModel, !customModel.isEmpty {
              .custom(customModel)
            } else {
              // Make sure you run `ollama pull llama3` in your terminal to download this model.
              .custom("llama3")
            }

          let parameters = ChatCompletionParameters(
            messages: [.init(
              role: .user,
              content: content)],
            model: model)
          switch selectedSegment {
          case .chatCompletion:
            try await chatProvider.startChat(parameters: parameters)
          case .chatCompeltionStream:
            try await chatProvider.startStreamedChat(parameters: parameters)
          }
        }
      } label: {
        Image(systemName: "paperplane")
      }
      .buttonStyle(.bordered)
    }
    .padding()
  }

  /// stream = `false`
  var chatCompletionResultView: some View {
    ForEach(Array(chatProvider.messages.enumerated()), id: \.offset) { _, val in
      VStack(spacing: 0) {
        Text("\(val)")
      }
    }
  }

  /// stream = `true`
  var streamedChatResultView: some View {
    VStack {
      Button("Cancel stream") {
        chatProvider.cancelStream()
      }
      Text(chatProvider.message)
    }
  }

  @State private var chatProvider: ChatProvider
  @State private var isLoading = false
  @State private var prompt = ""
  @State private var selectedSegment = ChatConfig.chatCompeltionStream
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/LocalHostEntryView.swift
================================================
//
//  LocalHostEntryView.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 6/24/24.
//

import SwiftOpenAI
import SwiftUI

struct LocalHostEntryView: View {
  var body: some View {
    NavigationStack {
      VStack {
        Spacer()
        TextField("Enter URL", text: $url)
          .padding()
          .textFieldStyle(.roundedBorder)
        NavigationLink(destination: OptionsListView(
          openAIService: OpenAIServiceFactory.service(baseURL: url),
          options: [.localChat]))
        {
          Text("Continue")
            .padding()
            .padding(.horizontal, 48)
            .foregroundColor(.white)
            .background(
              Capsule()
                .foregroundColor(url.isEmpty ? .gray.opacity(0.2) : Color(red: 64 / 255, green: 195 / 255, blue: 125 / 255)))
        }
        .disabled(url.isEmpty)
        Spacer()
      }
      .padding()
      .navigationTitle("Enter URL")
    }
  }

  @State private var url = ""
}

#Preview {
  ApiKeyIntroView()
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/ModelsDemo/ModelsDemoView.swift
================================================
//
//  ModelsDemoView.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 10/24/23.
//

import SwiftOpenAI
import SwiftUI

struct ModelsDemoView: View {
  init(service: OpenAIService) {
    _modelsProvider = State(initialValue: ModelsProvider(service: service))
  }

  var body: some View {
    VStack {
      showModelsButton
      list
    }
  }

  var list: some View {
    List {
      ForEach(Array(modelsProvider.models.enumerated()), id: \.offset) { _, model in
        Text("\(model.id)")
      }
    }
  }

  var showModelsButton: some View {
    Button("List models") {
      Task {
        isLoading = true
        defer { isLoading = false } // ensure isLoading is set to false when the
        do {
          try await modelsProvider.listModels()
        } catch {
          errorMessage = "\(error)"
        }
      }
    }
    .buttonStyle(.bordered)
  }

  @State private var modelsProvider: ModelsProvider
  @State private var isLoading = false
  @State private var errorMessage = ""
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/ModelsDemo/ModelsProvider.swift
================================================
//
//  ModelsProvider.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 10/24/23.
//

import SwiftOpenAI
import SwiftUI

@Observable
class ModelsProvider {
  init(service: OpenAIService) {
    self.service = service
  }

  var models = [ModelObject]()
  var retrievedModel: ModelObject?
  var deletionStatus: DeletionStatus?

  func listModels() async throws {
    models = try await service.listModels().data
  }

  func retrieveModelWith(
    id: String)
    async throws
  {
    retrievedModel = try await service.retrieveModelWith(id: id)
  }

  func deleteFineTuneModelWith(
    id: String)
    async throws
  {
    deletionStatus = try await service.deleteFineTuneModelWith(id: id)
  }

  private let service: OpenAIService
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/ModerationsDemo/ModerationDemoView.swift
================================================
//
//  ModerationDemoView.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 10/24/23.
//

import SwiftOpenAI
import SwiftUI

struct ModerationDemoView: View {
  init(service: OpenAIService) {
    _moderationProvider = State(initialValue: ModerationProvider(service: service))
  }

  var body: some View {
    VStack {
      textArea
      if moderationProvider.isFlagged {
        Text("That is not a nice thing to say.")
      }
      if !errorMessage.isEmpty {
        Text("Error \(errorMessage)")
          .bold()
      }
    }
    .overlay(
      Group {
        if isLoading {
          ProgressView()
        } else {
          EmptyView()
        }
      })
  }

  var textArea: some View {
    HStack(spacing: 4) {
      TextField("Enter prompt", text: $prompt, axis: .vertical)
        .textFieldStyle(.roundedBorder)
        .padding()
      Button {
        Task {
          isLoading = true
          defer { isLoading = false } // ensure isLoading is set to false when the
          do {
            try await moderationProvider.createModerationFromText(parameters: .init(input: prompt))
          } catch {
            errorMessage = "\(error)"
          }
        }
      } label: {
        Image(systemName: "paperplane")
      }
      .buttonStyle(.bordered)
    }
    .padding()
  }

  @State private var moderationProvider: ModerationProvider
  @State private var isLoading = false
  @State private var prompt = ""
  @State private var errorMessage = ""
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/ModerationsDemo/ModerationProvider.swift
================================================
//
//  ModerationProvider.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 10/24/23.
//

import SwiftOpenAI
import SwiftUI

@Observable
class ModerationProvider {
  init(service: OpenAIService) {
    self.service = service
  }

  var isFlagged = false

  func createModerationFromText(
    parameters: ModerationParameter<String>)
    async throws
  {
    isFlagged = try await service.createModerationFromText(parameters: parameters).isFlagged
  }

  func createModerationFromTexts(
    parameters: ModerationParameter<[String]>)
    async throws
  {
    isFlagged = try await service.createModerationFromTexts(parameters: parameters).isFlagged
  }

  private let service: OpenAIService
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/OptionsListView.swift
================================================
//
//  OptionsListView.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 10/19/23.
//

import SwiftOpenAI
import SwiftUI

struct OptionsListView: View {
  /// https://platform.openai.com/docs/api-reference
  enum APIOption: String, CaseIterable, Identifiable {
    case audio = "Audio"
    case chat = "Chat"
    case chatPredictedOutput = "Chat Predicted Output"
    case localChat = "Local Chat" // Ollama
    case vision = "Vision"
    case embeddings = "Embeddings"
    case fineTuning = "Fine Tuning"
    case files = "Files"
    case images = "Images"
    case models = "Models"
    case moderations = "Moderations"
    case chatHistoryConversation = "Chat History Conversation"
    case chatFunctionCall = "Chat Functions call"
    case chatFunctionsCallStream = "Chat Functions call (Stream)"
    case chatStructuredOutput = "Chat Structured Output"
    case chatStructuredOutputTool = "Chat Structured Output Tools"
    case configureAssistant = "Configure Assistant"
    case realTimeAPI = "Real time API"
    case responseStream = "Response Stream Demo"

    var id: String { rawValue }
  }

  var openAIService: OpenAIService

  var options: [APIOption]

  var body: some View {
    VStack {
      // Custom model input field
      VStack(alignment: .leading, spacing: 8) {
        Text("Custom Model (Optional)")
          .font(.caption)
          .foregroundColor(.secondary)
        TextField("e.g., grok-beta, claude-3-opus, etc.", text: $customModel)
          .textFieldStyle(.roundedBorder)
          .autocapitalization(.none)
          .disableAutocorrection(true)
      }
      .padding()
      List(options, id: \.self, selection: $selection) { option in
        Text(option.rawValue)
      }
    }
    .sheet(item: $selection) { selection in
      VStack {
        Text(selection.rawValue)
          .font(.largeTitle)
          .padding()
        switch selection {
        case .audio:
          AudioDemoView(service: openAIService)
        case .chat:
          ChatDemoView(service: openAIService, customModel: customModel)
        case .chatPredictedOutput:
          ChatPredictedOutputDemoView(service: openAIService, customModel: customModel)
        case .vision:
          ChatVisionDemoView(service: openAIService, customModel: customModel)
        case .embeddings:
          EmbeddingsDemoView(service: openAIService)
        case .fineTuning:
          FineTuningJobDemoView(service: openAIService)
        case .files:
          FilesDemoView(service: openAIService)
        case .images:
          ImagesDemoView(service: openAIService)
        case .localChat:
          LocalChatDemoView(service: openAIService, customModel: customModel)
        case .models:
          ModelsDemoView(service: openAIService)
        case .moderations:
          ModerationDemoView(service: openAIService)
        case .chatHistoryConversation:
          ChatStreamFluidConversationDemoView(service: openAIService, customModel: customModel)
        case .chatFunctionCall:
          ChatFunctionCallDemoView(service: openAIService)
        case .chatFunctionsCallStream:
          ChatFunctionsCalllStreamDemoView(service: openAIService, customModel: customModel)
        case .chatStructuredOutput:
          ChatStructuredOutputDemoView(service: openAIService, customModel: customModel)
        case .chatStructuredOutputTool:
          ChatStructureOutputToolDemoView(service: openAIService, customModel: customModel)
        case .configureAssistant:
          AssistantConfigurationDemoView(service: openAIService)
        case .realTimeAPI:
          Text("WIP")
        case .responseStream:
          ResponseStreamDemoView(service: openAIService)
        }
      }
    }
  }

  @State private var selection: APIOption? = nil
  @State private var customModel = ""
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/PredictedOutputsDemo/ChatPredictedOutputDemoView.swift
================================================
//
//  ChatPredictedOutputDemoView.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 1/3/25.
//

import Foundation
import SwiftOpenAI
import SwiftUI

// MARK: - ChatPredictedOutputDemoView

/// https://platform.openai.com/docs/guides/predicted-outputs
struct ChatPredictedOutputDemoView: View {
  init(service: OpenAIService, customModel: String? = nil) {
    self.customModel = customModel
    chatProvider = ChatProvider(service: service)
  }

  let customModel: String?

  var body: some View {
    ScrollView {
      VStack {
        textArea
        Text(chatProvider.errorMessage)
          .foregroundColor(.red)
        chatCompletionResultView
      }
    }
    .overlay(
      Group {
        if isLoading {
          ProgressView()
        } else {
          EmptyView()
        }
      })
  }

  var textArea: some View {
    HStack(spacing: 4) {
      TextField("Enter prompt", text: $prompt, axis: .vertical)
        .textFieldStyle(.roundedBorder)
        .padding()
      Button {
        Task {
          isLoading = true
          defer { isLoading = false } // ensure isLoading is set to false when the

          let content = ChatCompletionParameters.Message.ContentType.text(prompt)
          prompt = ""
          let model: Model =
            if let customModel, !customModel.isEmpty {
              .custom(customModel)
            } else {
              .gpt4o
            }

          let parameters = ChatCompletionParameters(
            messages: [
              .init(role: .system, content: .text(systemMessage)),
              .init(role: .user, content: content),
              .init(role: .user, content: .text(predictedCode)),
            ], // Sending the predicted code as another user message.
            model: model,
            prediction: .init(content: .text(predictedCode)))
          try await chatProvider.startChat(parameters: parameters)
        }
      } label: {
        Image(systemName: "paperplane")
      }
      .buttonStyle(.bordered)
    }
    .padding()
  }

  /// stream = `false`
  var chatCompletionResultView: some View {
    ForEach(Array(chatProvider.messages.enumerated()), id: \.offset) { _, val in
      VStack(spacing: 0) {
        Text("\(val)")
      }
    }
  }

  @State private var chatProvider: ChatProvider
  @State private var isLoading = false
  @State private var prompt = ""
}

let systemMessage = """
  You are a code editor assistant. I only output code without any explanations, commentary, or additional text. I follow these rules:

  1. Respond with code only, never any text or explanations
  2. Use appropriate syntax highlighting/formatting 
  3. If the code needs to be modified/improved, output the complete updated code
  4. Do not include caveats, introductions, or commentary
  5. Do not ask questions or solicit feedback
  6. Do not explain what changes were made
  7. Assume the user knows what they want and will review the code themselves
  """

let predictedCode = """
  struct ChatPredictedOutputDemoView: View {

     @State private var chatProvider: ChatProvider
     @State private var isLoading = false
     @State private var prompt = ""

     init(service: OpenAIService) {
        chatProvider = ChatProvider(service: service)
     }

     var body: some View {
        ScrollView {
           VStack {
              textArea
              Text(chatProvider.errorMessage)
                 .foregroundColor(.red)
              streamedChatResultView
           }
        }
        .overlay(
           Group {
              if isLoading {
                 ProgressView()
              } else {
                 EmptyView()
              }
           }
        )
     }

     var textArea: some View {
        HStack(spacing: 4) {
           TextField("Enter prompt", text: $prompt, axis: .vertical)
              .textFieldStyle(.roundedBorder)
              .padding()
           Button {
              Task {
                 isLoading = true
                 defer { isLoading = false }  // ensure isLoading is set to false when the

                 let content: ChatCompletionParameters.Message.ContentType = .text(prompt)
                 prompt = ""
                 let parameters = ChatCompletionParameters(
                    messages: [.init(
                    role: .user,
                    content: content)],
                    model: .gpt4o)
              }
           } label: {
              Image(systemName: "paperplane")
           }
           .buttonStyle(.bordered)
        }
        .padding()
     }

     /// stream = `true`
     var streamedChatResultView: some View {
        VStack {
           Button("Cancel stream") {
              chatProvider.cancelStream()
           }
           Text(chatProvider.message)
        }
     }
  }
  """


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/Preview Content/Preview Assets.xcassets/Contents.json
================================================
{
  "info" : {
    "author" : "xcode",
    "version" : 1
  }
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/Resources/WorldCupData.jsonl
================================================
{"messages": [{"role": "system", "content": "You are a helpful assistant specializing in World Cup history."}, {"role": "user", "content": "Tell me about the World Cup winner in Spain 1982."}, {"role": "assistant", "content": "The winner of the 1982 FIFA World Cup held in Spain was Italy. They defeated West Germany 3-1 in the final match."}]}
{"messages": [{"role": "system", "content": "You are a helpful assistant specializing in World Cup history."}, {"role": "user", "content": "What about Mexico 1986?"}, {"role": "assistant", "content": "The 1986 FIFA World Cup in Mexico was won by Argentina. They beat West Germany 3-2 in the final."}]}
{"messages": [{"role": "system", "content": "You are a helpful assistant specializing in World Cup history."}, {"role": "user", "content": "Who won in Italy 1990?"}, {"role": "assistant", "content": "West Germany won the 1990 World Cup in Italy. They defeated Argentina 1-0 in the final."}]}
{"messages": [{"role": "system", "content": "You are a helpful assistant specializing in World Cup history."}, {"role": "user", "content": "What about USA 1994?"}, {"role": "assistant", "content": "Brazil won the 1994 World Cup held in the USA. They defeated Italy in a penalty shootout after a 0-0 draw in the final match."}]}
{"messages": [{"role": "system", "content": "You are a helpful assistant specializing in World Cup history."}, {"role": "user", "content": "Tell me about France 1998."}, {"role": "assistant", "content": "France won the 1998 World Cup held in their home country. They defeated Brazil 3-0 in the final."}]}
{"messages": [{"role": "system", "content": "You are a helpful assistant specializing in World Cup history."}, {"role": "user", "content": "What happened in Korea-Japan 2002?"}, {"role": "assistant", "content": "Brazil won the 2002 World Cup held in South Korea and Japan. They defeated Germany 2-0 in the final match."}]}
{"messages": [{"role": "system", "content": "You are a helpful assistant specializing in World Cup history."}, {"role": "user", "content": "Who won in Germany 2006?"}, {"role": "assistant", "content": "Italy won the 2006 World Cup held in Germany. They defeated France in a penalty shootout after a 1-1 draw in the final."}]}
{"messages": [{"role": "system", "content": "You are a helpful assistant specializing in World Cup history."}, {"role": "user", "content": "Tell me about South Africa 2010."}, {"role": "assistant", "content": "Spain won the 2010 World Cup held in South Africa. They defeated the Netherlands 1-0 in the final."}]}
{"messages": [{"role": "system", "content": "You are a helpful assistant specializing in World Cup history."}, {"role": "user", "content": "Who won in Brazil 2014?"}, {"role": "assistant", "content": "Germany won the 2014 World Cup held in Brazil. They defeated Argentina 1-0 in the final."}]}
{"messages": [{"role": "system", "content": "You are a helpful assistant specializing in World Cup history."}, {"role": "user", "content": "What about Russia 2018?"}, {"role": "assistant", "content": "France won the 2018 World Cup held in Russia. They defeated Croatia 4-2 in the final match."}]}
{"messages": [{"role": "system", "content": "You are a helpful assistant specializing in World Cup history."}, {"role": "user", "content": "Tell me about Qatar 2022."}, {"role": "assistant", "content": "I don't know"}]}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/ResponseAPIDemo/ResponseStreamDemoView.swift
================================================
//
//  ResponseStreamDemoView.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 6/7/25.
//

import SwiftOpenAI
import SwiftUI

// MARK: - ResponseStreamDemoView

struct ResponseStreamDemoView: View {
  init(service: OpenAIService) {
    _provider = State(initialValue: ResponseStreamProvider(service: service))
  }

  @Environment(\.colorScheme) var colorScheme

  var body: some View {
    VStack(spacing: 0) {
      // Header
      headerView

      // Messages
      ScrollViewReader { proxy in
        ScrollView {
          LazyVStack(spacing: 12) {
            ForEach(provider.messages) { message in
              MessageBubbleView(message: message)
                .id(message.id)
            }

            if provider.isStreaming {
              HStack {
                LoadingIndicatorView()
                  .frame(width: 30, height: 30)
                Spacer()
              }
              .padding(.horizontal)
            }
          }
          .padding()
        }
        .onChange(of: provider.messages.count) { _, _ in
          withAnimation {
            proxy.scrollTo(provider.messages.last?.id, anchor: .bottom)
          }
        }
      }

      // Error view
      if let error = provider.error {
        Text(error)
          .foregroundColor(.red)
          .font(.caption)
          .padding(.horizontal)
          .padding(.vertical, 8)
          .background(Color.red.opacity(0.1))
      }

      // Input area
      inputArea
    }
    .navigationTitle("Response Stream Demo")
    .navigationBarTitleDisplayMode(.inline)
    .toolbar {
      ToolbarItem(placement: .navigationBarTrailing) {
        Button("Clear") {
          provider.clearConversation()
        }
        .disabled(provider.isStreaming)
      }
    }
  }

  @State private var provider: ResponseStreamProvider
  @State private var inputText = ""
  @FocusState private var isInputFocused: Bool

  // MARK: - Subviews

  private var headerView: some View {
    VStack(alignment: .leading, spacing: 8) {
      Text("Streaming Responses with Conversation State")
        .font(.headline)

      Text("This demo uses the Responses API with streaming to maintain conversation context across multiple turns.")
        .font(.caption)
        .foregroundColor(.secondary)

      if provider.messages.isEmpty {
        Label("Start a conversation below", systemImage: "bubble.left.and.bubble.right")
          .font(.caption)
          .foregroundColor(.blue)
          .padding(.top, 4)
      }
    }
    .frame(maxWidth: .infinity, alignment: .leading)
    .padding()
    .background(Color(UIColor.secondarySystemBackground))
  }

  private var inputArea: some View {
    HStack(spacing: 12) {
      TextField("Type a message...", text: $inputText, axis: .vertical)
        .textFieldStyle(.roundedBorder)
        .lineLimit(1 ... 5)
        .focused($isInputFocused)
        .disabled(provider.isStreaming)
        .onSubmit {
          sendMessage()
        }

      Button(action: sendMessage) {
        Image(systemName: provider.isStreaming ? "stop.circle.fill" : "arrow.up.circle.fill")
          .font(.title2)
          .foregroundColor(provider.isStreaming ? .red : (inputText.isEmpty ? .gray : .blue))
      }
      .disabled(!provider.isStreaming && inputText.isEmpty)
    }
    .padding()
    .background(Color(UIColor.systemBackground))
    .overlay(
      Rectangle()
        .frame(height: 1)
        .foregroundColor(Color(UIColor.separator)),
      alignment: .top)
  }

  private func sendMessage() {
    guard !inputText.trimmingCharacters(in: .whitespacesAndNewlines).isEmpty else { return }

    if provider.isStreaming {
      provider.stopStreaming()
    } else {
      let message = inputText
      inputText = ""
      provider.sendMessage(message)
    }
  }
}

// MARK: - MessageBubbleView

struct MessageBubbleView: View {
  let message: ResponseStreamProvider.ResponseMessage

  @Environment(\.colorScheme) var colorScheme

  var body: some View {
    HStack {
      if message.role == .assistant {
        messageContent
          .background(backgroundGradient)
          .cornerRadius(16)
          .overlay(
            RoundedRectangle(cornerRadius: 16)
              .stroke(borderColor, lineWidth: 1))
        Spacer(minLength: 60)
      } else {
        Spacer(minLength: 60)
        messageContent
          .background(Color.blue)
          .cornerRadius(16)
          .foregroundColor(.white)
      }
    }
  }

  private var messageContent: some View {
    VStack(alignment: .leading, spacing: 4) {
      if message.role == .assistant, message.isStreaming {
        HStack(spacing: 4) {
          Image(systemName: "dot.radiowaves.left.and.right")
            .font(.caption2)
            .foregroundColor(.blue)
          Text("Streaming...")
            .font(.caption2)
            .foregroundColor(.secondary)
        }
      }

      Text(message.content.isEmpty && message.isStreaming ? " " : message.content)
        .padding(.horizontal, 12)
        .padding(.vertical, 8)

      if message.role == .assistant, !message.isStreaming, message.responseId != nil {
        Text("Response ID: \(String(message.responseId?.prefix(8) ?? ""))")
          .font(.caption2)
          .foregroundColor(.secondary)
          .padding(.horizontal, 12)
          .padding(.bottom, 4)
      }
    }
  }

  private var backgroundGradient: some View {
    LinearGradient(
      gradient: Gradient(colors: [
        Color(UIColor.secondarySystemBackground),
        Color(UIColor.tertiarySystemBackground),
      ]),
      startPoint: .topLeading,
      endPoint: .bottomTrailing)
  }

  private var borderColor: Color {
    colorScheme == .dark ? Color.white.opacity(0.1) : Color.black.opacity(0.1)
  }
}

// MARK: - LoadingIndicatorView

struct LoadingIndicatorView: View {
  var body: some View {
    ZStack {
      ForEach(0 ..< 3) { index in
        Circle()
          .fill(Color.blue)
          .frame(width: 8, height: 8)
          .offset(x: CGFloat(index - 1) * 12)
          .opacity(0.8)
          .scaleEffect(animationScale(for: index))
      }
    }
    .onAppear {
      withAnimation(
        .easeInOut(duration: 0.8)
          .repeatForever(autoreverses: true))
      {
        animationAmount = 1
      }
    }
  }

  @State private var animationAmount = 0.0

  private func animationScale(for index: Int) -> Double {
    let delay = Double(index) * 0.1
    let progress = (animationAmount + delay).truncatingRemainder(dividingBy: 1.0)
    return 0.5 + (0.5 * sin(progress * .pi))
  }
}

// MARK: - Preview

#Preview {
  NavigationView {
    ResponseStreamDemoView(service: OpenAIServiceFactory.service(apiKey: "test"))
  }
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/ResponseAPIDemo/ResponseStreamProvider.swift
================================================
//
//  ResponseStreamProvider.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 6/7/25.
//

import SwiftOpenAI
import SwiftUI

@MainActor
@Observable
class ResponseStreamProvider {
  // MARK: - Initialization

  init(service: OpenAIService) {
    self.service = service
  }

  // MARK: - Message Model

  struct ResponseMessage: Identifiable {
    let id = UUID()
    let role: MessageRole
    var content: String
    let timestamp: Date
    var isStreaming = false
    let responseId: String?

    enum MessageRole {
      case user
      case assistant
    }
  }

  var messages = [ResponseMessage]()
  var isStreaming = false
  var currentStreamingMessage: ResponseMessage?
  var error: String?

  // MARK: - Public Methods

  func sendMessage(_ text: String) {
    // Cancel any existing stream
    streamTask?.cancel()

    // Add user message
    let userMessage = ResponseMessage(
      role: .user,
      content: text,
      timestamp: Date(),
      responseId: nil)
    messages.append(userMessage)

    // Start streaming response
    streamTask = Task {
      await streamResponse(for: text)
    }
  }

  func stopStreaming() {
    streamTask?.cancel()
    streamTask = nil

    // Finalize current streaming message
    if var message = currentStreamingMessage {
      message.isStreaming = false
      if let index = messages.firstIndex(where: { $0.id == message.id }) {
        messages[index] = message
      }
    }

    currentStreamingMessage = nil
    isStreaming = false
  }

  func clearConversation() {
    stopStreaming()
    messages.removeAll()
    previousResponseId = nil
    error = nil
  }

  private let service: OpenAIService
  private var previousResponseId: String?
  private var streamTask: Task<Void, Never>?

  // MARK: - Private Methods

  private func streamResponse(for userInput: String) async {
    isStreaming = true
    error = nil

    // Create streaming message placeholder
    let streamingMessage = ResponseMessage(
      role: .assistant,
      content: "",
      timestamp: Date(),
      isStreaming: true,
      responseId: nil)
    messages.append(streamingMessage)
    currentStreamingMessage = streamingMessage

    do {
      // Build input array with conversation history
      var inputArray = [InputItem]()

      // Add conversation history
      for message in messages.dropLast(2) { // Exclude current user message and streaming placeholder
        let content = message.content
        switch message.role {
        case .user:
          inputArray.append(.message(InputMessage(role: "user", content: .text(content))))
        case .assistant:
          // Assistant messages in conversation history should be sent as simple text
          inputArray.append(.message(InputMessage(
            role: "assistant",
            content: .text(content))))
        }
      }

      // Add current user message
      inputArray.append(.message(InputMessage(role: "user", content: .text(userInput))))

      let parameters = ModelResponseParameter(
        input: .array(inputArray),
        model: .gpt5,
        instructions: "You are a helpful assistant. Use the conversation history to provide contextual responses.",
        maxOutputTokens: 1000,
        previousResponseId: previousResponseId,
        tools: [.imageGeneration(.init())])

      let stream = try await service.responseCreateStream(parameters)
      var accumulatedText = ""

      for try await event in stream {
        guard !Task.isCancelled else { break }

        switch event {
        case .responseCreated:
          // Response created event - we'll get the ID in responseCompleted
          break

        case .outputTextDelta(let delta):
          accumulatedText += delta.delta
          updateStreamingMessage(with: accumulatedText)

        case .responseCompleted(let completed):
          // Update previous response ID for conversation continuity
          previousResponseId = completed.response.id

          // Finalize the message
          finalizeStreamingMessage(
            with: accumulatedText,
            responseId: completed.response.id)

        case .responseFailed(let failed):
          throw APIError.requestFailed(
            description: failed.response.error?.message ?? "Stream failed")

        case .error(let errorEvent):
          throw APIError.requestFailed(
            description: errorEvent.message)

        default:
          // Handle other events as needed
          break
        }
      }

    } catch {
      self.error = error.localizedDescription

      // Remove streaming message on error
      if let streamingId = currentStreamingMessage?.id {
        messages.removeAll { $0.id == streamingId }
      }
    }

    currentStreamingMessage = nil
    isStreaming = false
  }

  private func updateStreamingMessage(with content: String) {
    guard
      let messageId = currentStreamingMessage?.id,
      let index = messages.firstIndex(where: { $0.id == messageId })
    else {
      return
    }

    messages[index].content = content
  }

  private func finalizeStreamingMessage(with content: String, responseId: String) {
    guard
      let messageId = currentStreamingMessage?.id,
      let index = messages.firstIndex(where: { $0.id == messageId })
    else {
      return
    }

    messages[index].content = content
    messages[index].isStreaming = false
    messages[index] = ResponseMessage(
      role: .assistant,
      content: content,
      timestamp: messages[index].timestamp,
      isStreaming: false,
      responseId: responseId)
  }
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/ServiceSelectionView.swift
================================================
//
//  ServiceSelectionView.swift
//  SwiftOpenAIExample
//
//  Created by Lou Zell on 3/27/24.
//

import SwiftUI

struct ServiceSelectionView: View {
  var body: some View {
    NavigationStack {
      List {
        Section("Select Service") {
          NavigationLink(destination: ApiKeyIntroView()) {
            VStack(alignment: .leading) {
              Text("Default OpenAI Service")
                .padding(.bottom, 10)
              Group {
                Text("Use this service to test SwiftOpenAI functionality by providing your own OpenAI key.")
              }
              .font(.caption)
              .fontWeight(.light)
            }
          }

          NavigationLink(destination: AIProxyIntroView()) {
            VStack(alignment: .leading) {
              Text("AIProxy Service")
                .padding(.bottom, 10)
              Group {
                Text(
                  "Use this service to test SwiftOpenAI functionality with requests proxied through AIProxy for key protection.")
              }
              .font(.caption)
              .fontWeight(.light)
            }
          }

          NavigationLink(destination: LocalHostEntryView()) {
            VStack(alignment: .leading) {
              Text("Ollama")
                .padding(.bottom, 10)
              Group {
                Text("Use this service to test SwiftOpenAI functionality by providing your own local host.")
              }
              .font(.caption)
              .fontWeight(.light)
            }
          }
        }
      }
    }
  }
}

#Preview {
  ServiceSelectionView()
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/SharedModels/ChatDisplayMessage.swift
================================================
//
//  ChatDisplayMessage.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 11/4/23.
//

import Foundation
import SwiftOpenAI

struct ChatDisplayMessage: Identifiable {
  init(
    id: UUID = UUID(),
    content: DisplayContent,
    type: DisplayMessageType,
    delta: ChatDisplayMessage.Delta?)
  {
    self.id = id
    self.content = content
    self.type = type
    self.delta = delta
  }

  struct Delta {
    var role: String
    var content: String
    var functionCallName: String?
    var functionCallArguments: String?
  }

  enum DisplayContent: Equatable {
    case text(String)
    case images([URL])
    case content([ChatCompletionParameters.Message.ContentType.MessageContent])
    case error(String)

    static func ==(lhs: DisplayContent, rhs: DisplayContent) -> Bool {
      switch (lhs, rhs) {
      case (.images(let a), .images(let b)):
        a == b
      case (.content(let a), .content(let b)):
        a == b
      case (.error(let a), .error(let b)):
        a == b
      default:
        false
      }
    }
  }

  enum DisplayMessageType {
    case received, sent
  }

  let id: UUID
  let content: DisplayContent
  let type: DisplayMessageType
  let delta: Delta?
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/SharedUI/ChatDisplayMessageView.swift
================================================
//
//  ChatDisplayMessageView.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 11/4/23.
//

import SwiftUI

struct ChatDisplayMessageView: View {
  let message: ChatDisplayMessage

  var body: some View {
    VStack(alignment: .leading, spacing: 8) {
      headerFor(message: message)
      Group {
        switch message.content {
        case .content(let content):
          let text = content.compactMap { contentItem -> String? in
            if case .text(let text) = contentItem {
              return text
            } else {
              return nil
            }
          }.first ?? ""

          let urls = content.compactMap { contentItem -> URL? in
            if case .imageUrl(let imageDetail) = contentItem {
              return imageDetail.url
            } else {
              return nil
            }
          }
          VStack(alignment: .leading, spacing: 8) {
            chatImagesViewFrom(urls: urls)
            chatMessageViewWith(text)
          }

        case .error(let error):
          Text(error)
            .padding()
            .font(.callout)
            .background(
              RoundedRectangle(cornerRadius: 20)
                .foregroundColor(.red.opacity(0.7)))

        case .text(let text):
          chatMessageViewWith(text)

        case .images(let urls):
          chatImagesViewFrom(urls: urls)
        }
      }
      .padding(.leading, 23)
    }
  }

  @ViewBuilder
  func chatMessageViewWith(
    _ text: String)
    -> some View
  {
    if text.isEmpty {
      ChatMessageLoadingView(animationDuration: 0.5)
        .frame(width: 10, height: 10)
    } else {
      Text(text)
        .font(.body)
    }
  }

  func headerFor(
    message: ChatDisplayMessage)
    -> some View
  {
    HStack {
      Image(systemName: message.type == .sent ? "person.circle" : "wand.and.stars")
        .resizable()
        .frame(width: 15, height: 15)
      Text(message.type == .sent ? "USER" : "CHATGPT")
        .font(.caption2)
    }
    .foregroundColor(.gray.opacity(0.9))
  }

  func chatImagesViewFrom(
    urls: [URL])
    -> some View
  {
    ScrollView(.horizontal, showsIndicators: false) {
      HStack(spacing: 8) {
        ForEach(urls, id: \.self) { url in
          URLImageView(url: url)
        }
      }
    }
  }

  @State private var urls = [URL]()
}

#Preview {
  VStack(alignment: .leading) {
    ChatDisplayMessageView(message: .init(content: .text("How are you?"), type: .sent, delta: nil))
    ChatDisplayMessageView(message: .init(content: .text("I am ok"), type: .received, delta: nil))
    ChatDisplayMessageView(message: .init(content: .images([]), type: .received, delta: nil))
  }
  .padding()
  .frame(maxWidth: .infinity, alignment: .leading)
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/SharedUI/ChatMessageLoadingView.swift
================================================
//
//  ChatMessageLoadingView.swift
//
//
//  Created by James Rochabrun on 3/28/24.
//

import Foundation
import SwiftUI

struct ChatMessageLoadingView: View {
  var animationDuration: Double
  @State private var isScaledUp = false

  var body: some View {
    Circle()
      .scaleEffect(isScaledUp ? 1.5 : 1) // 1.5 is 150% size, 1 is 100% size
      .onAppear {
        withAnimation(Animation.easeInOut(duration: animationDuration).repeatForever(autoreverses: true)) {
          isScaledUp.toggle()
        }
      }
  }
}

#Preview {
  ChatMessageLoadingView(animationDuration: 0.2)
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/SharedUI/LoadingView.swift
================================================
//
//  LoadingView.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 11/4/23.
//

import SwiftUI

struct LoadingView: View {
  let timer = Timer.publish(every: 0.5, on: .main, in: .common).autoconnect()

  var body: some View {
    HStack {
      Text("\(getDots())")
        .font(.title)
        .onReceive(timer) { _ in
          withAnimation {
            dotsCount = (dotsCount + 1) % 4
          }
        }
    }
    .frame(minHeight: 40)
  }

  func getDots() -> String {
    String(repeating: ".", count: dotsCount)
  }

  @State private var dotsCount = 0

}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/SharedUI/Theme/Sizes.swift
================================================
//
//  Sizes.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 11/13/23.
//

import Foundation

// MARK: - Sizes

enum Sizes { }

extension Sizes {
  static let spacingExtraSmall: CGFloat = 4.0
  static let spacingSmall: CGFloat = 6
  static let spacingMedium: CGFloat = 8
  static let spacingLarge: CGFloat = 12
  static let spacingExtraLarge: CGFloat = 16
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/SharedUI/Theme/ThemeColor.swift
================================================
//
//  ThemeColor.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 11/13/23.
//

import SwiftUI

// MARK: - ThemeColor

enum ThemeColor { }

extension ThemeColor {
  static let tintColor = Color.purple
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/SharedUI/URLImageView.swift
================================================
//
//  URLImageView.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 11/4/23.
//

import SwiftUI

// MARK: - URLImageView

struct URLImageView: View {
  let url: URL

  var body: some View {
    AsyncImage(
      url: url,
      transaction: Transaction(animation: .easeInOut))
    { phase in
      switch phase {
      case .empty:
        ProgressView()

      case .success(let image):
        image
          .resizable()
          .frame(width: 100, height: 100)
          .transition(.opacity)

      case .failure:
        Image(systemName: "wifi.slash")

      @unknown default:
        EmptyView()
      }
    }
    .frame(width: 100, height: 100)
    .background(Color.gray)
    .clipShape(RoundedRectangle(cornerRadius: 10))
  }
}

let urlImageViewMockURL =
  URL(
    string: "https://upload.wikimedia.org/wikipedia/commons/thumb/d/dd/Gfp-wisconsin-madison-the-nature-boardwalk.jpg/2560px-Gfp-wisconsin-madison-the-nature-boardwalk.jpg")!

#Preview {
  ScrollView {
    VStack(spacing: 40) {
      URLImageView(url: urlImageViewMockURL)
      URLImageView(url: urlImageViewMockURL)
        .clipShape(Circle())
        .overlay(Circle().stroke(Color.white, lineWidth: 4))
        .shadow(radius: 10)
      URLImageView(url: urlImageViewMockURL)
        .clipShape(Circle())
        .overlay(Circle().stroke(Color.white, lineWidth: 4))
    }
  }
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/SwiftOpenAIExample.entitlements
================================================
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
    <key>com.apple.security.app-sandbox</key>
    <true/>
    <key>com.apple.security.files.user-selected.read-only</key>
    <true/>
</dict>
</plist>


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/SwiftOpenAIExampleApp.swift
================================================
//
//  SwiftOpenAIExampleApp.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 10/19/23.
//

import SwiftUI

@main
struct SwiftOpenAIExampleApp: App {
  var body: some Scene {
    WindowGroup {
      ServiceSelectionView()
    }
  }
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/Utilities/ContentLoader.swift
================================================
//
//  ContentLoader.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 10/19/23.
//

import Foundation

struct ContentLoader {
  enum Error: Swift.Error {
    case fileNotFound(name: String)
    case fileDecodingFailed(name: String, Swift.Error)
  }

  func urlFromAsset(fromFileNamed name: String, ext: String) -> URL? {
    guard
      let url = Bundle.main.url(
        forResource: name,
        withExtension: ext)
    else {
      return nil
    }
    return url
  }

  func loadBundledContent(fromFileNamed name: String, ext: String) throws -> Data {
    guard let url = urlFromAsset(fromFileNamed: name, ext: ext) else {
      throw Error.fileNotFound(name: name)
    }

    do {
      return try Data(contentsOf: url)
    } catch {
      throw Error.fileDecodingFailed(name: name, error)
    }
  }
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/Vision/ChatVisionDemoView.swift
================================================
//
//  ChatVisionDemoView.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 11/8/23.
//

import PhotosUI
import SwiftOpenAI
import SwiftUI

struct ChatVisionDemoView: View {
  init(service: OpenAIService, customModel: String? = nil) {
    self.customModel = customModel
    _chatProvider = State(initialValue: ChatVisionProvider(service: service, customModel: customModel))
  }

  let customModel: String?

  var body: some View {
    ScrollViewReader { proxy in
      VStack {
        List(chatProvider.chatMessages) { message in
          ChatDisplayMessageView(message: message)
            .listRowSeparator(.hidden)
        }
        .listStyle(.plain)
        .onChange(of: chatProvider.chatMessages.last?.content) {
          let lastMessage = chatProvider.chatMessages.last
          if let id = lastMessage?.id {
            proxy.scrollTo(id, anchor: .bottom)
          }
        }
        textArea
      }
    }
  }

  var textArea: some View {
    HStack(spacing: 0) {
      photoPicker
      VStack(alignment: .leading, spacing: 0) {
        if !selectedImages.isEmpty {
          selectedImagesView
          Divider()
            .foregroundColor(.gray)
        }
        textField
          .padding(6)
      }
      .padding(.vertical, 2)
      .padding(.horizontal, 2)
      .animation(.bouncy, value: selectedImages.isEmpty)
      .background(
        RoundedRectangle(cornerRadius: 16)
          .stroke(.gray, lineWidth: 1))
      .padding(.horizontal, 8)
      textAreSendButton
    }
    .padding(.horizontal)
    .disabled(isLoading)
  }

  var textField: some View {
    TextField(
      "How Can I help you today?",
      text: $prompt,
      axis: .vertical)
  }

  var textAreSendButton: some View {
    Button {
      Task {
        isLoading = true
        defer {
          // ensure isLoading is set to false after the function executes.
          isLoading = false
        }
        /// Make the request
        let content: [ChatCompletionParameters.Message.ContentType.MessageContent] = [
          .text(prompt),
        ] + selectedImageURLS.map { .imageUrl(.init(url: $0)) }
        resetInput()
        let model: Model =
          if let customModel, !customModel.isEmpty {
            .custom(customModel)
          } else {
            .gpt4o
          }

        try await chatProvider.startStreamedChat(parameters: .init(
          messages: [.init(role: .user, content: .contentArray(content))],
          model: model, maxTokens: 300), content: content)
      }
    } label: {
      Image(systemName: "paperplane")
    }
    .buttonStyle(.bordered)
    .disabled(prompt.isEmpty)
  }

  var photoPicker: some View {
    PhotosPicker(selection: $selectedItems, matching: .images) {
      Image(systemName: "photo")
    }
    .onChange(of: selectedItems) {
      Task {
        selectedImages.removeAll()
        for item in selectedItems {
          if let data = try? await item.loadTransferable(type: Data.self) {
            let base64String = data.base64EncodedString()
            let url = URL(string: "data:image/jpeg;base64,\(base64String)")!
            selectedImageURLS.append(url)
            if let uiImage = UIImage(data: data) {
              let image = Image(uiImage: uiImage)
              selectedImages.append(image)
            }
          }
        }
      }
    }
  }

  var selectedImagesView: some View {
    HStack(spacing: 0) {
      ForEach(0 ..< selectedImages.count, id: \.self) { i in
        selectedImages[i]
          .resizable()
          .frame(width: 60, height: 60)
          .clipShape(RoundedRectangle(cornerRadius: 12))
          .padding(4)
      }
    }
  }

  @State private var chatProvider: ChatVisionProvider
  @State private var isLoading = false
  @State private var prompt = ""
  @State private var selectedItems = [PhotosPickerItem]()
  @State private var selectedImages = [Image]()
  @State private var selectedImageURLS = [URL]()

  /// Called when the user taps on the send button. Clears the selected images and prompt.
  private func resetInput() {
    prompt = ""
    selectedImages = []
    selectedItems = []
    selectedImageURLS = []
  }
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample/Vision/ChatVisionProvider.swift
================================================
//
//  ChatVisionProvider.swift
//  SwiftOpenAIExample
//
//  Created by James Rochabrun on 11/9/23.
//

import SwiftOpenAI
import SwiftUI

@Observable
class ChatVisionProvider {
  init(service: OpenAIService, customModel: String? = nil) {
    self.service = service
    self.customModel = customModel
  }

  // MARK: - Initializer

  let customModel: String?

  // MARK: - Public Properties

  /// A collection of messages for display in the UI, representing the conversation.
  var chatMessages = [ChatDisplayMessage]()

  // MARK: - Public Methods

  func startStreamedChat(
    parameters: ChatCompletionParameters,
    content: [ChatCompletionParameters.Message.ContentType.MessageContent])
    async throws
  {
    // Displays the user message in the UI
    await startNewUserDisplayMessage(content)
    // Start a new assistant message that is initially empty.
    await startNewAssistantEmptyDisplayMessage()

    do {
      // Begin the chat stream with the updated parameters.
      let stream = try await service.startStreamedChat(parameters: parameters)
      for try await result in stream {
        // Extract the first choice from the stream results, if none exist, exit the loop.
        guard let choice = result.choices?.first else { return }

        // Store initial `role` and `functionCall` data from the first `choice.delta` for UI display.
        // This information is essential for maintaining context in the conversation and for updating
        // the chat UI with proper role attributions for each message.
        var newDelta = ChatDisplayMessage.Delta(role: "", content: "")
        if let firstDelta = firstChatMessageResponseDelta[result.id ?? ""] {
          // If we have already stored the first delta for this result ID, reuse its role.
          newDelta.role = firstDelta.role!
        } else {
          // Otherwise, store the first delta received for future reference.
          firstChatMessageResponseDelta[result.id ?? ""] = choice.delta
        }
        // Assign the content received in the current message to the newDelta.
        newDelta.content = temporalReceivedMessageContent
        // Update the UI with the latest assistant message and the corresponding delta.
        await updateLastAssistantMessage(content: choice.delta?.content ?? "", delta: newDelta)
      }
    } catch {
      // If an error occurs, update the UI to display the error message.
      updateLastDisplayedMessage(.init(content: .error("\(error)"), type: .received, delta: nil))
    }
  }

  // MARK: - Private Properties

  private let service: OpenAIService

  /// Accumulates the streamed message content for real-time display updates in the UI.
  private var temporalReceivedMessageContent = ""
  /// Tracks the identifier of the last message displayed, enabling updates in the from the streaming API response.
  private var lastDisplayedMessageID: UUID?
  /// Stores the initial chat message's delta, which uniquely includes metadata like `role`.
  private var firstChatMessageResponseDelta = [String: ChatCompletionChunkObject.ChatChoice.Delta]()

  // MARK: - Private Methods

  @MainActor
  private func startNewUserDisplayMessage(_ content: [ChatCompletionParameters.Message.ContentType.MessageContent]) {
    // Updates the UI with
    let startingMessage = ChatDisplayMessage(
      content: .content(content),
      type: .sent, delta: nil)
    addMessage(startingMessage)
  }

  @MainActor
  private func startNewAssistantEmptyDisplayMessage() {
    firstChatMessageResponseDelta = [:]
    temporalReceivedMessageContent = ""
    let newMessage = ChatDisplayMessage(content: .text(temporalReceivedMessageContent), type: .received, delta: nil)
    let newMessageId = newMessage.id
    lastDisplayedMessageID = newMessageId
    addMessage(newMessage)
  }

  @MainActor
  private func updateLastAssistantMessage(
    content: String,
    delta: ChatDisplayMessage.Delta)
  {
    temporalReceivedMessageContent += content
    guard let id = lastDisplayedMessageID, let index = chatMessages.firstIndex(where: { $0.id == id }) else { return }
    chatMessages[index] = ChatDisplayMessage(
      id: id,
      content: .text(temporalReceivedMessageContent),
      type: .received,
      delta: delta)
  }

  @MainActor
  private func addMessage(_ message: ChatDisplayMessage) {
    withAnimation {
      chatMessages.append(message)
    }
  }

  private func updateLastDisplayedMessage(_ message: ChatDisplayMessage) {
    chatMessages[chatMessages.count - 1] = message
  }
}


================================================
FILE: Examples/SwiftOpenAIExample/SwiftOpenAIExample.xcodeproj/project.pbxproj
================================================
// !$*UTF8*$!
{
	archiveVersion = 1;
	classes = {
	};
	objectVersion = 60;
	objects = {

/* Begin PBXBuildFile section */
		0DEE5DC12BB40643004894AD /* SwiftOpenAI in Frameworks */ = {isa = PBXBuildFile; productRef = 0DEE5DC02BB40643004894AD /* SwiftOpenAI */; };
		0DF957842BB53BEF00DD2013 /* ServiceSelectionView.swift in Sources */ = {isa = PBXBuildFile; fileRef = 0DF957832BB53BEF00DD2013 /* ServiceSelectionView.swift */; };
		0DF957862BB543F100DD2013 /* AIProxyIntroView.swift in Sources */ = {isa = PBXBuildFile; fileRef = 0DF957852BB543F100DD2013 /* AIProxyIntroView.swift */; };
		7B029E372C6893FD0025681A /* ChatStructuredOutputProvider.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B029E362C6893FD0025681A /* ChatStructuredOutputProvider.swift */; };
		7B029E392C68940D0025681A /* ChatStructuredOutputDemoView.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B029E382C68940D0025681A /* ChatStructuredOutputDemoView.swift */; };
		7B029E3C2C69BE990025681A /* ChatStructuredOutputToolProvider.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B029E3B2C69BE990025681A /* ChatStructuredOutputToolProvider.swift */; };
		7B029E3E2C69BEA70025681A /* ChatStructureOutputToolDemoView.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B029E3D2C69BEA70025681A /* ChatStructureOutputToolDemoView.swift */; };
		7B1268052B08246400400694 /* AssistantConfigurationDemoView.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B1268042B08246400400694 /* AssistantConfigurationDemoView.swift */; };
		7B1268072B08247C00400694 /* AssistantConfigurationProvider.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B1268062B08247C00400694 /* AssistantConfigurationProvider.swift */; };
		7B2B6D562DF434670059B4BB /* ResponseStreamDemoView.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B2B6D552DF434670059B4BB /* ResponseStreamDemoView.swift */; };
		7B2B6D582DF4347E0059B4BB /* ResponseStreamProvider.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B2B6D572DF4347E0059B4BB /* ResponseStreamProvider.swift */; };
		7B3DDCC52BAAA722004B5C96 /* AssistantsListDemoView.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B3DDCC42BAAA722004B5C96 /* AssistantsListDemoView.swift */; };
		7B3DDCC72BAAAD34004B5C96 /* AssistantThreadConfigurationProvider.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B3DDCC62BAAAD34004B5C96 /* AssistantThreadConfigurationProvider.swift */; };
		7B3DDCC92BAAAF96004B5C96 /* AssistantStreamDemoScreen.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B3DDCC82BAAAF96004B5C96 /* AssistantStreamDemoScreen.swift */; };
		7B413DFE2BB6846D009684CD /* ChatMessageLoadingView.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B413DFD2BB6846D009684CD /* ChatMessageLoadingView.swift */; };
		7B436B962AE24A04003CE281 /* OptionsListView.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B436B952AE24A04003CE281 /* OptionsListView.swift */; };
		7B436B992AE25052003CE281 /* ContentLoader.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B436B982AE25052003CE281 /* ContentLoader.swift */; };
		7B436B9B2AE25094003CE281 /* narcos.m4a in Resources */ = {isa = PBXBuildFile; fileRef = 7B436B9A2AE25093003CE281 /* narcos.m4a */; };
		7B436B9E2AE256BC003CE281 /* german.m4a in Resources */ = {isa = PBXBuildFile; fileRef = 7B436B9D2AE256BC003CE281 /* german.m4a */; };
		7B436BA12AE25958003CE281 /* ChatProvider.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B436BA02AE25958003CE281 /* ChatProvider.swift */; };
		7B436BA32AE25962003CE281 /* ChatDemoView.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B436BA22AE25962003CE281 /* ChatDemoView.swift */; };
		7B436BA62AE77F37003CE281 /* Embeddingsprovider.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B436BA52AE77F37003CE281 /* Embeddingsprovider.swift */; };
		7B436BA82AE77F41003CE281 /* EmbeddingsDemoView.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B436BA72AE77F41003CE281 /* EmbeddingsDemoView.swift */; };
		7B436BAB2AE788F1003CE281 /* FineTuningJobProvider.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B436BAA2AE788F1003CE281 /* FineTuningJobProvider.swift */; };
		7B436BAD2AE788FB003CE281 /* FineTuningJobDemoView.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B436BAC2AE788FB003CE281 /* FineTuningJobDemoView.swift */; };
		7B436BB02AE79369003CE281 /* FilesDemoView.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B436BAF2AE79369003CE281 /* FilesDemoView.swift */; };
		7B436BB22AE79370003CE281 /* FilesProvider.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B436BB12AE79370003CE281 /* FilesProvider.swift */; };
		7B436BB42AE79481003CE281 /* WorldCupData.jsonl in Resources */ = {isa = PBXBuildFile; fileRef = 7B436BB32AE79481003CE281 /* WorldCupData.jsonl */; };
		7B436BB72AE7A2EA003CE281 /* ImagesProvider.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B436BB62AE7A2EA003CE281 /* ImagesProvider.swift */; };
		7B436BB92AE7A2F2003CE281 /* ImagesDemoView.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B436BB82AE7A2F2003CE281 /* ImagesDemoView.swift */; };
		7B436BBC2AE7ABD3003CE281 /* ModelsProvider.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B436BBB2AE7ABD3003CE281 /* ModelsProvider.swift */; };
		7B436BBE2AE7ABDA003CE281 /* ModelsDemoView.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B436BBD2AE7ABDA003CE281 /* ModelsDemoView.swift */; };
		7B436BC12AE7B01F003CE281 /* ModerationProvider.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B436BC02AE7B01F003CE281 /* ModerationProvider.swift */; };
		7B436BC32AE7B027003CE281 /* ModerationDemoView.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B436BC22AE7B027003CE281 /* ModerationDemoView.swift */; };
		7B50DD282C2A9A390070A64D /* LocalHostEntryView.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B50DD272C2A9A390070A64D /* LocalHostEntryView.swift */; };
		7B50DD2B2C2A9D2F0070A64D /* LocalChatDemoView.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B50DD2A2C2A9D2F0070A64D /* LocalChatDemoView.swift */; };
		7B7239A02AF625F200646679 /* ChatFluidConversationProvider.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B72399F2AF625F200646679 /* ChatFluidConversationProvider.swift */; };
		7B7239A22AF6260D00646679 /* ChatDisplayMessage.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B7239A12AF6260D00646679 /* ChatDisplayMessage.swift */; };
		7B7239A42AF6289900646679 /* ChatStreamFluidConversationDemoView.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B7239A32AF6289900646679 /* ChatStreamFluidConversationDemoView.swift */; };
		7B7239A62AF628F800646679 /* ChatDisplayMessageView.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B7239A52AF628F800646679 /* ChatDisplayMessageView.swift */; };
		7B7239A82AF6292100646679 /* LoadingView.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B7239A72AF6292100646679 /* LoadingView.swift */; };
		7B7239AB2AF6294C00646679 /* URLImageView.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B7239AA2AF6294C00646679 /* URLImageView.swift */; };
		7B7239AE2AF9FF0000646679 /* ChatFunctionsCallStreamProvider.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B7239AD2AF9FF0000646679 /* ChatFunctionsCallStreamProvider.swift */; };
		7B7239B12AF9FF3C00646679 /* ChatFunctionsCalllStreamDemoView.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B7239B02AF9FF3C00646679 /* ChatFunctionsCalllStreamDemoView.swift */; };
		7B99C2E72C0718DE00E701B3 /* FilesPicker.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B99C2E62C0718DE00E701B3 /* FilesPicker.swift */; };
		7B99C2E92C0718FF00E701B3 /* FileAttachmentView.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B99C2E82C0718FF00E701B3 /* FileAttachmentView.swift */; };
		7B99C2EB2C07191200E701B3 /* AttachmentView.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B99C2EA2C07191200E701B3 /* AttachmentView.swift */; };
		7B99C2ED2C071B1600E701B3 /* FilesPickerProvider.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7B99C2EC2C071B1600E701B3 /* FilesPickerProvider.swift */; };
		7BA788CD2AE23A48008825D5 /* SwiftOpenAIExampleApp.swift in Sources */ = {isa = PBXBuildFile; fileRef = 7BA788CC2AE23A48008825D5 /* SwiftOpenAIExampleApp.swift */; };
		7BA788CF2AE23A48
Download .txt
gitextract_oinuzyhk/

├── .github/
│   └── workflows/
│       └── ci.yml
├── .gitignore
├── CONTRIBUTING.md
├── Examples/
│   ├── RealtimeExample/
│   │   ├── README.md
│   │   └── RealtimeExample.swift
│   └── SwiftOpenAIExample/
│       ├── SwiftOpenAIExample/
│       │   ├── AIProxyIntroView.swift
│       │   ├── ApiKeyIntroView.swift
│       │   ├── Assets.xcassets/
│       │   │   ├── AccentColor.colorset/
│       │   │   │   └── Contents.json
│       │   │   ├── AppIcon.appiconset/
│       │   │   │   └── Contents.json
│       │   │   └── Contents.json
│       │   ├── Assistants/
│       │   │   ├── AssistantConfigurationDemoView.swift
│       │   │   ├── AssistantConfigurationProvider.swift
│       │   │   ├── AssistantStreamDemoScreen.swift
│       │   │   ├── AssistantThreadConfigurationProvider.swift
│       │   │   └── AssistantsListDemoView.swift
│       │   ├── AudioDemo/
│       │   │   ├── AudioDemoView.swift
│       │   │   └── AudioProvider.swift
│       │   ├── ChatDemo/
│       │   │   ├── ChatDemoView.swift
│       │   │   └── ChatProvider.swift
│       │   ├── ChatFunctionsCall/
│       │   │   ├── ChatMessageDisplayModel.swift
│       │   │   ├── ChatMessageView.swift
│       │   │   ├── Completion/
│       │   │   │   ├── ChatFunctionCallDemoView.swift
│       │   │   │   └── ChatFunctionCallProvider.swift
│       │   │   └── Stream/
│       │   │       ├── ChatFunctionsCallStreamProvider.swift
│       │   │       └── ChatFunctionsCalllStreamDemoView.swift
│       │   ├── ChatStreamFluidConversationDemo/
│       │   │   ├── ChatFluidConversationProvider.swift
│       │   │   └── ChatStreamFluidConversationDemoView.swift
│       │   ├── ChatStructureOutputTool/
│       │   │   ├── ChatStructureOutputToolDemoView.swift
│       │   │   └── ChatStructuredOutputToolProvider.swift
│       │   ├── ChatStructuredOutputs/
│       │   │   ├── ChatStructuredOutputDemoView.swift
│       │   │   └── ChatStructuredOutputProvider.swift
│       │   ├── EmbeddingsDemo/
│       │   │   ├── EmbeddingsDemoView.swift
│       │   │   └── Embeddingsprovider.swift
│       │   ├── Files/
│       │   │   ├── AttachmentView.swift
│       │   │   ├── FileAttachmentView.swift
│       │   │   ├── FilesPicker.swift
│       │   │   └── FilesPickerProvider.swift
│       │   ├── FilesDemo/
│       │   │   ├── FilesDemoView.swift
│       │   │   └── FilesProvider.swift
│       │   ├── FineTuningDemo/
│       │   │   ├── FineTuningJobDemoView.swift
│       │   │   └── FineTuningJobProvider.swift
│       │   ├── ImagesDemo/
│       │   │   ├── ImagesDemoView.swift
│       │   │   └── ImagesProvider.swift
│       │   ├── LocalChatDemo/
│       │   │   └── LocalChatDemoView.swift
│       │   ├── LocalHostEntryView.swift
│       │   ├── ModelsDemo/
│       │   │   ├── ModelsDemoView.swift
│       │   │   └── ModelsProvider.swift
│       │   ├── ModerationsDemo/
│       │   │   ├── ModerationDemoView.swift
│       │   │   └── ModerationProvider.swift
│       │   ├── OptionsListView.swift
│       │   ├── PredictedOutputsDemo/
│       │   │   └── ChatPredictedOutputDemoView.swift
│       │   ├── Preview Content/
│       │   │   └── Preview Assets.xcassets/
│       │   │       └── Contents.json
│       │   ├── Resources/
│       │   │   ├── WorldCupData.jsonl
│       │   │   ├── german.m4a
│       │   │   └── narcos.m4a
│       │   ├── ResponseAPIDemo/
│       │   │   ├── ResponseStreamDemoView.swift
│       │   │   └── ResponseStreamProvider.swift
│       │   ├── ServiceSelectionView.swift
│       │   ├── SharedModels/
│       │   │   └── ChatDisplayMessage.swift
│       │   ├── SharedUI/
│       │   │   ├── ChatDisplayMessageView.swift
│       │   │   ├── ChatMessageLoadingView.swift
│       │   │   ├── LoadingView.swift
│       │   │   ├── Theme/
│       │   │   │   ├── Sizes.swift
│       │   │   │   └── ThemeColor.swift
│       │   │   └── URLImageView.swift
│       │   ├── SwiftOpenAIExample.entitlements
│       │   ├── SwiftOpenAIExampleApp.swift
│       │   ├── Utilities/
│       │   │   └── ContentLoader.swift
│       │   └── Vision/
│       │       ├── ChatVisionDemoView.swift
│       │       └── ChatVisionProvider.swift
│       ├── SwiftOpenAIExample.xcodeproj/
│       │   ├── project.pbxproj
│       │   ├── project.xcworkspace/
│       │   │   └── contents.xcworkspacedata
│       │   └── xcshareddata/
│       │       └── xcschemes/
│       │           └── SwiftOpenAIExample.xcscheme
│       ├── SwiftOpenAIExampleTests/
│       │   └── SwiftOpenAIExampleTests.swift
│       └── SwiftOpenAIExampleUITests/
│           ├── SwiftOpenAIExampleUITests.swift
│           └── SwiftOpenAIExampleUITestsLaunchTests.swift
├── LICENSE
├── Package.swift
├── README.md
├── Sources/
│   └── OpenAI/
│       ├── AIProxy/
│       │   ├── AIProxyCertificatePinning.swift
│       │   ├── AIProxyService.swift
│       │   └── Endpoint+AIProxy.swift
│       ├── Azure/
│       │   ├── AzureOpenAIAPI.swift
│       │   ├── AzureOpenAIConfiguration.swift
│       │   └── DefaultOpenAIAzureService.swift
│       ├── LocalModelService/
│       │   ├── LocalModelAPI.swift
│       │   └── LocalModelService.swift
│       ├── Private/
│       │   ├── Audio/
│       │   │   ├── AudioPCMPlayer.swift
│       │   │   ├── AudioPCMPlayerError.swift
│       │   │   ├── AudioUtils.swift
│       │   │   ├── MicrophonePCMSampleVendor.swift
│       │   │   ├── MicrophonePCMSampleVendorAE.swift
│       │   │   ├── MicrophonePCMSampleVendorAT.swift
│       │   │   ├── MicrophonePCMSampleVendorCommon.swift
│       │   │   └── MicrophonePCMSampleVendorError.swift
│       │   ├── Networking/
│       │   │   ├── AsyncHTTPClientAdapter.swift
│       │   │   ├── Endpoint.swift
│       │   │   ├── HTTPClient.swift
│       │   │   ├── MultipartFormDataBuilder.swift
│       │   │   ├── OpenAIAPI.swift
│       │   │   └── URLSessionHTTPClientAdapter.swift
│       │   └── Realtime/
│       │       ├── OpenAIRealtimeSession.swift
│       │       └── RealtimeActor.swift
│       └── Public/
│           ├── Parameters/
│           │   ├── Assistant/
│           │   │   └── AssistantParameters.swift
│           │   ├── Audio/
│           │   │   ├── AudioSpeechParameters.swift
│           │   │   ├── AudioTranscriptionParameters.swift
│           │   │   └── AudioTranslationParameters.swift
│           │   ├── Batch/
│           │   │   └── BatchParameter.swift
│           │   ├── Chat/
│           │   │   └── ChatCompletionParameters.swift
│           │   ├── Conversations/
│           │   │   ├── CreateConversationItemsParameter.swift
│           │   │   ├── CreateConversationParameter.swift
│           │   │   ├── GetConversationItemParameter.swift
│           │   │   ├── GetConversationItemsParameter.swift
│           │   │   └── UpdateConversationParameter.swift
│           │   ├── Embedding/
│           │   │   └── EmbeddingParameter.swift
│           │   ├── File/
│           │   │   └── FileParameter.swift
│           │   ├── FineTuning/
│           │   │   └── FineTuningJobParameters.swift
│           │   ├── Image/
│           │   │   ├── Dalle.swift
│           │   │   ├── ImageCreateParameters.swift
│           │   │   ├── ImageEditParameters.swift
│           │   │   └── ImageVariationParameters.swift
│           │   ├── ImageGen/
│           │   │   ├── CreateImageEditParameters.swift
│           │   │   ├── CreateImageParameters.swift
│           │   │   └── CreateImageVariationParameters.swift
│           │   ├── Message/
│           │   │   ├── MessageParameter.swift
│           │   │   └── ModifyMessageParameters.swift
│           │   ├── Model.swift
│           │   ├── Moderation/
│           │   │   └── ModerationParameter.swift
│           │   ├── MultipartFormDataParameters.swift
│           │   ├── Realtime/
│           │   │   ├── OpenAIRealtimeConversationItemCreate.swift
│           │   │   ├── OpenAIRealtimeInputAudioBufferAppend.swift
│           │   │   ├── OpenAIRealtimeResponseCreate.swift
│           │   │   ├── OpenAIRealtimeSessionConfiguration.swift
│           │   │   └── OpenAIRealtimeSessionUpdate.swift
│           │   ├── Response/
│           │   │   ├── Conversation.swift
│           │   │   ├── GetInputItemsParameter.swift
│           │   │   ├── GetResponseParameter.swift
│           │   │   ├── InputType.swift
│           │   │   ├── ModelResponseParameter.swift
│           │   │   ├── Prompt.swift
│           │   │   ├── ResponseInclude.swift
│           │   │   └── StreamOptions.swift
│           │   ├── Runs/
│           │   │   ├── CreateThreadAndRunParameter.swift
│           │   │   ├── ModifyRunParameters.swift
│           │   │   ├── RunParameter.swift
│           │   │   └── RunToolsOutputParameter.swift
│           │   ├── Threads/
│           │   │   ├── CreateThreadParameters.swift
│           │   │   └── ModifyThreadParameters.swift
│           │   ├── VectorStore/
│           │   │   └── VectorStoreParameter.swift
│           │   ├── VectorStoreFileBatch/
│           │   │   └── VectorStoreFileBatchParameter.swift
│           │   └── VectorStoreFiles/
│           │       └── VectorStoreFileParameter.swift
│           ├── ResponseModels/
│           │   ├── Assistants/
│           │   │   ├── AssistantObject.swift
│           │   │   ├── AssistantStreamEvent.swift
│           │   │   └── AssistantStreamEventObject.swift
│           │   ├── Audio/
│           │   │   ├── AudioObject.swift
│           │   │   └── AudioSpeechObject.swift
│           │   ├── Batch/
│           │   │   └── BatchObject.swift
│           │   ├── Chat/
│           │   │   ├── ChatCompletionChunkObject.swift
│           │   │   ├── ChatCompletionObject.swift
│           │   │   └── ChatUsage.swift
│           │   ├── Conversations/
│           │   │   └── ConversationModel.swift
│           │   ├── Delta.swift
│           │   ├── Embedding/
│           │   │   └── EmbeddingObject.swift
│           │   ├── File/
│           │   │   └── FileObject.swift
│           │   ├── FineTuning/
│           │   │   ├── FineTuningJobEventObject.swift
│           │   │   └── FineTuningJobObject.swift
│           │   ├── Image/
│           │   │   └── ImageObject.swift
│           │   ├── ImageGen/
│           │   │   └── CreateImageResponse.swift
│           │   ├── Messages/
│           │   │   ├── MessageContent.swift
│           │   │   ├── MessageDeltaObject.swift
│           │   │   └── MessageObject.swift
│           │   ├── Model/
│           │   │   └── ModelObject.swift
│           │   ├── Moderation/
│           │   │   └── ModerationObject.swift
│           │   ├── OpenAIErrorResponse.swift
│           │   ├── OpenAIResponse.swift
│           │   ├── Realtime/
│           │   │   ├── OpenAIRealtimeInputAudioBufferSpeechStarted.swift
│           │   │   ├── OpenAIRealtimeMessage.swift
│           │   │   └── OpenAIRealtimeResponseFunctionCallArgumentsDone.swift
│           │   ├── Response/
│           │   │   ├── OutputItem.swift
│           │   │   ├── ResponseModel.swift
│           │   │   └── ResponseStreamEvent.swift
│           │   ├── Runs/
│           │   │   ├── RunObject.swift
│           │   │   ├── RunStepDeltaObject.swift
│           │   │   ├── RunStepDetails.swift
│           │   │   └── RunStepObject.swift
│           │   ├── Threads/
│           │   │   └── ThreadObject.swift
│           │   ├── VectorStore/
│           │   │   └── VectorStoreObject.swift
│           │   ├── VectorStoreFile/
│           │   │   └── VectorStoreFileObject.swift
│           │   └── VectorStoreFileBatch/
│           │       └── VectorStoreFileBatchObject.swift
│           ├── Service/
│           │   ├── DefaultOpenAIService.swift
│           │   ├── OpenAIService.swift
│           │   └── OpenAIServiceFactory.swift
│           └── Shared/
│               ├── AudioController.swift
│               ├── DeletionStatus.swift
│               ├── ExpirationPolicy.swift
│               ├── FileCount.swift
│               ├── IncompleteDetails.swift
│               ├── JSONSchema.swift
│               ├── LastError.swift
│               ├── MessageAttachment.swift
│               ├── OpenAIError.swift
│               ├── OpenAIJSONValue.swift
│               ├── Reasoning.swift
│               ├── ResponseFormat.swift
│               ├── TextConfiguration.swift
│               ├── Tool.swift
│               ├── ToolCall.swift
│               ├── ToolChoice.swift
│               ├── ToolChoiceMode.swift
│               ├── ToolResources.swift
│               ├── TruncationStrategy.swift
│               └── Usage.swift
├── Tests/
│   └── OpenAITests/
│       ├── InputTypeTests.swift
│       ├── ModelResponseParameterTests.swift
│       ├── OpenAITests.swift
│       ├── OutputItemTests.swift
│       ├── ResponseModelValidationTests.swift
│       └── ResponseStreamEventTests.swift
└── rules.swiftformat
Condensed preview — 219 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (1,427K chars).
[
  {
    "path": ".github/workflows/ci.yml",
    "chars": 1236,
    "preview": "# This workflow will build a Swift project\n# For more information see: https://docs.github.com/en/actions/automating-bui"
  },
  {
    "path": ".gitignore",
    "chars": 318,
    "preview": ".DS_Store\n/.build\n/Packages\nxcuserdata/\nDerivedData/\n.swiftpm/configuration/registries.json\n.swiftpm/xcode/package.xcwor"
  },
  {
    "path": "CONTRIBUTING.md",
    "chars": 91,
    "preview": "\n### Linting:\n```bash\nbrew install swiftformat\nswiftformat --config rules.swiftformat .\n```"
  },
  {
    "path": "Examples/RealtimeExample/README.md",
    "chars": 7087,
    "preview": "# OpenAI Realtime API Example\n\nThis example demonstrates how to use SwiftOpenAI's Realtime API for bidirectional voice c"
  },
  {
    "path": "Examples/RealtimeExample/RealtimeExample.swift",
    "chars": 6167,
    "preview": "//\n//  RealtimeExample.swift\n//  SwiftOpenAI\n//\n//  Example implementation of OpenAI Realtime API for bidirectional voic"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/AIProxyIntroView.swift",
    "chars": 1901,
    "preview": "//\n//  AIProxyIntroView.swift\n//  SwiftOpenAIExample\n//\n//  Created by Lou Zell on 3/27/24.\n//\n\nimport SwiftOpenAI\nimpor"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/ApiKeyIntroView.swift",
    "chars": 1791,
    "preview": "//\n//  ApiKeyIntroView.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 10/19/23.\n//\n\nimport SwiftOpenA"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/Assets.xcassets/AccentColor.colorset/Contents.json",
    "chars": 123,
    "preview": "{\n  \"colors\" : [\n    {\n      \"idiom\" : \"universal\"\n    }\n  ],\n  \"info\" : {\n    \"author\" : \"xcode\",\n    \"version\" : 1\n  }"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/Assets.xcassets/AppIcon.appiconset/Contents.json",
    "chars": 999,
    "preview": "{\n  \"images\" : [\n    {\n      \"idiom\" : \"universal\",\n      \"platform\" : \"ios\",\n      \"size\" : \"1024x1024\"\n    },\n    {\n  "
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/Assets.xcassets/Contents.json",
    "chars": 63,
    "preview": "{\n  \"info\" : {\n    \"author\" : \"xcode\",\n    \"version\" : 1\n  }\n}\n"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/Assistants/AssistantConfigurationDemoView.swift",
    "chars": 8320,
    "preview": "//\n//  AssistantConfigurationDemoView.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 11/17/23.\n//\n\nim"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/Assistants/AssistantConfigurationProvider.swift",
    "chars": 1655,
    "preview": "//\n//  AssistantConfigurationProvider.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 11/17/23.\n//\n\nim"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/Assistants/AssistantStreamDemoScreen.swift",
    "chars": 5858,
    "preview": "//\n//  AssistantStreamDemoScreen.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 3/19/24.\n//\n\nimport S"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/Assistants/AssistantThreadConfigurationProvider.swift",
    "chars": 2355,
    "preview": "//\n//  AssistantThreadConfigurationProvider.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 3/19/24.\n/"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/Assistants/AssistantsListDemoView.swift",
    "chars": 1146,
    "preview": "//\n//  AssistantsListDemoView.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 3/19/24.\n//\n\nimport Swif"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/AudioDemo/AudioDemoView.swift",
    "chars": 3065,
    "preview": "//\n//  AudioDemoView.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 10/19/23.\n//\n\nimport SwiftOpenAI\n"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/AudioDemo/AudioProvider.swift",
    "chars": 1557,
    "preview": "//\n//  AudioProvider.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 10/19/23.\n//\n\nimport AVFoundation"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/ChatDemo/ChatDemoView.swift",
    "chars": 2983,
    "preview": "//\n//  ChatDemoView.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 10/19/23.\n//\n\nimport SwiftOpenAI\ni"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/ChatDemo/ChatProvider.swift",
    "chars": 1778,
    "preview": "//\n//  ChatProvider.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 10/19/23.\n//\n\nimport SwiftOpenAI\ni"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/ChatFunctionsCall/ChatMessageDisplayModel.swift",
    "chars": 1041,
    "preview": "//\n//  ChatMessageDisplayModel.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 11/13/23.\n//\n\nimport Fo"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/ChatFunctionsCall/ChatMessageView.swift",
    "chars": 2803,
    "preview": "//\n//  ChatMessageView.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 11/13/23.\n//\n\nimport Foundation"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/ChatFunctionsCall/Completion/ChatFunctionCallDemoView.swift",
    "chars": 2250,
    "preview": "//\n//  ChatFunctionCallDemoView.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 11/14/23.\n//\n\nimport S"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/ChatFunctionsCall/Completion/ChatFunctionCallProvider.swift",
    "chars": 8764,
    "preview": "//\n//  ChatFunctionCallProvider.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 11/14/23.\n//\n\nimport S"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/ChatFunctionsCall/Stream/ChatFunctionsCallStreamProvider.swift",
    "chars": 10897,
    "preview": "//\n//  ChatFunctionsCallStreamProvider.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 11/6/23.\n//\n\nim"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/ChatFunctionsCall/Stream/ChatFunctionsCalllStreamDemoView.swift",
    "chars": 2274,
    "preview": "//\n//  ChatFunctionsCalllStreamDemoView.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 11/6/23.\n//\n\ni"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/ChatStreamFluidConversationDemo/ChatFluidConversationProvider.swift",
    "chars": 6604,
    "preview": "//\n//  ChatFluidConversationProvider.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 11/4/23.\n//\n\nimpo"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/ChatStreamFluidConversationDemo/ChatStreamFluidConversationDemoView.swift",
    "chars": 2586,
    "preview": "//\n//  ChatStreamFluidConversationDemoView.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 11/4/23.\n//"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/ChatStructureOutputTool/ChatStructureOutputToolDemoView.swift",
    "chars": 2248,
    "preview": "//\n//  ChatStructureOutputToolDemoView.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 8/11/24.\n//\n\nim"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/ChatStructureOutputTool/ChatStructuredOutputToolProvider.swift",
    "chars": 8044,
    "preview": "//\n//  ChatStructuredOutputToolProvider.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 8/11/24.\n//\n\ni"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/ChatStructuredOutputs/ChatStructuredOutputDemoView.swift",
    "chars": 5412,
    "preview": "//\n//  ChatStructuredOutputDemoView.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 8/10/24.\n//\n\nimpor"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/ChatStructuredOutputs/ChatStructuredOutputProvider.swift",
    "chars": 2844,
    "preview": "//\n//  ChatStructuredOutputProvider.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 8/10/24.\n//\n\nimpor"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/EmbeddingsDemo/EmbeddingsDemoView.swift",
    "chars": 1917,
    "preview": "//\n//  EmbeddingsDemoView.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 10/23/23.\n//\n\nimport SwiftOp"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/EmbeddingsDemo/Embeddingsprovider.swift",
    "chars": 492,
    "preview": "//\n//  Embeddingsprovider.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 10/23/23.\n//\n\nimport SwiftOp"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/Files/AttachmentView.swift",
    "chars": 1150,
    "preview": "//\n//  AttachmentView.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 5/29/24.\n//\n\nimport SwiftUI\n\nstr"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/Files/FileAttachmentView.swift",
    "chars": 3317,
    "preview": "//\n//  FileAttachmentView.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 5/29/24.\n//\n\nimport SwiftOpe"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/Files/FilesPicker.swift",
    "chars": 3994,
    "preview": "//\n//  FilesPicker.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 5/29/24.\n//\n\nimport SwiftOpenAI\nimp"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/Files/FilesPickerProvider.swift",
    "chars": 1090,
    "preview": "//\n//  FilesPickerProvider.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 5/29/24.\n//\n\nimport SwiftOp"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/FilesDemo/FilesDemoView.swift",
    "chars": 3549,
    "preview": "//\n//  FilesDemoView.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 10/23/23.\n//\n\nimport SwiftOpenAI\n"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/FilesDemo/FilesProvider.swift",
    "chars": 1083,
    "preview": "//\n//  FilesProvider.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 10/23/23.\n//\n\nimport SwiftOpenAI\n"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/FineTuningDemo/FineTuningJobDemoView.swift",
    "chars": 1985,
    "preview": "//\n//  FineTuningJobDemoView.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 10/23/23.\n//\n\nimport Swif"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/FineTuningDemo/FineTuningJobProvider.swift",
    "chars": 1366,
    "preview": "//\n//  FineTuningJobProvider.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 10/23/23.\n//\n\nimport Swif"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/ImagesDemo/ImagesDemoView.swift",
    "chars": 1698,
    "preview": "//\n//  ImagesDemoView.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 10/24/23.\n//\n\nimport SwiftOpenAI"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/ImagesDemo/ImagesProvider.swift",
    "chars": 973,
    "preview": "//\n//  ImagesProvider.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 10/23/23.\n//\n\nimport SwiftOpenAI"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/LocalChatDemo/LocalChatDemoView.swift",
    "chars": 3864,
    "preview": "//\n//  LocalChatDemoView.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 6/24/24.\n//\n\nimport SwiftOpen"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/LocalHostEntryView.swift",
    "chars": 1020,
    "preview": "//\n//  LocalHostEntryView.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 6/24/24.\n//\n\nimport SwiftOpe"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/ModelsDemo/ModelsDemoView.swift",
    "chars": 1017,
    "preview": "//\n//  ModelsDemoView.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 10/24/23.\n//\n\nimport SwiftOpenAI"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/ModelsDemo/ModelsProvider.swift",
    "chars": 748,
    "preview": "//\n//  ModelsProvider.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 10/24/23.\n//\n\nimport SwiftOpenAI"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/ModerationsDemo/ModerationDemoView.swift",
    "chars": 1493,
    "preview": "//\n//  ModerationDemoView.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 10/24/23.\n//\n\nimport SwiftOp"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/ModerationsDemo/ModerationProvider.swift",
    "chars": 707,
    "preview": "//\n//  ModerationProvider.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 10/24/23.\n//\n\nimport SwiftOp"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/OptionsListView.swift",
    "chars": 3817,
    "preview": "//\n//  OptionsListView.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 10/19/23.\n//\n\nimport SwiftOpenA"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/PredictedOutputsDemo/ChatPredictedOutputDemoView.swift",
    "chars": 4799,
    "preview": "//\n//  ChatPredictedOutputDemoView.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 1/3/25.\n//\n\nimport "
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/Preview Content/Preview Assets.xcassets/Contents.json",
    "chars": 63,
    "preview": "{\n  \"info\" : {\n    \"author\" : \"xcode\",\n    \"version\" : 1\n  }\n}\n"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/Resources/WorldCupData.jsonl",
    "chars": 3345,
    "preview": "{\"messages\": [{\"role\": \"system\", \"content\": \"You are a helpful assistant specializing in World Cup history.\"}, {\"role\": "
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/ResponseAPIDemo/ResponseStreamDemoView.swift",
    "chars": 6694,
    "preview": "//\n//  ResponseStreamDemoView.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 6/7/25.\n//\n\nimport Swift"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/ResponseAPIDemo/ResponseStreamProvider.swift",
    "chars": 5566,
    "preview": "//\n//  ResponseStreamProvider.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 6/7/25.\n//\n\nimport Swift"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/ServiceSelectionView.swift",
    "chars": 1605,
    "preview": "//\n//  ServiceSelectionView.swift\n//  SwiftOpenAIExample\n//\n//  Created by Lou Zell on 3/27/24.\n//\n\nimport SwiftUI\n\nstru"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/SharedModels/ChatDisplayMessage.swift",
    "chars": 1214,
    "preview": "//\n//  ChatDisplayMessage.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 11/4/23.\n//\n\nimport Foundati"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/SharedUI/ChatDisplayMessageView.swift",
    "chars": 2745,
    "preview": "//\n//  ChatDisplayMessageView.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 11/4/23.\n//\n\nimport Swif"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/SharedUI/ChatMessageLoadingView.swift",
    "chars": 591,
    "preview": "//\n//  ChatMessageLoadingView.swift\n//\n//\n//  Created by James Rochabrun on 3/28/24.\n//\n\nimport Foundation\nimport SwiftU"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/SharedUI/LoadingView.swift",
    "chars": 585,
    "preview": "//\n//  LoadingView.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 11/4/23.\n//\n\nimport SwiftUI\n\nstruct"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/SharedUI/Theme/Sizes.swift",
    "chars": 376,
    "preview": "//\n//  Sizes.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 11/13/23.\n//\n\nimport Foundation\n\n// MARK:"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/SharedUI/Theme/ThemeColor.swift",
    "chars": 221,
    "preview": "//\n//  ThemeColor.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 11/13/23.\n//\n\nimport SwiftUI\n\n// MAR"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/SharedUI/URLImageView.swift",
    "chars": 1376,
    "preview": "//\n//  URLImageView.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 11/4/23.\n//\n\nimport SwiftUI\n\n// MA"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/SwiftOpenAIExample.entitlements",
    "chars": 322,
    "preview": "<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n<!DOCTYPE plist PUBLIC \"-//Apple//DTD PLIST 1.0//EN\" \"http://www.apple.com/DTDs/P"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/SwiftOpenAIExampleApp.swift",
    "chars": 251,
    "preview": "//\n//  SwiftOpenAIExampleApp.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 10/19/23.\n//\n\nimport Swif"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/Utilities/ContentLoader.swift",
    "chars": 825,
    "preview": "//\n//  ContentLoader.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 10/19/23.\n//\n\nimport Foundation\n\n"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/Vision/ChatVisionDemoView.swift",
    "chars": 4162,
    "preview": "//\n//  ChatVisionDemoView.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 11/8/23.\n//\n\nimport PhotosUI"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample/Vision/ChatVisionProvider.swift",
    "chars": 4512,
    "preview": "//\n//  ChatVisionProvider.swift\n//  SwiftOpenAIExample\n//\n//  Created by James Rochabrun on 11/9/23.\n//\n\nimport SwiftOpe"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample.xcodeproj/project.pbxproj",
    "chars": 59989,
    "preview": "// !$*UTF8*$!\n{\n\tarchiveVersion = 1;\n\tclasses = {\n\t};\n\tobjectVersion = 60;\n\tobjects = {\n\n/* Begin PBXBuildFile section *"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample.xcodeproj/project.xcworkspace/contents.xcworkspacedata",
    "chars": 135,
    "preview": "<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n<Workspace\n   version = \"1.0\">\n   <FileRef\n      location = \"self:\">\n   </FileRef"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExample.xcodeproj/xcshareddata/xcschemes/SwiftOpenAIExample.xcscheme",
    "chars": 4308,
    "preview": "<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n<Scheme\n   LastUpgradeVersion = \"1500\"\n   version = \"2.2\">\n   <BuildAction\n      "
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExampleTests/SwiftOpenAIExampleTests.swift",
    "chars": 1163,
    "preview": "//\n//  SwiftOpenAIExampleTests.swift\n//  SwiftOpenAIExampleTests\n//\n//  Created by James Rochabrun on 10/19/23.\n//\n\nimpo"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExampleUITests/SwiftOpenAIExampleUITests.swift",
    "chars": 1322,
    "preview": "//\n//  SwiftOpenAIExampleUITests.swift\n//  SwiftOpenAIExampleUITests\n//\n//  Created by James Rochabrun on 10/19/23.\n//\n\n"
  },
  {
    "path": "Examples/SwiftOpenAIExample/SwiftOpenAIExampleUITests/SwiftOpenAIExampleUITestsLaunchTests.swift",
    "chars": 786,
    "preview": "//\n//  SwiftOpenAIExampleUITestsLaunchTests.swift\n//  SwiftOpenAIExampleUITests\n//\n//  Created by James Rochabrun on 10/"
  },
  {
    "path": "LICENSE",
    "chars": 1072,
    "preview": "MIT License\n\nCopyright (c) 2023 James Rochabrun\n\nPermission is hereby granted, free of charge, to any person obtaining a"
  },
  {
    "path": "Package.swift",
    "chars": 1066,
    "preview": "// swift-tools-version: 5.9\n// The swift-tools-version declares the minimum version of Swift required to build this pack"
  },
  {
    "path": "README.md",
    "chars": 200884,
    "preview": "# SwiftOpenAI\n<img width=\"1090\" alt=\"repoOpenAI\" src=\"https://github.com/jamesrochabrun/SwiftOpenAI/assets/5378604/51bc5"
  },
  {
    "path": "Sources/OpenAI/AIProxy/AIProxyCertificatePinning.swift",
    "chars": 7655,
    "preview": "//\n//  AIProxyCertificatePinning.swift\n//\n//\n//  Created by Lou Zell on 6/23/24.\n//\n#if !os(Linux)\nimport Foundation\nimp"
  },
  {
    "path": "Sources/OpenAI/AIProxy/AIProxyService.swift",
    "chars": 48172,
    "preview": "//\n//  AIProxyService.swift\n//\n//\n//  Created by Lou Zell on 3/27/24.\n//\n#if !os(Linux)\nimport Foundation\n\nprivate let a"
  },
  {
    "path": "Sources/OpenAI/AIProxy/Endpoint+AIProxy.swift",
    "chars": 7644,
    "preview": "//\n//  Endpoint+AIProxy.swift\n//\n//\n//  Created by Lou Zell on 3/26/24.\n//\n#if !os(Linux)\nimport DeviceCheck\nimport Foun"
  },
  {
    "path": "Sources/OpenAI/Azure/AzureOpenAIAPI.swift",
    "chars": 6245,
    "preview": "//\n//  AzureOpenAIAPI.swift\n//\n//\n//  Created by James Rochabrun on 1/23/24.\n//\n\nimport Foundation\n\n// MARK: - AzureOpen"
  },
  {
    "path": "Sources/OpenAI/Azure/AzureOpenAIConfiguration.swift",
    "chars": 871,
    "preview": "//\n//  AzureOpenAIConfiguration.swift\n//\n//\n//  Created by James Rochabrun on 1/23/24.\n//\n\nimport Foundation\n\n/// [Refer"
  },
  {
    "path": "Sources/OpenAI/Azure/DefaultOpenAIAzureService.swift",
    "chars": 40815,
    "preview": "//\n//  DefaultOpenAIAzureService.swift\n//\n//\n//  Created by James Rochabrun on 1/23/24.\n//\n\nimport Foundation\n#if os(Lin"
  },
  {
    "path": "Sources/OpenAI/LocalModelService/LocalModelAPI.swift",
    "chars": 1164,
    "preview": "//\n//  LocalModelAPI.swift\n//\n//\n//  Created by James Rochabrun on 6/30/24.\n//\n\nimport Foundation\n\n// MARK: - LocalModel"
  },
  {
    "path": "Sources/OpenAI/LocalModelService/LocalModelService.swift",
    "chars": 30993,
    "preview": "//\n//  LocalModelService.swift\n//\n//\n//  Created by James Rochabrun on 6/30/24.\n//\n\nimport Foundation\n\nstruct LocalModel"
  },
  {
    "path": "Sources/OpenAI/Private/Audio/AudioPCMPlayer.swift",
    "chars": 3769,
    "preview": "//\n//  AudioPCMPlayer.swift\n//  SwiftOpenAI\n//\n//  Created from AIProxySwift\n//  Original: https://github.com/lzell/AIPr"
  },
  {
    "path": "Sources/OpenAI/Private/Audio/AudioPCMPlayerError.swift",
    "chars": 400,
    "preview": "//\n//  AudioPCMPlayerError.swift\n//  SwiftOpenAI\n//\n//  Created from AIProxySwift\n//  Original: https://github.com/lzell"
  },
  {
    "path": "Sources/OpenAI/Private/Audio/AudioUtils.swift",
    "chars": 6783,
    "preview": "//\n//  AudioUtils.swift\n//  SwiftOpenAI\n//\n//  Created from AIProxySwift\n//  Original: https://github.com/lzell/AIProxyS"
  },
  {
    "path": "Sources/OpenAI/Private/Audio/MicrophonePCMSampleVendor.swift",
    "chars": 334,
    "preview": "//\n//  MicrophonePCMSampleVendor.swift\n//  SwiftOpenAI\n//\n//  Created from AIProxySwift\n//  Original: https://github.com"
  },
  {
    "path": "Sources/OpenAI/Private/Audio/MicrophonePCMSampleVendorAE.swift",
    "chars": 4297,
    "preview": "//\n//  MicrophonePCMSampleVendorAE.swift\n//  SwiftOpenAI\n//\n//  Created from AIProxySwift\n//  Original: https://github.c"
  },
  {
    "path": "Sources/OpenAI/Private/Audio/MicrophonePCMSampleVendorAT.swift",
    "chars": 9170,
    "preview": "//\n//  MicrophonePCMSampleVendorAT.swift\n//  SwiftOpenAI\n//\n//  Created from AIProxySwift\n//  Original: https://github.c"
  },
  {
    "path": "Sources/OpenAI/Private/Audio/MicrophonePCMSampleVendorCommon.swift",
    "chars": 4707,
    "preview": "//\n//  MicrophonePCMSampleVendorCommon.swift\n//  SwiftOpenAI\n//\n//  Created from AIProxySwift\n//  Original: https://gith"
  },
  {
    "path": "Sources/OpenAI/Private/Audio/MicrophonePCMSampleVendorError.swift",
    "chars": 418,
    "preview": "//\n//  MicrophonePCMSampleVendorError.swift\n//  SwiftOpenAI\n//\n//  Created from AIProxySwift\n//  Original: https://githu"
  },
  {
    "path": "Sources/OpenAI/Private/Networking/AsyncHTTPClientAdapter.swift",
    "chars": 4390,
    "preview": "//\n//  AsyncHTTPClientAdapter.swift\n//  SwiftOpenAI\n//\n//  Created by Joe Fabisevich on 5/18/25.\n//\n\n#if os(Linux)\nimpor"
  },
  {
    "path": "Sources/OpenAI/Private/Networking/Endpoint.swift",
    "chars": 3041,
    "preview": "//\n//  Endpoint.swift\n//\n//\n//  Created by James Rochabrun on 10/11/23.\n//\n\nimport Foundation\n#if os(Linux)\nimport Found"
  },
  {
    "path": "Sources/OpenAI/Private/Networking/HTTPClient.swift",
    "chars": 3013,
    "preview": "import Foundation\n\n#if canImport(FoundationNetworking)\nimport FoundationNetworking\n#endif\n\n// MARK: - HTTPClient\n\n/// Pr"
  },
  {
    "path": "Sources/OpenAI/Private/Networking/MultipartFormDataBuilder.swift",
    "chars": 1852,
    "preview": "//\n//  MultipartFormDataBuilder.swift\n//\n//\n//  Created by James Rochabrun on 10/11/23.\n//\n\nimport Foundation\n\n// MARK: "
  },
  {
    "path": "Sources/OpenAI/Private/Networking/OpenAIAPI.swift",
    "chars": 12217,
    "preview": "//\n//  OpenAIAPI.swift\n//\n//\n//  Created by James Rochabrun on 10/10/23.\n//\n\nimport Foundation\n\n// MARK: - OpenAIAPI\n\nen"
  },
  {
    "path": "Sources/OpenAI/Private/Networking/URLSessionHTTPClientAdapter.swift",
    "chars": 3283,
    "preview": "import Foundation\n\n#if canImport(FoundationNetworking)\nimport FoundationNetworking\n#endif\n\n#if !os(Linux)\n/// Adapter th"
  },
  {
    "path": "Sources/OpenAI/Private/Realtime/OpenAIRealtimeSession.swift",
    "chars": 11559,
    "preview": "//\n//  OpenAIRealtimeSession.swift\n//  SwiftOpenAI\n//\n//  Created from AIProxySwift\n//  Original: https://github.com/lze"
  },
  {
    "path": "Sources/OpenAI/Private/Realtime/RealtimeActor.swift",
    "chars": 240,
    "preview": "//\n//  RealtimeActor.swift\n//  SwiftOpenAI\n//\n//  Created from AIProxySwift\n//  Original: https://github.com/lzell/AIPro"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/Assistant/AssistantParameters.swift",
    "chars": 5666,
    "preview": "//\n//  AssistantParameters.swift\n//\n//\n//  Created by James Rochabrun on 11/15/23.\n//\n\nimport Foundation\n\n/// Create an "
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/Audio/AudioSpeechParameters.swift",
    "chars": 2014,
    "preview": "//\n//  AudioSpeechParameters.swift\n//\n//\n//  Created by James Rochabrun on 11/14/23.\n//\n\nimport Foundation\n\n/// [Generat"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/Audio/AudioTranscriptionParameters.swift",
    "chars": 3964,
    "preview": "//\n//  AudioTranscriptionParameters.swift\n//\n//\n//  Created by James Rochabrun on 10/10/23.\n//\n\nimport Foundation\n\n// MA"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/Audio/AudioTranslationParameters.swift",
    "chars": 2942,
    "preview": "//\n//  AudioTranslationParameters.swift\n//\n//\n//  Created by James Rochabrun on 10/10/23.\n//\n\nimport Foundation\n\n// MARK"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/Batch/BatchParameter.swift",
    "chars": 1240,
    "preview": "//\n//  BatchParameter.swift\n//\n//\n//  Created by James Rochabrun on 4/27/24.\n//\n\nimport Foundation\n\n/// [Create large ba"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/Chat/ChatCompletionParameters.swift",
    "chars": 25728,
    "preview": "//\n//  ChatCompletionParameters.swift\n//\n//\n//  Created by James Rochabrun on 10/10/23.\n//\n\nimport Foundation\n\n/// [Crea"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/Conversations/CreateConversationItemsParameter.swift",
    "chars": 897,
    "preview": "//\n//  CreateConversationItemsParameter.swift\n//  SwiftOpenAI\n//\n//  Created by James Rochabrun on 10/05/25.\n//\n\nimport "
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/Conversations/CreateConversationParameter.swift",
    "chars": 889,
    "preview": "//\n//  CreateConversationParameter.swift\n//  SwiftOpenAI\n//\n//  Created by James Rochabrun on 10/05/25.\n//\n\nimport Found"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/Conversations/GetConversationItemParameter.swift",
    "chars": 899,
    "preview": "//\n//  GetConversationItemParameter.swift\n//  SwiftOpenAI\n//\n//  Created by James Rochabrun on 10/05/25.\n//\n\nimport Foun"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/Conversations/GetConversationItemsParameter.swift",
    "chars": 1437,
    "preview": "//\n//  GetConversationItemsParameter.swift\n//  SwiftOpenAI\n//\n//  Created by James Rochabrun on 10/05/25.\n//\n\nimport Fou"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/Conversations/UpdateConversationParameter.swift",
    "chars": 680,
    "preview": "//\n//  UpdateConversationParameter.swift\n//  SwiftOpenAI\n//\n//  Created by James Rochabrun on 10/05/25.\n//\n\nimport Found"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/Embedding/EmbeddingParameter.swift",
    "chars": 2137,
    "preview": "//\n//  EmbeddingParameter.swift\n//\n//\n//  Created by James Rochabrun on 10/12/23.\n//\n\nimport Foundation\n\n/// [Creates](h"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/File/FileParameter.swift",
    "chars": 1604,
    "preview": "//\n//  FileParameter.swift\n//\n//\n//  Created by James Rochabrun on 10/16/23.\n//\n\nimport Foundation\n\n// MARK: - FileParam"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/FineTuning/FineTuningJobParameters.swift",
    "chars": 4839,
    "preview": "//\n//  FineTuningJobParameters.swift\n//\n//\n//  Created by James Rochabrun on 10/17/23.\n//\n\nimport Foundation\n\n/// [Creat"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/Image/Dalle.swift",
    "chars": 1838,
    "preview": "//\n//  Dalle.swift\n//\n//\n//  Created by James Rochabrun on 11/15/23.\n//\n\nimport Foundation\n\n/// [DALL·E](https://platfor"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/Image/ImageCreateParameters.swift",
    "chars": 2558,
    "preview": "//\n//  ImageCreateParameters.swift\n//\n//\n//  Created by James Rochabrun on 10/12/23.\n//\n\nimport Foundation\n\n/// [Creates"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/Image/ImageEditParameters.swift",
    "chars": 4571,
    "preview": "//\n//  ImageEditParameters.swift\n//\n//\n//  Created by James Rochabrun on 10/12/23.\n//\n\nimport Foundation\n#if canImport(U"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/Image/ImageVariationParameters.swift",
    "chars": 3673,
    "preview": "//\n//  ImageVariationParameters.swift\n//\n//\n//  Created by James Rochabrun on 10/12/23.\n//\n\nimport Foundation\n#if canImp"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/ImageGen/CreateImageEditParameters.swift",
    "chars": 8472,
    "preview": "//\n//  CreateImageEditParameters.swift\n//  SwiftOpenAI\n//\n//  Created by James Rochabrun on 4/24/25.\n//\nimport Foundatio"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/ImageGen/CreateImageParameters.swift",
    "chars": 6596,
    "preview": "//\n//  CreateImageParameters.swift\n//  SwiftOpenAI\n//\n//  Created by James Rochabrun on 4/24/25.\n//\n\nimport Foundation\n\n"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/ImageGen/CreateImageVariationParameters.swift",
    "chars": 4403,
    "preview": "//\n//  CreateImageVariationParameters.swift\n//  SwiftOpenAI\n//\n//  Created by James Rochabrun on 4/24/25.\n//\n\nimport Fou"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/Message/MessageParameter.swift",
    "chars": 4477,
    "preview": "//\n//  MessageParameter.swift\n//\n//\n//  Created by James Rochabrun on 11/16/23.\n//\n\nimport Foundation\n\n/// [Create a mes"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/Message/ModifyMessageParameters.swift",
    "chars": 680,
    "preview": "//\n//  ModifyMessageParameters.swift\n//\n//\n//  Created by James Rochabrun on 11/25/23.\n//\n\nimport Foundation\n\n/// Modifi"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/Model.swift",
    "chars": 4791,
    "preview": "//\n//  Model.swift\n//\n//\n//  Created by James Rochabrun on 11/15/23.\n//\n\nimport Foundation\n\n/// [Models](https://platfor"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/Moderation/ModerationParameter.swift",
    "chars": 1072,
    "preview": "//\n//  ModerationParameter.swift\n//\n//\n//  Created by James Rochabrun on 10/13/23.\n//\n\nimport Foundation\n\n/// [Classifie"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/MultipartFormDataParameters.swift",
    "chars": 202,
    "preview": "//\n//  MultipartFormDataParameters.swift\n//\n//\n//  Created by James Rochabrun on 10/11/23.\n//\n\nimport Foundation\n\npublic"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/Realtime/OpenAIRealtimeConversationItemCreate.swift",
    "chars": 1789,
    "preview": "//\n//  OpenAIRealtimeConversationItemCreate.swift\n//  SwiftOpenAI\n//\n//  Created from AIProxySwift\n//  Original: https:/"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/Realtime/OpenAIRealtimeInputAudioBufferAppend.swift",
    "chars": 407,
    "preview": "//\n//  OpenAIRealtimeInputAudioBufferAppend.swift\n//  SwiftOpenAI\n//\n//  Created from AIProxySwift\n//  Original: https:/"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/Realtime/OpenAIRealtimeResponseCreate.swift",
    "chars": 1450,
    "preview": "//\n//  OpenAIRealtimeResponseCreate.swift\n//  SwiftOpenAI\n//\n//  Created from AIProxySwift\n//  Original: https://github."
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/Realtime/OpenAIRealtimeSessionConfiguration.swift",
    "chars": 11163,
    "preview": "// MARK: - OpenAIRealtimeSessionConfiguration\n\n//\n//  OpenAIRealtimeSessionConfiguration.swift\n//  SwiftOpenAI\n//\n//  Cr"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/Realtime/OpenAIRealtimeSessionUpdate.swift",
    "chars": 929,
    "preview": "//\n//  OpenAIRealtimeSessionUpdate.swift\n//  SwiftOpenAI\n//\n//  Created from AIProxySwift\n//  Original: https://github.c"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/Response/Conversation.swift",
    "chars": 2987,
    "preview": "//\n//  Conversation.swift\n//  SwiftOpenAI\n//\n//  Created by James Rochabrun on 3/15/25.\n//\n\nimport Foundation\n\n// MARK: "
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/Response/GetInputItemsParameter.swift",
    "chars": 1140,
    "preview": "//\n//  GetInputItemsParameter.swift\n//  SwiftOpenAI\n//\n//  Created by James Rochabrun on 3/15/25.\n//\n\nimport Foundation\n"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/Response/GetResponseParameter.swift",
    "chars": 1415,
    "preview": "//\n//  GetResponseParameter.swift\n//  SwiftOpenAI\n//\n//  Created by James Rochabrun on 3/15/25.\n//\n\nimport Foundation\n\n/"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/Response/InputType.swift",
    "chars": 18510,
    "preview": "//\n//  InputType.swift\n//  SwiftOpenAI\n//\n//  Created by James Rochabrun on 3/15/25.\n//\n\nimport Foundation\n\n// MARK: - I"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/Response/ModelResponseParameter.swift",
    "chars": 12263,
    "preview": "//\n//  ModelResponseParameter.swift\n//  SwiftOpenAI\n//\n//  Created by James Rochabrun on 3/15/25.\n//\n\nimport Foundation\n"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/Response/Prompt.swift",
    "chars": 1972,
    "preview": "//\n//  Prompt.swift\n//  SwiftOpenAI\n//\n//  Created by James Rochabrun on 3/15/25.\n//\n\nimport Foundation\n\n// MARK: - Prom"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/Response/ResponseInclude.swift",
    "chars": 1397,
    "preview": "//\n//  ResponseInclude.swift\n//  SwiftOpenAI\n//\n//  Created by James Rochabrun on 3/15/25.\n//\n\nimport Foundation\n\n// MAR"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/Response/StreamOptions.swift",
    "chars": 983,
    "preview": "//\n//  StreamOptions.swift\n//  SwiftOpenAI\n//\n//  Created by James Rochabrun on 3/15/25.\n//\n\nimport Foundation\n\n// MARK:"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/Runs/CreateThreadAndRunParameter.swift",
    "chars": 5667,
    "preview": "//\n//  CreateThreadAndRunParameter.swift\n//\n//\n//  Created by James Rochabrun on 11/17/23.\n//\n\nimport Foundation\n\n/// [C"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/Runs/ModifyRunParameters.swift",
    "chars": 660,
    "preview": "//\n//  ModifyRunParameters.swift\n//\n//\n//  Created by James Rochabrun on 11/29/23.\n//\n\nimport Foundation\n\n/// Modifies a"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/Runs/RunParameter.swift",
    "chars": 5802,
    "preview": "//\n//  RunParameter.swift\n//\n//\n//  Created by James Rochabrun on 11/16/23.\n//\n\nimport Foundation\n\n/// [Create a run.](h"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/Runs/RunToolsOutputParameter.swift",
    "chars": 1523,
    "preview": "//\n//  RunToolsOutputParameter.swift\n//\n//\n//  Created by James Rochabrun on 11/16/23.\n//\n\nimport Foundation\n\n/// When a"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/Threads/CreateThreadParameters.swift",
    "chars": 1349,
    "preview": "//\n//  CreateThreadParameters.swift\n//\n//\n//  Created by James Rochabrun on 11/16/23.\n//\n\nimport Foundation\n\n/// Create "
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/Threads/ModifyThreadParameters.swift",
    "chars": 675,
    "preview": "//\n//  ModifyThreadParameters.swift\n//\n//\n//  Created by James Rochabrun on 11/25/23.\n//\n\nimport Foundation\n\n/// Modifie"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/VectorStore/VectorStoreParameter.swift",
    "chars": 2066,
    "preview": "//\n//  VectorStoreParameter.swift\n//\n//\n//  Created by James Rochabrun on 4/27/24.\n//\n\nimport Foundation\n\n/// Vector sto"
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/VectorStoreFileBatch/VectorStoreFileBatchParameter.swift",
    "chars": 646,
    "preview": "//\n//  VectorStoreFileBatchParameter.swift\n//\n//\n//  Created by James Rochabrun on 4/29/24.\n//\n\nimport Foundation\n\n/// ["
  },
  {
    "path": "Sources/OpenAI/Public/Parameters/VectorStoreFiles/VectorStoreFileParameter.swift",
    "chars": 582,
    "preview": "//\n//  VectorStoreFileParameter.swift\n//\n//\n//  Created by James Rochabrun on 4/28/24.\n//\n\nimport Foundation\n\n/// [Vecto"
  },
  {
    "path": "Sources/OpenAI/Public/ResponseModels/Assistants/AssistantObject.swift",
    "chars": 5043,
    "preview": "//\n//  AssistantObject.swift\n//\n//\n//  Created by James Rochabrun on 11/15/23.\n//\n\nimport Foundation\n\n/// BETA.\n/// Repr"
  },
  {
    "path": "Sources/OpenAI/Public/ResponseModels/Assistants/AssistantStreamEvent.swift",
    "chars": 2722,
    "preview": "//\n//  AssistantStreamEvent.swift\n//\n//\n//  Created by James Rochabrun on 3/22/24.\n//\n\nimport Foundation\n\n/// A model th"
  },
  {
    "path": "Sources/OpenAI/Public/ResponseModels/Assistants/AssistantStreamEventObject.swift",
    "chars": 5051,
    "preview": "//\n//  AssistantStreamEventObject.swift\n//\n//\n//  Created by James Rochabrun on 3/22/24.\n//\n\nimport Foundation\n\n/// Repr"
  },
  {
    "path": "Sources/OpenAI/Public/ResponseModels/Audio/AudioObject.swift",
    "chars": 2285,
    "preview": "//\n//  AudioObject.swift\n//\n//\n//  Created by James Rochabrun on 10/10/23.\n//\n\nimport Foundation\n\n/// The [audio](https:"
  },
  {
    "path": "Sources/OpenAI/Public/ResponseModels/Audio/AudioSpeechObject.swift",
    "chars": 307,
    "preview": "//\n//  AudioSpeechObject.swift\n//\n//\n//  Created by James Rochabrun on 11/14/23.\n//\n\nimport Foundation\n\n/// The [audio s"
  },
  {
    "path": "Sources/OpenAI/Public/ResponseModels/Batch/BatchObject.swift",
    "chars": 3444,
    "preview": "//\n//  BatchObject.swift\n//\n//\n//  Created by James Rochabrun on 4/27/24.\n//\n\nimport Foundation\n\npublic struct BatchObje"
  },
  {
    "path": "Sources/OpenAI/Public/ResponseModels/Chat/ChatCompletionChunkObject.swift",
    "chars": 5428,
    "preview": "//\n//  ChatCompletionChunkObject.swift\n//\n//\n//  Created by James Rochabrun on 10/10/23.\n//\n\nimport Foundation\n\n/// Repr"
  },
  {
    "path": "Sources/OpenAI/Public/ResponseModels/Chat/ChatCompletionObject.swift",
    "chars": 6138,
    "preview": "//\n//  ChatCompletionObject.swift\n//\n//\n//  Created by James Rochabrun on 10/10/23.\n//\n\nimport Foundation\n\n/// Represent"
  },
  {
    "path": "Sources/OpenAI/Public/ResponseModels/Chat/ChatUsage.swift",
    "chars": 1961,
    "preview": "//\n//  ChatUsage.swift\n//  SwiftOpenAI\n//\n//  Created by James Rochabrun on 1/3/25.\n//\n\nimport Foundation\n\n// MARK: - Ch"
  },
  {
    "path": "Sources/OpenAI/Public/ResponseModels/Conversations/ConversationModel.swift",
    "chars": 785,
    "preview": "//\n//  ConversationModel.swift\n//  SwiftOpenAI\n//\n//  Created by James Rochabrun on 10/5/25.\n//\n\nimport Foundation\n\n/// "
  },
  {
    "path": "Sources/OpenAI/Public/ResponseModels/Delta.swift",
    "chars": 416,
    "preview": "//\n//  Delta.swift\n//\n//\n//  Created by James Rochabrun on 3/22/24.\n//\n\nimport Foundation\n\n/// Protocol for Assistant St"
  },
  {
    "path": "Sources/OpenAI/Public/ResponseModels/Embedding/EmbeddingObject.swift",
    "chars": 661,
    "preview": "//\n//  EmbeddingObject.swift\n//\n//\n//  Created by James Rochabrun on 10/12/23.\n//\n\nimport Foundation\n\n/// [Represents an"
  },
  {
    "path": "Sources/OpenAI/Public/ResponseModels/File/FileObject.swift",
    "chars": 1657,
    "preview": "//\n//  FileObject.swift\n//\n//\n//  Created by James Rochabrun on 10/16/23.\n//\n\nimport Foundation\n\n/// The [File object](h"
  },
  {
    "path": "Sources/OpenAI/Public/ResponseModels/FineTuning/FineTuningJobEventObject.swift",
    "chars": 940,
    "preview": "//\n//  FineTuningJobEventObject.swift\n//\n//\n//  Created by James Rochabrun on 10/17/23.\n//\n\nimport Foundation\n\n/// [Fine"
  },
  {
    "path": "Sources/OpenAI/Public/ResponseModels/FineTuning/FineTuningJobObject.swift",
    "chars": 4121,
    "preview": "//\n//  FineTuningJobObject.swift\n//\n//\n//  Created by James Rochabrun on 10/17/23.\n//\n\nimport Foundation\n\n// MARK: - Fin"
  },
  {
    "path": "Sources/OpenAI/Public/ResponseModels/Image/ImageObject.swift",
    "chars": 756,
    "preview": "//\n//  ImageObject.swift\n//\n//\n//  Created by James Rochabrun on 10/12/23.\n//\n\nimport Foundation\n\n/// [Represents the ur"
  },
  {
    "path": "Sources/OpenAI/Public/ResponseModels/ImageGen/CreateImageResponse.swift",
    "chars": 2005,
    "preview": "//\n//  CreateImageResponse.swift\n//  SwiftOpenAI\n//\n//  Created by James Rochabrun on 4/24/25.\n//\n\nimport Foundation\n\n//"
  },
  {
    "path": "Sources/OpenAI/Public/ResponseModels/Messages/MessageContent.swift",
    "chars": 6985,
    "preview": "//\n//  MessageContent.swift\n//\n//\n//  Created by James Rochabrun on 3/17/24.\n//\n\nimport Foundation\n\n// MARK: - Assistant"
  },
  {
    "path": "Sources/OpenAI/Public/ResponseModels/Messages/MessageDeltaObject.swift",
    "chars": 1052,
    "preview": "//\n//  MessageDeltaObject.swift\n//\n//\n//  Created by James Rochabrun on 3/17/24.\n//\n\nimport Foundation\n\n/// [MessageDelt"
  },
  {
    "path": "Sources/OpenAI/Public/ResponseModels/Messages/MessageObject.swift",
    "chars": 3292,
    "preview": "//\n//  MessageObject.swift\n//\n//\n//  Created by James Rochabrun on 11/15/23.\n//\n\nimport Foundation\n\n/// BETA.\n/// Repres"
  },
  {
    "path": "Sources/OpenAI/Public/ResponseModels/Model/ModelObject.swift",
    "chars": 1924,
    "preview": "//\n//  ModelObject.swift\n//\n//\n//  Created by James Rochabrun on 10/13/23.\n//\n\nimport Foundation\n\n/// Describes an OpenA"
  },
  {
    "path": "Sources/OpenAI/Public/ResponseModels/Moderation/ModerationObject.swift",
    "chars": 3582,
    "preview": "//\n//  ModerationObject.swift\n//\n//\n//  Created by James Rochabrun on 10/13/23.\n//\n\nimport Foundation\n\n/// The [moderati"
  },
  {
    "path": "Sources/OpenAI/Public/ResponseModels/OpenAIErrorResponse.swift",
    "chars": 594,
    "preview": "//\n//  OpenAIErrorResponse.swift\n//\n//\n//  Created by James Rochabrun on 11/13/23.\n//\n\nimport Foundation\n\n// {\n//  \"erro"
  },
  {
    "path": "Sources/OpenAI/Public/ResponseModels/OpenAIResponse.swift",
    "chars": 1268,
    "preview": "//\n//  OpenAIResponse.swift\n//\n//\n//  Created by James Rochabrun on 10/13/23.\n//\n\nimport Foundation\n\n/// A generic struc"
  },
  {
    "path": "Sources/OpenAI/Public/ResponseModels/Realtime/OpenAIRealtimeInputAudioBufferSpeechStarted.swift",
    "chars": 598,
    "preview": "//\n//  OpenAIRealtimeInputAudioBufferSpeechStarted.swift\n//  SwiftOpenAI\n//\n//  Created from AIProxySwift\n//  Original: "
  },
  {
    "path": "Sources/OpenAI/Public/ResponseModels/Realtime/OpenAIRealtimeMessage.swift",
    "chars": 2337,
    "preview": "//\n//  OpenAIRealtimeMessage.swift\n//  SwiftOpenAI\n//\n//  Created from AIProxySwift\n//  Original: https://github.com/lze"
  },
  {
    "path": "Sources/OpenAI/Public/ResponseModels/Realtime/OpenAIRealtimeResponseFunctionCallArgumentsDone.swift",
    "chars": 756,
    "preview": "//\n//  OpenAIRealtimeResponseFunctionCallArgumentsDone.swift\n//  SwiftOpenAI\n//\n//  Created from AIProxySwift\n//  Origin"
  },
  {
    "path": "Sources/OpenAI/Public/ResponseModels/Response/OutputItem.swift",
    "chars": 22843,
    "preview": "//\n//  OutputItem.swift\n//  SwiftOpenAI\n//\n//  Created by James Rochabrun on 3/15/25.\n\nimport Foundation\n\n/// An output "
  },
  {
    "path": "Sources/OpenAI/Public/ResponseModels/Response/ResponseModel.swift",
    "chars": 10677,
    "preview": "//\n//  ResponseModel.swift\n//  SwiftOpenAI\n//\n//  Created by James Rochabrun on 3/15/25.\n//\n\nimport Foundation\n\n/// The "
  },
  {
    "path": "Sources/OpenAI/Public/ResponseModels/Response/ResponseStreamEvent.swift",
    "chars": 44898,
    "preview": "//\n//  ResponseStreamEvent.swift\n//  SwiftOpenAI\n//\n//  Created by James Rochabrun on 6/7/25.\n//\n\nimport Foundation\n\n// "
  },
  {
    "path": "Sources/OpenAI/Public/ResponseModels/Runs/RunObject.swift",
    "chars": 7097,
    "preview": "//\n//  RunObject.swift\n//\n//\n//  Created by James Rochabrun on 11/15/23.\n//\n\nimport Foundation\n\n/// BETA.\n/// A [run](ht"
  },
  {
    "path": "Sources/OpenAI/Public/ResponseModels/Runs/RunStepDeltaObject.swift",
    "chars": 838,
    "preview": "//\n//  RunStepDeltaObject.swift\n//\n//\n//  Created by James Rochabrun on 3/17/24.\n//\n\nimport Foundation\n\n/// Represents a"
  },
  {
    "path": "Sources/OpenAI/Public/ResponseModels/Runs/RunStepDetails.swift",
    "chars": 8746,
    "preview": "//\n//  RunStepDetails.swift\n//\n//\n//  Created by James Rochabrun on 3/17/24.\n//\n\nimport Foundation\n\n// MARK: - RunStepDe"
  },
  {
    "path": "Sources/OpenAI/Public/ResponseModels/Runs/RunStepObject.swift",
    "chars": 4987,
    "preview": "//\n//  RunStepObject.swift\n//\n//\n//  Created by James Rochabrun on 11/17/23.\n//\n\nimport Foundation\n\n/// Represents a [st"
  },
  {
    "path": "Sources/OpenAI/Public/ResponseModels/Threads/ThreadObject.swift",
    "chars": 1558,
    "preview": "//\n//  ThreadObject.swift\n//\n//\n//  Created by James Rochabrun on 11/15/23.\n//\n\nimport Foundation\n\n/// BETA\n/// A [threa"
  },
  {
    "path": "Sources/OpenAI/Public/ResponseModels/VectorStore/VectorStoreObject.swift",
    "chars": 1766,
    "preview": "//\n//  VectorStoreObject.swift\n//\n//\n//  Created by James Rochabrun on 4/27/24.\n//\n\nimport Foundation\n\npublic struct Vec"
  },
  {
    "path": "Sources/OpenAI/Public/ResponseModels/VectorStoreFile/VectorStoreFileObject.swift",
    "chars": 1263,
    "preview": "//\n//  VectorStoreFileObject.swift\n//\n//\n//  Created by James Rochabrun on 4/28/24.\n//\n\nimport Foundation\n\n/// [The Vect"
  },
  {
    "path": "Sources/OpenAI/Public/ResponseModels/VectorStoreFileBatch/VectorStoreFileBatchObject.swift",
    "chars": 1204,
    "preview": "//\n//  VectorStoreFileBatchObject.swift\n//\n//\n//  Created by James Rochabrun on 4/29/24.\n//\n\nimport Foundation\n\n/// [The"
  },
  {
    "path": "Sources/OpenAI/Public/Service/DefaultOpenAIService.swift",
    "chars": 54053,
    "preview": "//\n//  DefaultOpenAIService.swift\n//\n//\n//  Created by James Rochabrun on 10/17/23.\n//\n\nimport Foundation\n\nstruct Defaul"
  },
  {
    "path": "Sources/OpenAI/Public/Service/OpenAIService.swift",
    "chars": 74688,
    "preview": "//\n//  OpenAIService.swift\n//\n//\n//  Created by James Rochabrun on 10/17/23.\n//\n\nimport Foundation\n#if os(Linux)\nimport "
  },
  {
    "path": "Sources/OpenAI/Public/Service/OpenAIServiceFactory.swift",
    "chars": 6669,
    "preview": "//\n//  OpenAIServiceFactory.swift\n//\n//\n//  Created by James Rochabrun on 10/18/23.\n//\n\nimport Foundation\n\npublic class "
  },
  {
    "path": "Sources/OpenAI/Public/Shared/AudioController.swift",
    "chars": 4033,
    "preview": "//\n//  AudioController.swift\n//  SwiftOpenAI\n//\n//  Created from AIProxySwift\n//  Original: https://github.com/lzell/AIP"
  },
  {
    "path": "Sources/OpenAI/Public/Shared/DeletionStatus.swift",
    "chars": 223,
    "preview": "//\n//  DeletionStatus.swift\n//\n//\n//  Created by James Rochabrun on 4/27/24.\n//\n\nimport Foundation\n\npublic struct Deleti"
  },
  {
    "path": "Sources/OpenAI/Public/Shared/ExpirationPolicy.swift",
    "chars": 366,
    "preview": "//\n//  ExpirationPolicy.swift\n//\n//\n//  Created by James Rochabrun on 4/27/24.\n//\n\nimport Foundation\n\npublic struct Expi"
  },
  {
    "path": "Sources/OpenAI/Public/Shared/FileCount.swift",
    "chars": 644,
    "preview": "//\n//  FileCount.swift\n//\n//\n//  Created by James Rochabrun on 4/29/24.\n//\n\nimport Foundation\n\npublic struct FileCount: "
  },
  {
    "path": "Sources/OpenAI/Public/Shared/IncompleteDetails.swift",
    "chars": 531,
    "preview": "//\n//  IncompleteDetails.swift\n//\n//\n//  Created by James Rochabrun on 4/25/24.\n//\n\nimport Foundation\n\n/// Message: On a"
  },
  {
    "path": "Sources/OpenAI/Public/Shared/JSONSchema.swift",
    "chars": 6535,
    "preview": "//\n//  JSONSchema.swift\n//\n//\n//  Created by James Rochabrun on 8/10/24.\n//\n\nimport Foundation\n\n// MARK: - JSONSchemaTyp"
  },
  {
    "path": "Sources/OpenAI/Public/Shared/LastError.swift",
    "chars": 272,
    "preview": "//\n//  LastError.swift\n//\n//\n//  Created by James Rochabrun on 4/28/24.\n//\n\nimport Foundation\n\npublic struct LastError: "
  },
  {
    "path": "Sources/OpenAI/Public/Shared/MessageAttachment.swift",
    "chars": 611,
    "preview": "//\n//  MessageAttachment.swift\n//\n//\n//  Created by James Rochabrun on 4/25/24.\n//\n\nimport Foundation\n\n/// Messages have"
  },
  {
    "path": "Sources/OpenAI/Public/Shared/OpenAIError.swift",
    "chars": 439,
    "preview": "//\n//  OpenAIError.swift\n//  SwiftOpenAI\n//\n\nimport Foundation\n\npublic enum OpenAIError: LocalizedError {\n  case audioCo"
  },
  {
    "path": "Sources/OpenAI/Public/Shared/OpenAIJSONValue.swift",
    "chars": 5249,
    "preview": "//\n//  OpenAIJSONValue.swift\n//  SwiftOpenAI\n//\n//  Created from AIProxySwift\n//  Original: https://github.com/lzell/AIP"
  }
]

// ... and 19 more files (download for full content)

About this extraction

This page contains the full source code of the jamesrochabrun/SwiftOpenAI GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 219 files (1.3 MB), approximately 331.1k tokens. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.

Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.

Copied to clipboard!