[
  {
    "path": ".github/workflows/deploy-pages.yml",
    "content": "name: Deploy Next.js static site to Pages\n\non:\n  push:\n    branches: [\"main\"]\n  workflow_dispatch:\n\npermissions:\n  contents: read\n  pages: write\n  id-token: write\n\nconcurrency:\n  group: \"pages\"\n  cancel-in-progress: true\n\njobs:\n  build:\n    runs-on: ubuntu-latest\n    steps:\n      - name: Checkout\n        uses: actions/checkout@v4\n\n      - name: Setup Node\n        uses: actions/setup-node@v4\n        with:\n          node-version: 22\n\n      - name: Install dependencies\n        run: npm install\n        working-directory: frontend\n\n      - name: Build static site\n        run: npm run build\n        working-directory: frontend\n\n      - name: Upload Pages artifact\n        uses: actions/upload-pages-artifact@v3\n        with:\n          path: frontend/out\n\n  deploy:\n    environment:\n      name: github-pages\n      url: ${{ steps.deployment.outputs.page_url }}\n    runs-on: ubuntu-latest\n    needs: build\n    steps:\n      - name: Deploy to GitHub Pages\n        id: deployment\n        uses: actions/deploy-pages@v4\n"
  },
  {
    "path": ".gitignore",
    "content": "# Python\n__pycache__/\n*.py[cod]\n*.so\n.venv/\n.pytest_cache/\n.mypy_cache/\n.coverage\nhtmlcov/\n\n# Node\nnode_modules/\n.next/\ndist/\ncoverage/\n\n# Environment\n.env\n.env.local\n\n# IDE\n.vscode/\n.idea/\n.DS_Store\n\n# Build\nbuild/\n*.log\n"
  },
  {
    "path": ".gitkeep",
    "content": ""
  },
  {
    "path": "README.md",
    "content": "# Agentic AI Workflow Engine\n\nA production-style, full-stack orchestration platform for autonomous market analysis workflows using **LangGraph**, **OpenAI**, and **vector retrieval patterns**.\n\n## Architecture Overview\n\n- **Backend**: FastAPI service exposing workflow execution APIs.\n- **Orchestration**: LangGraph state-machine graph with explicit node transitions (`research -> analysis -> recommendations`).\n- **LLM Layer**: OpenAI chat model abstraction with configurable model selection.\n- **Retrieval Layer**: Chroma-based vector service abstraction for extensible RAG workflows.\n- **Frontend**: Next.js dashboard to trigger workflows and inspect results.\n- **Runtime**: Docker Compose for local full-stack bootstrapping.\n\n## Repository Structure\n\n```text\nbackend/\n  app/\n    api/           # HTTP routers\n    core/          # config + logging\n    graph/         # LangGraph workflow definitions\n    schemas/       # API contracts\n    services/      # LLM and vector abstractions\n  tests/\nfrontend/\n  src/\n    app/           # Next.js pages\n    components/    # UI components\n    lib/           # API clients\n    types/         # shared types\ninfra/             # reserved for IaC modules\nscripts/           # tooling scripts\ndocs/              # architecture and ADRs\n```\n\n## Local Development\n\n### 1) Prerequisites\n- Python 3.11+\n- Node 22+\n- Docker + Docker Compose\n\n### 2) Configure environment\n\n```bash\ncp .env.example .env\n```\n\nPopulate at least:\n- `OPENAI_API_KEY`\n- `OPENAI_MODEL`\n\n### 3) Run backend\n\n```bash\ncd backend\npython -m venv .venv && source .venv/bin/activate\npip install -e .\nuvicorn app.main:app --reload --port 8000\n```\n\n### 4) Run frontend\n\n```bash\ncd frontend\nnpm install\nnpm run dev\n```\n\n### 5) Run full stack with Docker\n\n```bash\ndocker compose up --build\n```\n\n## API Contract\n\n### POST `/workflows/market-analysis`\n\nRequest:\n```json\n{\n  \"company\": \"NVIDIA\",\n  \"objectives\": [\"Competitive positioning\", \"Pricing strategy\"],\n  \"context\": {}\n}\n```\n\nResponse:\n```json\n{\n  \"workflow_id\": \"uuid\",\n  \"status\": \"completed\",\n  \"summary\": \"...\",\n  \"artifacts\": {\n    \"research_notes\": \"...\",\n    \"recommendations\": \"...\"\n  }\n}\n```\n\n## Engineering Characteristics\n\n- Clear separation of concerns (API / orchestration / services / schema boundaries).\n- Strong typing across Python and TypeScript.\n- Environment-driven configuration with explicit defaults.\n- Dockerized runtime and clean onboarding docs.\n- Test scaffold for API health and easy extension for integration tests.\n\n## Scalability Roadmap\n\n1. Add task queue + worker runtime (Celery/Arq) for asynchronous workflow runs.\n2. Persist workflow state transitions in SQLModel-backed store.\n3. Add observability stack (OpenTelemetry traces, structured logs, metrics).\n4. Add multi-tenant authn/authz and policy-enforced tool access.\n5. Expand graph with conditional branches and evaluator nodes.\n\n## Quality Gates (recommended)\n\nBackend:\n```bash\npytest\nruff check .\nmypy app\n```\n\nFrontend:\n```bash\nnpm run lint\nnpm run build\n```\n\n## GitHub Pages Demo Deployment\n\nThis repository includes a ready-to-use workflow at `.github/workflows/deploy-pages.yml`.\n\n1. Push your code to `main`.\n2. In GitHub: **Settings → Pages → Source = GitHub Actions**.\n3. Run workflow (push or manual dispatch).\n\nThe build publishes `frontend/out/` to Pages.\n\n### Why previous workflow failed\n\nIf `actions/setup-node` uses npm caching without a lockfile path that exists, it fails with:\n`Some specified paths were not resolved, unable to cache dependencies.`\n\nThis repository workflow avoids that issue by not requiring `cache-dependency-path`.\n"
  },
  {
    "path": "backend/Dockerfile",
    "content": "FROM python:3.11-slim\nWORKDIR /app\nCOPY pyproject.toml ./\nRUN pip install --no-cache-dir -U pip && pip install --no-cache-dir .\nCOPY app ./app\nCMD [\"uvicorn\", \"app.main:app\", \"--host\", \"0.0.0.0\", \"--port\", \"8000\"]\n"
  },
  {
    "path": "backend/app/__init__.py",
    "content": ""
  },
  {
    "path": "backend/app/api/workflows.py",
    "content": "from fastapi import APIRouter\n\nfrom app.graph.workflow import run_workflow\nfrom app.schemas.workflow import WorkflowRequest, WorkflowResponse\n\nrouter = APIRouter(prefix=\"/workflows\", tags=[\"workflows\"])\n\n\n@router.post(\"/market-analysis\", response_model=WorkflowResponse)\ndef execute_market_analysis(payload: WorkflowRequest) -> WorkflowResponse:\n    result = run_workflow(payload.company, payload.objectives)\n    return WorkflowResponse(**result)\n"
  },
  {
    "path": "backend/app/core/config.py",
    "content": "from functools import lru_cache\n\nfrom pydantic_settings import BaseSettings, SettingsConfigDict\n\n\nclass Settings(BaseSettings):\n    model_config = SettingsConfigDict(env_file=\".env\", env_file_encoding=\"utf-8\", extra=\"ignore\")\n\n    app_env: str = \"development\"\n    app_host: str = \"0.0.0.0\"\n    app_port: int = 8000\n    log_level: str = \"INFO\"\n\n    openai_api_key: str = \"\"\n    openai_model: str = \"gpt-4.1\"\n\n    vector_store_provider: str = \"chroma\"\n    chroma_persist_directory: str = \"./data/chroma\"\n\n    database_url: str = \"sqlite:///./workflow_engine.db\"\n\n\n@lru_cache(maxsize=1)\ndef get_settings() -> Settings:\n    return Settings()\n"
  },
  {
    "path": "backend/app/core/logging.py",
    "content": "import logging\n\n\ndef configure_logging(level: str = \"INFO\") -> None:\n    logging.basicConfig(\n        level=getattr(logging, level.upper(), logging.INFO),\n        format=\"%(asctime)s | %(levelname)s | %(name)s | %(message)s\",\n    )\n"
  },
  {
    "path": "backend/app/graph/workflow.py",
    "content": "from typing import TypedDict\nfrom uuid import uuid4\n\nfrom langgraph.graph import END, StateGraph\n\nfrom app.services.llm_service import LLMService\n\n\nclass MarketState(TypedDict):\n    company: str\n    objectives: list[str]\n    research_notes: str\n    analysis: str\n    recommendations: str\n\n\nllm = LLMService()\n\n\ndef research_node(state: MarketState) -> MarketState:\n    prompt = f\"Research top market dynamics for {state['company']}.\"\n    state[\"research_notes\"] = llm.summarize(prompt)\n    return state\n\n\ndef analysis_node(state: MarketState) -> MarketState:\n    prompt = (\n        f\"Use notes to produce strategic analysis for {state['company']}: {state['research_notes']}\"\n    )\n    state[\"analysis\"] = llm.summarize(prompt)\n    return state\n\n\ndef recommendations_node(state: MarketState) -> MarketState:\n    prompt = (\n        f\"Provide 5 executive recommendations for {state['company']} based on {state['analysis']}\"\n    )\n    state[\"recommendations\"] = llm.summarize(prompt)\n    return state\n\n\ndef build_market_workflow():\n    graph = StateGraph(MarketState)\n    graph.add_node(\"research\", research_node)\n    graph.add_node(\"analysis\", analysis_node)\n    graph.add_node(\"recommendations\", recommendations_node)\n\n    graph.set_entry_point(\"research\")\n    graph.add_edge(\"research\", \"analysis\")\n    graph.add_edge(\"analysis\", \"recommendations\")\n    graph.add_edge(\"recommendations\", END)\n\n    return graph.compile()\n\n\ndef run_workflow(company: str, objectives: list[str]) -> dict:\n    workflow = build_market_workflow()\n    workflow_id = str(uuid4())\n\n    state: MarketState = {\n        \"company\": company,\n        \"objectives\": objectives,\n        \"research_notes\": \"\",\n        \"analysis\": \"\",\n        \"recommendations\": \"\",\n    }\n    result = workflow.invoke(state)\n    return {\n        \"workflow_id\": workflow_id,\n        \"status\": \"completed\",\n        \"summary\": result[\"analysis\"],\n        \"artifacts\": {\n            \"research_notes\": result[\"research_notes\"],\n            \"recommendations\": result[\"recommendations\"],\n        },\n    }\n"
  },
  {
    "path": "backend/app/main.py",
    "content": "from fastapi import FastAPI\n\nfrom app.api.workflows import router as workflow_router\nfrom app.core.config import get_settings\nfrom app.core.logging import configure_logging\n\nsettings = get_settings()\nconfigure_logging(settings.log_level)\n\napp = FastAPI(\n    title=\"Agentic AI Workflow Engine\",\n    description=\"Enterprise orchestration engine powered by LangGraph\",\n    version=\"0.1.0\",\n)\napp.include_router(workflow_router)\n\n\n@app.get(\"/health\")\ndef health_check() -> dict[str, str]:\n    return {\"status\": \"ok\"}\n"
  },
  {
    "path": "backend/app/schemas/workflow.py",
    "content": "from typing import Any\n\nfrom pydantic import BaseModel, Field\n\n\nclass WorkflowRequest(BaseModel):\n    company: str = Field(..., description=\"Company to analyze\")\n    objectives: list[str] = Field(default_factory=list)\n    context: dict[str, Any] = Field(default_factory=dict)\n\n\nclass WorkflowResponse(BaseModel):\n    workflow_id: str\n    status: str\n    summary: str\n    artifacts: dict[str, Any]\n"
  },
  {
    "path": "backend/app/services/llm_service.py",
    "content": "from langchain_openai import ChatOpenAI\n\nfrom app.core.config import get_settings\n\n\nclass LLMService:\n    def __init__(self) -> None:\n        settings = get_settings()\n        self.client = ChatOpenAI(model=settings.openai_model, api_key=settings.openai_api_key)\n\n    def summarize(self, prompt: str) -> str:\n        response = self.client.invoke(prompt)\n        return response.content\n"
  },
  {
    "path": "backend/app/services/vector_store.py",
    "content": "from langchain_community.vectorstores import Chroma\nfrom langchain_openai import OpenAIEmbeddings\n\nfrom app.core.config import get_settings\n\n\nclass VectorStoreService:\n    def __init__(self) -> None:\n        settings = get_settings()\n        self._embeddings = OpenAIEmbeddings(api_key=settings.openai_api_key)\n        self._store = Chroma(\n            persist_directory=settings.chroma_persist_directory,\n            embedding_function=self._embeddings,\n        )\n\n    def similarity_search(self, query: str, k: int = 4):\n        return self._store.similarity_search(query, k=k)\n"
  },
  {
    "path": "backend/pyproject.toml",
    "content": "[project]\nname = \"agentic-ai-workflow-engine\"\nversion = \"0.1.0\"\ndescription = \"LangGraph-based AI orchestration system for autonomous market analysis\"\nreadme = \"README.md\"\nrequires-python = \">=3.11\"\ndependencies = [\n  \"fastapi>=0.115.0\",\n  \"uvicorn[standard]>=0.30.0\",\n  \"pydantic-settings>=2.6.0\",\n  \"langgraph>=0.2.0\",\n  \"langchain>=0.3.0\",\n  \"langchain-openai>=0.2.0\",\n  \"langchain-community>=0.3.0\",\n  \"chromadb>=0.5.0\",\n  \"sqlmodel>=0.0.22\",\n  \"httpx>=0.27.0\",\n]\n\n[project.optional-dependencies]\ndev = [\"pytest>=8.3.0\", \"ruff>=0.7.0\", \"mypy>=1.12.0\"]\n\n[tool.ruff]\nline-length = 100\ntarget-version = \"py311\"\n\n[tool.pytest.ini_options]\npythonpath = [\".\"]\n"
  },
  {
    "path": "backend/tests/test_health.py",
    "content": "from fastapi.testclient import TestClient\n\nfrom app.main import app\n\n\ndef test_health_check() -> None:\n    client = TestClient(app)\n    response = client.get(\"/health\")\n    assert response.status_code == 200\n    assert response.json() == {\"status\": \"ok\"}\n"
  },
  {
    "path": "docker-compose.yml",
    "content": "version: \"3.9\"\nservices:\n  api:\n    build:\n      context: ./backend\n    env_file:\n      - .env\n    ports:\n      - \"8000:8000\"\n    volumes:\n      - ./backend:/app\n      - ./data:/app/data\n  web:\n    build:\n      context: ./frontend\n    ports:\n      - \"3000:3000\"\n    environment:\n      - NEXT_PUBLIC_API_BASE_URL=http://api:8000\n    depends_on:\n      - api\n"
  },
  {
    "path": "frontend/Dockerfile",
    "content": "FROM node:22-alpine\nWORKDIR /app\nCOPY package.json ./\nRUN npm install\nCOPY . .\nCMD [\"npm\", \"run\", \"dev\", \"--\", \"-H\", \"0.0.0.0\", \"-p\", \"3000\"]\n"
  },
  {
    "path": "frontend/next.config.mjs",
    "content": "const isGithubActions = process.env.GITHUB_ACTIONS === \"true\";\nconst repoName = \"AI-Workflow-Engine\";\n\nconst nextConfig = {\n  reactStrictMode: true,\n  output: \"export\",\n  images: {\n    unoptimized: true,\n  },\n  basePath: isGithubActions ? `/${repoName}` : \"\",\n  assetPrefix: isGithubActions ? `/${repoName}/` : \"\",\n};\n\nexport default nextConfig;\n"
  },
  {
    "path": "frontend/package.json",
    "content": "{\n  \"name\": \"workflow-engine-web\",\n  \"private\": true,\n  \"version\": \"0.1.0\",\n  \"scripts\": {\n    \"dev\": \"next dev\",\n    \"build\": \"next build\",\n    \"start\": \"next start\",\n    \"lint\": \"next lint\"\n  },\n  \"dependencies\": {\n    \"next\": \"15.0.0\",\n    \"react\": \"18.2.0\",\n    \"react-dom\": \"18.2.0\"\n  },\n  \"devDependencies\": {\n    \"typescript\": \"5.6.3\",\n    \"@types/react\": \"18.3.12\",\n    \"@types/node\": \"22.8.1\",\n    \"eslint\": \"9.13.0\",\n    \"eslint-config-next\": \"15.0.0\"\n  }\n}\n"
  },
  {
    "path": "frontend/src/app/layout.tsx",
    "content": "import type { Metadata } from \"next\";\n\nexport const metadata: Metadata = {\n  title: \"Agentic AI Workflow Engine Demo\",\n  description: \"Senior-level orchestration demo UI for LangGraph-powered workflows\",\n};\n\nexport default function RootLayout({ children }: { children: React.ReactNode }) {\n  return (\n    <html lang=\"en\">\n      <body>{children}</body>\n    </html>\n  );\n}\n"
  },
  {
    "path": "frontend/src/app/page.tsx",
    "content": "import WorkflowForm from \"../components/WorkflowForm\";\nimport \"./styles.css\";\n\nexport default function HomePage() {\n  return (\n    <main className=\"page-shell\">\n      <div className=\"aurora aurora-a\" />\n      <div className=\"aurora aurora-b\" />\n      <div className=\"noise-overlay\" />\n\n      <header className=\"hero\">\n        <p className=\"eyebrow\">LangGraph • OpenAI • Retrieval • Orchestration</p>\n        <h1>Agentic AI Workflow Engine</h1>\n        <p className=\"hero-copy\">\n          Autonomous market analysis with stateful graph execution, retriever-augmented context,\n          and executive-grade recommendation synthesis.\n        </p>\n        <div className=\"hero-metrics\">\n          <article>\n            <span>3</span>\n            <p>Orchestration Nodes</p>\n          </article>\n          <article>\n            <span>1 Click</span>\n            <p>Run Strategic Analysis</p>\n          </article>\n          <article>\n            <span>Typed</span>\n            <p>End-to-End Contracts</p>\n          </article>\n        </div>\n      </header>\n\n      <section className=\"content-grid\">\n        <article className=\"panel elevated\">\n          <h2>Workflow Studio</h2>\n          <p className=\"panel-copy\">\n            Trigger a complete research → analysis → recommendation cycle. Designed for portfolio\n            strategy, product intelligence, and competitive planning.\n          </p>\n          <WorkflowForm />\n        </article>\n\n        <aside className=\"panel glass\">\n          <h2>System Highlights</h2>\n          <ul>\n            <li>Deterministic LangGraph state machine orchestration.</li>\n            <li>OpenAI model abstraction for configurable inference strategy.</li>\n            <li>Vector retrieval-ready service layer for contextual augmentation.</li>\n            <li>Containerized full-stack runtime for local and cloud demos.</li>\n          </ul>\n          <div className=\"pulse-card\">\n            <h3>Demo Intent</h3>\n            <p>\n              This interface is optimized for executive demos and architectural storytelling,\n              showcasing senior-level engineering quality and product thinking.\n            </p>\n          </div>\n        </aside>\n      </section>\n\n      <footer className=\"site-footer\">\n        <p>© 2023 Bertrand Amobi.</p>\n        <p>For demonstration purposes only. <a href=\"https://github.com/bertrandamobi/AI-Workflow-Engine\" target=\"_blank\" rel=\"noreferrer\">View Source Code.</a></p>\n      </footer>\n    </main>\n  );\n}\n"
  },
  {
    "path": "frontend/src/app/styles.css",
    "content": ":root {\n  color-scheme: dark;\n}\n\n* {\n  box-sizing: border-box;\n}\n\nbody {\n  margin: 0;\n  min-height: 100vh;\n  background: radial-gradient(circle at 20% 20%, #341426, #0f0f13 45%),\n    radial-gradient(circle at 80% 70%, #20403c, transparent 30%),\n    linear-gradient(140deg, #1e0f2b, #0f1119 70%);\n  color: #f8f4f0;\n  font-family: Inter, Segoe UI, system-ui, -apple-system, sans-serif;\n}\n\n.page-shell {\n  position: relative;\n  overflow: hidden;\n  padding: 2.5rem 1.25rem 1.5rem;\n  max-width: 1180px;\n  margin: 0 auto;\n}\n\n.aurora {\n  position: absolute;\n  filter: blur(60px);\n  border-radius: 999px;\n  opacity: 0.5;\n  animation: drift 15s ease-in-out infinite alternate;\n  z-index: -2;\n}\n\n.aurora-a {\n  width: 300px;\n  height: 300px;\n  background: #f25f4c;\n  top: -120px;\n  left: -100px;\n}\n\n.aurora-b {\n  width: 340px;\n  height: 340px;\n  background: #59c3c3;\n  right: -120px;\n  bottom: 80px;\n  animation-delay: 1.5s;\n}\n\n.noise-overlay {\n  position: absolute;\n  inset: 0;\n  pointer-events: none;\n  z-index: -1;\n  opacity: 0.12;\n  background-image: radial-gradient(circle, #f6bd6022 1px, transparent 1px);\n  background-size: 3px 3px;\n}\n\n.hero h1 {\n  font-size: clamp(2rem, 4vw, 3.5rem);\n  margin: 0.4rem 0;\n}\n\n.eyebrow {\n  color: #f6bd60;\n  letter-spacing: 0.1em;\n  text-transform: uppercase;\n  font-size: 0.78rem;\n}\n\n.hero-copy {\n  max-width: 68ch;\n  color: #e6ddd5;\n}\n\n.hero-metrics {\n  display: grid;\n  grid-template-columns: repeat(auto-fit, minmax(130px, 1fr));\n  gap: 1rem;\n  margin-top: 1.2rem;\n}\n\n.hero-metrics article {\n  background: #ffffff0f;\n  border: 1px solid #ffffff24;\n  border-radius: 14px;\n  padding: 0.9rem;\n  backdrop-filter: blur(6px);\n}\n\n.hero-metrics span {\n  font-weight: 700;\n  color: #ffdca8;\n}\n\n.content-grid {\n  margin-top: 2rem;\n  display: grid;\n  grid-template-columns: 1.6fr 1fr;\n  gap: 1.2rem;\n}\n\n.panel {\n  border-radius: 18px;\n  padding: 1.25rem;\n}\n\n.elevated {\n  background: linear-gradient(160deg, #31142f, #182227);\n  border: 1px solid #ffffff1a;\n  box-shadow: 0 30px 80px #0000004f;\n}\n\n.glass {\n  background: #ffffff0f;\n  border: 1px solid #ffffff2b;\n  backdrop-filter: blur(8px);\n}\n\n.panel-copy {\n  color: #eadfd7;\n}\n\nul {\n  margin: 0;\n  padding-left: 1.2rem;\n  display: grid;\n  gap: 0.65rem;\n}\n\n.pulse-card {\n  margin-top: 1rem;\n  padding: 1rem;\n  border-radius: 14px;\n  background: linear-gradient(120deg, #f25f4c2d, #59c3c32e);\n  border: 1px solid #ffffff24;\n  animation: pulse 4s ease-in-out infinite;\n}\n\nform {\n  display: grid;\n  gap: 0.8rem;\n}\n\ninput,\ntextarea,\nbutton {\n  border-radius: 12px;\n  border: 1px solid #ffffff2d;\n  font: inherit;\n  padding: 0.75rem 0.8rem;\n}\n\ninput,\ntextarea {\n  background: #0e1118a8;\n  color: #f5f2ee;\n}\n\nbutton {\n  background: linear-gradient(110deg, #f25f4c, #f6bd60);\n  color: #141414;\n  font-weight: 700;\n  cursor: pointer;\n  transition: transform 160ms ease, filter 160ms ease;\n}\n\nbutton:hover {\n  transform: translateY(-1px);\n  filter: brightness(1.07);\n}\n\n.site-footer {\n  margin-top: 2rem;\n  border-top: 1px solid #ffffff2a;\n  padding-top: 1rem;\n  color: #e0d8d0;\n  text-align: center;\n}\n\n.site-footer p {\n  margin: 0.3rem 0;\n}\n\n.site-footer a {\n  color: #ffdca8;\n}\n\n@media (max-width: 900px) {\n  .content-grid {\n    grid-template-columns: 1fr;\n  }\n}\n\n@keyframes pulse {\n  0%, 100% { transform: scale(1); }\n  50% { transform: scale(1.01); }\n}\n\n@keyframes drift {\n  from { transform: translateY(0) translateX(0); }\n  to { transform: translateY(30px) translateX(18px); }\n}\n"
  },
  {
    "path": "frontend/src/components/WorkflowForm.tsx",
    "content": "\"use client\";\n\nimport { FormEvent, useState } from \"react\";\nimport { runMarketAnalysis } from \"../lib/api\";\nimport { WorkflowResponse } from \"../types/workflow\";\n\nexport default function WorkflowForm() {\n  const [company, setCompany] = useState(\"NVIDIA\");\n  const [objectives, setObjectives] = useState(\"Competitive positioning, product strategy\");\n  const [result, setResult] = useState<WorkflowResponse | null>(null);\n  const [loading, setLoading] = useState(false);\n\n  const onSubmit = async (event: FormEvent) => {\n    event.preventDefault();\n    setLoading(true);\n    setResult(null);\n    try {\n      const response = await runMarketAnalysis({\n        company,\n        objectives: objectives.split(\",\").map((item) => item.trim()),\n      });\n      setResult(response);\n    } finally {\n      setLoading(false);\n    }\n  };\n\n  return (\n    <div>\n      <form onSubmit={onSubmit}>\n        <input value={company} onChange={(event) => setCompany(event.target.value)} placeholder=\"Company\" />\n        <textarea\n          value={objectives}\n          onChange={(event) => setObjectives(event.target.value)}\n          placeholder=\"Comma-separated objectives\"\n        />\n        <button type=\"submit\" disabled={loading}>{loading ? \"Running...\" : \"Run Workflow\"}</button>\n      </form>\n\n      {result && (\n        <section>\n          <h2>Workflow Result</h2>\n          <p><strong>ID:</strong> {result.workflow_id}</p>\n          <p><strong>Status:</strong> {result.status}</p>\n          <h3>Summary</h3>\n          <p>{result.summary}</p>\n          <h3>Recommendations</h3>\n          <p>{result.artifacts.recommendations}</p>\n        </section>\n      )}\n    </div>\n  );\n}\n"
  },
  {
    "path": "frontend/src/lib/api.ts",
    "content": "import { WorkflowRequest, WorkflowResponse } from \"../types/workflow\";\n\nconst API_BASE_URL = process.env.NEXT_PUBLIC_API_BASE_URL ?? \"http://localhost:8000\";\n\nexport async function runMarketAnalysis(payload: WorkflowRequest): Promise<WorkflowResponse> {\n  const response = await fetch(`${API_BASE_URL}/workflows/market-analysis`, {\n    method: \"POST\",\n    headers: { \"Content-Type\": \"application/json\" },\n    body: JSON.stringify(payload),\n  });\n\n  if (!response.ok) {\n    throw new Error(`Request failed: ${response.status}`);\n  }\n  return response.json();\n}\n"
  },
  {
    "path": "frontend/src/types/workflow.ts",
    "content": "export interface WorkflowRequest {\n  company: string;\n  objectives: string[];\n}\n\nexport interface WorkflowResponse {\n  workflow_id: string;\n  status: string;\n  summary: string;\n  artifacts: {\n    research_notes: string;\n    recommendations: string;\n  };\n}\n"
  },
  {
    "path": "frontend/tsconfig.json",
    "content": "{\n  \"compilerOptions\": {\n    \"target\": \"ES2022\",\n    \"lib\": [\"DOM\", \"DOM.Iterable\", \"ES2022\"],\n    \"allowJs\": false,\n    \"skipLibCheck\": true,\n    \"strict\": true,\n    \"noEmit\": true,\n    \"module\": \"ESNext\",\n    \"moduleResolution\": \"Bundler\",\n    \"resolveJsonModule\": true,\n    \"isolatedModules\": true,\n    \"jsx\": \"preserve\",\n    \"incremental\": true\n  },\n  \"include\": [\"src\"]\n}\n"
  }
]