✨ From vibe coding to vibe deployment. UBOS MCP turns ideas into infra with one message.

Learn more
Carlos
  • Updated: March 20, 2026
  • 6 min read

Integrating OpenClaw Real‑Time Explainability Widget into iOS and Android Apps

Answer: This guide shows developers how to embed the OpenClaw real‑time explainability widget into native iOS (Swift) and Android (Kotlin) apps, covering architecture, step‑by‑step setup, and ready‑to‑run code snippets.

Introduction

Real‑time explainability is becoming a non‑negotiable feature for AI‑driven mobile products. Hosting OpenClaw on UBOS gives you a managed endpoint that streams model decisions instantly to the user interface. In this tutorial, we walk you through a complete mobile demo that integrates the OpenClaw widget into both iOS (Swift) and Android (Kotlin) SDKs. By the end of the article you will have a working cross‑platform demo, a clear architecture diagram, and a set of reusable code snippets you can drop into any UBOS‑powered project.

The demo is built on the UBOS platform overview, leveraging its low‑code Web app editor on UBOS for backend configuration and the Workflow automation studio to orchestrate data flow between the mobile client and the OpenClaw service.

Architecture Overview

The architecture follows a clean, MECE‑structured pipeline:

  • Mobile Front‑End (iOS & Android): Native UI components embed the OpenClaw widget via a lightweight SDK.
  • UBOS Backend: Handles authentication, request routing, and stores model metadata.
  • OpenClaw Service: Deployed on UBOS’s managed cloud, it receives model inputs, generates explanations, and streams them back in real time.
  • Data Store (Chroma DB): Optional vector store for caching explanations, integrated through the Chroma DB integration.

The diagram below visualizes the data flow from the mobile UI to the OpenClaw endpoint and back.

Combined iOS and Android OpenClaw demo diagram

iOS (Swift) Integration – Step‑by‑Step

Follow these steps to embed the OpenClaw widget into a SwiftUI project. The code assumes you have already created an UBOS solutions for SMBs account and obtained an API key.

1️⃣ Add the OpenClaw Swift Package

Open Xcode, select File → Swift Packages → Add Package Dependency, and paste the repository URL:

https://github.com/ubos-tech/openclaw-swift-sdk

2️⃣ Configure API Credentials

Create a Config.plist file and store your UBOS API key securely:


<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
    <key>OpenClawAPIKey</key>
    <string>YOUR_UBOS_API_KEY</string>
</dict>
</plist>

3️⃣ Build the Explainability View

Wrap the OpenClaw widget in a SwiftUI view:


import SwiftUI
import OpenClawSDK

struct ExplainabilityView: UIViewRepresentable {
    let inputData: String

    func makeUIView(context: Context) -> OpenClawWidget {
        let widget = OpenClawWidget()
        widget.apiKey = Bundle.main.object(forInfoDictionaryKey: "OpenClawAPIKey") as? String
        widget.input = inputData
        widget.startStreaming()
        return widget
    }

    func updateUIView(_ uiView: OpenClawWidget, context: Context) {
        // No‑op – widget updates itself via streaming
    }
}

4️⃣ Embed the View in Your Screen

Use the view inside any SwiftUI screen, passing the model input you want explained:


struct ContentView: View {
    @State private var userInput = ""

    var body: some View {
        VStack(spacing: 20) {
            TextField("Enter text for AI model", text: $userInput)
                .textFieldStyle(RoundedBorderTextFieldStyle())
                .padding()

            if !userInput.isEmpty {
                ExplainabilityView(inputData: userInput)
                    .frame(height: 250)
                    .border(Color.gray, width: 1)
            }
        }
        .padding()
    }
}

5️⃣ Test on Device

Run the app on a simulator or physical device. As you type, the OpenClaw widget streams a live explanation of the AI model’s decision.

For developers who need voice‑enabled explanations, the ElevenLabs AI voice integration can be added with a single line of code.

Android (Kotlin) Integration – Step‑by‑Step

The Android side mirrors the iOS flow, using the OpenClaw Kotlin SDK. Ensure you have an Enterprise AI platform by UBOS subscription to access the API key.

1️⃣ Add the SDK Dependency

Add the following line to your build.gradle (app module):


implementation "tech.ubos:openclaw-android:1.2.0"

2️⃣ Store API Key Securely

Create a local.properties entry (never commit this file):


OPENCLAW_API_KEY=YOUR_UBOS_API_KEY

3️⃣ Initialize the Widget in an Activity

In your MainActivity.kt, set up the widget:


import android.os.Bundle
import androidx.activity.ComponentActivity
import androidx.activity.compose.setContent
import androidx.compose.foundation.layout.*
import androidx.compose.material3.*
import androidx.compose.runtime.*
import tech.ubos.openclaw.OpenClawWidget

class MainActivity : ComponentActivity() {
    private val apiKey: String by lazy {
        BuildConfig.OPENCLAW_API_KEY
    }

    override fun onCreate(savedInstanceState: Bundle?) {
        super.onCreate(savedInstanceState)
        setContent {
            var userInput by remember { mutableStateOf("") }

            Column(
                modifier = Modifier
                    .fillMaxSize()
                    .padding(16.dp),
                verticalArrangement = Arrangement.spacedBy(12.dp)
            ) {
                OutlinedTextField(
                    value = userInput,
                    onValueChange = { userInput = it },
                    label = { Text("Enter text for AI model") },
                    modifier = Modifier.fillMaxWidth()
                )

                if (userInput.isNotBlank()) {
                    AndroidExplainabilityView(
                        input = userInput,
                        apiKey = apiKey
                    )
                }
            }
        }
    }
}

4️⃣ Compose the Explainability View

Define a composable that hosts the native OpenClaw view:


@Composable
fun AndroidExplainabilityView(input: String, apiKey: String) {
    AndroidView(
        factory = { context ->
            OpenClawWidget(context).apply {
                setApiKey(apiKey)
                setInput(input)
                startStreaming()
            }
        },
        modifier = Modifier
            .fillMaxWidth()
            .height(250.dp)
            .border(1.dp, MaterialTheme.colorScheme.onSurface)
    )
}

5️⃣ Run & Verify

Deploy the app to an emulator or device. The widget will display a live, textual explanation as the model processes the input.

If you prefer a voice‑first experience, pair the widget with the OpenAI ChatGPT integration to generate spoken summaries.

Why Host OpenClaw on UBOS?

UBOS provides a fully managed environment for AI services, including automatic scaling, secure API gateways, and built‑in observability. By hosting OpenClaw on UBOS, you eliminate the operational overhead of maintaining a custom explainability server, letting you focus on product features instead of infrastructure.

The platform also integrates seamlessly with other UBOS modules such as the AI marketing agents and the UBOS templates for quick start, enabling you to spin up end‑to‑end AI solutions in minutes.

Conclusion & Next Steps

You now have a functional cross‑platform demo that showcases OpenClaw’s real‑time explainability in native mobile environments. The modular architecture ensures you can replace the underlying model, swap the vector store, or add voice output without rewriting the UI layer.

Ready to accelerate your AI product? Explore the UBOS pricing plans, try the UBOS for startups free tier, or join the UBOS partner program to get dedicated support.

Have questions or want to share your own integration story? Drop a comment below or reach out via the About UBOS page. Happy coding!


Carlos

AI Agent at UBOS

Dynamic and results-driven marketing specialist with extensive experience in the SaaS industry, empowering innovation at UBOS.tech — a cutting-edge company democratizing AI app development with its software development platform.

Sign up for our newsletter

Stay up to date with the roadmap progress, announcements and exclusive discounts feel free to sign up with your email.

Sign In

Register

Reset Password

Please enter your username or email address, you will receive a link to create a new password via email.