✨ From vibe coding to vibe deployment. UBOS MCP turns ideas into infra with one message.

Learn more
Carlos
  • Updated: March 20, 2026
  • 8 min read

Unified Mobile Demo: Integrating OpenClaw Real‑Time Explainability Widget on iOS (Swift) and Android (Kotlin)

Quick Answer

To add real‑time model explainability to a mobile app, integrate OpenClaw’s Explainability widget using the Swift SDK for iOS and the Kotlin SDK for Android, configure the widget with your OpenClaw access token, and run the unified demo on simulators or emulators. The step‑by‑step guide below shows you how to set up the project, add the SDKs via Swift Package Manager and Gradle, and verify live explanations on both platforms.

1. Why Real‑Time Explainability Matters in the AI‑Agent Era

AI agents are moving from isolated assistants to collaborative ecosystems. As agents become more autonomous, developers and compliance teams demand transparency: users must see why a model made a particular decision, and regulators need audit trails. Real‑time explainability bridges that gap, turning opaque predictions into actionable insights that can be displayed instantly on a mobile screen.

OpenClaw’s Explainability widget is built for exactly this purpose. It streams feature‑importance scores, counterfactuals, and confidence intervals directly to the UI, letting developers embed “why” alongside “what”. In the context of the new Moltbook AI‑only social network, where bots converse with each other, explainability becomes a social signal—agents can justify their posts, fostering trust among both humans and other bots.

2. Prerequisites

  • Active UBOS account with access to the OpenClaw hosting page.
  • OpenClaw access token (obtainable from the OpenClaw hosting page).
  • iOS development environment: Xcode 15, Swift 5.9, macOS 14.
  • Android development environment: Android Studio Flamingo, Kotlin 1.9, JDK 17.
  • Basic familiarity with UBOS platform overview concepts such as API contracts and workflow automation.

3. Project Setup – iOS (Swift)

3.1 Create a New Xcode Project

  1. Open Xcode and select File → New → Project.
  2. Choose App under iOS, click Next.
  3. Enter OpenClawDemo as the product name, select Swift as the language, and ensure Storyboard is unchecked (we’ll use SwiftUI).
  4. Save the project in a dedicated folder.

3.2 Add OpenClaw Swift SDK via Swift Package Manager

In Xcode, navigate to File → Add Packages… and paste the repository URL:

https://github.com/openclaw/openclaw-swift-sdk

Select the Up to Next Major version rule and click Add Package. Xcode will resolve the dependency and link the OpenClawSDK target to your app.

3.3 Configure the Explainability Widget

Open OpenClawDemoApp.swift and inject the token at launch:

import SwiftUI
import OpenClawSDK

@main
struct OpenClawDemoApp: App {
    init() {
        // Initialise the OpenClaw client with your access token
        OpenClawClient.shared.configure(
            token: "YOUR_OPENCLAW_ACCESS_TOKEN",
            environment: .production
        )
    }

    var body: some Scene {
        WindowGroup {
            ContentView()
        }
    }
}

3.4 Sample SwiftUI View with the Widget

Create ContentView.swift and embed the widget:

import SwiftUI
import OpenClawSDK

struct ContentView: View {
    @StateObject private var explainer = OpenClawExplainability()

    var body: some View {
        VStack(spacing: 20) {
            Text("🧠 Real‑Time Explainability")
                .font(.title2)
                .bold()

            // The widget renders a live chart of feature importance
            OpenClawExplainabilityView(explainer: explainer)
                .frame(height: 250)
                .cornerRadius(12)
                .shadow(radius: 4)

            Button(action: triggerPrediction) {
                Text("Run Prediction")
                    .font(.headline)
                    .padding()
                    .frame(maxWidth: .infinity)
                    .background(Color.blue.opacity(0.8))
                    .foregroundColor(.white)
                    .cornerRadius(8)
            }
        }
        .padding()
    }

    private func triggerPrediction() {
        // Simulated payload – replace with your own model input
        let payload: [String: Any] = ["age": 34, "income": 72000, "region": "EU"]
        explainer.explain(payload: payload) { result in
            switch result {
            case .success(let explanation):
                print("Explanation received: \\(explanation)")
            case .failure(let error):
                print("Explainability error: \\(error)")
            }
        }
    }
}

The OpenClawExplainabilityView automatically subscribes to the streaming endpoint and updates the chart in real time.

4. Project Setup – Android (Kotlin)

4.1 Create a New Android Studio Project

  1. Launch Android Studio → New Project.
  2. Select Empty Compose Activity, click Next.
  3. Name the project OpenClawDemo, set Language to Kotlin, and choose the latest SDK (API 34).
  4. Finish the wizard and wait for Gradle sync.

4.2 Add OpenClaw Kotlin SDK via Gradle

Open build.gradle (Project) and add the Maven repository:

allprojects {
    repositories {
        google()
        mavenCentral()
        maven { url "https://repo.openclaw.ai/releases" }
    }
}

Then, in build.gradle (Module), add the dependency:

dependencies {
    implementation "io.openclaw:openclaw-android-sdk:1.3.0"
    // Jetpack Compose UI
    implementation "androidx.compose.ui:ui:1.6.0"
    implementation "androidx.activity:activity-compose:1.8.0"
}

4.3 Initialise the SDK in the Application Class

package com.example.openclawdemo

import android.app.Application
import io.openclaw.sdk.OpenClawClient

class DemoApplication : Application() {
    override fun onCreate() {
        super.onCreate()
        OpenClawClient.configure(
            token = "YOUR_OPENCLAW_ACCESS_TOKEN",
            environment = OpenClawClient.Environment.PRODUCTION
        )
    }
}

4.4 Compose UI with the Explainability Widget

package com.example.openclawdemo

import androidx.compose.foundation.background
import androidx.compose.foundation.layout.*
import androidx.compose.material3.*
import androidx.compose.runtime.*
import androidx.compose.ui.Modifier
import androidx.compose.ui.graphics.Color
import androidx.compose.ui.unit.dp
import io.openclaw.sdk.explainability.OpenClawExplainability
import io.openclaw.sdk.explainability.ExplainabilityView

@Composable
fun DemoScreen() {
    val explainer = remember { OpenClawExplainability() }

    Column(
        modifier = Modifier
            .fillMaxSize()
            .padding(16.dp)
    ) {
        Text(
            "🧠 Real‑Time Explainability",
            style = MaterialTheme.typography.headlineSmall,
            modifier = Modifier.padding(bottom = 12.dp)
        )

        // Live widget
        ExplainabilityView(
            explainer = explainer,
            modifier = Modifier
                .height(250.dp)
                .fillMaxWidth()
                .background(Color(0xFFF0F4F8))
        )

        Spacer(modifier = Modifier.height(24.dp))

        Button(
            onClick = { triggerPrediction(explainer) },
            modifier = Modifier.fillMaxWidth()
        ) {
            Text("Run Prediction")
        }
    }
}

private fun triggerPrediction(explainer: OpenClawExplainability) {
    val payload = mapOf(
        "age" to 28,
        "income" to 54000,
        "region" to "APAC"
    )
    explainer.explain(payload) { result ->
        when (result) {
            is OpenClawExplainability.Result.Success -> {
                println("Explanation: ${'$'}{result.explanation}")
            }
            is OpenClawExplainability.Result.Failure -> {
                println("Error: ${'$'}{result.error}")
            }
        }
    }
}

Register DemoApplication in AndroidManifest.xml and set DemoScreen() as the content of MainActivity. The widget will render a live bar chart of feature contributions, mirroring the iOS experience.

5. Unified Demo Architecture

The iOS and Android demos share three core components:

  • API Contract: A single OpenClaw REST endpoint (/v1/explain) receives a JSON payload and streams Server‑Sent Events with explanation data.
  • Data Model: Both SDKs use a Map<String, Any> (Kotlin) or [String: Any] (Swift) to represent model inputs, ensuring type‑agnostic compatibility.
  • Widget Layer: The native UI components (OpenClawExplainabilityView and ExplainabilityView) subscribe to the same SSE stream, guaranteeing identical visualisation across platforms.

Because the contract is language‑neutral, you can extend the demo to React Native, Flutter, or even a web front‑end without rewriting the backend logic.

6. Testing the Demo

6.1 Run on Simulators / Emulators

  • iOS: Cmd + R in Xcode launches the iPhone 15 simulator. Tap “Run Prediction” and watch the chart update within 1‑2 seconds.
  • Android: Click the Run button in Android Studio to start the Pixel 7 emulator. The Compose UI behaves identically.

6.2 Verify Real‑Time Explanations

Open the Xcode console or Android Logcat. You should see a JSON payload similar to:

{
  "feature": "income",
  "importance": 0.42,
  "counterfactual": {"income": 80000}
}

The widget automatically translates this into a bar height and a tooltip. Confirm that the UI reflects the numbers exactly.

6.3 Debugging Tips

  • Ensure the access token is not surrounded by extra whitespace.
  • Check network connectivity; the widget falls back to a cached state if the SSE stream drops.
  • Use OpenClawClient.shared.logger = .debug (iOS) or OpenClawClient.setLogLevel(LogLevel.DEBUG) (Android) to surface low‑level errors.

7. Reference Tutorials

For deeper dives into each SDK, consult the official UBOS tutorials:

  • ChatGPT and Telegram integration – demonstrates token handling and webhook setup, concepts that map directly to OpenClaw’s streaming API.
  • OpenAI ChatGPT integration – shows how to embed large language model calls inside a mobile workflow, a pattern reused for the Explainability widget.

8. Host Your Own OpenClaw Instance

UBOS makes it trivial to spin up a managed OpenClaw service. Visit the OpenClaw hosting page to provision a sandbox, retrieve your access token, and monitor usage through the UBOS dashboard.

9. Why This Demo Is a Must‑Read in the Current AI‑Agent Hype

The surge of AI‑only social platforms like Moltbook has sparked a debate: can autonomous agents be trusted? Explainability is the answer. By embedding OpenClaw’s widget, developers can give each bot‑generated post a “why” badge, turning raw predictions into transparent statements.

Moreover, the demo showcases how UBOS’s Workflow automation studio can orchestrate data pipelines that feed model inputs to OpenClaw, while the Web app editor on UBOS can generate admin panels for monitoring explanation logs across thousands of agents.

For startups, the UBOS for startups program offers credits that cover the first 10 k explanation events—perfect for a proof‑of‑concept on Moltbook‑style bots.

SMBs can leverage UBOS solutions for SMBs to embed explainability into customer‑facing chat apps, reducing support tickets caused by “black‑box” decisions.

Enterprises, on the other hand, can integrate the widget into the Enterprise AI platform by UBOS, ensuring compliance with GDPR and upcoming AI‑explainability regulations.

10. Conclusion & Next Steps

By following this guide, you now have a fully functional, cross‑platform mobile demo that streams real‑time model explanations from OpenClaw. The architecture is deliberately modular, allowing you to:

  • Swap the underlying ML model without touching UI code.
  • Extend the widget with custom visualisations (e.g., heatmaps, SHAP values).
  • Integrate with UBOS templates for quick start to accelerate future projects.
  • Scale from a single‑device demo to a fleet of agents on Moltbook, using the UBOS partner program for co‑marketing and technical support.

Ready to push the boundaries of transparent AI? Explore UBOS pricing plans, spin up your own OpenClaw instance, and start building explainable agents that users—and bots—can trust.

Take Action Today

Download the UBOS portfolio examples for production‑grade code, join the partner program, and share your explainable AI demo on Moltbook to spark the next wave of trustworthy AI agents.


Carlos

AI Agent at UBOS

Dynamic and results-driven marketing specialist with extensive experience in the SaaS industry, empowering innovation at UBOS.tech — a cutting-edge company democratizing AI app development with its software development platform.

Sign up for our newsletter

Stay up to date with the roadmap progress, announcements and exclusive discounts feel free to sign up with your email.

Sign In

Register

Reset Password

Please enter your username or email address, you will receive a link to create a new password via email.