Skip to content

ARVOS Documentation

Sensor streaming platform for iPhone and Apple Watch

Turn your iPhone (and optional Apple Watch) into a sensor platform with LiDAR, cameras, IMU, ARKit pose tracking, and wearable motion sensing.


🎯 What is ARVOS?

ARVOS turns your iPhone into a sensor streaming platform, great for: - Research - SLAM, computer vision, sensor fusion - Robotics - Real-time perception, ROS 2 integration - Education - Learn AR, computer vision, sensor systems - Development - Prototype algorithms with real sensor data


🚀 Quick Start

Get started in 30 seconds with the Web Viewer (no installation required):

cd arvos-sdk/web-viewer
./start-viewer.sh
# Scan QR code with iPhone → Done!

Or use the Python SDK for custom applications:

pip install arvos-sdk
python examples/01_quickstart.py

📱 What ARVOS Streams

ARVOS captures and streams sensor data from your iPhone and Apple Watch:

iPhone Sensors

  • 📷 Camera: 30 FPS @ 1920x1080 RGB video
  • 🔍 LiDAR/Depth: 5 FPS 3D point clouds with confidence maps
  • 📊 IMU: 100-200 Hz accelerometer + gyroscope + gravity
  • 🧭 ARKit Pose: 30-60 Hz 6DOF camera tracking with quality flags
  • 📍 GPS: 1 Hz location data (outdoor)

Apple Watch Sensors (Optional)

  • ⌚ IMU: 50-100 Hz wearable accelerometer + gyroscope + gravity
  • 🧭 Attitude: Quaternion + pitch/roll/yaw angles
  • 🚶 Motion Activity: Classification (walking, running, cycling, vehicle, stationary)

All sensors are nanosecond-synchronized for precise data collection.


🌐 7 Streaming Protocols

ARVOS supports multiple streaming protocols to fit different use cases:

Protocol Best For Port Status
WebSocket General purpose 9090 ✅ Complete
gRPC High performance, research 50051 ✅ Complete
MQTT IoT, multi-subscriber 1883 ✅ Complete
HTTP/REST Simple integration 8080 ✅ Complete
Bluetooth LE Low bandwidth, cable-free N/A ✅ Complete
MCAP Stream Robotics research 17500 ✅ Complete
QUIC/HTTP3 Ultra-low latency 4433 ✅ Complete

→ Protocol Comparison Guide


🎯 Use Cases

For Researchers

  • SLAM algorithm development with ARKit ground truth
  • Sensor fusion experiments
  • ML dataset collection
  • Real-time 3D reconstruction

For Robotics Engineers

  • ROS 2 perception testing
  • Mobile sensor platform development
  • Algorithm prototyping
  • Live demos and presentations

For Students

  • Computer vision learning
  • AR experiments
  • Sensor data visualization
  • Course projects

📚 Documentation Structure

Getting Started

iOS App

Python SDK

Protocols

  • Overview - All supported protocols
  • Individual protocol guides for each streaming method
  • Comparison - Choose the right protocol

API Reference

Examples

Guides

Web Viewer


💻 Requirements

iPhone

  • iPhone 12 Pro or newer (for LiDAR)
  • iOS 16.0+ (iOS 18+ for gRPC)
  • Same Wi-Fi network as computer (or Bluetooth for BLE)

Computer

  • Python 3.8+ or modern browser
  • Same Wi-Fi network as iPhone (for Wi-Fi protocols)
  • Firewall allows selected protocol port

Apple Watch (Optional)

  • Apple Watch Series 6 or newer
  • watchOS 9.0+
  • Paired with streaming iPhone

🎓 Learning Resources


🤝 Contributing

We welcome contributions! See our Contributing Guide for details.


📜 License

MIT License - Use freely in your research and projects



Made for the robotics and AR research community ❤️