EIO Developer Kit
Complete documentation for developing applications on the EIO Developer Kit platform. Built on Android 9 AOSP, providing direct access to hardware APIs including camera, microphone, speaker, and physical controls.
System Overview
The EIO Developer Kit is built on Android 9 AOSP. This documentation covers the essential hardware APIs for building immersive AR applications:
⌨️Event Key Listening
Handle physical button inputs (scroll, confirm, back, power)
📸Camera APIs
Capture and process visual data using CameraX or Camera2
🎤Microphone APIs
Record audio using MediaRecorder
🔊Speaker APIs
Audio output and text-to-speech functionality
Quick Start Guide
Follow these steps to set up your development environment and create your first EIO Developer Kit application.
Prerequisites
Before you begin, ensure you have the following installed on your development machine:
| Requirement | Version | Notes |
|---|---|---|
| Android Studio | Hedgehog (2023.1.1) or later | Official IDE for Android development |
| JDK | 11 or 17 | Required for Android Gradle Plugin |
| Android SDK | API Level 28+ (Android 9.0) | Target platform for Developer Kit |
| Gradle | 8.0+ | Build automation tool |
| USB Driver | Latest | For device debugging via ADB |
Step 1: Install Android Studio
Download and install Android Studio from the official website. During installation, make sure to include:
- Android SDK
- Android SDK Platform-Tools
- Android Virtual Device (optional, for emulator testing)
Step 2: Configure SDK
Open Android Studio and navigate to Settings → Languages & Frameworks → Android SDK. Install the following components:
Step 3: Create a New Project
Create a new Android project with the following configuration:
- Open Android Studio and select New Project
- Choose Empty Views Activity template
- Set the following options:
- Name:
MyEIOApp - Package name:
com.example.myeioapp - Language:
KotlinorJava - Minimum SDK:
API 28 (Android 9.0 Pie)
- Name:
- Click Finish to create the project
Step 4: Configure build.gradle
Open your app-level build.gradle and configure it for the Developer Kit:
Step 5: Connect Your Device
Connect the EIO Developer Kit to your computer via USB:
- Enable Developer Options on the device:
- Go to
Settings → About Device - Tap
Build Number7 times
- Go to
- Enable USB Debugging:
- Go to
Settings → Developer Options - Enable
USB Debugging
- Go to
- Connect via USB and accept the debugging prompt on the device
- Verify connection in terminal:
Step 6: Run Your App
With your device connected:
- Select your device from the device dropdown in Android Studio
- Click the Run button (▶) or press
Shift + F10 - Wait for the app to build and install on the device
- Your app should launch automatically on the Developer Kit
adb kill-server followed by adb start-server to reset the ADB connection.Next Steps
Now that your development environment is set up, you can start integrating the hardware APIs. We recommend following this learning path:
Key Events
Start with physical button handling - it's the simplest API and essential for navigation.
Camera APIs
Add camera preview for AR applications - the core visual input for smart glasses.
Microphone APIs
Enable voice input and audio recording for voice commands and dictation.
Speaker APIs
Add audio feedback and text-to-speech for a complete interactive experience.
Build Your First AR Application
Let's build a simple but complete AR application in about 15 minutes. This hands-on tutorial will create a "Smart Viewer" app that displays a camera preview and responds to button presses with audio feedback.
What we'll build:
Step 1: Create the Project
Create a new Android project following the Quick Start Guide above, then add CameraX dependencies to your app/build.gradle:
Step 2: Add Permissions
Add camera permission to AndroidManifest.xml:
Step 3: Create the Layout
Replace the content of activity_main.xml:
Step 4: Write the MainActivity
Replace the content of MainActivity.java:
Step 5: Build and Run
- Sync Gradle - Click "Sync Now" when prompted after adding dependencies
- Connect Device - Plug in your EIO Developer Kit via USB
- Run - Press the Run button (▶) in Android Studio
- Test - Try the controls:
- Scroll wheel left/right → Zoom level changes with voice feedback
- Press OK button → "Photo captured" message with voice
- Press Back → App exits with "Goodbye"
Congratulations!
You've built your first AR application! This simple app demonstrates the core interaction pattern for AR glasses: visual input (camera), physical controls (buttons), and audio output (TTS). Explore the API sections below to add more features.
What's Next?
Now that you have a working app, you can extend it with:
1. Event Key Listening
The EIO Developer Kit features physical controls that map to standard Android key events. These can be captured in your Activity to provide intuitive navigation and interaction.
What you'll learn:
- Map physical buttons to Android key codes
- Override
dispatchKeyEvent()to capture button presses - Provide visual feedback with animations
- Support multiple device types and key mappings
Key Definitions
The following table shows the primary hardware key mappings:
| Physical Event | Map Event | Hex Code | Decimal Code |
|---|---|---|---|
| Scroll Forward | KEY_RIGHT | 0x6a | 106 |
| Scroll Backward | KEY_LEFT | 0x69 | 105 |
| Confirm | KEY_OK | 0x160 | 352 |
| Return | KEY_BACK | 0x9e | 158 |
| Power | POWER | 0x74 | 116 |
Extended Key Events
Additional keys available on compatible AR glasses (e.g., C2000 series):
| Key Code | Physical Button | Description |
|---|---|---|
| KEYCODE_F1 | SOS Key | Emergency or quick action button |
| KEYCODE_F2 | Record Key | Red dot / recording toggle |
| KEYCODE_DPAD_LEFT | Wheel CCW | Scroll wheel counterclockwise |
| KEYCODE_DPAD_RIGHT | Wheel CW | Scroll wheel clockwise |
| KEYCODE_DPAD_CENTER | Wheel Press | Scroll wheel pressed |
| KEYCODE_BACK | Back Key | Return / navigate back |
Implementation Example
Override dispatchKeyEvent() in your Activity to capture and handle key events:
UI Animation Feedback
Enhance user experience with visual feedback animations when physical buttons are pressed:
2. Camera APIs
With button handling in place, let's add visual capabilities. The camera is the primary sensor for AR applications, enabling scene understanding, object detection, and visual assistance features.
Access the device camera using the CameraX API for preview, capture, and image analysis. CameraX provides a consistent API across Android devices with lifecycle-aware components. For advanced control, Camera2 API is also available.
What you'll learn:
- Configure CameraX dependencies and permissions
- Display live camera preview in your app
- Understand Camera2 core components for advanced use cases
- Handle lifecycle and resolution switching
- Troubleshoot common camera issues
Camera2 Core Components
When using the Camera2 API for low-level camera control, you'll work with these key components:
| Component | Type | Description |
|---|---|---|
| TextureView | UI Component | Displays camera preview, receives image data via SurfaceTexture |
| CameraDevice | Camera Handle | Represents the physical camera, used for creating capture requests |
| CameraCaptureSession | Session Manager | Manages capture request execution, routes output to target surfaces |
| CaptureRequest.Builder | Request Builder | Configures preview settings (autofocus, exposure, etc.) |
| CameraManager | System Service | Lists available cameras, queries characteristics, opens camera |
| HandlerThread | Background Thread | Executes time-consuming camera operations off the UI thread |
Camera2 Initialization Flow
The Camera2 initialization follows this sequence:
- UI Setup - Initialize TextureView and set SurfaceTextureListener
- Permission Check - Verify CAMERA permission before opening
- Get Camera Info - Use CameraManager to get camera ID and characteristics
- Get Supported Resolutions - Query StreamConfigurationMap for output sizes
- Open Camera - Call cameraManager.openCamera with state callback
- Start Preview - Create capture session and start repeating request
Dependencies
Add the following CameraX dependencies to your build.gradle:
Permissions
Declare camera permission in your AndroidManifest.xml:
Layout
Camera Implementation
The following Kotlin implementation demonstrates camera preview with toggle functionality:
ImageCapture andImageAnalysisto your lifecycle binding.Camera Lifecycle Management
Proper lifecycle management is critical for camera applications:
| Lifecycle Method | Camera Actions |
|---|---|
| onCreate | Initialize UI, set listeners, configure camera manager |
| onResume | Start background thread, check TextureView availability, open camera |
| onPause | Close camera and session, stop background thread, release resources |
| onDestroy | Final cleanup of any remaining resources |
Resolution Switching
To switch camera resolution dynamically:
Common Camera Troubleshooting
| Issue | Solution |
|---|---|
| Camera won't open | Check permission declaration and runtime grant; look for CameraAccessException |
| Preview issues | Verify resolution and buffer size; ensure Surface is added to capture request |
| Resolution switch fails | Ensure supportedSizes is populated; call restartCamera properly |
| App crashes | Avoid heavy operations on UI thread; release resources during lifecycle transitions |
3. Microphone APIs
Now that you have visual input from the camera, let's add audio input. Voice commands and audio recording enable hands-free interaction, which is essential for AR glasses where users may not have access to a touchscreen.
Record audio using Android's MediaRecorder API. This allows you to capture voice input, ambient audio, or any sound for processing and playback.
What you'll learn:
- Request and handle audio recording permissions
- Configure MediaRecorder for different quality levels
- Implement recording, playback, and pause/resume controls
- Build reusable audio module interfaces
- Integrate with physical button controls
Permissions
Layout
Recording Implementation
The following implementation provides complete recording, stopping, and playback functionality:
AudioRecord with PCM encoding for raw audio access, or AAC encoder for higher quality compressed audio.Advanced Audio Configuration
For higher quality recording, use these MediaRecorder settings:
Secondary Development Interfaces
Recommended interfaces for building reusable audio modules:
| Feature | Suggested Interface |
|---|---|
| Recording control | startRecording(int durationMs) - configurable duration |
| Playback control | startPlayback(), pausePlayback(), resumePlayback() |
| State listener | OnAudioStateListener - notifies recording/playback state changes |
| File management | getLastRecordingFilePath(), deleteRecordingFile() |
Remote Control Integration
Support for AR glasses physical controls and remote devices:
Important Notes
- Always check recording permission before initializing MediaRecorder
- Ensure recording file exists before playback to avoid exceptions
- Test compatibility when using external microphones (USB audio devices)
- Lower encoding parameters for low-end devices that don't support high bitrates
- Recording files saved to
getExternalFilesDir()are private and deleted on uninstall
4. Speaker APIs
To complete the audio loop, let's add output capabilities. Audio feedback is crucial for AR applications where users need confirmation of actions, navigation guidance, or text reading without looking at a screen.
Output audio through the device speaker using Android's AudioManager and TextToSpeech APIs. This enables voice feedback, notifications, and text-to-speech functionality.
What you'll learn:
- Initialize and configure TextToSpeech engine
- Control audio routing and speaker settings
- Customize speech parameters (pitch, speed, language)
- Handle TTS lifecycle and resource cleanup
Layout
Text-to-Speech Implementation
This implementation provides text-to-speech with speaker control:
setPitch() (range: 0.5 - 2.0) and setSpeechRate() (range: 0.5 - 2.0). For multi-language support, check available locales with textToSpeech.getAvailableLanguages().Audio Output Configuration
Key AudioManager methods for speaker control:
| Method | Description |
|---|---|
| setMode(MODE_NORMAL) | Normal audio mode for media playback |
| setMode(MODE_IN_COMMUNICATION) | Optimized for voice communication |
| setSpeakerphoneOn(true) | Route audio to the loudspeaker |
| setStreamVolume() | Adjust volume for specific audio streams |
| adjustVolume() | Raise or lower volume incrementally |
5. Putting It All Together
Now that you understand each API individually, let's combine them into a complete application. This example demonstrates a simple AR assistant that uses all four hardware APIs.
Application Features:
- Camera preview for visual context
- Physical button navigation between modes
- Voice recording for commands
- Text-to-speech feedback
Project Structure
AndroidManifest.xml
First, declare all required permissions:
Main Activity Integration
Here's how to integrate all APIs in a single Activity:
Summary Checklist
Before deploying your app to the EIO Developer Kit, verify these items:
Additional Resources
Explore more resources to help you build amazing applications on the EIO Developer Kit platform.
