These topics come up when the role involves maps, fitness tracking, camera-based scanning, or Bluetooth connectivity. You need to understand how Android interacts with hardware and how to handle lifecycle and permission complexities.
Camera2 is the low-level camera API from Android 5.0 (API 21). It gives full control β manual exposure, focus, ISO, RAW capture, frame-by-frame processing. But it requires managing CameraDevice, CameraCaptureSession, CaptureRequest, and handling device-specific quirks. A lot of boilerplate.
CameraX is a Jetpack library built on top of Camera2. It uses a use-case-based API β Preview, ImageCapture, ImageAnalysis, and VideoCapture. Itβs lifecycle-aware, handles device compatibility internally, and works consistently across devices. For most apps, CameraX is the right choice. Camera2 is only needed for low-level control like manual focus or custom capture pipelines.
Two approaches β CameraController for simplicity and CameraProvider for flexibility. The simplest setup:
val previewView: PreviewView = findViewById(R.id.previewView)
val cameraController = LifecycleCameraController(this)
cameraController.bindToLifecycle(this)
cameraController.cameraSelector = CameraSelector.DEFAULT_BACK_CAMERA
previewView.controller = cameraController
With CameraProvider, you bind use cases explicitly:
val cameraProviderFuture = ProcessCameraProvider.getInstance(this)
cameraProviderFuture.addListener({
val cameraProvider = cameraProviderFuture.get()
val preview = Preview.Builder().build().also {
it.setSurfaceProvider(previewView.surfaceProvider)
}
cameraProvider.bindToLifecycle(
this, CameraSelector.DEFAULT_BACK_CAMERA, preview
)
}, ContextCompat.getMainExecutor(this))
CameraX opens, closes, and releases camera resources automatically based on the lifecycle.
Three permissions:
ACCESS_COARSE_LOCATION β approximate location from Wi-Fi/cell, roughly city-block level.ACCESS_FINE_LOCATION β precise GPS location.ACCESS_BACKGROUND_LOCATION β location access when the app is in the background. Required separately from Android 10 (API 29).Android 12 lets users grant only approximate location even when you request fine. Background location needs a separate runtime request β you canβt combine it with the foreground dialog. Google Play also requires justification for background location during review.
The Fused Location Provider is part of Google Play Services. It combines GPS, Wi-Fi, cell towers, and device sensors to determine location. It picks the best source based on your accuracy and battery needs automatically.
Raw GPS via LocationManager gives high accuracy outdoors but drains battery, takes time to get a fix, and fails indoors. The Fused Location Provider handles all of this β it returns a cached last-known location almost instantly and switches between providers transparently.
BiometricPrompt from androidx.biometric provides a standard UI for fingerprint, face, and iris authentication. You set the allowed authenticator types:
BIOMETRIC_STRONG (Class 3) β hardware-backed biometrics. Required for crypto operations.BIOMETRIC_WEAK (Class 2) β basic biometrics, not strong enough for crypto.DEVICE_CREDENTIAL β PIN, pattern, or password.val promptInfo = BiometricPrompt.PromptInfo.Builder()
.setTitle("Authenticate")
.setSubtitle("Verify your identity")
.setAllowedAuthenticators(BIOMETRIC_STRONG or DEVICE_CREDENTIAL)
.build()
val biometricPrompt = BiometricPrompt(this, executor,
object : BiometricPrompt.AuthenticationCallback() {
override fun onAuthenticationSucceeded(result: AuthenticationResult) {
val cipher = result.cryptoObject?.cipher
}
override fun onAuthenticationFailed() {}
}
)
biometricPrompt.authenticate(promptInfo)
Always check BiometricManager.canAuthenticate(BIOMETRIC_STRONG) first to see if the device supports it.
Android groups sensors into three categories:
Some sensors are hardware-based (physical chips) and some are software-based (derived from one or more hardware sensors). The linear acceleration sensor and gravity sensor are software-based.
The sensor framework is in the android.hardware package. You get a SensorManager from system services, find the sensor you need, and register a SensorEventListener.
class MotionActivity : AppCompatActivity(), SensorEventListener {
private lateinit var sensorManager: SensorManager
private var accelerometer: Sensor? = null
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
sensorManager = getSystemService(SENSOR_SERVICE) as SensorManager
accelerometer = sensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER)
}
override fun onResume() {
super.onResume()
accelerometer?.let {
sensorManager.registerListener(this, it, SensorManager.SENSOR_DELAY_NORMAL)
}
}
override fun onPause() {
super.onPause()
sensorManager.unregisterListener(this)
}
override fun onSensorChanged(event: SensorEvent) {
val x = event.values[0]
val y = event.values[1]
val z = event.values[2]
}
override fun onAccuracyChanged(sensor: Sensor, accuracy: Int) {}
}
Always unregister in onPause(). Leaving it registered keeps the sensor hardware active and drains battery.
The accelerometer measures linear acceleration along three axes (x, y, z) in m/sΒ², including gravity. It tells you tilt and shake. The gyroscope measures rate of rotation around each axis in rad/s β how fast the device is spinning.
Accelerometer is used for tilt detection, step counting, and shake gestures. Gyroscope is used for rotation tracking in games, AR, and image stabilization.
Bluetooth Classic handles continuous, high-throughput data β audio streaming, file transfer, serial communication. It uses more power and maintains a persistent connection.
BLE (Bluetooth Low Energy) is for short bursts of small data with low power consumption. Used for IoT sensors, fitness devices, beacons, and proximity detection. In Android, Classic Bluetooth APIs are under android.bluetooth, while BLE uses BluetoothLeScanner for scanning and BluetoothGatt for communication. From Android 12, BLE requires BLUETOOTH_SCAN and BLUETOOTH_CONNECT runtime permissions instead of location permission.
Android has progressively restricted background location:
ACCESS_BACKGROUND_LOCATION became a separate permission.For continuous background location like navigation or fitness tracking, use a foreground service with foregroundServiceType="location". For periodic checks, use WorkManager. The Fused Location Providerβs requestLocationUpdates() with a PendingIntent works for background updates, but the system throttles delivery.
Geofencing defines virtual boundaries around geographic areas. You get notified when the user enters, exits, or dwells in that area. It uses the GeofencingClient API built on top of the Fused Location Provider.
You create a Geofence with a center point, radius, and transition types (GEOFENCE_TRANSITION_ENTER, EXIT, DWELL). The system monitors these efficiently β it uses cell and Wi-Fi when the user is far away and switches to GPS as they get closer. Limit is 100 active geofences per app.
ImageAnalysis gives CPU-accessible image buffers for frame-by-frame processing β barcode scanning, face detection, object recognition.
val imageAnalysis = ImageAnalysis.Builder()
.setTargetResolution(Size(1280, 720))
.setBackpressureStrategy(ImageAnalysis.STRATEGY_KEEP_ONLY_LATEST)
.build()
imageAnalysis.setAnalyzer(executor) { imageProxy ->
val rotationDegrees = imageProxy.imageInfo.rotationDegrees
processFrame(imageProxy)
imageProxy.close() // must close to receive next frame
}
cameraProvider.bindToLifecycle(this, cameraSelector, preview, imageAnalysis)
STRATEGY_KEEP_ONLY_LATEST drops frames if the analyzer is slow β usually what you want for real-time processing. You must call imageProxy.close() when done, otherwise the pipeline stalls.
MediaPlayer is the built-in framework class. Simple to use, but limited format support, poor customization, and inconsistent behavior across devices.
ExoPlayer (now androidx.media3) is Googleβs open-source media player. It supports DASH, HLS, SmoothStreaming, and progressive downloads. You can swap components like the renderer, extractor, and track selector. It also supports DRM, ad insertion, and background audio with MediaSession integration. For any production app, Media3 is the standard choice. MediaPlayer is only for the simplest cases like notification sounds.
Android provides two hardware sensors:
TYPE_STEP_COUNTER β total steps since last reboot. Hardware-backed, runs even when your app isnβt active. You save a baseline and calculate the difference.TYPE_STEP_DETECTOR β fires an event for each individual step.val stepCounter = sensorManager.getDefaultSensor(Sensor.TYPE_STEP_COUNTER)
sensorManager.registerListener(object : SensorEventListener {
override fun onSensorChanged(event: SensorEvent) {
val totalStepsSinceBoot = event.values[0].toLong()
val stepsInSession = totalStepsSinceBoot - baselineSteps
updateUI(stepsInSession)
}
override fun onAccuracyChanged(sensor: Sensor, accuracy: Int) {}
}, stepCounter, SensorManager.SENSOR_DELAY_NORMAL)
TYPE_STEP_COUNTER is preferred for accuracy. Both require ACTIVITY_RECOGNITION permission on Android 10+.
When registering a listener, you pass a delay constant:
SENSOR_DELAY_NORMAL β ~200ms, for screen orientation.SENSOR_DELAY_UI β ~60ms, for UI updates.SENSOR_DELAY_GAME β ~20ms, for games.SENSOR_DELAY_FASTEST β as fast as hardware allows.These are hints, not guarantees. Higher rates give more responsive data but increase CPU and battery usage. From Android 12 (API 31), most sensors are rate-limited to 200Hz unless you declare the HIGH_SAMPLING_RATE_SENSORS permission.
NFC enables short-range wireless communication (about 4 cm). Android supports three modes:
Android routes discovered NFC tags to apps using the tag dispatch system. You define intent filters in the manifest for NDEF_DISCOVERED, TECH_DISCOVERED, or TAG_DISCOVERED and handle the tag data in your activity.
BLE follows a specific flow:
BluetoothLeScanner.startScan() with ScanFilter and ScanSettings. Always set filters and stop scanning once you find your device.device.connectGatt() returns a BluetoothGatt object. All callbacks come through BluetoothGattCallback.gatt.discoverServices(). When onServicesDiscovered fires, enumerate services and characteristics.gatt.readCharacteristic() or gatt.writeCharacteristic(). For continuous data, enable notifications with gatt.setCharacteristicNotification().All GATT operations are async and must be serialized β only one pending operation at a time. A second read before the first callback silently fails. Most production BLE code uses a command queue to handle this.
The proximity sensor measures how close an object is to the screen, usually in centimeters. Most devices report a binary value β near or far. Android uses it during phone calls to turn off the screen when you hold the phone to your ear.
Apps can use TYPE_PROXIMITY to detect when the device is covered or in a pocket. The PROXIMITY_SCREEN_OFF_WAKE_LOCK lets you turn the screen off when the sensor detects an object and back on when it clears.
Raw sensor data is noisy. Two common approaches:
filtered = alpha * previous + (1 - alpha) * current. Smaller alpha means more smoothing.highPass = current - lowPassFiltered.The rotation vector sensor (TYPE_ROTATION_VECTOR) fuses accelerometer, gyroscope, and magnetometer data internally. It gives a stable orientation without manual filtering. In practice, prefer composite sensors over manually fusing raw data β the platform implementation is optimized across devices.
Camera2 revolves around three objects:
CameraDevice β the physical camera. Opened with CameraManager.openCamera().CameraCaptureSession β the session where you submit capture requests. Created with a list of output surfaces.CaptureRequest β defines what to capture and how. You set auto-focus mode, exposure, flash, and target surfaces.For preview, you create a repeating request with session.setRepeatingRequest(). For a photo, use session.capture(). You need to handle CameraDevice.StateCallback, CameraCaptureSession.StateCallback, and CaptureCallback β all async. This is exactly the boilerplate CameraX eliminates.
Call sensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER). If it returns null, the device doesnβt have that sensor. You should always null-check before registering a listener.
To list all available sensors on the device, use sensorManager.getSensorList(Sensor.TYPE_ALL). This is useful for debugging or building a sensor dashboard.
Before Android 12, Bluetooth scanning required ACCESS_FINE_LOCATION because BLE scans can reveal user location through beacon proximity. From Android 12 (API 31), Google introduced granular Bluetooth permissions:
BLUETOOTH_SCAN β for discovering nearby devices.BLUETOOTH_CONNECT β for connecting to paired devices.BLUETOOTH_ADVERTISE β for making the device discoverable.If you set android:usesPermissionFlags="neverForLocation" on BLUETOOTH_SCAN, you donβt need location permission at all. These are runtime permissions β request them before scanning or connecting.
From Android 10 (API 29), the ACTIVITY_RECOGNITION permission is required to use step counter sensors (TYPE_STEP_COUNTER, TYPE_STEP_DETECTOR) and the Activity Recognition API. Itβs a runtime permission β you must request it before registering sensor listeners for step data.
Before Android 10, these sensors worked without any special permission. The change was a privacy measure since step data reveals physical activity patterns.
PRIORITY_HIGH_ACCURACY and PRIORITY_BALANCED_POWER_ACCURACY in location requests?