How to Add Vital Signs to Your Telehealth Platform
A technical guide for CTOs and engineering leaders on integrating camera-based rPPG vital signs capture into existing telehealth platforms using an on-device SDK.
Telehealth has solved the access problem. Patients can see their providers from anywhere, on any device, at virtually any time. But it has not solved the clinical data problem. Every day, millions of virtual visits happen where providers make decisions without the most basic physiological data: heart rate, respiratory rate, blood oxygen saturation, and stress indicators.
This is not a minor gap. It is a structural limitation that affects clinical outcomes, provider confidence, and ultimately your platform's value proposition. The good news is that it is now solvable without requiring patients to buy hardware. Remote photoplethysmography (rPPG) technology can extract vital signs from a standard camera feed, the same camera already streaming the video visit.
This guide walks through how to integrate rPPG-based vital signs capture into your existing telehealth platform, from architecture decisions to implementation timeline.
The Problem: Telehealth Without Vitals
When a patient walks into a clinic, the first thing that happens is a vitals check. Blood pressure, heart rate, temperature, and oxygen saturation are recorded before the provider even enters the room. These numbers form the baseline for every clinical decision that follows.
In telehealth, that entire step is missing. Providers are left to rely on patient self-reporting, visual assessment through a camera, and their clinical intuition. For routine follow-ups, this may be adequate. For chronic disease management, medication titration, post-surgical monitoring, or any visit where physiological state matters, it is a significant handicap.
Health systems know this. In surveys of telehealth program directors, the inability to capture vital signs consistently ranks as the top clinical limitation of virtual care. It is also the most common reason providers give for preferring in-person visits over telehealth for certain encounter types.
For telehealth platforms, this means lost revenue. Every visit that gets routed back to in-person because the provider needs vitals data is a visit your platform did not capture. Every enterprise buyer who hesitates because your platform cannot support clinical workflows is a contract you did not close.
The Solution: Camera-Based rPPG
Remote photoplethysmography works by detecting subtle color changes in facial skin caused by blood flow. Each heartbeat pushes oxygenated blood through capillaries near the skin surface, creating micro-fluctuations in reflected light that are invisible to the human eye but detectable by a camera sensor and signal processing algorithms.
From this signal, rPPG can extract several clinically relevant measurements:
- Heart rate (HR): Beat-to-beat pulse rate with clinical-grade accuracy
- Heart rate variability (HRV): Time-domain and frequency-domain HRV metrics that indicate autonomic nervous system function
- Blood oxygen saturation (SpO2): Estimated peripheral oxygen saturation derived from multi-wavelength analysis
- Stress index: A composite measure derived from HRV patterns that reflects sympathetic-parasympathetic balance
- Respiratory rate: Breathing rate extracted from the amplitude modulation of the PPG signal
The critical advantage for telehealth is that this requires nothing from the patient except the camera they are already using. No wearable, no clip-on sensor, no additional hardware. The measurement happens passively during the video visit, or actively during a brief dedicated capture window.
Integration Architecture
Adding rPPG vitals to your telehealth platform involves a client-side SDK, a data transport layer, and provider-side presentation. Here is how each component fits into your existing architecture.
Client-Side Capture
Circadify's rPPG SDK runs entirely in the browser via WebAssembly. It hooks into the same camera feed used for the video visit, processing frames locally on the patient's device. This on-device processing model means raw facial video never leaves the patient's device for vitals analysis, which simplifies your HIPAA compliance posture considerably.
The SDK integration at the client level involves three steps:
-
Camera access coordination. Your platform already requests camera permissions for the video visit. The SDK shares the same MediaStream, so there is no second permission prompt. You pass the active video track to the SDK's initialization function.
-
Capture session management. The SDK exposes a simple lifecycle: initialize, start capture, stop capture, get results. A typical measurement window is 30 to 60 seconds. You can trigger this automatically when the visit begins, on provider request, or as a patient-initiated action.
-
Result delivery. When a capture session completes, the SDK returns a structured JSON payload with all measured vitals, confidence scores, and signal quality indicators. This payload is what you transmit to your backend.
WebRTC Considerations
If your platform uses WebRTC for video calls, you need to manage the interaction between the SDK and your WebRTC pipeline carefully. The SDK needs access to uncompressed video frames at a reasonable resolution (minimum 640x480) and frame rate (minimum 15 fps). WebRTC encoding can interfere with this if the SDK operates on the encoded stream rather than the raw camera feed.
The recommended pattern is to create a separate processing branch from the raw MediaStream. The camera feed goes to both your WebRTC peer connection and the rPPG SDK independently. This avoids any quality degradation from video compression and ensures the SDK gets consistent frame quality regardless of network-adaptive bitrate changes in your WebRTC stream.
Data Flow to Your Backend
The vitals payload from the SDK is a compact JSON object, typically under 2 KB. You can transmit it through your existing signaling channel, a dedicated REST endpoint, or a WebSocket connection. The payload includes timestamps, measurement duration, and per-vital confidence scores so your backend can validate data quality before persisting or displaying results.
On the server side, you store the vitals data associated with the visit record. If your platform integrates with EHR systems, the vitals map cleanly to standard observation resources in FHIR R4 format, using the same LOINC codes used for in-clinic vitals.
Provider Dashboard
On the provider side, vitals should appear in context within the visit interface. The most effective pattern is a vitals panel adjacent to the video feed that updates when a measurement completes. Providers should see the measured values, the confidence level, and a timestamp. Trend data from prior visits adds clinical value by showing changes over time.
Consider providing a mechanism for the provider to request a re-measurement during the visit. Network issues, poor lighting, or patient movement can affect signal quality, and providers should have the ability to get a fresh reading.
Key Considerations
HIPAA and Privacy
On-device processing is the most significant architectural decision for privacy. Because the rPPG signal processing happens in the patient's browser, the raw facial video used for analysis never traverses your servers. Only the derived vitals data (numeric values and metadata) is transmitted. This is a fundamentally different privacy model than cloud-based processing, where facial video would need to be uploaded, processed, and then deleted.
That said, the derived vitals data itself is protected health information (PHI) under HIPAA. It must be encrypted in transit and at rest, associated with the patient's record through your existing identity management, and subject to the same access controls and audit logging as any other clinical data on your platform.
Latency and Performance
The SDK's WebAssembly implementation is optimized for modern browsers and runs efficiently on mid-range hardware. A typical measurement adds minimal CPU overhead beyond what the video call itself requires. On mobile devices, battery impact is comparable to running a video call with the screen on, which the patient is already doing.
Measurement latency depends on the capture window duration. A 30-second capture yields results within 1 to 2 seconds of completion. For real-time heart rate display during the call, the SDK can provide streaming estimates with slightly lower accuracy that update every few seconds.
Browser Compatibility
The SDK supports all major modern browsers: Chrome, Firefox, Safari, and Edge on desktop, and Chrome and Safari on mobile. WebAssembly support is the primary requirement, which covers over 95 percent of browsers in active use. For the small percentage of patients on older browsers, your platform should degrade gracefully by simply not offering the vitals capture option rather than breaking the visit experience.
Signal Quality and Edge Cases
Not every measurement will produce clinical-grade results. Low lighting, excessive patient movement, certain skin conditions, and heavy facial coverings can reduce signal quality. The SDK returns confidence scores with every measurement, and your platform should define a threshold below which results are flagged or excluded. This is analogous to how a pulse oximeter will fail to read if placed improperly. The technology is robust, but it is not infallible, and your UX should account for that honestly.
Implementation Timeline
For a typical telehealth platform with an existing WebRTC-based video infrastructure, the integration follows a predictable timeline:
Weeks 1-2: SDK Integration and Camera Pipeline. Connect the SDK to your existing camera feed, implement the capture lifecycle, and verify measurements in your development environment. This is primarily front-end engineering work.
Weeks 3-4: Backend and Data Flow. Build the endpoint or channel for receiving vitals data, wire it into your visit records, and implement storage and retrieval. If you have an EHR integration, map the vitals to FHIR observations during this phase.
Weeks 5-6: Provider UX. Build the vitals display into your provider dashboard, implement re-measurement controls, and add trend visualization for longitudinal data. Conduct provider usability testing.
Weeks 7-8: QA and Edge Cases. Test across browsers, devices, and lighting conditions. Validate signal quality thresholds. Ensure graceful degradation when the SDK cannot produce a reliable measurement. Load test the backend under concurrent visit volumes.
Weeks 9-10: Staged Rollout. Deploy to a subset of providers or visits, monitor data quality and user feedback, and iterate before broad availability.
A focused engineering team of two to three developers can complete this integration in a single quarter. The SDK handles the complex signal processing, so your team's effort is concentrated on integration plumbing and UX rather than algorithm development.
What This Unlocks
Adding vital signs to your telehealth platform changes the conversation with enterprise buyers. Health systems evaluating telehealth solutions consistently ask about clinical data capture. Being able to demonstrate contactless vitals during a live demo moves your platform from "video visit tool" to "clinical care delivery platform" in the buyer's mind.
It also opens new pricing dimensions. Vitals-enabled visits carry more clinical value than video-only visits, supporting premium per-visit pricing or a tiered platform model. For platforms pursuing remote patient monitoring (RPM) use cases, rPPG vitals create a bridge between synchronous telehealth visits and asynchronous monitoring without requiring patients to adopt new hardware.
The technology is mature, the integration effort is bounded, and the market timing is right. Telehealth platforms that add vitals capture now will define the next standard of care for virtual visits. Those that wait will be playing catch-up when enterprise buyers start requiring it.
