feat: 完善架构优化性能

- 调整音视频架构,提升 RKMPP 编码 MJPEG-->H264 性能,同时解决丢帧马赛克问题;
- 删除多用户逻辑,只保留单用户,支持设置 web 单会话;
- 修复删除体验不好的的回退逻辑,前端页面菜单位置微调;
- 增加 OTG USB 设备动态调整功能;
- 修复 mdns 问题,webrtc 视频切换更顺畅。
This commit is contained in:
mofeng
2026-01-25 16:04:29 +08:00
parent 01e01430da
commit 1786b7689d
66 changed files with 4225 additions and 2936 deletions

View File

@@ -129,6 +129,7 @@ tempfile = "3"
[build-dependencies] [build-dependencies]
protobuf-codegen = "3.7" protobuf-codegen = "3.7"
toml = "0.9" toml = "0.9"
cc = "1"
[profile.release] [profile.release]
opt-level = 3 opt-level = 3

299
README.md
View File

@@ -1,81 +1,294 @@
# One-KVM <div align="center">
<img src="https://github.com/mofeng-git/Build-Armbian/assets/62919083/add9743a-0987-4e8a-b2cb-62121f236582" alt="One-KVM Logo" width="300">
<h1>One-KVM</h1>
<p><strong>Rust 编写的开放轻量 IP-KVM 解决方案,实现 BIOS 级远程管理</strong></p>
<p align="center"> <p><a href="README.md">简体中文</a></p>
<strong>开放轻量的 IP-KVM 解决方案,实现 BIOS 级远程管理</strong>
</p>
<p align="center"> [![GitHub stars](https://img.shields.io/github/stars/mofeng-git/One-KVM?style=social)](https://github.com/mofeng-git/One-KVM/stargazers)
<a href="#功能特性">功能特性</a> [![GitHub forks](https://img.shields.io/github/forks/mofeng-git/One-KVM?style=social)](https://github.com/mofeng-git/One-KVM/network/members)
<a href="#快速开始">快速开始</a> [![GitHub issues](https://img.shields.io/github/issues/mofeng-git/One-KVM)](https://github.com/mofeng-git/One-KVM/issues)
<p>
<a href="docs/README.md">📖 技术文档</a>
<a href="#快速开始">⚡ 快速开始</a>
<a href="#功能介绍">📊 功能介绍</a>
<a href="#迁移说明">🔁 迁移说明</a>
</p> </p>
</div>
--- ---
## 介绍 ## 📋 目录
One-KVM 是一个用 Rust 编写的开放轻量的 IP-KVM基于 IP 的键盘、视频、鼠标)解决方案,让你可以通过网络远程控制计算机,包括 BIOS 级别的操作。 - [项目概述](#项目概述)
- [迁移说明](#迁移说明)
- [功能介绍](#功能介绍)
- [快速开始](#快速开始)
- [贡献与反馈](#贡献与反馈)
- [致谢](#致谢)
- [许可证](#许可证)
**当前软件处于开发早期阶段,各种功能和细节还有待完善,欢迎体验,但请勿应用于生产环境。** ## 📖 项目概述
## 功能特性 **One-KVM Rust** 是一个用 Rust 编写的轻量级 IP-KVM 解决方案,可通过网络远程管理服务器和工作站,实现 BIOS 级远程控制。
项目目标:
- **开放**:不绑定特定硬件配置,尽量适配常见 Linux 设备
- **轻量**:单二进制分发,部署过程更简单
- **易用**:网页界面完成设备与参数配置,尽量减少手动改配置文件
> **注意:** One-KVM Rust 目前仍处于开发早期阶段,功能与细节会快速迭代,欢迎体验与反馈。
## 🔁 迁移说明
开发重心正在从 **One-KVM Python** 逐步转向 **One-KVM Rust**
- 如果你在使用 **One-KVM Python基于 PiKVM**,请查看 [One-KVM Python 文档](https://docs.one-kvm.cn/python/)
- One-KVM Rust 相较于 One-KVM Python**尚未适配 CSI HDMI 采集卡**、**不支持 VNC 访问**,仍处于开发早期阶段
## 📊 功能介绍
### 核心功能 ### 核心功能
| 功能 | 说明 | | 功能 | 说明 |
|------|------| |------|------|
| 视频采集 | HDMI USB 采集卡支持,提供 MJPEG/H264/H265/VP8/VP9 视频流 | | 视频采集 | HDMI USB 采集卡支持,提供 MJPEG / WebRTCH.264/H.265/VP8/VP9 |
| 键鼠控制 | USB OTG HID 或 CH340 + CH39329 HID支持绝对/相对鼠标模式 | | 键鼠控制 | USB OTG HID 或 CH340 + CH9329 HID支持绝对/相对鼠标模式 |
| 虚拟U盘 | USB Mass Storage支持 ISO/IMG 镜像挂载和 Ventoy 虚拟U盘模式 | | 虚拟媒体 | USB Mass Storage支持 ISO/IMG 镜像挂载和 Ventoy 虚拟U盘模式 |
| ATX 电源控制 | GPIO 控制电源/重启按钮 | | ATX 电源控制 | GPIO 控制电源/重启按钮 |
| 音频传输 | ALSA 采集 + Opus 编码HTTP/WebRTC | | 音频传输 | ALSA 采集 + Opus 编码HTTP/WebRTC |
### 硬件编码 ### 硬件编码
支持自动检测和选择硬件加速: 支持自动检测和选择硬件加速:
- **VAAPI** - Intel/AMD GPU
- **RKMPP** - Rockchip SoC (**尚未实现**)
- **V4L2 M2M** - 通用硬件编码器 (**尚未实现**)
- **软件编码** - CPU 编码
### 其他特性 - **VAAPI**Intel/AMD GPU
- **RKMPP**Rockchip SoC
- **V4L2 M2M**:通用硬件编码器(尚未实现)
- **软件编码**CPU 编码
- 单二进制部署,依赖更轻量 ### 扩展能力
- Web UI 配置,无需编辑配置文件,多语言支持 (中文/英文)
- 内置 Web 终端 (ttyd),内网穿透支持 (gostc)P2P 组网支持 (EasyTier)
## 快速开始 - Web UI 配置,多语言支持(中文/英文)
- 内置 Web 终端ttyd内网穿透支持gostc、P2P 组网支持EasyTier、RustDesk 协议集成(用于跨平台远程访问能力扩展)
### Docker 运行 ## ⚡ 快速开始
安装方式Docker / DEB 软件包 / 飞牛 NASFPK
### 方式一Docker 安装(推荐)
前提条件:
- Linux 主机已安装 Docker
- 插好 USB HDMI 采集卡
- 启用 USB OTG 或插好 CH340+CH9329 HID 线(用于 HID 模拟)
启动容器:
```bash ```bash
docker run -d --privileged \ docker run --name one-kvm -itd --privileged=true \
--name one-kvm \ -v /dev:/dev -v /sys/:/sys \
-v /dev:/dev \
-v /sys/kernel/config:/sys/kernel/config \
--net=host \ --net=host \
silentwind0/one-kvm silentwind0/one-kvm
``` ```
访问 http://IP:8080 访问 Web 界面:`http://<设备IP>:8080`首次访问会引导创建管理员账户。默认端口HTTP `8080`;启用 HTTPS 后为 `8443`
### 环境变量 #### 常用环境变量Docker
| 变量 | 说明 | 默认值 | | 变量 | 默认值 | 说明 |
|------|------|--------| |------|------|------|
| `ENABLE_HTTPS` | 启用 HTTPS | `false` | | `ENABLE_HTTPS` | `false` | 是否启用 HTTPS`true/false` |
| `HTTP_PORT` | HTTP 端口 | `8080` | | `HTTP_PORT` | `8080` | HTTP 端口(`ENABLE_HTTPS=false` 时生效) |
| `VERBOSE` | 日志级别 (1/2/3) | - | | `HTTPS_PORT` | `8443` | HTTPS 端口(`ENABLE_HTTPS=true` 时生效) |
| `BIND_ADDRESS` | - | 监听地址(如 `0.0.0.0` |
| `VERBOSE` | `0` | 日志详细程度:`1`-v`2`-vv`3`-vvv |
| `DATA_DIR` | `/etc/one-kvm` | 数据目录(等价于 `one-kvm -d <DIR>`,优先级高于 `ONE_KVM_DATA_DIR` |
> 说明:`--privileged=true` 和挂载 `/dev`、`/sys` 是硬件访问所需配置,当前版本不可省略。
>
> 兼容性:同时支持旧变量名 `ONE_KVM_DATA_DIR`。
>
> HTTPS未提供证书时会自动生成默认自签名证书。
>
> Ventoy若修改 `DATA_DIR`,请确保 Ventoy 资源文件位于 `${DATA_DIR}/ventoy``boot.img`、`core.img`、`ventoy.disk.img`)。
## 致谢 ### 方式二DEB 软件包安装
感谢以下项目 前提条件
- [PiKVM](https://github.com/pikvm/pikvm) - 原始 Python 版 IP-KVM - Debian 11+ / Ubuntu 22+
- [RustDesk](https://github.com/rustdesk/rustdesk) - hwcodec 硬件编码库 - 插好 USB HDMI 采集卡、HID 线OTG 或 CH340+CH9329
- [ttyd](https://github.com/tsl0922/ttyd) - Web 终端
- [EasyTier](https://github.com/EasyTier/EasyTier) - P2P 组网
## 许可证 安装步骤:
待定 1. 从 GitHub Releases 下载适合架构的 `one-kvm_*.deb`[Releases](https://github.com/mofeng-git/One-KVM/releases)
2. 安装:
```bash
sudo apt update
sudo apt install ./one-kvm_*_*.deb
```
访问 Web 界面:`http://<设备IP>:8080`
### 方式三:飞牛 NASFPK安装
前提条件:
- 飞牛 NAS 系统(目前仅支持 x86_64 架构)
- 插好 USB HDMI 采集卡、CH340+CH9329 HID 线
安装步骤:
1. 从 GitHub Releases 下载 `*.fpk` 软件包:[Releases](https://github.com/mofeng-git/One-KVM/releases)
2. 在飞牛应用商店选择“手动安装”,导入 `*.fpk`
访问 Web 界面:`http://<设备IP>:8420`
## 报告问题
如果您发现了问题,请:
1. 使用 [GitHub Issues](https://github.com/mofeng-git/One-KVM/issues) 报告
2. 提供详细的错误信息和复现步骤
3. 包含您的硬件配置和系统信息
## 赞助支持
本项目基于多个优秀开源项目进行二次开发,作者投入了大量时间进行测试和维护。如果您觉得这个项目有价值,欢迎通过 **[为爱发电](https://afdian.com/a/silentwind)** 支持项目发展。
### 感谢名单
<details>
<summary><strong>点击查看感谢名单</strong></summary>
- 浩龙的电子嵌入式之路
- Tsuki
- H_xiaoming
- 0蓝蓝0
- fairybl
- Will
- 浩龙的电子嵌入式之路
- 自.知
- 观棋不语٩ ི۶
- 爱发电用户_a57a4
- 爱发电用户_2c769
- 霜序
- 远方(闲鱼用户名:小远技术店铺)
- 爱发电用户_399fc
- 斐斐の
- 爱发电用户_09451
- 超高校级的錆鱼
- 爱发电用户_08cff
- guoke
- mgt
- 姜沢掵
- ui_beam
- 爱发电用户_c0dd7
- 爱发电用户_dnjK
- 忍者胖猪
- 永遠の願い
- 爱发电用户_GBrF
- 爱发电用户_fd65c
- 爱发电用户_vhNa
- 爱发电用户_Xu6S
- moss
- woshididi
- 爱发电用户_a0fd1
- 爱发电用户_f6bH
- 码农
- 爱发电用户_6639f
- jeron
- 爱发电用户_CN7y
- 爱发电用户_Up6w
- 爱发电用户_e3202
- 一语念白
- 云边
- 爱发电用户_5a711
- 爱发电用户_9a706
- T0m9ir1SUKI
- 爱发电用户_56d52
- 爱发电用户_3N6F
- DUSK
- 飘零
- .
- 饭太稀
-
- ......
</details>
### 赞助商
本项目得到以下赞助商的支持:
**CDN 加速及安全防护:**
- **[Tencent EdgeOne](https://edgeone.ai/zh?from=github)** - 提供 CDN 加速及安全防护服务
![Tencent EdgeOne](https://edgeone.ai/media/34fe3a45-492d-4ea4-ae5d-ea1087ca7b4b.png)
**文件存储服务:**
- **[Huang1111公益计划](https://pan.huang1111.cn/s/mxkx3T1)** - 提供免登录下载服务
**云服务商**
- **[林枫云](https://www.dkdun.cn)** - 赞助了本项目宁波大带宽服务器
![林枫云](https://docs.one-kvm.cn/img/36076FEFF0898A80EBD5756D28F4076C.png)
林枫云主营国内外地域的精品线路业务服务器、高主频游戏服务器和大带宽服务器。

View File

@@ -4,36 +4,44 @@
set -e set -e
# Start one-kvm with default options # Start one-kvm with default options.
# Additional options can be passed via environment variables # Additional options can be passed via environment variables.
EXTRA_ARGS="-d /etc/one-kvm"
# Data directory (prefer DATA_DIR, keep ONE_KVM_DATA_DIR for backward compatibility)
DATA_DIR="${DATA_DIR:-${ONE_KVM_DATA_DIR:-/etc/one-kvm}}"
ARGS=(-d "$DATA_DIR")
# Enable HTTPS if requested # Enable HTTPS if requested
if [ "${ENABLE_HTTPS:-false}" = "true" ]; then if [ "${ENABLE_HTTPS:-false}" = "true" ]; then
EXTRA_ARGS="$EXTRA_ARGS --enable-https" ARGS+=(--enable-https)
fi fi
# Custom bind address # Custom bind address
if [ -n "$BIND_ADDRESS" ]; then if [ -n "$BIND_ADDRESS" ]; then
EXTRA_ARGS="$EXTRA_ARGS -a $BIND_ADDRESS" ARGS+=(-a "$BIND_ADDRESS")
fi fi
# Custom port # Custom port
if [ -n "$HTTP_PORT" ]; then if [ -n "$HTTP_PORT" ]; then
EXTRA_ARGS="$EXTRA_ARGS -p $HTTP_PORT" ARGS+=(-p "$HTTP_PORT")
fi
# Custom HTTPS port
if [ -n "$HTTPS_PORT" ]; then
ARGS+=(--https-port "$HTTPS_PORT")
fi fi
# Verbosity level # Verbosity level
if [ -n "$VERBOSE" ]; then if [ -n "$VERBOSE" ]; then
case "$VERBOSE" in case "$VERBOSE" in
1) EXTRA_ARGS="$EXTRA_ARGS -v" ;; 1) ARGS+=(-v) ;;
2) EXTRA_ARGS="$EXTRA_ARGS -vv" ;; 2) ARGS+=(-vv) ;;
3) EXTRA_ARGS="$EXTRA_ARGS -vvv" ;; 3) ARGS+=(-vvv) ;;
esac esac
fi fi
echo "[INFO] Starting one-kvm..." echo "[INFO] Starting one-kvm..."
echo "[INFO] Extra arguments: $EXTRA_ARGS" echo "[INFO] Arguments: ${ARGS[*]}"
# Execute one-kvm # Execute one-kvm
exec /usr/bin/one-kvm $EXTRA_ARGS exec /usr/bin/one-kvm "${ARGS[@]}"

View File

@@ -98,6 +98,7 @@ mod ffmpeg {
link_os(); link_os();
build_ffmpeg_ram(builder); build_ffmpeg_ram(builder);
build_ffmpeg_hw(builder);
} }
/// Link system FFmpeg using pkg-config or custom path /// Link system FFmpeg using pkg-config or custom path
@@ -373,4 +374,57 @@ mod ffmpeg {
); );
} }
} }
fn build_ffmpeg_hw(builder: &mut Build) {
let manifest_dir = PathBuf::from(env!("CARGO_MANIFEST_DIR"));
let ffmpeg_hw_dir = manifest_dir.join("cpp").join("ffmpeg_hw");
let ffi_header = ffmpeg_hw_dir
.join("ffmpeg_hw_ffi.h")
.to_string_lossy()
.to_string();
bindgen::builder()
.header(ffi_header)
.rustified_enum("*")
.generate()
.unwrap()
.write_to_file(Path::new(&env::var_os("OUT_DIR").unwrap()).join("ffmpeg_hw_ffi.rs"))
.unwrap();
let target_arch = std::env::var("CARGO_CFG_TARGET_ARCH").unwrap_or_default();
let enable_rkmpp = matches!(target_arch.as_str(), "aarch64" | "arm")
|| std::env::var_os("CARGO_FEATURE_RKMPP").is_some();
if enable_rkmpp {
// Include RGA headers for NV16->NV12 conversion (RGA im2d API)
let rga_sys_dirs = [
Path::new("/usr/aarch64-linux-gnu/include/rga"),
Path::new("/usr/include/rga"),
];
let mut added = false;
for dir in rga_sys_dirs.iter() {
if dir.exists() {
builder.include(dir);
added = true;
}
}
if !added {
// Fallback to repo-local rkrga headers if present
let repo_root = manifest_dir
.parent()
.and_then(|p| p.parent())
.map(|p| p.to_path_buf())
.unwrap_or_else(|| manifest_dir.clone());
let rkrga_dir = repo_root.join("ffmpeg").join("rkrga");
if rkrga_dir.exists() {
builder.include(rkrga_dir.join("include"));
builder.include(rkrga_dir.join("im2d_api"));
}
}
builder.file(ffmpeg_hw_dir.join("ffmpeg_hw_mjpeg_h264.cpp"));
} else {
println!(
"cargo:info=Skipping ffmpeg_hw_mjpeg_h264.cpp (RKMPP) for arch {}",
target_arch
);
}
}
} }

View File

@@ -0,0 +1,42 @@
#pragma once
#include <stdint.h>
#ifdef __cplusplus
extern "C" {
#endif
typedef struct FfmpegHwMjpegH264 FfmpegHwMjpegH264;
FfmpegHwMjpegH264* ffmpeg_hw_mjpeg_h264_new(const char* dec_name,
const char* enc_name,
int width,
int height,
int fps,
int bitrate_kbps,
int gop,
int thread_count);
int ffmpeg_hw_mjpeg_h264_encode(FfmpegHwMjpegH264* ctx,
const uint8_t* data,
int len,
int64_t pts_ms,
uint8_t** out_data,
int* out_len,
int* out_keyframe);
int ffmpeg_hw_mjpeg_h264_reconfigure(FfmpegHwMjpegH264* ctx,
int bitrate_kbps,
int gop);
int ffmpeg_hw_mjpeg_h264_request_keyframe(FfmpegHwMjpegH264* ctx);
void ffmpeg_hw_mjpeg_h264_free(FfmpegHwMjpegH264* ctx);
void ffmpeg_hw_packet_free(uint8_t* data);
const char* ffmpeg_hw_last_error(void);
#ifdef __cplusplus
}
#endif

View File

@@ -0,0 +1,444 @@
extern "C" {
#include <libavcodec/avcodec.h>
#include <libavutil/avutil.h>
#include <libavutil/error.h>
#include <libavutil/hwcontext.h>
#include <libavutil/hwcontext_drm.h>
#include <libavutil/pixdesc.h>
#include <libavutil/opt.h>
}
#include <string>
#include <string.h>
#include <stdlib.h>
#include <stdio.h>
#define LOG_MODULE "FFMPEG_HW"
#include "../common/log.h"
#include "ffmpeg_hw_ffi.h"
namespace {
thread_local std::string g_last_error;
static void set_last_error(const std::string &msg) {
g_last_error = msg;
LOG_ERROR(msg);
}
static std::string make_err(const std::string &ctx, int err) {
return ctx + " (ret=" + std::to_string(err) + "): " + av_err2str(err);
}
static const char* pix_fmt_name(AVPixelFormat fmt) {
const char *name = av_get_pix_fmt_name(fmt);
return name ? name : "unknown";
}
struct FfmpegHwMjpegH264Ctx {
AVCodecContext *dec_ctx = nullptr;
AVCodecContext *enc_ctx = nullptr;
AVPacket *dec_pkt = nullptr;
AVFrame *dec_frame = nullptr;
AVPacket *enc_pkt = nullptr;
AVBufferRef *hw_device_ctx = nullptr;
AVBufferRef *hw_frames_ctx = nullptr;
AVPixelFormat hw_pixfmt = AV_PIX_FMT_NONE;
std::string dec_name;
std::string enc_name;
int width = 0;
int height = 0;
int fps = 30;
int bitrate_kbps = 2000;
int gop = 60;
int thread_count = 1;
bool force_keyframe = false;
};
static enum AVPixelFormat get_hw_format(AVCodecContext *ctx,
const enum AVPixelFormat *pix_fmts) {
auto *self = reinterpret_cast<FfmpegHwMjpegH264Ctx *>(ctx->opaque);
if (self && self->hw_pixfmt != AV_PIX_FMT_NONE) {
const enum AVPixelFormat *p;
for (p = pix_fmts; *p != AV_PIX_FMT_NONE; p++) {
if (*p == self->hw_pixfmt) {
return *p;
}
}
}
return pix_fmts[0];
}
static int init_decoder(FfmpegHwMjpegH264Ctx *ctx) {
const AVCodec *dec = avcodec_find_decoder_by_name(ctx->dec_name.c_str());
if (!dec) {
set_last_error("Decoder not found: " + ctx->dec_name);
return -1;
}
ctx->dec_ctx = avcodec_alloc_context3(dec);
if (!ctx->dec_ctx) {
set_last_error("Failed to allocate decoder context");
return -1;
}
ctx->dec_ctx->width = ctx->width;
ctx->dec_ctx->height = ctx->height;
ctx->dec_ctx->thread_count = ctx->thread_count > 0 ? ctx->thread_count : 1;
ctx->dec_ctx->opaque = ctx;
// Pick HW pixfmt for RKMPP
const AVCodecHWConfig *cfg = nullptr;
for (int i = 0; (cfg = avcodec_get_hw_config(dec, i)); i++) {
if (cfg->device_type == AV_HWDEVICE_TYPE_RKMPP) {
ctx->hw_pixfmt = cfg->pix_fmt;
break;
}
}
if (ctx->hw_pixfmt == AV_PIX_FMT_NONE) {
set_last_error("No RKMPP hw pixfmt for decoder");
return -1;
}
int ret = av_hwdevice_ctx_create(&ctx->hw_device_ctx,
AV_HWDEVICE_TYPE_RKMPP, NULL, NULL, 0);
if (ret < 0) {
set_last_error(make_err("av_hwdevice_ctx_create failed", ret));
return -1;
}
ctx->dec_ctx->hw_device_ctx = av_buffer_ref(ctx->hw_device_ctx);
ctx->dec_ctx->get_format = get_hw_format;
ret = avcodec_open2(ctx->dec_ctx, dec, NULL);
if (ret < 0) {
set_last_error(make_err("avcodec_open2 decoder failed", ret));
return -1;
}
ctx->dec_pkt = av_packet_alloc();
ctx->dec_frame = av_frame_alloc();
ctx->enc_pkt = av_packet_alloc();
if (!ctx->dec_pkt || !ctx->dec_frame || !ctx->enc_pkt) {
set_last_error("Failed to allocate packet/frame");
return -1;
}
return 0;
}
static int init_encoder(FfmpegHwMjpegH264Ctx *ctx, AVBufferRef *frames_ctx) {
const AVCodec *enc = avcodec_find_encoder_by_name(ctx->enc_name.c_str());
if (!enc) {
set_last_error("Encoder not found: " + ctx->enc_name);
return -1;
}
ctx->enc_ctx = avcodec_alloc_context3(enc);
if (!ctx->enc_ctx) {
set_last_error("Failed to allocate encoder context");
return -1;
}
ctx->enc_ctx->width = ctx->width;
ctx->enc_ctx->height = ctx->height;
ctx->enc_ctx->time_base = AVRational{1, 1000};
ctx->enc_ctx->framerate = AVRational{ctx->fps, 1};
ctx->enc_ctx->bit_rate = (int64_t)ctx->bitrate_kbps * 1000;
ctx->enc_ctx->gop_size = ctx->gop > 0 ? ctx->gop : ctx->fps;
ctx->enc_ctx->max_b_frames = 0;
ctx->enc_ctx->pix_fmt = AV_PIX_FMT_DRM_PRIME;
ctx->enc_ctx->sw_pix_fmt = AV_PIX_FMT_NV12;
if (frames_ctx) {
AVHWFramesContext *hwfc = reinterpret_cast<AVHWFramesContext *>(frames_ctx->data);
if (hwfc) {
ctx->enc_ctx->pix_fmt = static_cast<AVPixelFormat>(hwfc->format);
ctx->enc_ctx->sw_pix_fmt = static_cast<AVPixelFormat>(hwfc->sw_format);
if (hwfc->width > 0) ctx->enc_ctx->width = hwfc->width;
if (hwfc->height > 0) ctx->enc_ctx->height = hwfc->height;
}
ctx->hw_frames_ctx = av_buffer_ref(frames_ctx);
ctx->enc_ctx->hw_frames_ctx = av_buffer_ref(frames_ctx);
}
if (ctx->hw_device_ctx) {
ctx->enc_ctx->hw_device_ctx = av_buffer_ref(ctx->hw_device_ctx);
}
AVDictionary *opts = nullptr;
av_dict_set(&opts, "rc_mode", "CBR", 0);
av_dict_set(&opts, "profile", "high", 0);
av_dict_set_int(&opts, "qp_init", 23, 0);
av_dict_set_int(&opts, "qp_max", 48, 0);
av_dict_set_int(&opts, "qp_min", 0, 0);
av_dict_set_int(&opts, "qp_max_i", 48, 0);
av_dict_set_int(&opts, "qp_min_i", 0, 0);
int ret = avcodec_open2(ctx->enc_ctx, enc, &opts);
av_dict_free(&opts);
if (ret < 0) {
std::string detail = "avcodec_open2 encoder failed: ";
detail += ctx->enc_name;
detail += " fmt=" + std::string(pix_fmt_name(ctx->enc_ctx->pix_fmt));
detail += " sw=" + std::string(pix_fmt_name(ctx->enc_ctx->sw_pix_fmt));
detail += " size=" + std::to_string(ctx->enc_ctx->width) + "x" + std::to_string(ctx->enc_ctx->height);
detail += " fps=" + std::to_string(ctx->fps);
set_last_error(make_err(detail, ret));
avcodec_free_context(&ctx->enc_ctx);
ctx->enc_ctx = nullptr;
if (ctx->hw_frames_ctx) {
av_buffer_unref(&ctx->hw_frames_ctx);
ctx->hw_frames_ctx = nullptr;
}
return -1;
}
return 0;
}
static void free_encoder(FfmpegHwMjpegH264Ctx *ctx) {
if (ctx->enc_ctx) {
avcodec_free_context(&ctx->enc_ctx);
ctx->enc_ctx = nullptr;
}
if (ctx->hw_frames_ctx) {
av_buffer_unref(&ctx->hw_frames_ctx);
ctx->hw_frames_ctx = nullptr;
}
}
} // namespace
extern "C" FfmpegHwMjpegH264* ffmpeg_hw_mjpeg_h264_new(const char* dec_name,
const char* enc_name,
int width,
int height,
int fps,
int bitrate_kbps,
int gop,
int thread_count) {
if (!dec_name || !enc_name || width <= 0 || height <= 0) {
set_last_error("Invalid parameters for ffmpeg_hw_mjpeg_h264_new");
return nullptr;
}
auto *ctx = new FfmpegHwMjpegH264Ctx();
ctx->dec_name = dec_name;
ctx->enc_name = enc_name;
ctx->width = width;
ctx->height = height;
ctx->fps = fps > 0 ? fps : 30;
ctx->bitrate_kbps = bitrate_kbps > 0 ? bitrate_kbps : 2000;
ctx->gop = gop > 0 ? gop : ctx->fps;
ctx->thread_count = thread_count > 0 ? thread_count : 1;
if (init_decoder(ctx) != 0) {
ffmpeg_hw_mjpeg_h264_free(reinterpret_cast<FfmpegHwMjpegH264*>(ctx));
return nullptr;
}
return reinterpret_cast<FfmpegHwMjpegH264*>(ctx);
}
extern "C" int ffmpeg_hw_mjpeg_h264_encode(FfmpegHwMjpegH264* handle,
const uint8_t* data,
int len,
int64_t pts_ms,
uint8_t** out_data,
int* out_len,
int* out_keyframe) {
if (!handle || !data || len <= 0 || !out_data || !out_len || !out_keyframe) {
set_last_error("Invalid parameters for encode");
return -1;
}
auto *ctx = reinterpret_cast<FfmpegHwMjpegH264Ctx*>(handle);
*out_data = nullptr;
*out_len = 0;
*out_keyframe = 0;
av_packet_unref(ctx->dec_pkt);
int ret = av_new_packet(ctx->dec_pkt, len);
if (ret < 0) {
set_last_error(make_err("av_new_packet failed", ret));
return -1;
}
memcpy(ctx->dec_pkt->data, data, len);
ctx->dec_pkt->size = len;
ret = avcodec_send_packet(ctx->dec_ctx, ctx->dec_pkt);
if (ret < 0) {
set_last_error(make_err("avcodec_send_packet failed", ret));
return -1;
}
while (true) {
ret = avcodec_receive_frame(ctx->dec_ctx, ctx->dec_frame);
if (ret == AVERROR(EAGAIN) || ret == AVERROR_EOF) {
return 0;
}
if (ret < 0) {
set_last_error(make_err("avcodec_receive_frame failed", ret));
return -1;
}
if (ctx->dec_frame->format != AV_PIX_FMT_DRM_PRIME) {
set_last_error("Decoder output is not DRM_PRIME");
av_frame_unref(ctx->dec_frame);
return -1;
}
if (!ctx->enc_ctx) {
if (!ctx->dec_frame->hw_frames_ctx) {
set_last_error("Decoder returned frame without hw_frames_ctx");
av_frame_unref(ctx->dec_frame);
return -1;
}
if (init_encoder(ctx, ctx->dec_frame->hw_frames_ctx) != 0) {
av_frame_unref(ctx->dec_frame);
return -1;
}
}
AVFrame *send_frame = ctx->dec_frame;
AVFrame *tmp = nullptr;
if (ctx->force_keyframe) {
tmp = av_frame_clone(send_frame);
if (tmp) {
tmp->pict_type = AV_PICTURE_TYPE_I;
send_frame = tmp;
}
ctx->force_keyframe = false;
}
send_frame->pts = pts_ms; // time_base is ms
ret = avcodec_send_frame(ctx->enc_ctx, send_frame);
if (tmp) {
av_frame_free(&tmp);
}
if (ret < 0) {
std::string detail = "avcodec_send_frame failed";
if (send_frame) {
detail += " frame_fmt=";
detail += pix_fmt_name(static_cast<AVPixelFormat>(send_frame->format));
detail += " w=" + std::to_string(send_frame->width);
detail += " h=" + std::to_string(send_frame->height);
if (send_frame->format == AV_PIX_FMT_DRM_PRIME && send_frame->data[0]) {
const AVDRMFrameDescriptor *drm =
reinterpret_cast<const AVDRMFrameDescriptor *>(send_frame->data[0]);
if (drm && drm->layers[0].format) {
detail += " drm_fmt=0x";
char buf[9];
snprintf(buf, sizeof(buf), "%08x", drm->layers[0].format);
detail += buf;
}
if (drm && drm->objects[0].format_modifier) {
detail += " drm_mod=0x";
char buf[17];
snprintf(buf, sizeof(buf), "%016llx",
(unsigned long long)drm->objects[0].format_modifier);
detail += buf;
}
}
}
set_last_error(make_err(detail, ret));
av_frame_unref(ctx->dec_frame);
return -1;
}
av_packet_unref(ctx->enc_pkt);
ret = avcodec_receive_packet(ctx->enc_ctx, ctx->enc_pkt);
if (ret == AVERROR(EAGAIN)) {
av_frame_unref(ctx->dec_frame);
return 0;
}
if (ret < 0) {
set_last_error(make_err("avcodec_receive_packet failed", ret));
av_frame_unref(ctx->dec_frame);
return -1;
}
if (ctx->enc_pkt->size > 0) {
uint8_t *buf = (uint8_t*)malloc(ctx->enc_pkt->size);
if (!buf) {
set_last_error("malloc for output packet failed");
av_packet_unref(ctx->enc_pkt);
av_frame_unref(ctx->dec_frame);
return -1;
}
memcpy(buf, ctx->enc_pkt->data, ctx->enc_pkt->size);
*out_data = buf;
*out_len = ctx->enc_pkt->size;
*out_keyframe = (ctx->enc_pkt->flags & AV_PKT_FLAG_KEY) ? 1 : 0;
av_packet_unref(ctx->enc_pkt);
av_frame_unref(ctx->dec_frame);
return 1;
}
av_frame_unref(ctx->dec_frame);
}
}
extern "C" int ffmpeg_hw_mjpeg_h264_reconfigure(FfmpegHwMjpegH264* handle,
int bitrate_kbps,
int gop) {
if (!handle) {
set_last_error("Invalid handle for reconfigure");
return -1;
}
auto *ctx = reinterpret_cast<FfmpegHwMjpegH264Ctx*>(handle);
if (!ctx->enc_ctx || !ctx->hw_frames_ctx) {
set_last_error("Encoder not initialized for reconfigure");
return -1;
}
ctx->bitrate_kbps = bitrate_kbps > 0 ? bitrate_kbps : ctx->bitrate_kbps;
ctx->gop = gop > 0 ? gop : ctx->gop;
AVBufferRef *frames_ref = ctx->hw_frames_ctx ? av_buffer_ref(ctx->hw_frames_ctx) : nullptr;
free_encoder(ctx);
if (init_encoder(ctx, frames_ref) != 0) {
if (frames_ref) av_buffer_unref(&frames_ref);
return -1;
}
if (frames_ref) av_buffer_unref(&frames_ref);
return 0;
}
extern "C" int ffmpeg_hw_mjpeg_h264_request_keyframe(FfmpegHwMjpegH264* handle) {
if (!handle) {
set_last_error("Invalid handle for request_keyframe");
return -1;
}
auto *ctx = reinterpret_cast<FfmpegHwMjpegH264Ctx*>(handle);
ctx->force_keyframe = true;
return 0;
}
extern "C" void ffmpeg_hw_mjpeg_h264_free(FfmpegHwMjpegH264* handle) {
auto *ctx = reinterpret_cast<FfmpegHwMjpegH264Ctx*>(handle);
if (!ctx) return;
if (ctx->dec_pkt) av_packet_free(&ctx->dec_pkt);
if (ctx->dec_frame) av_frame_free(&ctx->dec_frame);
if (ctx->enc_pkt) av_packet_free(&ctx->enc_pkt);
if (ctx->dec_ctx) avcodec_free_context(&ctx->dec_ctx);
free_encoder(ctx);
if (ctx->hw_device_ctx) av_buffer_unref(&ctx->hw_device_ctx);
delete ctx;
}
extern "C" void ffmpeg_hw_packet_free(uint8_t* data) {
if (data) {
free(data);
}
}
extern "C" const char* ffmpeg_hw_last_error(void) {
return g_last_error.c_str();
}

View File

@@ -0,0 +1,118 @@
#![allow(non_upper_case_globals)]
#![allow(non_camel_case_types)]
#![allow(non_snake_case)]
use std::{
ffi::{CStr, CString},
os::raw::c_int,
};
include!(concat!(env!("OUT_DIR"), "/ffmpeg_hw_ffi.rs"));
#[derive(Debug, Clone)]
pub struct HwMjpegH264Config {
pub decoder: String,
pub encoder: String,
pub width: i32,
pub height: i32,
pub fps: i32,
pub bitrate_kbps: i32,
pub gop: i32,
pub thread_count: i32,
}
pub struct HwMjpegH264Pipeline {
ctx: *mut FfmpegHwMjpegH264,
config: HwMjpegH264Config,
}
unsafe impl Send for HwMjpegH264Pipeline {}
impl HwMjpegH264Pipeline {
pub fn new(config: HwMjpegH264Config) -> Result<Self, String> {
unsafe {
let dec = CString::new(config.decoder.as_str()).map_err(|_| "decoder name invalid".to_string())?;
let enc = CString::new(config.encoder.as_str()).map_err(|_| "encoder name invalid".to_string())?;
let ctx = ffmpeg_hw_mjpeg_h264_new(
dec.as_ptr(),
enc.as_ptr(),
config.width,
config.height,
config.fps,
config.bitrate_kbps,
config.gop,
config.thread_count,
);
if ctx.is_null() {
return Err(last_error_message());
}
Ok(Self { ctx, config })
}
}
pub fn encode(&mut self, data: &[u8], pts_ms: i64) -> Result<Option<(Vec<u8>, bool)>, String> {
unsafe {
let mut out_data: *mut u8 = std::ptr::null_mut();
let mut out_len: c_int = 0;
let mut out_key: c_int = 0;
let ret = ffmpeg_hw_mjpeg_h264_encode(
self.ctx,
data.as_ptr(),
data.len() as c_int,
pts_ms,
&mut out_data,
&mut out_len,
&mut out_key,
);
if ret < 0 {
return Err(last_error_message());
}
if out_data.is_null() || out_len == 0 {
return Ok(None);
}
let slice = std::slice::from_raw_parts(out_data, out_len as usize);
let mut vec = Vec::with_capacity(slice.len());
vec.extend_from_slice(slice);
ffmpeg_hw_packet_free(out_data);
Ok(Some((vec, out_key != 0)))
}
}
pub fn reconfigure(&mut self, bitrate_kbps: i32, gop: i32) -> Result<(), String> {
unsafe {
let ret = ffmpeg_hw_mjpeg_h264_reconfigure(self.ctx, bitrate_kbps, gop);
if ret != 0 {
return Err(last_error_message());
}
self.config.bitrate_kbps = bitrate_kbps;
self.config.gop = gop;
Ok(())
}
}
pub fn request_keyframe(&mut self) {
unsafe {
let _ = ffmpeg_hw_mjpeg_h264_request_keyframe(self.ctx);
}
}
}
impl Drop for HwMjpegH264Pipeline {
fn drop(&mut self) {
unsafe {
ffmpeg_hw_mjpeg_h264_free(self.ctx);
}
self.ctx = std::ptr::null_mut();
}
}
pub fn last_error_message() -> String {
unsafe {
let ptr = ffmpeg_hw_last_error();
if ptr.is_null() {
return String::new();
}
let cstr = CStr::from_ptr(ptr);
cstr.to_string_lossy().to_string()
}
}

View File

@@ -1,5 +1,7 @@
pub mod common; pub mod common;
pub mod ffmpeg; pub mod ffmpeg;
#[cfg(any(target_arch = "aarch64", target_arch = "arm", feature = "rkmpp"))]
pub mod ffmpeg_hw;
pub mod ffmpeg_ram; pub mod ffmpeg_ram;
#[no_mangle] #[no_mangle]

View File

@@ -117,21 +117,11 @@ pub enum CaptureState {
Error, Error,
} }
/// Audio capture statistics
#[derive(Debug, Clone, Default)]
pub struct AudioStats {
pub frames_captured: u64,
pub frames_dropped: u64,
pub buffer_overruns: u64,
pub current_latency_ms: f32,
}
/// ALSA audio capturer /// ALSA audio capturer
pub struct AudioCapturer { pub struct AudioCapturer {
config: AudioConfig, config: AudioConfig,
state: Arc<watch::Sender<CaptureState>>, state: Arc<watch::Sender<CaptureState>>,
state_rx: watch::Receiver<CaptureState>, state_rx: watch::Receiver<CaptureState>,
stats: Arc<Mutex<AudioStats>>,
frame_tx: broadcast::Sender<AudioFrame>, frame_tx: broadcast::Sender<AudioFrame>,
stop_flag: Arc<AtomicBool>, stop_flag: Arc<AtomicBool>,
sequence: Arc<AtomicU64>, sequence: Arc<AtomicU64>,
@@ -150,7 +140,6 @@ impl AudioCapturer {
config, config,
state: Arc::new(state_tx), state: Arc::new(state_tx),
state_rx, state_rx,
stats: Arc::new(Mutex::new(AudioStats::default())),
frame_tx, frame_tx,
stop_flag: Arc::new(AtomicBool::new(false)), stop_flag: Arc::new(AtomicBool::new(false)),
sequence: Arc::new(AtomicU64::new(0)), sequence: Arc::new(AtomicU64::new(0)),
@@ -174,11 +163,6 @@ impl AudioCapturer {
self.frame_tx.subscribe() self.frame_tx.subscribe()
} }
/// Get statistics
pub async fn stats(&self) -> AudioStats {
self.stats.lock().await.clone()
}
/// Start capturing /// Start capturing
pub async fn start(&self) -> Result<()> { pub async fn start(&self) -> Result<()> {
if self.state() == CaptureState::Running { if self.state() == CaptureState::Running {
@@ -194,7 +178,6 @@ impl AudioCapturer {
let config = self.config.clone(); let config = self.config.clone();
let state = self.state.clone(); let state = self.state.clone();
let stats = self.stats.clone();
let frame_tx = self.frame_tx.clone(); let frame_tx = self.frame_tx.clone();
let stop_flag = self.stop_flag.clone(); let stop_flag = self.stop_flag.clone();
let sequence = self.sequence.clone(); let sequence = self.sequence.clone();
@@ -204,7 +187,6 @@ impl AudioCapturer {
capture_loop( capture_loop(
config, config,
state, state,
stats,
frame_tx, frame_tx,
stop_flag, stop_flag,
sequence, sequence,
@@ -239,7 +221,6 @@ impl AudioCapturer {
fn capture_loop( fn capture_loop(
config: AudioConfig, config: AudioConfig,
state: Arc<watch::Sender<CaptureState>>, state: Arc<watch::Sender<CaptureState>>,
stats: Arc<Mutex<AudioStats>>,
frame_tx: broadcast::Sender<AudioFrame>, frame_tx: broadcast::Sender<AudioFrame>,
stop_flag: Arc<AtomicBool>, stop_flag: Arc<AtomicBool>,
sequence: Arc<AtomicU64>, sequence: Arc<AtomicU64>,
@@ -248,7 +229,6 @@ fn capture_loop(
let result = run_capture( let result = run_capture(
&config, &config,
&state, &state,
&stats,
&frame_tx, &frame_tx,
&stop_flag, &stop_flag,
&sequence, &sequence,
@@ -266,7 +246,6 @@ fn capture_loop(
fn run_capture( fn run_capture(
config: &AudioConfig, config: &AudioConfig,
state: &watch::Sender<CaptureState>, state: &watch::Sender<CaptureState>,
stats: &Arc<Mutex<AudioStats>>,
frame_tx: &broadcast::Sender<AudioFrame>, frame_tx: &broadcast::Sender<AudioFrame>,
stop_flag: &AtomicBool, stop_flag: &AtomicBool,
sequence: &AtomicU64, sequence: &AtomicU64,
@@ -334,9 +313,6 @@ fn run_capture(
match pcm.state() { match pcm.state() {
State::XRun => { State::XRun => {
warn_throttled!(log_throttler, "xrun", "Audio buffer overrun, recovering"); warn_throttled!(log_throttler, "xrun", "Audio buffer overrun, recovering");
if let Ok(mut s) = stats.try_lock() {
s.buffer_overruns += 1;
}
let _ = pcm.prepare(); let _ = pcm.prepare();
continue; continue;
} }
@@ -377,11 +353,6 @@ fn run_capture(
debug!("No audio receivers: {}", e); debug!("No audio receivers: {}", e);
} }
} }
// Update stats
if let Ok(mut s) = stats.try_lock() {
s.frames_captured += 1;
}
} }
Err(e) => { Err(e) => {
// Check for buffer overrun (EPIPE = 32 on Linux) // Check for buffer overrun (EPIPE = 32 on Linux)
@@ -389,21 +360,12 @@ fn run_capture(
if desc.contains("EPIPE") || desc.contains("Broken pipe") { if desc.contains("EPIPE") || desc.contains("Broken pipe") {
// Buffer overrun // Buffer overrun
warn_throttled!(log_throttler, "buffer_overrun", "Audio buffer overrun"); warn_throttled!(log_throttler, "buffer_overrun", "Audio buffer overrun");
if let Ok(mut s) = stats.try_lock() {
s.buffer_overruns += 1;
}
let _ = pcm.prepare(); let _ = pcm.prepare();
} else if desc.contains("No such device") || desc.contains("ENODEV") { } else if desc.contains("No such device") || desc.contains("ENODEV") {
// Device disconnected - use longer throttle for this // Device disconnected - use longer throttle for this
error_throttled!(log_throttler, "no_device", "Audio read error: {}", e); error_throttled!(log_throttler, "no_device", "Audio read error: {}", e);
if let Ok(mut s) = stats.try_lock() {
s.frames_dropped += 1;
}
} else { } else {
error_throttled!(log_throttler, "read_error", "Audio read error: {}", e); error_throttled!(log_throttler, "read_error", "Audio read error: {}", e);
if let Ok(mut s) = stats.try_lock() {
s.frames_dropped += 1;
}
} }
} }
} }

View File

@@ -4,7 +4,7 @@
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use std::sync::Arc; use std::sync::Arc;
use tokio::sync::{broadcast, RwLock}; use tokio::sync::RwLock;
use tracing::info; use tracing::info;
use super::capture::AudioConfig; use super::capture::AudioConfig;
@@ -104,10 +104,6 @@ pub struct AudioStatus {
pub quality: AudioQuality, pub quality: AudioQuality,
/// Number of connected subscribers /// Number of connected subscribers
pub subscriber_count: usize, pub subscriber_count: usize,
/// Frames encoded
pub frames_encoded: u64,
/// Bytes output
pub bytes_output: u64,
/// Error message if any /// Error message if any
pub error: Option<String>, pub error: Option<String>,
} }
@@ -352,16 +348,10 @@ impl AudioController {
let streaming = self.is_streaming().await; let streaming = self.is_streaming().await;
let error = self.last_error.read().await.clone(); let error = self.last_error.read().await.clone();
let (subscriber_count, frames_encoded, bytes_output) = let subscriber_count = if let Some(ref streamer) = *self.streamer.read().await {
if let Some(ref streamer) = *self.streamer.read().await { streamer.stats().await.subscriber_count
let stats = streamer.stats().await;
(
stats.subscriber_count,
stats.frames_encoded,
stats.bytes_output,
)
} else { } else {
(0, 0, 0) 0
}; };
AudioStatus { AudioStatus {
@@ -374,14 +364,12 @@ impl AudioController {
}, },
quality: config.quality, quality: config.quality,
subscriber_count, subscriber_count,
frames_encoded,
bytes_output,
error, error,
} }
} }
/// Subscribe to Opus frames (for WebSocket clients) /// Subscribe to Opus frames (for WebSocket clients)
pub fn subscribe_opus(&self) -> Option<broadcast::Receiver<OpusFrame>> { pub fn subscribe_opus(&self) -> Option<tokio::sync::watch::Receiver<Option<Arc<OpusFrame>>>> {
// Use try_read to avoid blocking - this is called from sync context sometimes // Use try_read to avoid blocking - this is called from sync context sometimes
if let Ok(guard) = self.streamer.try_read() { if let Ok(guard) = self.streamer.try_read() {
guard.as_ref().map(|s| s.subscribe_opus()) guard.as_ref().map(|s| s.subscribe_opus())
@@ -391,7 +379,9 @@ impl AudioController {
} }
/// Subscribe to Opus frames (async version) /// Subscribe to Opus frames (async version)
pub async fn subscribe_opus_async(&self) -> Option<broadcast::Receiver<OpusFrame>> { pub async fn subscribe_opus_async(
&self,
) -> Option<tokio::sync::watch::Receiver<Option<Arc<OpusFrame>>>> {
self.streamer self.streamer
.read() .read()
.await .await

View File

@@ -6,7 +6,6 @@
//! - Audio device enumeration //! - Audio device enumeration
//! - Audio streaming pipeline //! - Audio streaming pipeline
//! - High-level audio controller //! - High-level audio controller
//! - Shared audio pipeline for WebRTC multi-session support
//! - Device health monitoring //! - Device health monitoring
pub mod capture; pub mod capture;
@@ -14,7 +13,6 @@ pub mod controller;
pub mod device; pub mod device;
pub mod encoder; pub mod encoder;
pub mod monitor; pub mod monitor;
pub mod shared_pipeline;
pub mod streamer; pub mod streamer;
pub use capture::{AudioCapturer, AudioConfig, AudioFrame}; pub use capture::{AudioCapturer, AudioConfig, AudioFrame};
@@ -22,7 +20,4 @@ pub use controller::{AudioController, AudioControllerConfig, AudioQuality, Audio
pub use device::{enumerate_audio_devices, enumerate_audio_devices_with_current, AudioDeviceInfo}; pub use device::{enumerate_audio_devices, enumerate_audio_devices_with_current, AudioDeviceInfo};
pub use encoder::{OpusConfig, OpusEncoder, OpusFrame}; pub use encoder::{OpusConfig, OpusEncoder, OpusFrame};
pub use monitor::{AudioHealthMonitor, AudioHealthStatus, AudioMonitorConfig}; pub use monitor::{AudioHealthMonitor, AudioHealthStatus, AudioMonitorConfig};
pub use shared_pipeline::{
SharedAudioPipeline, SharedAudioPipelineConfig, SharedAudioPipelineStats,
};
pub use streamer::{AudioStreamState, AudioStreamer, AudioStreamerConfig}; pub use streamer::{AudioStreamState, AudioStreamer, AudioStreamerConfig};

View File

@@ -1,450 +0,0 @@
//! Shared Audio Pipeline for WebRTC
//!
//! This module provides a shared audio encoding pipeline that can serve
//! multiple WebRTC sessions with a single encoder instance.
//!
//! # Architecture
//!
//! ```text
//! AudioCapturer (ALSA)
//! |
//! v (broadcast::Receiver<AudioFrame>)
//! SharedAudioPipeline (single Opus encoder)
//! |
//! v (broadcast::Sender<OpusFrame>)
//! ┌────┴────┬────────┬────────┐
//! v v v v
//! Session1 Session2 Session3 ...
//! (RTP) (RTP) (RTP) (RTP)
//! ```
//!
//! # Key Features
//!
//! - **Single encoder**: All sessions share one Opus encoder
//! - **Broadcast distribution**: Encoded frames are broadcast to all subscribers
//! - **Dynamic bitrate**: Bitrate can be changed at runtime
//! - **Statistics**: Tracks encoding performance metrics
use std::sync::atomic::{AtomicBool, AtomicU64, Ordering};
use std::sync::Arc;
use std::time::Instant;
use tokio::sync::{broadcast, Mutex, RwLock};
use tracing::{debug, error, info, trace, warn};
use super::capture::AudioFrame;
use super::encoder::{OpusConfig, OpusEncoder, OpusFrame};
use crate::error::{AppError, Result};
/// Shared audio pipeline configuration
#[derive(Debug, Clone)]
pub struct SharedAudioPipelineConfig {
/// Sample rate (must match audio capture)
pub sample_rate: u32,
/// Number of channels (1 or 2)
pub channels: u32,
/// Target bitrate in bps
pub bitrate: u32,
/// Opus application mode
pub application: OpusApplicationMode,
/// Enable forward error correction
pub fec: bool,
/// Broadcast channel capacity
pub channel_capacity: usize,
}
impl Default for SharedAudioPipelineConfig {
fn default() -> Self {
Self {
sample_rate: 48000,
channels: 2,
bitrate: 64000,
application: OpusApplicationMode::Audio,
fec: true,
channel_capacity: 16, // Reduced from 64 for lower latency
}
}
}
impl SharedAudioPipelineConfig {
/// Create config optimized for voice
pub fn voice() -> Self {
Self {
bitrate: 32000,
application: OpusApplicationMode::Voip,
..Default::default()
}
}
/// Create config optimized for music/high quality
pub fn high_quality() -> Self {
Self {
bitrate: 128000,
application: OpusApplicationMode::Audio,
..Default::default()
}
}
/// Convert to OpusConfig
pub fn to_opus_config(&self) -> OpusConfig {
OpusConfig {
sample_rate: self.sample_rate,
channels: self.channels,
bitrate: self.bitrate,
application: match self.application {
OpusApplicationMode::Voip => super::encoder::OpusApplication::Voip,
OpusApplicationMode::Audio => super::encoder::OpusApplication::Audio,
OpusApplicationMode::LowDelay => super::encoder::OpusApplication::LowDelay,
},
fec: self.fec,
}
}
}
/// Opus application mode
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
pub enum OpusApplicationMode {
/// Voice over IP - optimized for speech
Voip,
/// General audio - balanced quality
Audio,
/// Low delay mode - minimal latency
LowDelay,
}
/// Shared audio pipeline statistics
#[derive(Debug, Clone, Default)]
pub struct SharedAudioPipelineStats {
/// Frames received from audio capture
pub frames_received: u64,
/// Frames successfully encoded
pub frames_encoded: u64,
/// Frames dropped (encode errors)
pub frames_dropped: u64,
/// Total bytes encoded
pub bytes_encoded: u64,
/// Number of active subscribers
pub subscribers: u64,
/// Average encode time in milliseconds
pub avg_encode_time_ms: f32,
/// Current bitrate in bps
pub current_bitrate: u32,
/// Pipeline running time in seconds
pub running_time_secs: f64,
}
/// Shared Audio Pipeline
///
/// Provides a single Opus encoder that serves multiple WebRTC sessions.
/// All sessions receive the same encoded audio stream via broadcast channel.
pub struct SharedAudioPipeline {
/// Configuration
config: RwLock<SharedAudioPipelineConfig>,
/// Opus encoder (protected by mutex for encoding)
encoder: Mutex<Option<OpusEncoder>>,
/// Broadcast sender for encoded Opus frames
opus_tx: broadcast::Sender<OpusFrame>,
/// Running state
running: AtomicBool,
/// Statistics
stats: Mutex<SharedAudioPipelineStats>,
/// Start time for running time calculation
start_time: RwLock<Option<Instant>>,
/// Encode time accumulator for averaging
encode_time_sum_us: AtomicU64,
/// Encode count for averaging
encode_count: AtomicU64,
/// Stop signal (atomic for lock-free checking)
stop_flag: AtomicBool,
/// Encoding task handle
task_handle: Mutex<Option<tokio::task::JoinHandle<()>>>,
}
impl SharedAudioPipeline {
/// Create a new shared audio pipeline
pub fn new(config: SharedAudioPipelineConfig) -> Result<Arc<Self>> {
let (opus_tx, _) = broadcast::channel(config.channel_capacity);
Ok(Arc::new(Self {
config: RwLock::new(config),
encoder: Mutex::new(None),
opus_tx,
running: AtomicBool::new(false),
stats: Mutex::new(SharedAudioPipelineStats::default()),
start_time: RwLock::new(None),
encode_time_sum_us: AtomicU64::new(0),
encode_count: AtomicU64::new(0),
stop_flag: AtomicBool::new(false),
task_handle: Mutex::new(None),
}))
}
/// Create with default configuration
pub fn default_config() -> Result<Arc<Self>> {
Self::new(SharedAudioPipelineConfig::default())
}
/// Start the audio encoding pipeline
///
/// # Arguments
/// * `audio_rx` - Receiver for raw audio frames from AudioCapturer
pub async fn start(self: &Arc<Self>, audio_rx: broadcast::Receiver<AudioFrame>) -> Result<()> {
if self.running.load(Ordering::SeqCst) {
return Ok(());
}
let config = self.config.read().await.clone();
info!(
"Starting shared audio pipeline: {}Hz {}ch {}bps",
config.sample_rate, config.channels, config.bitrate
);
// Create encoder
let opus_config = config.to_opus_config();
let encoder = OpusEncoder::new(opus_config)?;
*self.encoder.lock().await = Some(encoder);
// Reset stats
{
let mut stats = self.stats.lock().await;
*stats = SharedAudioPipelineStats::default();
stats.current_bitrate = config.bitrate;
}
// Reset counters
self.encode_time_sum_us.store(0, Ordering::SeqCst);
self.encode_count.store(0, Ordering::SeqCst);
*self.start_time.write().await = Some(Instant::now());
self.stop_flag.store(false, Ordering::SeqCst);
self.running.store(true, Ordering::SeqCst);
// Start encoding task
let pipeline = self.clone();
let handle = tokio::spawn(async move {
pipeline.encoding_task(audio_rx).await;
});
*self.task_handle.lock().await = Some(handle);
info!("Shared audio pipeline started");
Ok(())
}
/// Stop the audio encoding pipeline
pub fn stop(&self) {
if !self.running.load(Ordering::SeqCst) {
return;
}
info!("Stopping shared audio pipeline");
// Signal stop (atomic, no lock needed)
self.stop_flag.store(true, Ordering::SeqCst);
self.running.store(false, Ordering::SeqCst);
}
/// Check if pipeline is running
pub fn is_running(&self) -> bool {
self.running.load(Ordering::SeqCst)
}
/// Subscribe to encoded Opus frames
pub fn subscribe(&self) -> broadcast::Receiver<OpusFrame> {
self.opus_tx.subscribe()
}
/// Get number of active subscribers
pub fn subscriber_count(&self) -> usize {
self.opus_tx.receiver_count()
}
/// Get current statistics
pub async fn stats(&self) -> SharedAudioPipelineStats {
let mut stats = self.stats.lock().await.clone();
stats.subscribers = self.subscriber_count() as u64;
// Calculate average encode time
let count = self.encode_count.load(Ordering::SeqCst);
if count > 0 {
let sum_us = self.encode_time_sum_us.load(Ordering::SeqCst);
stats.avg_encode_time_ms = (sum_us as f64 / count as f64 / 1000.0) as f32;
}
// Calculate running time
if let Some(start) = *self.start_time.read().await {
stats.running_time_secs = start.elapsed().as_secs_f64();
}
stats
}
/// Set bitrate dynamically
pub async fn set_bitrate(&self, bitrate: u32) -> Result<()> {
// Update config
self.config.write().await.bitrate = bitrate;
// Update encoder if running
if let Some(ref mut encoder) = *self.encoder.lock().await {
encoder.set_bitrate(bitrate)?;
}
// Update stats
self.stats.lock().await.current_bitrate = bitrate;
info!("Shared audio pipeline bitrate changed to {}bps", bitrate);
Ok(())
}
/// Update configuration (requires restart)
pub async fn update_config(&self, config: SharedAudioPipelineConfig) -> Result<()> {
if self.is_running() {
return Err(AppError::AudioError(
"Cannot update config while pipeline is running".to_string(),
));
}
*self.config.write().await = config;
Ok(())
}
/// Internal encoding task
async fn encoding_task(self: Arc<Self>, mut audio_rx: broadcast::Receiver<AudioFrame>) {
info!("Audio encoding task started");
loop {
// Check stop flag (atomic, no async lock needed)
if self.stop_flag.load(Ordering::Relaxed) {
break;
}
// Receive audio frame with timeout
let recv_result =
tokio::time::timeout(std::time::Duration::from_secs(2), audio_rx.recv()).await;
match recv_result {
Ok(Ok(audio_frame)) => {
// Update received count
{
let mut stats = self.stats.lock().await;
stats.frames_received += 1;
}
// Encode frame
let encode_start = Instant::now();
let encode_result = {
let mut encoder_guard = self.encoder.lock().await;
if let Some(ref mut encoder) = *encoder_guard {
Some(encoder.encode_frame(&audio_frame))
} else {
None
}
};
let encode_time = encode_start.elapsed();
// Update encode time stats
self.encode_time_sum_us
.fetch_add(encode_time.as_micros() as u64, Ordering::SeqCst);
self.encode_count.fetch_add(1, Ordering::SeqCst);
match encode_result {
Some(Ok(opus_frame)) => {
// Update stats
{
let mut stats = self.stats.lock().await;
stats.frames_encoded += 1;
stats.bytes_encoded += opus_frame.data.len() as u64;
}
// Broadcast to subscribers
if self.opus_tx.receiver_count() > 0 {
if let Err(e) = self.opus_tx.send(opus_frame) {
trace!("No audio subscribers: {}", e);
}
}
}
Some(Err(e)) => {
error!("Opus encode error: {}", e);
let mut stats = self.stats.lock().await;
stats.frames_dropped += 1;
}
None => {
warn!("Encoder not available");
break;
}
}
}
Ok(Err(broadcast::error::RecvError::Closed)) => {
info!("Audio source channel closed");
break;
}
Ok(Err(broadcast::error::RecvError::Lagged(n))) => {
warn!("Audio pipeline lagged by {} frames", n);
let mut stats = self.stats.lock().await;
stats.frames_dropped += n;
}
Err(_) => {
// Timeout - check if still running
if !self.running.load(Ordering::SeqCst) {
break;
}
debug!("Audio receive timeout, continuing...");
}
}
}
// Cleanup
self.running.store(false, Ordering::SeqCst);
*self.encoder.lock().await = None;
let stats = self.stats().await;
info!(
"Audio encoding task ended: {} frames encoded, {} dropped, {:.1}s runtime",
stats.frames_encoded, stats.frames_dropped, stats.running_time_secs
);
}
}
impl Drop for SharedAudioPipeline {
fn drop(&mut self) {
self.stop();
}
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_config_default() {
let config = SharedAudioPipelineConfig::default();
assert_eq!(config.sample_rate, 48000);
assert_eq!(config.channels, 2);
assert_eq!(config.bitrate, 64000);
}
#[test]
fn test_config_voice() {
let config = SharedAudioPipelineConfig::voice();
assert_eq!(config.bitrate, 32000);
assert_eq!(config.application, OpusApplicationMode::Voip);
}
#[test]
fn test_config_high_quality() {
let config = SharedAudioPipelineConfig::high_quality();
assert_eq!(config.bitrate, 128000);
}
#[tokio::test]
async fn test_pipeline_creation() {
let config = SharedAudioPipelineConfig::default();
let pipeline = SharedAudioPipeline::new(config);
assert!(pipeline.is_ok());
let pipeline = pipeline.unwrap();
assert!(!pipeline.is_running());
assert_eq!(pipeline.subscriber_count(), 0);
}
}

View File

@@ -7,7 +7,7 @@ use std::sync::atomic::{AtomicBool, AtomicU64, Ordering};
use std::sync::Arc; use std::sync::Arc;
use std::time::Instant; use std::time::Instant;
use tokio::sync::{broadcast, watch, Mutex, RwLock}; use tokio::sync::{broadcast, watch, Mutex, RwLock};
use tracing::{error, info, trace, warn}; use tracing::{error, info, warn};
use super::capture::{AudioCapturer, AudioConfig, CaptureState}; use super::capture::{AudioCapturer, AudioConfig, CaptureState};
use super::encoder::{OpusConfig, OpusEncoder, OpusFrame}; use super::encoder::{OpusConfig, OpusEncoder, OpusFrame};
@@ -72,18 +72,9 @@ impl AudioStreamerConfig {
/// Audio stream statistics /// Audio stream statistics
#[derive(Debug, Clone, Default)] #[derive(Debug, Clone, Default)]
pub struct AudioStreamStats { pub struct AudioStreamStats {
/// Frames captured from ALSA
pub frames_captured: u64,
/// Frames encoded to Opus /// Frames encoded to Opus
pub frames_encoded: u64,
/// Total bytes output (Opus)
pub bytes_output: u64,
/// Current encoding bitrate
pub current_bitrate: u32,
/// Number of active subscribers /// Number of active subscribers
pub subscriber_count: usize, pub subscriber_count: usize,
/// Buffer overruns
pub buffer_overruns: u64,
} }
/// Audio streamer /// Audio streamer
@@ -95,7 +86,7 @@ pub struct AudioStreamer {
state_rx: watch::Receiver<AudioStreamState>, state_rx: watch::Receiver<AudioStreamState>,
capturer: RwLock<Option<Arc<AudioCapturer>>>, capturer: RwLock<Option<Arc<AudioCapturer>>>,
encoder: Arc<Mutex<Option<OpusEncoder>>>, encoder: Arc<Mutex<Option<OpusEncoder>>>,
opus_tx: broadcast::Sender<OpusFrame>, opus_tx: watch::Sender<Option<Arc<OpusFrame>>>,
stats: Arc<Mutex<AudioStreamStats>>, stats: Arc<Mutex<AudioStreamStats>>,
sequence: AtomicU64, sequence: AtomicU64,
stream_start_time: RwLock<Option<Instant>>, stream_start_time: RwLock<Option<Instant>>,
@@ -111,7 +102,7 @@ impl AudioStreamer {
/// Create a new audio streamer with specified configuration /// Create a new audio streamer with specified configuration
pub fn with_config(config: AudioStreamerConfig) -> Self { pub fn with_config(config: AudioStreamerConfig) -> Self {
let (state_tx, state_rx) = watch::channel(AudioStreamState::Stopped); let (state_tx, state_rx) = watch::channel(AudioStreamState::Stopped);
let (opus_tx, _) = broadcast::channel(16); // Buffer size 16 for low latency let (opus_tx, _opus_rx) = watch::channel(None);
Self { Self {
config: RwLock::new(config), config: RwLock::new(config),
@@ -138,7 +129,7 @@ impl AudioStreamer {
} }
/// Subscribe to Opus frames /// Subscribe to Opus frames
pub fn subscribe_opus(&self) -> broadcast::Receiver<OpusFrame> { pub fn subscribe_opus(&self) -> watch::Receiver<Option<Arc<OpusFrame>>> {
self.opus_tx.subscribe() self.opus_tx.subscribe()
} }
@@ -175,9 +166,6 @@ impl AudioStreamer {
encoder.set_bitrate(bitrate)?; encoder.set_bitrate(bitrate)?;
} }
// Update stats
self.stats.lock().await.current_bitrate = bitrate;
info!("Audio bitrate changed to {}bps", bitrate); info!("Audio bitrate changed to {}bps", bitrate);
Ok(()) Ok(())
} }
@@ -216,7 +204,6 @@ impl AudioStreamer {
{ {
let mut stats = self.stats.lock().await; let mut stats = self.stats.lock().await;
*stats = AudioStreamStats::default(); *stats = AudioStreamStats::default();
stats.current_bitrate = config.opus.bitrate;
} }
// Record start time // Record start time
@@ -227,12 +214,11 @@ impl AudioStreamer {
let capturer_for_task = capturer.clone(); let capturer_for_task = capturer.clone();
let encoder = self.encoder.clone(); let encoder = self.encoder.clone();
let opus_tx = self.opus_tx.clone(); let opus_tx = self.opus_tx.clone();
let stats = self.stats.clone();
let state = self.state.clone(); let state = self.state.clone();
let stop_flag = self.stop_flag.clone(); let stop_flag = self.stop_flag.clone();
tokio::spawn(async move { tokio::spawn(async move {
Self::stream_task(capturer_for_task, encoder, opus_tx, stats, state, stop_flag).await; Self::stream_task(capturer_for_task, encoder, opus_tx, state, stop_flag).await;
}); });
Ok(()) Ok(())
@@ -273,8 +259,7 @@ impl AudioStreamer {
async fn stream_task( async fn stream_task(
capturer: Arc<AudioCapturer>, capturer: Arc<AudioCapturer>,
encoder: Arc<Mutex<Option<OpusEncoder>>>, encoder: Arc<Mutex<Option<OpusEncoder>>>,
opus_tx: broadcast::Sender<OpusFrame>, opus_tx: watch::Sender<Option<Arc<OpusFrame>>>,
stats: Arc<Mutex<AudioStreamStats>>,
state: watch::Sender<AudioStreamState>, state: watch::Sender<AudioStreamState>,
stop_flag: Arc<AtomicBool>, stop_flag: Arc<AtomicBool>,
) { ) {
@@ -302,12 +287,6 @@ impl AudioStreamer {
match recv_result { match recv_result {
Ok(Ok(audio_frame)) => { Ok(Ok(audio_frame)) => {
// Update capture stats
{
let mut s = stats.lock().await;
s.frames_captured += 1;
}
// Encode to Opus // Encode to Opus
let opus_result = { let opus_result = {
let mut enc_guard = encoder.lock().await; let mut enc_guard = encoder.lock().await;
@@ -320,18 +299,9 @@ impl AudioStreamer {
match opus_result { match opus_result {
Some(Ok(opus_frame)) => { Some(Ok(opus_frame)) => {
// Update stats // Publish latest frame to subscribers
{
let mut s = stats.lock().await;
s.frames_encoded += 1;
s.bytes_output += opus_frame.data.len() as u64;
}
// Broadcast to subscribers
if opus_tx.receiver_count() > 0 { if opus_tx.receiver_count() > 0 {
if let Err(e) = opus_tx.send(opus_frame) { let _ = opus_tx.send(Some(Arc::new(opus_frame)));
trace!("No audio subscribers: {}", e);
}
} }
} }
Some(Err(e)) => { Some(Err(e)) => {
@@ -349,8 +319,6 @@ impl AudioStreamer {
} }
Ok(Err(broadcast::error::RecvError::Lagged(n))) => { Ok(Err(broadcast::error::RecvError::Lagged(n))) => {
warn!("Audio receiver lagged by {} frames", n); warn!("Audio receiver lagged by {} frames", n);
let mut s = stats.lock().await;
s.buffer_overruns += n;
} }
Err(_) => { Err(_) => {
// Timeout - check if still capturing // Timeout - check if still capturing

View File

@@ -2,20 +2,18 @@ use axum::{
extract::{Request, State}, extract::{Request, State},
http::StatusCode, http::StatusCode,
middleware::Next, middleware::Next,
response::Response, response::{IntoResponse, Response},
Json,
}; };
use axum_extra::extract::CookieJar; use axum_extra::extract::CookieJar;
use std::sync::Arc; use std::sync::Arc;
use crate::error::ErrorResponse;
use crate::state::AppState; use crate::state::AppState;
/// Session cookie name /// Session cookie name
pub const SESSION_COOKIE: &str = "one_kvm_session"; pub const SESSION_COOKIE: &str = "one_kvm_session";
/// Auth layer for extracting session from request
#[derive(Clone)]
pub struct AuthLayer;
/// Extract session ID from request /// Extract session ID from request
pub fn extract_session_id(cookies: &CookieJar, headers: &axum::http::HeaderMap) -> Option<String> { pub fn extract_session_id(cookies: &CookieJar, headers: &axum::http::HeaderMap) -> Option<String> {
// First try cookie // First try cookie
@@ -69,9 +67,24 @@ pub async fn auth_middleware(
request.extensions_mut().insert(session); request.extensions_mut().insert(session);
return Ok(next.run(request).await); return Ok(next.run(request).await);
} }
let message = if state.is_session_revoked(&session_id).await {
"Logged in elsewhere"
} else {
"Session expired"
};
return Ok(unauthorized_response(message));
} }
Err(StatusCode::UNAUTHORIZED) Ok(unauthorized_response("Not authenticated"))
}
fn unauthorized_response(message: &str) -> Response {
let body = ErrorResponse {
success: false,
message: message.to_string(),
};
(StatusCode::UNAUTHORIZED, Json(body)).into_response()
} }
/// Check if endpoint is public (no auth required) /// Check if endpoint is public (no auth required)
@@ -99,47 +112,3 @@ fn is_public_endpoint(path: &str) -> bool {
|| path.ends_with(".png") || path.ends_with(".png")
|| path.ends_with(".svg") || path.ends_with(".svg")
} }
/// Require authentication - returns 401 if not authenticated
pub async fn require_auth(
State(state): State<Arc<AppState>>,
cookies: CookieJar,
request: Request,
next: Next,
) -> Result<Response, StatusCode> {
let session_id = extract_session_id(&cookies, request.headers());
if let Some(session_id) = session_id {
if let Ok(Some(_session)) = state.sessions.get(&session_id).await {
return Ok(next.run(request).await);
}
}
Err(StatusCode::UNAUTHORIZED)
}
/// Require admin privileges - returns 403 if not admin
pub async fn require_admin(
State(state): State<Arc<AppState>>,
cookies: CookieJar,
request: Request,
next: Next,
) -> Result<Response, StatusCode> {
let session_id = extract_session_id(&cookies, request.headers());
if let Some(session_id) = session_id {
if let Ok(Some(session)) = state.sessions.get(&session_id).await {
// Get user and check admin status
if let Ok(Some(user)) = state.users.get(&session.user_id).await {
if user.is_admin {
return Ok(next.run(request).await);
}
// User is authenticated but not admin
return Err(StatusCode::FORBIDDEN);
}
}
}
// Not authenticated at all
Err(StatusCode::UNAUTHORIZED)
}

View File

@@ -3,7 +3,7 @@ mod password;
mod session; mod session;
mod user; mod user;
pub use middleware::{auth_middleware, require_admin, AuthLayer, SESSION_COOKIE}; pub use middleware::{auth_middleware, SESSION_COOKIE};
pub use password::{hash_password, verify_password}; pub use password::{hash_password, verify_password};
pub use session::{Session, SessionStore}; pub use session::{Session, SessionStore};
pub use user::{User, UserStore}; pub use user::{User, UserStore};

View File

@@ -116,6 +116,22 @@ impl SessionStore {
Ok(result.rows_affected()) Ok(result.rows_affected())
} }
/// Delete all sessions
pub async fn delete_all(&self) -> Result<u64> {
let result = sqlx::query("DELETE FROM sessions")
.execute(&self.pool)
.await?;
Ok(result.rows_affected())
}
/// List all session IDs
pub async fn list_ids(&self) -> Result<Vec<String>> {
let rows: Vec<(String,)> = sqlx::query_as("SELECT id FROM sessions")
.fetch_all(&self.pool)
.await?;
Ok(rows.into_iter().map(|(id,)| id).collect())
}
/// Extend session expiration /// Extend session expiration
pub async fn extend(&self, session_id: &str) -> Result<()> { pub async fn extend(&self, session_id: &str) -> Result<()> {
let new_expires = Utc::now() + self.default_ttl; let new_expires = Utc::now() + self.default_ttl;

View File

@@ -149,6 +149,33 @@ impl UserStore {
Ok(()) Ok(())
} }
/// Update username
pub async fn update_username(&self, user_id: &str, new_username: &str) -> Result<()> {
if let Some(existing) = self.get_by_username(new_username).await? {
if existing.id != user_id {
return Err(AppError::BadRequest(format!(
"Username '{}' already exists",
new_username
)));
}
}
let now = Utc::now();
let result =
sqlx::query("UPDATE users SET username = ?1, updated_at = ?2 WHERE id = ?3")
.bind(new_username)
.bind(now.to_rfc3339())
.bind(user_id)
.execute(&self.pool)
.await?;
if result.rows_affected() == 0 {
return Err(AppError::NotFound("User not found".to_string()));
}
Ok(())
}
/// List all users /// List all users
pub async fn list(&self) -> Result<Vec<User>> { pub async fn list(&self) -> Result<Vec<User>> {
let rows: Vec<UserRow> = sqlx::query_as( let rows: Vec<UserRow> = sqlx::query_as(

View File

@@ -61,6 +61,8 @@ impl Default for AppConfig {
pub struct AuthConfig { pub struct AuthConfig {
/// Session timeout in seconds /// Session timeout in seconds
pub session_timeout_secs: u32, pub session_timeout_secs: u32,
/// Allow multiple concurrent web sessions (single-user mode)
pub single_user_allow_multiple_sessions: bool,
/// Enable 2FA /// Enable 2FA
pub totp_enabled: bool, pub totp_enabled: bool,
/// TOTP secret (encrypted) /// TOTP secret (encrypted)
@@ -71,6 +73,7 @@ impl Default for AuthConfig {
fn default() -> Self { fn default() -> Self {
Self { Self {
session_timeout_secs: 3600 * 24, // 24 hours session_timeout_secs: 3600 * 24, // 24 hours
single_user_allow_multiple_sessions: false,
totp_enabled: false, totp_enabled: false,
totp_secret: None, totp_secret: None,
} }
@@ -156,6 +159,88 @@ impl Default for OtgDescriptorConfig {
} }
} }
/// OTG HID function profile
#[typeshare]
#[derive(Debug, Clone, Serialize, Deserialize, PartialEq)]
#[serde(rename_all = "snake_case")]
pub enum OtgHidProfile {
/// Full HID device set (keyboard + relative mouse + absolute mouse + consumer control)
Full,
/// Legacy profile: only keyboard
LegacyKeyboard,
/// Legacy profile: only relative mouse
LegacyMouseRelative,
/// Custom function selection
Custom,
}
impl Default for OtgHidProfile {
fn default() -> Self {
Self::Full
}
}
/// OTG HID function selection (used when profile is Custom)
#[typeshare]
#[derive(Debug, Clone, Serialize, Deserialize, PartialEq)]
#[serde(default)]
pub struct OtgHidFunctions {
pub keyboard: bool,
pub mouse_relative: bool,
pub mouse_absolute: bool,
pub consumer: bool,
}
impl OtgHidFunctions {
pub fn full() -> Self {
Self {
keyboard: true,
mouse_relative: true,
mouse_absolute: true,
consumer: true,
}
}
pub fn legacy_keyboard() -> Self {
Self {
keyboard: true,
mouse_relative: false,
mouse_absolute: false,
consumer: false,
}
}
pub fn legacy_mouse_relative() -> Self {
Self {
keyboard: false,
mouse_relative: true,
mouse_absolute: false,
consumer: false,
}
}
pub fn is_empty(&self) -> bool {
!self.keyboard && !self.mouse_relative && !self.mouse_absolute && !self.consumer
}
}
impl Default for OtgHidFunctions {
fn default() -> Self {
Self::full()
}
}
impl OtgHidProfile {
pub fn resolve_functions(&self, custom: &OtgHidFunctions) -> OtgHidFunctions {
match self {
Self::Full => OtgHidFunctions::full(),
Self::LegacyKeyboard => OtgHidFunctions::legacy_keyboard(),
Self::LegacyMouseRelative => OtgHidFunctions::legacy_mouse_relative(),
Self::Custom => custom.clone(),
}
}
}
/// HID configuration /// HID configuration
#[typeshare] #[typeshare]
#[derive(Debug, Clone, Serialize, Deserialize, PartialEq)] #[derive(Debug, Clone, Serialize, Deserialize, PartialEq)]
@@ -172,6 +257,12 @@ pub struct HidConfig {
/// OTG USB device descriptor configuration /// OTG USB device descriptor configuration
#[serde(default)] #[serde(default)]
pub otg_descriptor: OtgDescriptorConfig, pub otg_descriptor: OtgDescriptorConfig,
/// OTG HID function profile
#[serde(default)]
pub otg_profile: OtgHidProfile,
/// OTG HID function selection (used when profile is Custom)
#[serde(default)]
pub otg_functions: OtgHidFunctions,
/// CH9329 serial port /// CH9329 serial port
pub ch9329_port: String, pub ch9329_port: String,
/// CH9329 baud rate /// CH9329 baud rate
@@ -188,6 +279,8 @@ impl Default for HidConfig {
otg_mouse: "/dev/hidg1".to_string(), otg_mouse: "/dev/hidg1".to_string(),
otg_udc: None, otg_udc: None,
otg_descriptor: OtgDescriptorConfig::default(), otg_descriptor: OtgDescriptorConfig::default(),
otg_profile: OtgHidProfile::default(),
otg_functions: OtgHidFunctions::default(),
ch9329_port: "/dev/ttyUSB0".to_string(), ch9329_port: "/dev/ttyUSB0".to_string(),
ch9329_baudrate: 9600, ch9329_baudrate: 9600,
mouse_absolute: true, mouse_absolute: true,
@@ -195,6 +288,12 @@ impl Default for HidConfig {
} }
} }
impl HidConfig {
pub fn effective_otg_functions(&self) -> OtgHidFunctions {
self.otg_profile.resolve_functions(&self.otg_functions)
}
}
/// MSD configuration /// MSD configuration
#[typeshare] #[typeshare]
#[derive(Debug, Clone, Serialize, Deserialize)] #[derive(Debug, Clone, Serialize, Deserialize)]

View File

@@ -222,6 +222,22 @@ pub enum SystemEvent {
hardware: bool, hardware: bool,
}, },
/// WebRTC ICE candidate (server -> client trickle)
#[serde(rename = "webrtc.ice_candidate")]
WebRTCIceCandidate {
/// WebRTC session ID
session_id: String,
/// ICE candidate data
candidate: crate::webrtc::signaling::IceCandidate,
},
/// WebRTC ICE gathering complete (server -> client)
#[serde(rename = "webrtc.ice_complete")]
WebRTCIceComplete {
/// WebRTC session ID
session_id: String,
},
/// Stream statistics update (sent periodically for client stats) /// Stream statistics update (sent periodically for client stats)
#[serde(rename = "stream.stats_update")] #[serde(rename = "stream.stats_update")]
StreamStatsUpdate { StreamStatsUpdate {
@@ -539,6 +555,8 @@ impl SystemEvent {
Self::StreamStatsUpdate { .. } => "stream.stats_update", Self::StreamStatsUpdate { .. } => "stream.stats_update",
Self::StreamModeChanged { .. } => "stream.mode_changed", Self::StreamModeChanged { .. } => "stream.mode_changed",
Self::StreamModeReady { .. } => "stream.mode_ready", Self::StreamModeReady { .. } => "stream.mode_ready",
Self::WebRTCIceCandidate { .. } => "webrtc.ice_candidate",
Self::WebRTCIceComplete { .. } => "webrtc.ice_complete",
Self::HidStateChanged { .. } => "hid.state_changed", Self::HidStateChanged { .. } => "hid.state_changed",
Self::HidBackendSwitching { .. } => "hid.backend_switching", Self::HidBackendSwitching { .. } => "hid.backend_switching",
Self::HidDeviceLost { .. } => "hid.device_lost", Self::HidDeviceLost { .. } => "hid.device_lost",

View File

@@ -109,13 +109,13 @@ impl LedState {
/// reopened on the next operation attempt. /// reopened on the next operation attempt.
pub struct OtgBackend { pub struct OtgBackend {
/// Keyboard device path (/dev/hidg0) /// Keyboard device path (/dev/hidg0)
keyboard_path: PathBuf, keyboard_path: Option<PathBuf>,
/// Relative mouse device path (/dev/hidg1) /// Relative mouse device path (/dev/hidg1)
mouse_rel_path: PathBuf, mouse_rel_path: Option<PathBuf>,
/// Absolute mouse device path (/dev/hidg2) /// Absolute mouse device path (/dev/hidg2)
mouse_abs_path: PathBuf, mouse_abs_path: Option<PathBuf>,
/// Consumer control device path (/dev/hidg3) /// Consumer control device path (/dev/hidg3)
consumer_path: PathBuf, consumer_path: Option<PathBuf>,
/// Keyboard device file /// Keyboard device file
keyboard_dev: Mutex<Option<File>>, keyboard_dev: Mutex<Option<File>>,
/// Relative mouse device file /// Relative mouse device file
@@ -157,9 +157,7 @@ impl OtgBackend {
keyboard_path: paths.keyboard, keyboard_path: paths.keyboard,
mouse_rel_path: paths.mouse_relative, mouse_rel_path: paths.mouse_relative,
mouse_abs_path: paths.mouse_absolute, mouse_abs_path: paths.mouse_absolute,
consumer_path: paths consumer_path: paths.consumer,
.consumer
.unwrap_or_else(|| PathBuf::from("/dev/hidg3")),
keyboard_dev: Mutex::new(None), keyboard_dev: Mutex::new(None),
mouse_rel_dev: Mutex::new(None), mouse_rel_dev: Mutex::new(None),
mouse_abs_dev: Mutex::new(None), mouse_abs_dev: Mutex::new(None),
@@ -300,13 +298,25 @@ impl OtgBackend {
/// 2. If handle is None but path exists, reopen the device /// 2. If handle is None but path exists, reopen the device
/// 3. Return whether the device is ready for I/O /// 3. Return whether the device is ready for I/O
fn ensure_device(&self, device_type: DeviceType) -> Result<()> { fn ensure_device(&self, device_type: DeviceType) -> Result<()> {
let (path, dev_mutex) = match device_type { let (path_opt, dev_mutex) = match device_type {
DeviceType::Keyboard => (&self.keyboard_path, &self.keyboard_dev), DeviceType::Keyboard => (&self.keyboard_path, &self.keyboard_dev),
DeviceType::MouseRelative => (&self.mouse_rel_path, &self.mouse_rel_dev), DeviceType::MouseRelative => (&self.mouse_rel_path, &self.mouse_rel_dev),
DeviceType::MouseAbsolute => (&self.mouse_abs_path, &self.mouse_abs_dev), DeviceType::MouseAbsolute => (&self.mouse_abs_path, &self.mouse_abs_dev),
DeviceType::ConsumerControl => (&self.consumer_path, &self.consumer_dev), DeviceType::ConsumerControl => (&self.consumer_path, &self.consumer_dev),
}; };
let path = match path_opt {
Some(p) => p,
None => {
self.online.store(false, Ordering::Relaxed);
return Err(AppError::HidError {
backend: "otg".to_string(),
reason: "Device disabled".to_string(),
error_code: "disabled".to_string(),
});
}
};
// Check if device path exists // Check if device path exists
if !path.exists() { if !path.exists() {
// Close the device if open (device was removed) // Close the device if open (device was removed)
@@ -383,20 +393,40 @@ impl OtgBackend {
/// Check if all HID device files exist /// Check if all HID device files exist
pub fn check_devices_exist(&self) -> bool { pub fn check_devices_exist(&self) -> bool {
self.keyboard_path.exists() && self.mouse_rel_path.exists() && self.mouse_abs_path.exists() self.keyboard_path
.as_ref()
.map_or(true, |p| p.exists())
&& self
.mouse_rel_path
.as_ref()
.map_or(true, |p| p.exists())
&& self
.mouse_abs_path
.as_ref()
.map_or(true, |p| p.exists())
&& self
.consumer_path
.as_ref()
.map_or(true, |p| p.exists())
} }
/// Get list of missing device paths /// Get list of missing device paths
pub fn get_missing_devices(&self) -> Vec<String> { pub fn get_missing_devices(&self) -> Vec<String> {
let mut missing = Vec::new(); let mut missing = Vec::new();
if !self.keyboard_path.exists() { if let Some(ref path) = self.keyboard_path {
missing.push(self.keyboard_path.display().to_string()); if !path.exists() {
missing.push(path.display().to_string());
} }
if !self.mouse_rel_path.exists() {
missing.push(self.mouse_rel_path.display().to_string());
} }
if !self.mouse_abs_path.exists() { if let Some(ref path) = self.mouse_rel_path {
missing.push(self.mouse_abs_path.display().to_string()); if !path.exists() {
missing.push(path.display().to_string());
}
}
if let Some(ref path) = self.mouse_abs_path {
if !path.exists() {
missing.push(path.display().to_string());
}
} }
missing missing
} }
@@ -407,6 +437,10 @@ impl OtgBackend {
/// ESHUTDOWN errors by closing the device handle for later reconnection. /// ESHUTDOWN errors by closing the device handle for later reconnection.
/// Uses write_with_timeout to avoid blocking on busy devices. /// Uses write_with_timeout to avoid blocking on busy devices.
fn send_keyboard_report(&self, report: &KeyboardReport) -> Result<()> { fn send_keyboard_report(&self, report: &KeyboardReport) -> Result<()> {
if self.keyboard_path.is_none() {
return Ok(());
}
// Ensure device is ready // Ensure device is ready
self.ensure_device(DeviceType::Keyboard)?; self.ensure_device(DeviceType::Keyboard)?;
@@ -472,6 +506,10 @@ impl OtgBackend {
/// ESHUTDOWN errors by closing the device handle for later reconnection. /// ESHUTDOWN errors by closing the device handle for later reconnection.
/// Uses write_with_timeout to avoid blocking on busy devices. /// Uses write_with_timeout to avoid blocking on busy devices.
fn send_mouse_report_relative(&self, buttons: u8, dx: i8, dy: i8, wheel: i8) -> Result<()> { fn send_mouse_report_relative(&self, buttons: u8, dx: i8, dy: i8, wheel: i8) -> Result<()> {
if self.mouse_rel_path.is_none() {
return Ok(());
}
// Ensure device is ready // Ensure device is ready
self.ensure_device(DeviceType::MouseRelative)?; self.ensure_device(DeviceType::MouseRelative)?;
@@ -534,6 +572,10 @@ impl OtgBackend {
/// ESHUTDOWN errors by closing the device handle for later reconnection. /// ESHUTDOWN errors by closing the device handle for later reconnection.
/// Uses write_with_timeout to avoid blocking on busy devices. /// Uses write_with_timeout to avoid blocking on busy devices.
fn send_mouse_report_absolute(&self, buttons: u8, x: u16, y: u16, wheel: i8) -> Result<()> { fn send_mouse_report_absolute(&self, buttons: u8, x: u16, y: u16, wheel: i8) -> Result<()> {
if self.mouse_abs_path.is_none() {
return Ok(());
}
// Ensure device is ready // Ensure device is ready
self.ensure_device(DeviceType::MouseAbsolute)?; self.ensure_device(DeviceType::MouseAbsolute)?;
@@ -600,6 +642,10 @@ impl OtgBackend {
/// ///
/// Sends a consumer control usage code and then releases it (sends 0x0000). /// Sends a consumer control usage code and then releases it (sends 0x0000).
fn send_consumer_report(&self, usage: u16) -> Result<()> { fn send_consumer_report(&self, usage: u16) -> Result<()> {
if self.consumer_path.is_none() {
return Ok(());
}
// Ensure device is ready // Ensure device is ready
self.ensure_device(DeviceType::ConsumerControl)?; self.ensure_device(DeviceType::ConsumerControl)?;
@@ -708,71 +754,72 @@ impl HidBackend for OtgBackend {
} }
// Wait for devices to appear (they should already exist from OtgService) // Wait for devices to appear (they should already exist from OtgService)
let device_paths = vec![ let mut device_paths = Vec::new();
self.keyboard_path.clone(), if let Some(ref path) = self.keyboard_path {
self.mouse_rel_path.clone(), device_paths.push(path.clone());
self.mouse_abs_path.clone(), }
]; if let Some(ref path) = self.mouse_rel_path {
device_paths.push(path.clone());
}
if let Some(ref path) = self.mouse_abs_path {
device_paths.push(path.clone());
}
if let Some(ref path) = self.consumer_path {
device_paths.push(path.clone());
}
if device_paths.is_empty() {
return Err(AppError::Internal(
"No HID devices configured for OTG backend".into(),
));
}
if !wait_for_hid_devices(&device_paths, 2000).await { if !wait_for_hid_devices(&device_paths, 2000).await {
return Err(AppError::Internal("HID devices did not appear".into())); return Err(AppError::Internal("HID devices did not appear".into()));
} }
// Open keyboard device // Open keyboard device
if self.keyboard_path.exists() { if let Some(ref path) = self.keyboard_path {
let file = Self::open_device(&self.keyboard_path)?; if path.exists() {
let file = Self::open_device(path)?;
*self.keyboard_dev.lock() = Some(file); *self.keyboard_dev.lock() = Some(file);
info!("Keyboard device opened: {}", self.keyboard_path.display()); info!("Keyboard device opened: {}", path.display());
} else { } else {
warn!( warn!("Keyboard device not found: {}", path.display());
"Keyboard device not found: {}", }
self.keyboard_path.display()
);
} }
// Open relative mouse device // Open relative mouse device
if self.mouse_rel_path.exists() { if let Some(ref path) = self.mouse_rel_path {
let file = Self::open_device(&self.mouse_rel_path)?; if path.exists() {
let file = Self::open_device(path)?;
*self.mouse_rel_dev.lock() = Some(file); *self.mouse_rel_dev.lock() = Some(file);
info!( info!("Relative mouse device opened: {}", path.display());
"Relative mouse device opened: {}",
self.mouse_rel_path.display()
);
} else { } else {
warn!( warn!("Relative mouse device not found: {}", path.display());
"Relative mouse device not found: {}", }
self.mouse_rel_path.display()
);
} }
// Open absolute mouse device // Open absolute mouse device
if self.mouse_abs_path.exists() { if let Some(ref path) = self.mouse_abs_path {
let file = Self::open_device(&self.mouse_abs_path)?; if path.exists() {
let file = Self::open_device(path)?;
*self.mouse_abs_dev.lock() = Some(file); *self.mouse_abs_dev.lock() = Some(file);
info!( info!("Absolute mouse device opened: {}", path.display());
"Absolute mouse device opened: {}",
self.mouse_abs_path.display()
);
} else { } else {
warn!( warn!("Absolute mouse device not found: {}", path.display());
"Absolute mouse device not found: {}", }
self.mouse_abs_path.display()
);
} }
// Open consumer control device (optional, may not exist on older setups) // Open consumer control device (optional, may not exist on older setups)
if self.consumer_path.exists() { if let Some(ref path) = self.consumer_path {
let file = Self::open_device(&self.consumer_path)?; if path.exists() {
let file = Self::open_device(path)?;
*self.consumer_dev.lock() = Some(file); *self.consumer_dev.lock() = Some(file);
info!( info!("Consumer control device opened: {}", path.display());
"Consumer control device opened: {}",
self.consumer_path.display()
);
} else { } else {
debug!( debug!("Consumer control device not found: {}", path.display());
"Consumer control device not found: {}", }
self.consumer_path.display()
);
} }
// Mark as online if all devices opened successfully // Mark as online if all devices opened successfully
@@ -905,7 +952,9 @@ impl HidBackend for OtgBackend {
} }
fn supports_absolute_mouse(&self) -> bool { fn supports_absolute_mouse(&self) -> bool {
self.mouse_abs_path.exists() self.mouse_abs_path
.as_ref()
.map_or(false, |p| p.exists())
} }
async fn send_consumer(&self, event: ConsumerEvent) -> Result<()> { async fn send_consumer(&self, event: ConsumerEvent) -> Result<()> {
@@ -928,7 +977,7 @@ pub fn is_otg_available() -> bool {
let mouse_rel = PathBuf::from("/dev/hidg1"); let mouse_rel = PathBuf::from("/dev/hidg1");
let mouse_abs = PathBuf::from("/dev/hidg2"); let mouse_abs = PathBuf::from("/dev/hidg2");
kb.exists() && mouse_rel.exists() && mouse_abs.exists() kb.exists() || mouse_rel.exists() || mouse_abs.exists()
} }
/// Implement Drop for OtgBackend to close device files /// Implement Drop for OtgBackend to close device files

View File

@@ -309,12 +309,9 @@ async fn main() -> anyhow::Result<()> {
// Pre-enable OTG functions to avoid gadget recreation (prevents kernel crashes) // Pre-enable OTG functions to avoid gadget recreation (prevents kernel crashes)
let will_use_otg_hid = matches!(config.hid.backend, config::HidBackend::Otg); let will_use_otg_hid = matches!(config.hid.backend, config::HidBackend::Otg);
let will_use_msd = config.msd.enabled || will_use_otg_hid; let will_use_msd = config.msd.enabled;
if will_use_otg_hid { if will_use_otg_hid {
if !config.msd.enabled {
tracing::info!("OTG HID enabled, automatically enabling MSD functionality");
}
if let Err(e) = otg_service.enable_hid().await { if let Err(e) = otg_service.enable_hid().await {
tracing::warn!("Failed to pre-enable HID: {}", e); tracing::warn!("Failed to pre-enable HID: {}", e);
} }
@@ -448,13 +445,11 @@ async fn main() -> anyhow::Result<()> {
} }
} }
// Set up frame source from video streamer (if capturer is available) // Configure direct capture for WebRTC encoder pipeline
// The frame source allows WebRTC sessions to receive live video frames let (device_path, actual_resolution, actual_format, actual_fps, jpeg_quality) =
if let Some(frame_tx) = streamer.frame_sender().await { streamer.current_capture_config().await;
// Synchronize WebRTC config with actual capture format before connecting
let (actual_format, actual_resolution, actual_fps) = streamer.current_video_config().await;
tracing::info!( tracing::info!(
"Initial video config from capturer: {}x{} {:?} @ {}fps", "Initial video config: {}x{} {:?} @ {}fps",
actual_resolution.width, actual_resolution.width,
actual_resolution.height, actual_resolution.height,
actual_format, actual_format,
@@ -463,12 +458,13 @@ async fn main() -> anyhow::Result<()> {
webrtc_streamer webrtc_streamer
.update_video_config(actual_resolution, actual_format, actual_fps) .update_video_config(actual_resolution, actual_format, actual_fps)
.await; .await;
webrtc_streamer.set_video_source(frame_tx).await; if let Some(device_path) = device_path {
tracing::info!("WebRTC streamer connected to video frame source"); webrtc_streamer
.set_capture_device(device_path, jpeg_quality)
.await;
tracing::info!("WebRTC streamer configured for direct capture");
} else { } else {
tracing::warn!( tracing::warn!("No capture device configured for WebRTC");
"Video capturer not ready, WebRTC will connect to frame source when available"
);
} }
// Create video stream manager (unified MJPEG/WebRTC management) // Create video stream manager (unified MJPEG/WebRTC management)

View File

@@ -27,7 +27,7 @@ use tracing::{debug, info, warn};
use super::manager::{wait_for_hid_devices, GadgetDescriptor, OtgGadgetManager}; use super::manager::{wait_for_hid_devices, GadgetDescriptor, OtgGadgetManager};
use super::msd::MsdFunction; use super::msd::MsdFunction;
use crate::config::OtgDescriptorConfig; use crate::config::{OtgDescriptorConfig, OtgHidFunctions};
use crate::error::{AppError, Result}; use crate::error::{AppError, Result};
/// Bitflags for requested functions (lock-free) /// Bitflags for requested functions (lock-free)
@@ -37,23 +37,42 @@ const FLAG_MSD: u8 = 0b10;
/// HID device paths /// HID device paths
#[derive(Debug, Clone)] #[derive(Debug, Clone)]
pub struct HidDevicePaths { pub struct HidDevicePaths {
pub keyboard: PathBuf, pub keyboard: Option<PathBuf>,
pub mouse_relative: PathBuf, pub mouse_relative: Option<PathBuf>,
pub mouse_absolute: PathBuf, pub mouse_absolute: Option<PathBuf>,
pub consumer: Option<PathBuf>, pub consumer: Option<PathBuf>,
} }
impl Default for HidDevicePaths { impl Default for HidDevicePaths {
fn default() -> Self { fn default() -> Self {
Self { Self {
keyboard: PathBuf::from("/dev/hidg0"), keyboard: None,
mouse_relative: PathBuf::from("/dev/hidg1"), mouse_relative: None,
mouse_absolute: PathBuf::from("/dev/hidg2"), mouse_absolute: None,
consumer: Some(PathBuf::from("/dev/hidg3")), consumer: None,
} }
} }
} }
impl HidDevicePaths {
pub fn existing_paths(&self) -> Vec<PathBuf> {
let mut paths = Vec::new();
if let Some(ref p) = self.keyboard {
paths.push(p.clone());
}
if let Some(ref p) = self.mouse_relative {
paths.push(p.clone());
}
if let Some(ref p) = self.mouse_absolute {
paths.push(p.clone());
}
if let Some(ref p) = self.consumer {
paths.push(p.clone());
}
paths
}
}
/// OTG Service state /// OTG Service state
#[derive(Debug, Clone, Default)] #[derive(Debug, Clone, Default)]
pub struct OtgServiceState { pub struct OtgServiceState {
@@ -65,6 +84,8 @@ pub struct OtgServiceState {
pub msd_enabled: bool, pub msd_enabled: bool,
/// HID device paths (set after gadget setup) /// HID device paths (set after gadget setup)
pub hid_paths: Option<HidDevicePaths>, pub hid_paths: Option<HidDevicePaths>,
/// HID function selection (set after gadget setup)
pub hid_functions: Option<OtgHidFunctions>,
/// Error message if setup failed /// Error message if setup failed
pub error: Option<String>, pub error: Option<String>,
} }
@@ -83,6 +104,8 @@ pub struct OtgService {
msd_function: RwLock<Option<MsdFunction>>, msd_function: RwLock<Option<MsdFunction>>,
/// Requested functions flags (atomic, lock-free read/write) /// Requested functions flags (atomic, lock-free read/write)
requested_flags: AtomicU8, requested_flags: AtomicU8,
/// Requested HID function set
hid_functions: RwLock<OtgHidFunctions>,
/// Current descriptor configuration /// Current descriptor configuration
current_descriptor: RwLock<GadgetDescriptor>, current_descriptor: RwLock<GadgetDescriptor>,
} }
@@ -95,6 +118,7 @@ impl OtgService {
state: RwLock::new(OtgServiceState::default()), state: RwLock::new(OtgServiceState::default()),
msd_function: RwLock::new(None), msd_function: RwLock::new(None),
requested_flags: AtomicU8::new(0), requested_flags: AtomicU8::new(0),
hid_functions: RwLock::new(OtgHidFunctions::default()),
current_descriptor: RwLock::new(GadgetDescriptor::default()), current_descriptor: RwLock::new(GadgetDescriptor::default()),
} }
} }
@@ -167,6 +191,35 @@ impl OtgService {
self.state.read().await.hid_paths.clone() self.state.read().await.hid_paths.clone()
} }
/// Get current HID function selection
pub async fn hid_functions(&self) -> OtgHidFunctions {
self.hid_functions.read().await.clone()
}
/// Update HID function selection
pub async fn update_hid_functions(&self, functions: OtgHidFunctions) -> Result<()> {
if functions.is_empty() {
return Err(AppError::BadRequest(
"OTG HID functions cannot be empty".to_string(),
));
}
{
let mut current = self.hid_functions.write().await;
if *current == functions {
return Ok(());
}
*current = functions;
}
// If HID is active, recreate gadget with new function set
if self.is_hid_requested() {
self.recreate_gadget().await?;
}
Ok(())
}
/// Get MSD function handle (for LUN configuration) /// Get MSD function handle (for LUN configuration)
pub async fn msd_function(&self) -> Option<MsdFunction> { pub async fn msd_function(&self) -> Option<MsdFunction> {
self.msd_function.read().await.clone() self.msd_function.read().await.clone()
@@ -182,16 +235,19 @@ impl OtgService {
// Mark HID as requested (lock-free) // Mark HID as requested (lock-free)
self.set_hid_requested(true); self.set_hid_requested(true);
// Check if already enabled // Check if already enabled and function set unchanged
let requested_functions = self.hid_functions.read().await.clone();
{ {
let state = self.state.read().await; let state = self.state.read().await;
if state.hid_enabled { if state.hid_enabled {
if state.hid_functions.as_ref() == Some(&requested_functions) {
if let Some(ref paths) = state.hid_paths { if let Some(ref paths) = state.hid_paths {
info!("HID already enabled, returning existing paths"); info!("HID already enabled, returning existing paths");
return Ok(paths.clone()); return Ok(paths.clone());
} }
} }
} }
}
// Recreate gadget with both HID and MSD if needed // Recreate gadget with both HID and MSD if needed
self.recreate_gadget().await?; self.recreate_gadget().await?;
@@ -294,6 +350,11 @@ impl OtgService {
// Read requested flags atomically (lock-free) // Read requested flags atomically (lock-free)
let hid_requested = self.is_hid_requested(); let hid_requested = self.is_hid_requested();
let msd_requested = self.is_msd_requested(); let msd_requested = self.is_msd_requested();
let hid_functions = if hid_requested {
self.hid_functions.read().await.clone()
} else {
OtgHidFunctions::default()
};
info!( info!(
"Recreating gadget with: HID={}, MSD={}", "Recreating gadget with: HID={}, MSD={}",
@@ -303,9 +364,15 @@ impl OtgService {
// Check if gadget already matches requested state // Check if gadget already matches requested state
{ {
let state = self.state.read().await; let state = self.state.read().await;
let functions_match = if hid_requested {
state.hid_functions.as_ref() == Some(&hid_functions)
} else {
state.hid_functions.is_none()
};
if state.gadget_active if state.gadget_active
&& state.hid_enabled == hid_requested && state.hid_enabled == hid_requested
&& state.msd_enabled == msd_requested && state.msd_enabled == msd_requested
&& functions_match
{ {
info!("Gadget already has requested functions, skipping recreate"); info!("Gadget already has requested functions, skipping recreate");
return Ok(()); return Ok(());
@@ -333,6 +400,7 @@ impl OtgService {
state.hid_enabled = false; state.hid_enabled = false;
state.msd_enabled = false; state.msd_enabled = false;
state.hid_paths = None; state.hid_paths = None;
state.hid_functions = None;
state.error = None; state.error = None;
} }
@@ -361,23 +429,20 @@ impl OtgService {
// Add HID functions if requested // Add HID functions if requested
if hid_requested { if hid_requested {
match ( if hid_functions.is_empty() {
manager.add_keyboard(), let error = "HID functions set is empty".to_string();
manager.add_mouse_relative(), let mut state = self.state.write().await;
manager.add_mouse_absolute(), state.error = Some(error.clone());
manager.add_consumer_control(), return Err(AppError::BadRequest(error));
) {
(Ok(kb), Ok(rel), Ok(abs), Ok(consumer)) => {
hid_paths = Some(HidDevicePaths {
keyboard: kb,
mouse_relative: rel,
mouse_absolute: abs,
consumer: Some(consumer),
});
debug!("HID functions added to gadget");
} }
(Err(e), _, _, _) | (_, Err(e), _, _) | (_, _, Err(e), _) | (_, _, _, Err(e)) => {
let error = format!("Failed to add HID functions: {}", e); let mut paths = HidDevicePaths::default();
if hid_functions.keyboard {
match manager.add_keyboard() {
Ok(kb) => paths.keyboard = Some(kb),
Err(e) => {
let error = format!("Failed to add keyboard HID function: {}", e);
let mut state = self.state.write().await; let mut state = self.state.write().await;
state.error = Some(error.clone()); state.error = Some(error.clone());
return Err(AppError::Internal(error)); return Err(AppError::Internal(error));
@@ -385,6 +450,46 @@ impl OtgService {
} }
} }
if hid_functions.mouse_relative {
match manager.add_mouse_relative() {
Ok(rel) => paths.mouse_relative = Some(rel),
Err(e) => {
let error = format!("Failed to add relative mouse HID function: {}", e);
let mut state = self.state.write().await;
state.error = Some(error.clone());
return Err(AppError::Internal(error));
}
}
}
if hid_functions.mouse_absolute {
match manager.add_mouse_absolute() {
Ok(abs) => paths.mouse_absolute = Some(abs),
Err(e) => {
let error = format!("Failed to add absolute mouse HID function: {}", e);
let mut state = self.state.write().await;
state.error = Some(error.clone());
return Err(AppError::Internal(error));
}
}
}
if hid_functions.consumer {
match manager.add_consumer_control() {
Ok(consumer) => paths.consumer = Some(consumer),
Err(e) => {
let error = format!("Failed to add consumer HID function: {}", e);
let mut state = self.state.write().await;
state.error = Some(error.clone());
return Err(AppError::Internal(error));
}
}
}
hid_paths = Some(paths);
debug!("HID functions added to gadget");
}
// Add MSD function if requested // Add MSD function if requested
let msd_func = if msd_requested { let msd_func = if msd_requested {
match manager.add_msd() { match manager.add_msd() {
@@ -423,12 +528,8 @@ impl OtgService {
// Wait for HID devices to appear // Wait for HID devices to appear
if let Some(ref paths) = hid_paths { if let Some(ref paths) = hid_paths {
let device_paths = vec![ let device_paths = paths.existing_paths();
paths.keyboard.clone(), if !device_paths.is_empty() && !wait_for_hid_devices(&device_paths, 2000).await {
paths.mouse_relative.clone(),
paths.mouse_absolute.clone(),
];
if !wait_for_hid_devices(&device_paths, 2000).await {
warn!("HID devices did not appear after gadget setup"); warn!("HID devices did not appear after gadget setup");
} }
} }
@@ -448,6 +549,11 @@ impl OtgService {
state.hid_enabled = hid_requested; state.hid_enabled = hid_requested;
state.msd_enabled = msd_requested; state.msd_enabled = msd_requested;
state.hid_paths = hid_paths; state.hid_paths = hid_paths;
state.hid_functions = if hid_requested {
Some(hid_functions)
} else {
None
};
state.error = None; state.error = None;
} }
@@ -509,6 +615,7 @@ impl OtgService {
state.hid_enabled = false; state.hid_enabled = false;
state.msd_enabled = false; state.msd_enabled = false;
state.hid_paths = None; state.hid_paths = None;
state.hid_functions = None;
state.error = None; state.error = None;
} }

View File

@@ -1600,8 +1600,15 @@ async fn run_video_streaming(
} }
result = encoded_frame_rx.recv() => { result = encoded_frame_rx.recv() => {
match result { let frame = match result {
Ok(frame) => { Some(frame) => frame,
None => {
info!("Video pipeline closed for connection {}, re-subscribing...", conn_id);
tokio::time::sleep(Duration::from_millis(100)).await;
continue 'subscribe_loop;
}
};
// Convert EncodedVideoFrame to RustDesk VideoFrame message // Convert EncodedVideoFrame to RustDesk VideoFrame message
// Use zero-copy version: Bytes.clone() only increments refcount // Use zero-copy version: Bytes.clone() only increments refcount
let msg_bytes = video_adapter.encode_frame_bytes_zero_copy( let msg_bytes = video_adapter.encode_frame_bytes_zero_copy(
@@ -1611,15 +1618,15 @@ async fn run_video_streaming(
); );
// Send to connection (blocks if channel is full, providing backpressure) // Send to connection (blocks if channel is full, providing backpressure)
if video_tx.send(msg_bytes).await.is_err() { if video_tx.try_send(msg_bytes).is_err() {
debug!("Video channel closed for connection {}", conn_id); // Drop when channel is full to avoid backpressure
break 'subscribe_loop; continue;
} }
encoded_count += 1; encoded_count += 1;
// Log stats periodically // Log stats periodically
if last_log_time.elapsed().as_secs() >= 10 { if last_log_time.elapsed().as_secs() >= 30 {
info!( info!(
"Video streaming stats for connection {}: {} frames forwarded", "Video streaming stats for connection {}: {} frames forwarded",
conn_id, encoded_count conn_id, encoded_count
@@ -1627,18 +1634,6 @@ async fn run_video_streaming(
last_log_time = Instant::now(); last_log_time = Instant::now();
} }
} }
Err(broadcast::error::RecvError::Lagged(n)) => {
debug!("Connection {} lagged {} encoded frames", conn_id, n);
}
Err(broadcast::error::RecvError::Closed) => {
// Pipeline was restarted (e.g., bitrate/codec change)
// Re-subscribe to the new pipeline
info!("Video pipeline closed for connection {}, re-subscribing...", conn_id);
tokio::time::sleep(Duration::from_millis(100)).await;
continue 'subscribe_loop;
}
}
}
} }
} }
} }
@@ -1725,9 +1720,20 @@ async fn run_audio_streaming(
break 'subscribe_loop; break 'subscribe_loop;
} }
result = opus_rx.recv() => { result = opus_rx.changed() => {
match result { if result.is_err() {
Ok(opus_frame) => { // Pipeline was restarted
info!("Audio pipeline closed for connection {}, re-subscribing...", conn_id);
audio_adapter.reset();
tokio::time::sleep(Duration::from_millis(100)).await;
continue 'subscribe_loop;
}
let opus_frame = match opus_rx.borrow().clone() {
Some(frame) => frame,
None => continue,
};
// Convert OpusFrame to RustDesk AudioFrame message // Convert OpusFrame to RustDesk AudioFrame message
let msg_bytes = audio_adapter.encode_opus_bytes(&opus_frame.data); let msg_bytes = audio_adapter.encode_opus_bytes(&opus_frame.data);
@@ -1748,18 +1754,6 @@ async fn run_audio_streaming(
last_log_time = Instant::now(); last_log_time = Instant::now();
} }
} }
Err(broadcast::error::RecvError::Lagged(n)) => {
debug!("Connection {} lagged {} audio frames", conn_id, n);
}
Err(broadcast::error::RecvError::Closed) => {
// Pipeline was restarted
info!("Audio pipeline closed for connection {}, re-subscribing...", conn_id);
audio_adapter.reset();
tokio::time::sleep(Duration::from_millis(100)).await;
continue 'subscribe_loop;
}
}
}
} }
} }
} }

View File

@@ -1,4 +1,4 @@
use std::sync::Arc; use std::{collections::VecDeque, sync::Arc};
use tokio::sync::{broadcast, RwLock}; use tokio::sync::{broadcast, RwLock};
use crate::atx::AtxController; use crate::atx::AtxController;
@@ -56,6 +56,8 @@ pub struct AppState {
pub events: Arc<EventBus>, pub events: Arc<EventBus>,
/// Shutdown signal sender /// Shutdown signal sender
pub shutdown_tx: broadcast::Sender<()>, pub shutdown_tx: broadcast::Sender<()>,
/// Recently revoked session IDs (for client kick detection)
pub revoked_sessions: Arc<RwLock<VecDeque<String>>>,
/// Data directory path /// Data directory path
data_dir: std::path::PathBuf, data_dir: std::path::PathBuf,
} }
@@ -92,6 +94,7 @@ impl AppState {
extensions, extensions,
events, events,
shutdown_tx, shutdown_tx,
revoked_sessions: Arc::new(RwLock::new(VecDeque::new())),
data_dir, data_dir,
}) })
} }
@@ -106,6 +109,26 @@ impl AppState {
self.shutdown_tx.subscribe() self.shutdown_tx.subscribe()
} }
/// Record revoked session IDs (bounded queue)
pub async fn remember_revoked_sessions(&self, session_ids: Vec<String>) {
if session_ids.is_empty() {
return;
}
let mut guard = self.revoked_sessions.write().await;
for id in session_ids {
guard.push_back(id);
}
while guard.len() > 32 {
guard.pop_front();
}
}
/// Check if a session ID was revoked (kicked)
pub async fn is_session_revoked(&self, session_id: &str) -> bool {
let guard = self.revoked_sessions.read().await;
guard.iter().any(|id| id == session_id)
}
/// Get complete device information for WebSocket clients /// Get complete device information for WebSocket clients
/// ///
/// This method collects the current state of all devices (video, HID, MSD, ATX, Audio) /// This method collects the current state of all devices (video, HID, MSD, ATX, Audio)

View File

@@ -157,6 +157,8 @@ pub struct MjpegStreamHandler {
max_drop_same_frames: AtomicU64, max_drop_same_frames: AtomicU64,
/// JPEG encoder for non-JPEG input formats /// JPEG encoder for non-JPEG input formats
jpeg_encoder: ParkingMutex<Option<JpegEncoder>>, jpeg_encoder: ParkingMutex<Option<JpegEncoder>>,
/// JPEG quality for software encoding (1-100)
jpeg_quality: AtomicU64,
} }
impl MjpegStreamHandler { impl MjpegStreamHandler {
@@ -179,9 +181,16 @@ impl MjpegStreamHandler {
last_frame_ts: ParkingRwLock::new(None), last_frame_ts: ParkingRwLock::new(None),
dropped_same_frames: AtomicU64::new(0), dropped_same_frames: AtomicU64::new(0),
max_drop_same_frames: AtomicU64::new(max_drop), max_drop_same_frames: AtomicU64::new(max_drop),
jpeg_quality: AtomicU64::new(80),
} }
} }
/// Set JPEG quality for software encoding (1-100)
pub fn set_jpeg_quality(&self, quality: u8) {
let clamped = quality.clamp(1, 100) as u64;
self.jpeg_quality.store(clamped, Ordering::Relaxed);
}
/// Update current frame /// Update current frame
pub fn update_frame(&self, frame: VideoFrame) { pub fn update_frame(&self, frame: VideoFrame) {
// Fast path: if no MJPEG clients are connected, do minimal bookkeeping and avoid // Fast path: if no MJPEG clients are connected, do minimal bookkeeping and avoid
@@ -260,23 +269,24 @@ impl MjpegStreamHandler {
fn encode_to_jpeg(&self, frame: &VideoFrame) -> Result<VideoFrame, String> { fn encode_to_jpeg(&self, frame: &VideoFrame) -> Result<VideoFrame, String> {
let resolution = frame.resolution; let resolution = frame.resolution;
let sequence = self.sequence.load(Ordering::Relaxed); let sequence = self.sequence.load(Ordering::Relaxed);
let desired_quality = self.jpeg_quality.load(Ordering::Relaxed) as u32;
// Get or create encoder // Get or create encoder
let mut encoder_guard = self.jpeg_encoder.lock(); let mut encoder_guard = self.jpeg_encoder.lock();
let encoder = encoder_guard.get_or_insert_with(|| { let encoder = encoder_guard.get_or_insert_with(|| {
let config = EncoderConfig::jpeg(resolution, 85); let config = EncoderConfig::jpeg(resolution, desired_quality);
match JpegEncoder::new(config) { match JpegEncoder::new(config) {
Ok(enc) => { Ok(enc) => {
debug!( debug!(
"Created JPEG encoder for MJPEG stream: {}x{}", "Created JPEG encoder for MJPEG stream: {}x{} (q={})",
resolution.width, resolution.height resolution.width, resolution.height, desired_quality
); );
enc enc
} }
Err(e) => { Err(e) => {
warn!("Failed to create JPEG encoder: {}, using default", e); warn!("Failed to create JPEG encoder: {}, using default", e);
// Try with default config // Try with default config
JpegEncoder::new(EncoderConfig::jpeg(resolution, 85)) JpegEncoder::new(EncoderConfig::jpeg(resolution, desired_quality))
.expect("Failed to create default JPEG encoder") .expect("Failed to create default JPEG encoder")
} }
} }
@@ -288,9 +298,16 @@ impl MjpegStreamHandler {
"Resolution changed, recreating JPEG encoder: {}x{}", "Resolution changed, recreating JPEG encoder: {}x{}",
resolution.width, resolution.height resolution.width, resolution.height
); );
let config = EncoderConfig::jpeg(resolution, 85); let config = EncoderConfig::jpeg(resolution, desired_quality);
*encoder = *encoder =
JpegEncoder::new(config).map_err(|e| format!("Failed to create encoder: {}", e))?; JpegEncoder::new(config).map_err(|e| format!("Failed to create encoder: {}", e))?;
} else if encoder.config().quality != desired_quality {
if let Err(e) = encoder.set_quality(desired_quality) {
warn!("Failed to set JPEG quality: {}, recreating encoder", e);
let config = EncoderConfig::jpeg(resolution, desired_quality);
*encoder = JpegEncoder::new(config)
.map_err(|e| format!("Failed to create encoder: {}", e))?;
}
} }
// Encode based on input format // Encode based on input format

View File

@@ -15,11 +15,18 @@
//! //!
//! Note: Audio WebSocket is handled separately by audio_ws.rs (/api/ws/audio) //! Note: Audio WebSocket is handled separately by audio_ws.rs (/api/ws/audio)
use std::io;
use std::path::PathBuf; use std::path::PathBuf;
use std::sync::atomic::{AtomicBool, Ordering}; use std::sync::atomic::{AtomicBool, Ordering};
use std::sync::Arc; use std::sync::Arc;
use tokio::sync::{broadcast, RwLock}; use tokio::sync::{Mutex, RwLock};
use tracing::info; use tracing::{error, info, warn};
use v4l::buffer::Type as BufferType;
use v4l::io::traits::CaptureStream;
use v4l::prelude::*;
use v4l::video::Capture;
use v4l::video::capture::Parameters;
use v4l::Format;
use crate::audio::AudioController; use crate::audio::AudioController;
use crate::error::{AppError, Result}; use crate::error::{AppError, Result};
@@ -28,11 +35,16 @@ use crate::hid::HidController;
use crate::video::capture::{CaptureConfig, VideoCapturer}; use crate::video::capture::{CaptureConfig, VideoCapturer};
use crate::video::device::{enumerate_devices, find_best_device, VideoDeviceInfo}; use crate::video::device::{enumerate_devices, find_best_device, VideoDeviceInfo};
use crate::video::format::{PixelFormat, Resolution}; use crate::video::format::{PixelFormat, Resolution};
use crate::video::frame::VideoFrame; use crate::video::frame::{FrameBuffer, FrameBufferPool, VideoFrame};
use super::mjpeg::MjpegStreamHandler; use super::mjpeg::MjpegStreamHandler;
use super::ws_hid::WsHidHandler; use super::ws_hid::WsHidHandler;
/// Minimum valid frame size for capture
const MIN_CAPTURE_FRAME_SIZE: usize = 128;
/// Validate JPEG header every N frames to reduce overhead
const JPEG_VALIDATE_INTERVAL: u64 = 30;
/// MJPEG streamer configuration /// MJPEG streamer configuration
#[derive(Debug, Clone)] #[derive(Debug, Clone)]
pub struct MjpegStreamerConfig { pub struct MjpegStreamerConfig {
@@ -104,8 +116,6 @@ pub struct MjpegStreamerStats {
pub mjpeg_clients: u64, pub mjpeg_clients: u64,
/// WebSocket HID client count /// WebSocket HID client count
pub ws_hid_clients: usize, pub ws_hid_clients: usize,
/// Total frames captured
pub frames_captured: u64,
} }
/// MJPEG Streamer /// MJPEG Streamer
@@ -130,6 +140,9 @@ pub struct MjpegStreamer {
// === Control === // === Control ===
start_lock: tokio::sync::Mutex<()>, start_lock: tokio::sync::Mutex<()>,
direct_stop: AtomicBool,
direct_active: AtomicBool,
direct_handle: Mutex<Option<tokio::task::JoinHandle<()>>>,
events: RwLock<Option<Arc<EventBus>>>, events: RwLock<Option<Arc<EventBus>>>,
config_changing: AtomicBool, config_changing: AtomicBool,
} }
@@ -148,6 +161,9 @@ impl MjpegStreamer {
ws_hid_handler: WsHidHandler::new(), ws_hid_handler: WsHidHandler::new(),
hid_controller: RwLock::new(None), hid_controller: RwLock::new(None),
start_lock: tokio::sync::Mutex::new(()), start_lock: tokio::sync::Mutex::new(()),
direct_stop: AtomicBool::new(false),
direct_active: AtomicBool::new(false),
direct_handle: Mutex::new(None),
events: RwLock::new(None), events: RwLock::new(None),
config_changing: AtomicBool::new(false), config_changing: AtomicBool::new(false),
}) })
@@ -166,6 +182,9 @@ impl MjpegStreamer {
ws_hid_handler: WsHidHandler::new(), ws_hid_handler: WsHidHandler::new(),
hid_controller: RwLock::new(None), hid_controller: RwLock::new(None),
start_lock: tokio::sync::Mutex::new(()), start_lock: tokio::sync::Mutex::new(()),
direct_stop: AtomicBool::new(false),
direct_active: AtomicBool::new(false),
direct_handle: Mutex::new(None),
events: RwLock::new(None), events: RwLock::new(None),
config_changing: AtomicBool::new(false), config_changing: AtomicBool::new(false),
}) })
@@ -228,16 +247,21 @@ impl MjpegStreamer {
let device = self.current_device.read().await; let device = self.current_device.read().await;
let config = self.config.read().await; let config = self.config.read().await;
let (resolution, format, frames_captured) = let (resolution, format) = {
if let Some(ref cap) = *self.capturer.read().await { if self.direct_active.load(Ordering::Relaxed) {
let stats = cap.stats().await; (
Some((config.resolution.width, config.resolution.height)),
Some(config.format.to_string()),
)
} else if let Some(ref cap) = *self.capturer.read().await {
let _ = cap;
( (
Some((config.resolution.width, config.resolution.height)), Some((config.resolution.width, config.resolution.height)),
Some(config.format.to_string()), Some(config.format.to_string()),
stats.frames_captured,
) )
} else { } else {
(None, None, 0) (None, None)
}
}; };
MjpegStreamerStats { MjpegStreamerStats {
@@ -248,7 +272,6 @@ impl MjpegStreamer {
fps: config.fps, fps: config.fps,
mjpeg_clients: self.mjpeg_handler.client_count(), mjpeg_clients: self.mjpeg_handler.client_count(),
ws_hid_clients: self.ws_hid_handler.client_count(), ws_hid_clients: self.ws_hid_handler.client_count(),
frames_captured,
} }
} }
@@ -266,15 +289,6 @@ impl MjpegStreamer {
self.ws_hid_handler.clone() self.ws_hid_handler.clone()
} }
/// Get frame sender for WebRTC integration
pub async fn frame_sender(&self) -> Option<broadcast::Sender<VideoFrame>> {
if let Some(ref cap) = *self.capturer.read().await {
Some(cap.frame_sender())
} else {
None
}
}
// ======================================================================== // ========================================================================
// Initialization // Initialization
// ======================================================================== // ========================================================================
@@ -293,6 +307,7 @@ impl MjpegStreamer {
); );
let config = self.config.read().await.clone(); let config = self.config.read().await.clone();
self.mjpeg_handler.set_jpeg_quality(config.jpeg_quality);
// Create capture config // Create capture config
let capture_config = CaptureConfig { let capture_config = CaptureConfig {
@@ -336,22 +351,23 @@ impl MjpegStreamer {
return Ok(()); return Ok(());
} }
// Get capturer let device = self
let capturer = self.capturer.read().await.clone(); .current_device
let capturer = .read()
capturer.ok_or_else(|| AppError::VideoError("Not initialized".to_string()))?; .await
.clone()
.ok_or_else(|| AppError::VideoError("Not initialized".to_string()))?;
// Start capture let config = self.config.read().await.clone();
capturer.start().await?;
// Start frame forwarding task self.direct_stop.store(false, Ordering::SeqCst);
let handler = self.mjpeg_handler.clone(); self.direct_active.store(true, Ordering::SeqCst);
let mut frame_rx = capturer.frame_sender().subscribe();
tokio::spawn(async move { let streamer = self.clone();
while let Ok(frame) = frame_rx.recv().await { let handle = tokio::task::spawn_blocking(move || {
handler.update_frame(frame); streamer.run_direct_capture(device.path, config);
}
}); });
*self.direct_handle.lock().await = Some(handle);
// Note: Audio WebSocket is handled separately by audio_ws.rs (/api/ws/audio) // Note: Audio WebSocket is handled separately by audio_ws.rs (/api/ws/audio)
@@ -370,7 +386,14 @@ impl MjpegStreamer {
return Ok(()); return Ok(());
} }
// Stop capturer self.direct_stop.store(true, Ordering::SeqCst);
if let Some(handle) = self.direct_handle.lock().await.take() {
let _ = handle.await;
}
self.direct_active.store(false, Ordering::SeqCst);
// Stop capturer (legacy path)
if let Some(ref cap) = *self.capturer.read().await { if let Some(ref cap) = *self.capturer.read().await {
let _ = cap.stop().await; let _ = cap.stop().await;
} }
@@ -412,6 +435,7 @@ impl MjpegStreamer {
// Update config // Update config
*self.config.write().await = config.clone(); *self.config.write().await = config.clone();
self.mjpeg_handler.set_jpeg_quality(config.jpeg_quality);
// Re-initialize if device path is set // Re-initialize if device path is set
if let Some(ref path) = config.device_path { if let Some(ref path) = config.device_path {
@@ -448,6 +472,202 @@ impl MjpegStreamer {
}); });
} }
} }
/// Direct capture loop for MJPEG mode (single loop, no broadcast)
fn run_direct_capture(self: Arc<Self>, device_path: PathBuf, config: MjpegStreamerConfig) {
const MAX_RETRIES: u32 = 5;
const RETRY_DELAY_MS: u64 = 200;
let handle = tokio::runtime::Handle::current();
let mut last_state = MjpegStreamerState::Streaming;
let mut set_state = |new_state: MjpegStreamerState| {
if new_state != last_state {
handle.block_on(async {
*self.state.write().await = new_state;
self.publish_state_change().await;
});
last_state = new_state;
}
};
let mut device_opt: Option<Device> = None;
let mut format_opt: Option<Format> = None;
let mut last_error: Option<String> = None;
for attempt in 0..MAX_RETRIES {
if self.direct_stop.load(Ordering::Relaxed) {
self.direct_active.store(false, Ordering::SeqCst);
return;
}
let device = match Device::with_path(&device_path) {
Ok(d) => d,
Err(e) => {
let err_str = e.to_string();
if err_str.contains("busy") || err_str.contains("resource") {
warn!(
"Device busy on attempt {}/{}, retrying in {}ms...",
attempt + 1,
MAX_RETRIES,
RETRY_DELAY_MS
);
std::thread::sleep(std::time::Duration::from_millis(RETRY_DELAY_MS));
last_error = Some(err_str);
continue;
}
last_error = Some(err_str);
break;
}
};
let requested = Format::new(
config.resolution.width,
config.resolution.height,
config.format.to_fourcc(),
);
match device.set_format(&requested) {
Ok(actual) => {
device_opt = Some(device);
format_opt = Some(actual);
break;
}
Err(e) => {
let err_str = e.to_string();
if err_str.contains("busy") || err_str.contains("resource") {
warn!(
"Device busy on set_format attempt {}/{}, retrying in {}ms...",
attempt + 1,
MAX_RETRIES,
RETRY_DELAY_MS
);
std::thread::sleep(std::time::Duration::from_millis(RETRY_DELAY_MS));
last_error = Some(err_str);
continue;
}
last_error = Some(err_str);
break;
}
}
}
let (device, actual_format) = match (device_opt, format_opt) {
(Some(d), Some(f)) => (d, f),
_ => {
error!(
"Failed to open device {:?}: {}",
device_path,
last_error.unwrap_or_else(|| "unknown error".to_string())
);
set_state(MjpegStreamerState::Error);
self.mjpeg_handler.set_offline();
self.direct_active.store(false, Ordering::SeqCst);
return;
}
};
info!(
"Capture format: {}x{} {:?} stride={}",
actual_format.width, actual_format.height, actual_format.fourcc, actual_format.stride
);
let resolution = Resolution::new(actual_format.width, actual_format.height);
let pixel_format =
PixelFormat::from_fourcc(actual_format.fourcc).unwrap_or(config.format);
if config.fps > 0 {
if let Err(e) = device.set_params(&Parameters::with_fps(config.fps)) {
warn!("Failed to set hardware FPS: {}", e);
}
}
let mut stream = match MmapStream::with_buffers(&device, BufferType::VideoCapture, 4) {
Ok(s) => s,
Err(e) => {
error!("Failed to create capture stream: {}", e);
set_state(MjpegStreamerState::Error);
self.mjpeg_handler.set_offline();
self.direct_active.store(false, Ordering::SeqCst);
return;
}
};
let buffer_pool = Arc::new(FrameBufferPool::new(8));
let mut signal_present = true;
let mut sequence: u64 = 0;
let mut validate_counter: u64 = 0;
while !self.direct_stop.load(Ordering::Relaxed) {
let (buf, meta) = match stream.next() {
Ok(frame_data) => frame_data,
Err(e) => {
if e.kind() == io::ErrorKind::TimedOut {
if signal_present {
signal_present = false;
set_state(MjpegStreamerState::NoSignal);
}
std::thread::sleep(std::time::Duration::from_millis(100));
continue;
}
let is_device_lost = match e.raw_os_error() {
Some(6) => true, // ENXIO
Some(19) => true, // ENODEV
Some(5) => true, // EIO
Some(32) => true, // EPIPE
Some(108) => true, // ESHUTDOWN
_ => false,
};
if is_device_lost {
error!("Video device lost: {} - {}", device_path.display(), e);
set_state(MjpegStreamerState::Error);
self.mjpeg_handler.set_offline();
self.direct_active.store(false, Ordering::SeqCst);
return;
}
error!("Capture error: {}", e);
continue;
}
};
let frame_size = meta.bytesused as usize;
if frame_size < MIN_CAPTURE_FRAME_SIZE {
continue;
}
validate_counter = validate_counter.wrapping_add(1);
if pixel_format.is_compressed()
&& validate_counter % JPEG_VALIDATE_INTERVAL == 0
&& !VideoFrame::is_valid_jpeg_bytes(&buf[..frame_size])
{
continue;
}
let mut owned = buffer_pool.take(frame_size);
owned.resize(frame_size, 0);
owned[..frame_size].copy_from_slice(&buf[..frame_size]);
let frame = VideoFrame::from_pooled(
Arc::new(FrameBuffer::new(owned, Some(buffer_pool.clone()))),
resolution,
pixel_format,
actual_format.stride,
sequence,
);
sequence = sequence.wrapping_add(1);
if !signal_present {
signal_present = true;
set_state(MjpegStreamerState::Streaming);
}
self.mjpeg_handler.update_frame(frame);
}
self.direct_active.store(false, Ordering::SeqCst);
}
} }
impl Default for MjpegStreamer { impl Default for MjpegStreamer {
@@ -463,6 +683,9 @@ impl Default for MjpegStreamer {
ws_hid_handler: WsHidHandler::new(), ws_hid_handler: WsHidHandler::new(),
hid_controller: RwLock::new(None), hid_controller: RwLock::new(None),
start_lock: tokio::sync::Mutex::new(()), start_lock: tokio::sync::Mutex::new(()),
direct_stop: AtomicBool::new(false),
direct_active: AtomicBool::new(false),
direct_handle: Mutex::new(None),
events: RwLock::new(None), events: RwLock::new(None),
config_changing: AtomicBool::new(false), config_changing: AtomicBool::new(false),
} }

View File

@@ -2,13 +2,13 @@
//! //!
//! Provides async video capture using memory-mapped buffers. //! Provides async video capture using memory-mapped buffers.
use bytes::Bytes;
use std::io; use std::io;
use std::path::{Path, PathBuf}; use std::path::{Path, PathBuf};
use std::sync::atomic::{AtomicBool, AtomicU64, Ordering}; use bytes::Bytes;
use std::sync::atomic::{AtomicBool, Ordering};
use std::sync::Arc; use std::sync::Arc;
use std::time::{Duration, Instant}; use std::time::{Duration, Instant};
use tokio::sync::{broadcast, watch, Mutex}; use tokio::sync::{watch, Mutex};
use tracing::{debug, error, info, warn}; use tracing::{debug, error, info, warn};
use v4l::buffer::Type as BufferType; use v4l::buffer::Type as BufferType;
use v4l::io::traits::CaptureStream; use v4l::io::traits::CaptureStream;
@@ -92,20 +92,8 @@ impl CaptureConfig {
/// Capture statistics /// Capture statistics
#[derive(Debug, Clone, Default)] #[derive(Debug, Clone, Default)]
pub struct CaptureStats { pub struct CaptureStats {
/// Total frames captured
pub frames_captured: u64,
/// Frames dropped (invalid/too small)
pub frames_dropped: u64,
/// Current FPS (calculated) /// Current FPS (calculated)
pub current_fps: f32, pub current_fps: f32,
/// Average frame size in bytes
pub avg_frame_size: usize,
/// Capture errors
pub errors: u64,
/// Last frame timestamp
pub last_frame_ts: Option<Instant>,
/// Whether signal is present
pub signal_present: bool,
} }
/// Video capturer state /// Video capturer state
@@ -131,9 +119,7 @@ pub struct VideoCapturer {
state: Arc<watch::Sender<CaptureState>>, state: Arc<watch::Sender<CaptureState>>,
state_rx: watch::Receiver<CaptureState>, state_rx: watch::Receiver<CaptureState>,
stats: Arc<Mutex<CaptureStats>>, stats: Arc<Mutex<CaptureStats>>,
frame_tx: broadcast::Sender<VideoFrame>,
stop_flag: Arc<AtomicBool>, stop_flag: Arc<AtomicBool>,
sequence: Arc<AtomicU64>,
capture_handle: Mutex<Option<tokio::task::JoinHandle<()>>>, capture_handle: Mutex<Option<tokio::task::JoinHandle<()>>>,
/// Last error that occurred (device path, reason) /// Last error that occurred (device path, reason)
last_error: Arc<parking_lot::RwLock<Option<(String, String)>>>, last_error: Arc<parking_lot::RwLock<Option<(String, String)>>>,
@@ -143,16 +129,13 @@ impl VideoCapturer {
/// Create a new video capturer /// Create a new video capturer
pub fn new(config: CaptureConfig) -> Self { pub fn new(config: CaptureConfig) -> Self {
let (state_tx, state_rx) = watch::channel(CaptureState::Stopped); let (state_tx, state_rx) = watch::channel(CaptureState::Stopped);
let (frame_tx, _) = broadcast::channel(16); // Buffer size 16 for low latency
Self { Self {
config, config,
state: Arc::new(state_tx), state: Arc::new(state_tx),
state_rx, state_rx,
stats: Arc::new(Mutex::new(CaptureStats::default())), stats: Arc::new(Mutex::new(CaptureStats::default())),
frame_tx,
stop_flag: Arc::new(AtomicBool::new(false)), stop_flag: Arc::new(AtomicBool::new(false)),
sequence: Arc::new(AtomicU64::new(0)),
capture_handle: Mutex::new(None), capture_handle: Mutex::new(None),
last_error: Arc::new(parking_lot::RwLock::new(None)), last_error: Arc::new(parking_lot::RwLock::new(None)),
} }
@@ -178,16 +161,6 @@ impl VideoCapturer {
*self.last_error.write() = None; *self.last_error.write() = None;
} }
/// Subscribe to frames
pub fn subscribe(&self) -> broadcast::Receiver<VideoFrame> {
self.frame_tx.subscribe()
}
/// Get frame sender (for sharing with other components like WebRTC)
pub fn frame_sender(&self) -> broadcast::Sender<VideoFrame> {
self.frame_tx.clone()
}
/// Get capture statistics /// Get capture statistics
pub async fn stats(&self) -> CaptureStats { pub async fn stats(&self) -> CaptureStats {
self.stats.lock().await.clone() self.stats.lock().await.clone()
@@ -225,15 +198,11 @@ impl VideoCapturer {
let config = self.config.clone(); let config = self.config.clone();
let state = self.state.clone(); let state = self.state.clone();
let stats = self.stats.clone(); let stats = self.stats.clone();
let frame_tx = self.frame_tx.clone();
let stop_flag = self.stop_flag.clone(); let stop_flag = self.stop_flag.clone();
let sequence = self.sequence.clone();
let last_error = self.last_error.clone(); let last_error = self.last_error.clone();
let handle = tokio::task::spawn_blocking(move || { let handle = tokio::task::spawn_blocking(move || {
capture_loop( capture_loop(config, state, stats, stop_flag, last_error);
config, state, stats, frame_tx, stop_flag, sequence, last_error,
);
}); });
*self.capture_handle.lock().await = Some(handle); *self.capture_handle.lock().await = Some(handle);
@@ -272,12 +241,10 @@ fn capture_loop(
config: CaptureConfig, config: CaptureConfig,
state: Arc<watch::Sender<CaptureState>>, state: Arc<watch::Sender<CaptureState>>,
stats: Arc<Mutex<CaptureStats>>, stats: Arc<Mutex<CaptureStats>>,
frame_tx: broadcast::Sender<VideoFrame>,
stop_flag: Arc<AtomicBool>, stop_flag: Arc<AtomicBool>,
sequence: Arc<AtomicU64>,
error_holder: Arc<parking_lot::RwLock<Option<(String, String)>>>, error_holder: Arc<parking_lot::RwLock<Option<(String, String)>>>,
) { ) {
let result = run_capture(&config, &state, &stats, &frame_tx, &stop_flag, &sequence); let result = run_capture(&config, &state, &stats, &stop_flag);
match result { match result {
Ok(_) => { Ok(_) => {
@@ -300,9 +267,7 @@ fn run_capture(
config: &CaptureConfig, config: &CaptureConfig,
state: &watch::Sender<CaptureState>, state: &watch::Sender<CaptureState>,
stats: &Arc<Mutex<CaptureStats>>, stats: &Arc<Mutex<CaptureStats>>,
frame_tx: &broadcast::Sender<VideoFrame>,
stop_flag: &AtomicBool, stop_flag: &AtomicBool,
sequence: &AtomicU64,
) -> Result<()> { ) -> Result<()> {
// Retry logic for device busy errors // Retry logic for device busy errors
const MAX_RETRIES: u32 = 5; const MAX_RETRIES: u32 = 5;
@@ -368,16 +333,7 @@ fn run_capture(
}; };
// Device opened and format set successfully - proceed with capture // Device opened and format set successfully - proceed with capture
return run_capture_inner( return run_capture_inner(config, state, stats, stop_flag, device, actual_format);
config,
state,
stats,
frame_tx,
stop_flag,
sequence,
device,
actual_format,
);
} }
// All retries exhausted // All retries exhausted
@@ -391,9 +347,7 @@ fn run_capture_inner(
config: &CaptureConfig, config: &CaptureConfig,
state: &watch::Sender<CaptureState>, state: &watch::Sender<CaptureState>,
stats: &Arc<Mutex<CaptureStats>>, stats: &Arc<Mutex<CaptureStats>>,
frame_tx: &broadcast::Sender<VideoFrame>,
stop_flag: &AtomicBool, stop_flag: &AtomicBool,
sequence: &AtomicU64,
device: Device, device: Device,
actual_format: Format, actual_format: Format,
) -> Result<()> { ) -> Result<()> {
@@ -402,8 +356,6 @@ fn run_capture_inner(
actual_format.width, actual_format.height, actual_format.fourcc, actual_format.stride actual_format.width, actual_format.height, actual_format.fourcc, actual_format.stride
); );
let resolution = Resolution::new(actual_format.width, actual_format.height);
let pixel_format = PixelFormat::from_fourcc(actual_format.fourcc).unwrap_or(config.format);
// Try to set hardware FPS (V4L2 VIDIOC_S_PARM) // Try to set hardware FPS (V4L2 VIDIOC_S_PARM)
if config.fps > 0 { if config.fps > 0 {
@@ -449,18 +401,13 @@ fn run_capture_inner(
// Main capture loop // Main capture loop
while !stop_flag.load(Ordering::Relaxed) { while !stop_flag.load(Ordering::Relaxed) {
// Try to capture a frame // Try to capture a frame
let (buf, meta) = match stream.next() { let (_buf, meta) = match stream.next() {
Ok(frame_data) => frame_data, Ok(frame_data) => frame_data,
Err(e) => { Err(e) => {
if e.kind() == io::ErrorKind::TimedOut { if e.kind() == io::ErrorKind::TimedOut {
warn!("Capture timeout - no signal?"); warn!("Capture timeout - no signal?");
let _ = state.send(CaptureState::NoSignal); let _ = state.send(CaptureState::NoSignal);
// Update stats
if let Ok(mut s) = stats.try_lock() {
s.signal_present = false;
}
// Wait a bit before retrying // Wait a bit before retrying
std::thread::sleep(Duration::from_millis(100)); std::thread::sleep(Duration::from_millis(100));
continue; continue;
@@ -486,9 +433,6 @@ fn run_capture_inner(
} }
error!("Capture error: {}", e); error!("Capture error: {}", e);
if let Ok(mut s) = stats.try_lock() {
s.errors += 1;
}
continue; continue;
} }
}; };
@@ -502,54 +446,16 @@ fn run_capture_inner(
"Dropping small frame: {} bytes (bytesused={})", "Dropping small frame: {} bytes (bytesused={})",
frame_size, meta.bytesused frame_size, meta.bytesused
); );
if let Ok(mut s) = stats.try_lock() {
s.frames_dropped += 1;
}
continue; continue;
} }
// For JPEG formats, validate header
if pixel_format.is_compressed() && !is_valid_jpeg(&buf[..frame_size]) {
debug!("Dropping invalid JPEG frame (size={})", frame_size);
if let Ok(mut s) = stats.try_lock() {
s.frames_dropped += 1;
}
continue;
}
// Create frame with actual data size
let seq = sequence.fetch_add(1, Ordering::Relaxed);
let frame = VideoFrame::new(
Bytes::copy_from_slice(&buf[..frame_size]),
resolution,
pixel_format,
actual_format.stride,
seq,
);
// Update state if was no signal // Update state if was no signal
if *state.borrow() == CaptureState::NoSignal { if *state.borrow() == CaptureState::NoSignal {
let _ = state.send(CaptureState::Running); let _ = state.send(CaptureState::Running);
} }
// Send frame to subscribers
let receiver_count = frame_tx.receiver_count();
if receiver_count > 0 {
if let Err(e) = frame_tx.send(frame) {
debug!("No active receivers for frame: {}", e);
}
} else if seq % 60 == 0 {
// Log every 60 frames (about 1 second at 60fps) when no receivers
debug!("No receivers for video frames (receiver_count=0)");
}
// Update stats
if let Ok(mut s) = stats.try_lock() {
s.frames_captured += 1;
s.signal_present = true;
s.last_frame_ts = Some(Instant::now());
// Update FPS calculation // Update FPS calculation
if let Ok(mut s) = stats.try_lock() {
fps_frame_count += 1; fps_frame_count += 1;
let elapsed = fps_window_start.elapsed(); let elapsed = fps_window_start.elapsed();
@@ -571,6 +477,7 @@ fn run_capture_inner(
} }
/// Validate JPEG frame data /// Validate JPEG frame data
#[cfg(test)]
fn is_valid_jpeg(data: &[u8]) -> bool { fn is_valid_jpeg(data: &[u8]) -> bool {
if data.len() < 125 { if data.len() < 125 {
return false; return false;

View File

@@ -511,21 +511,6 @@ impl Encoder for H264Encoder {
} }
} }
/// Encoder statistics
#[derive(Debug, Clone, Default)]
pub struct EncoderStats {
/// Total frames encoded
pub frames_encoded: u64,
/// Total bytes output
pub bytes_output: u64,
/// Current encoding FPS
pub fps: f32,
/// Average encoding time per frame (ms)
pub avg_encode_time_ms: f32,
/// Keyframes encoded
pub keyframes: u64,
}
#[cfg(test)] #[cfg(test)]
mod tests { mod tests {
use super::*; use super::*;

View File

@@ -1,17 +1,110 @@
//! Video frame data structures //! Video frame data structures
use bytes::Bytes; use bytes::Bytes;
use parking_lot::Mutex;
use std::sync::Arc; use std::sync::Arc;
use std::sync::OnceLock; use std::sync::OnceLock;
use std::time::Instant; use std::time::Instant;
use super::format::{PixelFormat, Resolution}; use super::format::{PixelFormat, Resolution};
#[derive(Clone)]
enum FrameData {
Bytes(Bytes),
Pooled(Arc<FrameBuffer>),
}
impl std::fmt::Debug for FrameData {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
FrameData::Bytes(bytes) => f
.debug_struct("FrameData::Bytes")
.field("len", &bytes.len())
.finish(),
FrameData::Pooled(buf) => f
.debug_struct("FrameData::Pooled")
.field("len", &buf.len())
.finish(),
}
}
}
#[derive(Debug)]
pub struct FrameBufferPool {
pool: Mutex<Vec<Vec<u8>>>,
max_buffers: usize,
}
impl FrameBufferPool {
pub fn new(max_buffers: usize) -> Self {
Self {
pool: Mutex::new(Vec::new()),
max_buffers: max_buffers.max(1),
}
}
pub fn take(&self, min_capacity: usize) -> Vec<u8> {
let mut pool = self.pool.lock();
if let Some(mut buf) = pool.pop() {
if buf.capacity() < min_capacity {
buf.reserve(min_capacity - buf.capacity());
}
buf
} else {
Vec::with_capacity(min_capacity)
}
}
pub fn put(&self, mut buf: Vec<u8>) {
buf.clear();
let mut pool = self.pool.lock();
if pool.len() < self.max_buffers {
pool.push(buf);
}
}
}
pub struct FrameBuffer {
data: Vec<u8>,
pool: Option<Arc<FrameBufferPool>>,
}
impl FrameBuffer {
pub fn new(data: Vec<u8>, pool: Option<Arc<FrameBufferPool>>) -> Self {
Self { data, pool }
}
pub fn as_slice(&self) -> &[u8] {
&self.data
}
pub fn len(&self) -> usize {
self.data.len()
}
}
impl std::fmt::Debug for FrameBuffer {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
f.debug_struct("FrameBuffer")
.field("len", &self.data.len())
.finish()
}
}
impl Drop for FrameBuffer {
fn drop(&mut self) {
if let Some(pool) = self.pool.take() {
let data = std::mem::take(&mut self.data);
pool.put(data);
}
}
}
/// A video frame with metadata /// A video frame with metadata
#[derive(Debug, Clone)] #[derive(Debug, Clone)]
pub struct VideoFrame { pub struct VideoFrame {
/// Raw frame data /// Raw frame data
data: Arc<Bytes>, data: FrameData,
/// Cached xxHash64 of frame data (lazy computed for deduplication) /// Cached xxHash64 of frame data (lazy computed for deduplication)
hash: Arc<OnceLock<u64>>, hash: Arc<OnceLock<u64>>,
/// Frame resolution /// Frame resolution
@@ -40,7 +133,7 @@ impl VideoFrame {
sequence: u64, sequence: u64,
) -> Self { ) -> Self {
Self { Self {
data: Arc::new(data), data: FrameData::Bytes(data),
hash: Arc::new(OnceLock::new()), hash: Arc::new(OnceLock::new()),
resolution, resolution,
format, format,
@@ -63,24 +156,51 @@ impl VideoFrame {
Self::new(Bytes::from(data), resolution, format, stride, sequence) Self::new(Bytes::from(data), resolution, format, stride, sequence)
} }
/// Create a frame from pooled buffer
pub fn from_pooled(
data: Arc<FrameBuffer>,
resolution: Resolution,
format: PixelFormat,
stride: u32,
sequence: u64,
) -> Self {
Self {
data: FrameData::Pooled(data),
hash: Arc::new(OnceLock::new()),
resolution,
format,
stride,
key_frame: true,
sequence,
capture_ts: Instant::now(),
online: true,
}
}
/// Get frame data as bytes slice /// Get frame data as bytes slice
pub fn data(&self) -> &[u8] { pub fn data(&self) -> &[u8] {
&self.data match &self.data {
FrameData::Bytes(bytes) => bytes,
FrameData::Pooled(buf) => buf.as_slice(),
}
} }
/// Get frame data as Bytes (cheap clone) /// Get frame data as Bytes (cheap clone)
pub fn data_bytes(&self) -> Bytes { pub fn data_bytes(&self) -> Bytes {
(*self.data).clone() match &self.data {
FrameData::Bytes(bytes) => bytes.clone(),
FrameData::Pooled(buf) => Bytes::copy_from_slice(buf.as_slice()),
}
} }
/// Get data length /// Get data length
pub fn len(&self) -> usize { pub fn len(&self) -> usize {
self.data.len() self.data().len()
} }
/// Check if frame is empty /// Check if frame is empty
pub fn is_empty(&self) -> bool { pub fn is_empty(&self) -> bool {
self.data.is_empty() self.data().is_empty()
} }
/// Get width /// Get width
@@ -108,7 +228,7 @@ impl VideoFrame {
pub fn get_hash(&self) -> u64 { pub fn get_hash(&self) -> u64 {
*self *self
.hash .hash
.get_or_init(|| xxhash_rust::xxh64::xxh64(self.data.as_ref(), 0)) .get_or_init(|| xxhash_rust::xxh64::xxh64(self.data(), 0))
} }
/// Check if format is JPEG/MJPEG /// Check if format is JPEG/MJPEG
@@ -121,25 +241,27 @@ impl VideoFrame {
if !self.is_jpeg() { if !self.is_jpeg() {
return false; return false;
} }
if self.data.len() < 125 { Self::is_valid_jpeg_bytes(self.data())
}
/// Validate JPEG bytes without constructing a frame
pub fn is_valid_jpeg_bytes(data: &[u8]) -> bool {
if data.len() < 125 {
return false; return false;
} }
// Check JPEG header let start_marker = ((data[0] as u16) << 8) | data[1] as u16;
let start_marker = ((self.data[0] as u16) << 8) | self.data[1] as u16;
if start_marker != 0xFFD8 { if start_marker != 0xFFD8 {
return false; return false;
} }
// Check JPEG end marker let end = data.len();
let end = self.data.len(); let end_marker = ((data[end - 2] as u16) << 8) | data[end - 1] as u16;
let end_marker = ((self.data[end - 2] as u16) << 8) | self.data[end - 1] as u16;
// Valid end markers: 0xFFD9, 0xD900, 0x0000 (padded)
matches!(end_marker, 0xFFD9 | 0xD900 | 0x0000) matches!(end_marker, 0xFFD9 | 0xD900 | 0x0000)
} }
/// Create an offline placeholder frame /// Create an offline placeholder frame
pub fn offline(resolution: Resolution, format: PixelFormat) -> Self { pub fn offline(resolution: Resolution, format: PixelFormat) -> Self {
Self { Self {
data: Arc::new(Bytes::new()), data: FrameData::Bytes(Bytes::new()),
hash: Arc::new(OnceLock::new()), hash: Arc::new(OnceLock::new()),
resolution, resolution,
format, format,
@@ -175,65 +297,3 @@ impl From<&VideoFrame> for FrameMeta {
} }
} }
} }
/// Ring buffer for storing recent frames
pub struct FrameRing {
frames: Vec<Option<VideoFrame>>,
capacity: usize,
write_pos: usize,
count: usize,
}
impl FrameRing {
/// Create a new frame ring with specified capacity
pub fn new(capacity: usize) -> Self {
assert!(capacity > 0, "Ring capacity must be > 0");
Self {
frames: (0..capacity).map(|_| None).collect(),
capacity,
write_pos: 0,
count: 0,
}
}
/// Push a frame into the ring
pub fn push(&mut self, frame: VideoFrame) {
self.frames[self.write_pos] = Some(frame);
self.write_pos = (self.write_pos + 1) % self.capacity;
if self.count < self.capacity {
self.count += 1;
}
}
/// Get the latest frame
pub fn latest(&self) -> Option<&VideoFrame> {
if self.count == 0 {
return None;
}
let pos = if self.write_pos == 0 {
self.capacity - 1
} else {
self.write_pos - 1
};
self.frames[pos].as_ref()
}
/// Get number of frames in ring
pub fn len(&self) -> usize {
self.count
}
/// Check if ring is empty
pub fn is_empty(&self) -> bool {
self.count == 0
}
/// Clear all frames
pub fn clear(&mut self) {
for frame in &mut self.frames {
*frame = None;
}
self.write_pos = 0;
self.count = 0;
}
}

View File

@@ -53,22 +53,8 @@ impl Default for H264PipelineConfig {
/// H264 pipeline statistics /// H264 pipeline statistics
#[derive(Debug, Clone, Default)] #[derive(Debug, Clone, Default)]
pub struct H264PipelineStats { pub struct H264PipelineStats {
/// Total frames captured
pub frames_captured: u64,
/// Total frames encoded
pub frames_encoded: u64,
/// Frames dropped (encoding too slow)
pub frames_dropped: u64,
/// Total bytes encoded
pub bytes_encoded: u64,
/// Keyframes encoded
pub keyframes_encoded: u64,
/// Average encoding time per frame (ms)
pub avg_encode_time_ms: f32,
/// Current encoding FPS /// Current encoding FPS
pub current_fps: f32, pub current_fps: f32,
/// Errors encountered
pub errors: u64,
} }
/// H264 video encoding pipeline /// H264 video encoding pipeline
@@ -84,8 +70,6 @@ pub struct H264Pipeline {
stats: Arc<Mutex<H264PipelineStats>>, stats: Arc<Mutex<H264PipelineStats>>,
/// Running state /// Running state
running: watch::Sender<bool>, running: watch::Sender<bool>,
/// Encode time accumulator for averaging
encode_times: Arc<Mutex<Vec<f32>>>,
} }
impl H264Pipeline { impl H264Pipeline {
@@ -183,7 +167,6 @@ impl H264Pipeline {
video_track, video_track,
stats: Arc::new(Mutex::new(H264PipelineStats::default())), stats: Arc::new(Mutex::new(H264PipelineStats::default())),
running: running_tx, running: running_tx,
encode_times: Arc::new(Mutex::new(Vec::with_capacity(100))),
}) })
} }
@@ -222,7 +205,6 @@ impl H264Pipeline {
let nv12_converter = self.nv12_converter.lock().await.take(); let nv12_converter = self.nv12_converter.lock().await.take();
let video_track = self.video_track.clone(); let video_track = self.video_track.clone();
let stats = self.stats.clone(); let stats = self.stats.clone();
let encode_times = self.encode_times.clone();
let config = self.config.clone(); let config = self.config.clone();
let mut running_rx = self.running.subscribe(); let mut running_rx = self.running.subscribe();
@@ -275,12 +257,6 @@ impl H264Pipeline {
} }
} }
// Update captured count
{
let mut s = stats.lock().await;
s.frames_captured += 1;
}
// Convert to NV12 for VAAPI encoder // Convert to NV12 for VAAPI encoder
// BGR24/RGB24/YUYV -> NV12 (via NV12 converter) // BGR24/RGB24/YUYV -> NV12 (via NV12 converter)
// NV12 -> pass through // NV12 -> pass through
@@ -297,8 +273,6 @@ impl H264Pipeline {
Ok(nv12_data) => encoder.encode_raw(nv12_data, pts_ms), Ok(nv12_data) => encoder.encode_raw(nv12_data, pts_ms),
Err(e) => { Err(e) => {
error!("NV12 conversion failed: {}", e); error!("NV12 conversion failed: {}", e);
let mut s = stats.lock().await;
s.errors += 1;
continue; continue;
} }
} }
@@ -323,35 +297,13 @@ impl H264Pipeline {
.await .await
{ {
error!("Failed to write frame to track: {}", e); error!("Failed to write frame to track: {}", e);
let mut s = stats.lock().await;
s.errors += 1;
} else { } else {
// Update stats let _ = start;
let encode_time = start.elapsed().as_secs_f32() * 1000.0;
let mut s = stats.lock().await;
s.frames_encoded += 1;
s.bytes_encoded += frame.data.len() as u64;
if is_keyframe {
s.keyframes_encoded += 1;
}
// Update encode time average
let mut times = encode_times.lock().await;
times.push(encode_time);
if times.len() > 100 {
times.remove(0);
}
if !times.is_empty() {
s.avg_encode_time_ms =
times.iter().sum::<f32>() / times.len() as f32;
}
} }
} }
} }
Err(e) => { Err(e) => {
error!("Encoding failed: {}", e); error!("Encoding failed: {}", e);
let mut s = stats.lock().await;
s.errors += 1;
} }
} }
@@ -365,8 +317,7 @@ impl H264Pipeline {
} }
} }
Err(broadcast::error::RecvError::Lagged(n)) => { Err(broadcast::error::RecvError::Lagged(n)) => {
let mut s = stats.lock().await; let _ = n;
s.frames_dropped += n;
} }
Err(broadcast::error::RecvError::Closed) => { Err(broadcast::error::RecvError::Closed) => {
info!("Frame channel closed, stopping H264 pipeline"); info!("Frame channel closed, stopping H264 pipeline");

View File

@@ -17,20 +17,33 @@
//! ``` //! ```
use bytes::Bytes; use bytes::Bytes;
use parking_lot::RwLock as ParkingRwLock;
use std::sync::atomic::{AtomicBool, AtomicI64, AtomicU64, Ordering}; use std::sync::atomic::{AtomicBool, AtomicI64, AtomicU64, Ordering};
use std::sync::Arc; use std::sync::Arc;
use std::time::{Duration, Instant}; use std::time::{Duration, Instant};
use tokio::sync::{broadcast, watch, Mutex, RwLock}; use tokio::sync::{broadcast, mpsc, watch, Mutex, RwLock};
use tracing::{debug, error, info, trace, warn}; use tracing::{debug, error, info, trace, warn};
/// Grace period before auto-stopping pipeline when no subscribers (in seconds) /// Grace period before auto-stopping pipeline when no subscribers (in seconds)
const AUTO_STOP_GRACE_PERIOD_SECS: u64 = 3; const AUTO_STOP_GRACE_PERIOD_SECS: u64 = 3;
/// Minimum valid frame size for capture
const MIN_CAPTURE_FRAME_SIZE: usize = 128;
/// Validate JPEG header every N frames to reduce overhead
const JPEG_VALIDATE_INTERVAL: u64 = 30;
use crate::error::{AppError, Result}; use crate::error::{AppError, Result};
use crate::video::convert::{Nv12Converter, PixelConverter}; use crate::video::convert::{Nv12Converter, PixelConverter};
#[cfg(any(target_arch = "aarch64", target_arch = "arm"))] #[cfg(any(target_arch = "aarch64", target_arch = "arm"))]
use crate::video::decoder::MjpegRkmppDecoder; use crate::video::decoder::MjpegRkmppDecoder;
use crate::video::decoder::MjpegTurboDecoder; use crate::video::decoder::MjpegTurboDecoder;
#[cfg(any(target_arch = "aarch64", target_arch = "arm"))]
use hwcodec::ffmpeg_hw::{last_error_message as ffmpeg_hw_last_error, HwMjpegH264Config, HwMjpegH264Pipeline};
use v4l::buffer::Type as BufferType;
use v4l::io::traits::CaptureStream;
use v4l::prelude::*;
use v4l::video::Capture;
use v4l::video::capture::Parameters;
use v4l::Format;
use crate::video::encoder::h264::{detect_best_encoder, H264Config, H264Encoder, H264InputFormat}; use crate::video::encoder::h264::{detect_best_encoder, H264Config, H264Encoder, H264InputFormat};
use crate::video::encoder::h265::{ use crate::video::encoder::h265::{
detect_best_h265_encoder, H265Config, H265Encoder, H265InputFormat, detect_best_h265_encoder, H265Config, H265Encoder, H265InputFormat,
@@ -40,7 +53,7 @@ use crate::video::encoder::traits::EncoderConfig;
use crate::video::encoder::vp8::{detect_best_vp8_encoder, VP8Config, VP8Encoder}; use crate::video::encoder::vp8::{detect_best_vp8_encoder, VP8Config, VP8Encoder};
use crate::video::encoder::vp9::{detect_best_vp9_encoder, VP9Config, VP9Encoder}; use crate::video::encoder::vp9::{detect_best_vp9_encoder, VP9Config, VP9Encoder};
use crate::video::format::{PixelFormat, Resolution}; use crate::video::format::{PixelFormat, Resolution};
use crate::video::frame::VideoFrame; use crate::video::frame::{FrameBuffer, FrameBufferPool, VideoFrame};
/// Encoded video frame for distribution /// Encoded video frame for distribution
#[derive(Debug, Clone)] #[derive(Debug, Clone)]
@@ -59,6 +72,10 @@ pub struct EncodedVideoFrame {
pub codec: VideoEncoderType, pub codec: VideoEncoderType,
} }
enum PipelineCmd {
SetBitrate { bitrate_kbps: u32, gop: u32 },
}
/// Shared video pipeline configuration /// Shared video pipeline configuration
#[derive(Debug, Clone)] #[derive(Debug, Clone)]
pub struct SharedVideoPipelineConfig { pub struct SharedVideoPipelineConfig {
@@ -150,16 +167,22 @@ impl SharedVideoPipelineConfig {
/// Pipeline statistics /// Pipeline statistics
#[derive(Debug, Clone, Default)] #[derive(Debug, Clone, Default)]
pub struct SharedVideoPipelineStats { pub struct SharedVideoPipelineStats {
pub frames_captured: u64,
pub frames_encoded: u64,
pub frames_dropped: u64,
pub frames_skipped: u64,
pub bytes_encoded: u64,
pub keyframes_encoded: u64,
pub avg_encode_time_ms: f32,
pub current_fps: f32, pub current_fps: f32,
pub errors: u64, }
pub subscribers: u64,
struct EncoderThreadState {
encoder: Option<Box<dyn VideoEncoderTrait + Send>>,
mjpeg_decoder: Option<MjpegDecoderKind>,
nv12_converter: Option<Nv12Converter>,
yuv420p_converter: Option<PixelConverter>,
encoder_needs_yuv420p: bool,
#[cfg(any(target_arch = "aarch64", target_arch = "arm"))]
ffmpeg_hw_pipeline: Option<HwMjpegH264Pipeline>,
#[cfg(any(target_arch = "aarch64", target_arch = "arm"))]
ffmpeg_hw_enabled: bool,
fps: u32,
codec: VideoEncoderType,
input_format: PixelFormat,
} }
/// Universal video encoder trait object /// Universal video encoder trait object
@@ -314,18 +337,13 @@ impl MjpegDecoderKind {
/// Universal shared video pipeline /// Universal shared video pipeline
pub struct SharedVideoPipeline { pub struct SharedVideoPipeline {
config: RwLock<SharedVideoPipelineConfig>, config: RwLock<SharedVideoPipelineConfig>,
encoder: Mutex<Option<Box<dyn VideoEncoderTrait + Send>>>, subscribers: ParkingRwLock<Vec<mpsc::Sender<Arc<EncodedVideoFrame>>>>,
mjpeg_decoder: Mutex<Option<MjpegDecoderKind>>,
nv12_converter: Mutex<Option<Nv12Converter>>,
yuv420p_converter: Mutex<Option<PixelConverter>>,
/// Whether the encoder needs YUV420P (true) or NV12 (false)
encoder_needs_yuv420p: AtomicBool,
/// Whether YUYV direct input is enabled (RKMPP optimization)
direct_input: AtomicBool,
frame_tx: broadcast::Sender<EncodedVideoFrame>,
stats: Mutex<SharedVideoPipelineStats>, stats: Mutex<SharedVideoPipelineStats>,
running: watch::Sender<bool>, running: watch::Sender<bool>,
running_rx: watch::Receiver<bool>, running_rx: watch::Receiver<bool>,
cmd_tx: ParkingRwLock<Option<tokio::sync::mpsc::UnboundedSender<PipelineCmd>>>,
/// Fast running flag for blocking capture loop
running_flag: AtomicBool,
/// Frame sequence counter (atomic for lock-free access) /// Frame sequence counter (atomic for lock-free access)
sequence: AtomicU64, sequence: AtomicU64,
/// Atomic flag for keyframe request (avoids lock contention) /// Atomic flag for keyframe request (avoids lock contention)
@@ -347,21 +365,16 @@ impl SharedVideoPipeline {
config.input_format config.input_format
); );
let (frame_tx, _) = broadcast::channel(16); // Reduced from 64 for lower latency
let (running_tx, running_rx) = watch::channel(false); let (running_tx, running_rx) = watch::channel(false);
let pipeline = Arc::new(Self { let pipeline = Arc::new(Self {
config: RwLock::new(config), config: RwLock::new(config),
encoder: Mutex::new(None), subscribers: ParkingRwLock::new(Vec::new()),
mjpeg_decoder: Mutex::new(None),
nv12_converter: Mutex::new(None),
yuv420p_converter: Mutex::new(None),
encoder_needs_yuv420p: AtomicBool::new(false),
direct_input: AtomicBool::new(false),
frame_tx,
stats: Mutex::new(SharedVideoPipelineStats::default()), stats: Mutex::new(SharedVideoPipelineStats::default()),
running: running_tx, running: running_tx,
running_rx, running_rx,
cmd_tx: ParkingRwLock::new(None),
running_flag: AtomicBool::new(false),
sequence: AtomicU64::new(0), sequence: AtomicU64::new(0),
keyframe_requested: AtomicBool::new(false), keyframe_requested: AtomicBool::new(false),
pipeline_start_time_ms: AtomicI64::new(0), pipeline_start_time_ms: AtomicI64::new(0),
@@ -370,9 +383,7 @@ impl SharedVideoPipeline {
Ok(pipeline) Ok(pipeline)
} }
/// Initialize encoder based on config fn build_encoder_state(config: &SharedVideoPipelineConfig) -> Result<EncoderThreadState> {
async fn init_encoder(&self) -> Result<()> {
let config = self.config.read().await.clone();
let registry = EncoderRegistry::global(); let registry = EncoderRegistry::global();
// Helper to get codec name for specific backend // Helper to get codec name for specific backend
@@ -506,6 +517,43 @@ impl SharedVideoPipeline {
|| selected_codec_name.contains("libx265") || selected_codec_name.contains("libx265")
|| selected_codec_name.contains("libvpx"); || selected_codec_name.contains("libvpx");
#[cfg(any(target_arch = "aarch64", target_arch = "arm"))]
if needs_mjpeg_decode && is_rkmpp_encoder && config.output_codec == VideoEncoderType::H264 {
info!("Initializing FFmpeg HW MJPEG->H264 pipeline (no fallback)");
let hw_config = HwMjpegH264Config {
decoder: "mjpeg_rkmpp".to_string(),
encoder: selected_codec_name.clone(),
width: config.resolution.width as i32,
height: config.resolution.height as i32,
fps: config.fps as i32,
bitrate_kbps: config.bitrate_kbps() as i32,
gop: config.gop_size() as i32,
thread_count: 1,
};
let pipeline = HwMjpegH264Pipeline::new(hw_config).map_err(|e| {
let detail = if e.is_empty() { ffmpeg_hw_last_error() } else { e };
AppError::VideoError(format!(
"FFmpeg HW MJPEG->H264 init failed: {}",
detail
))
})?;
info!("Using FFmpeg HW MJPEG->H264 pipeline");
return Ok(EncoderThreadState {
encoder: None,
mjpeg_decoder: None,
nv12_converter: None,
yuv420p_converter: None,
encoder_needs_yuv420p: false,
#[cfg(any(target_arch = "aarch64", target_arch = "arm"))]
ffmpeg_hw_pipeline: Some(pipeline),
#[cfg(any(target_arch = "aarch64", target_arch = "arm"))]
ffmpeg_hw_enabled: true,
fps: config.fps,
codec: config.output_codec,
input_format: config.input_format,
});
}
let pipeline_input_format = if needs_mjpeg_decode { let pipeline_input_format = if needs_mjpeg_decode {
if is_rkmpp_encoder { if is_rkmpp_encoder {
info!( info!(
@@ -515,8 +563,8 @@ impl SharedVideoPipeline {
#[cfg(any(target_arch = "aarch64", target_arch = "arm"))] #[cfg(any(target_arch = "aarch64", target_arch = "arm"))]
{ {
let decoder = MjpegRkmppDecoder::new(config.resolution)?; let decoder = MjpegRkmppDecoder::new(config.resolution)?;
*self.mjpeg_decoder.lock().await = Some(MjpegDecoderKind::Rkmpp(decoder)); let pipeline_format = PixelFormat::Nv12;
PixelFormat::Nv12 (Some(MjpegDecoderKind::Rkmpp(decoder)), pipeline_format)
} }
#[cfg(not(any(target_arch = "aarch64", target_arch = "arm")))] #[cfg(not(any(target_arch = "aarch64", target_arch = "arm")))]
{ {
@@ -530,17 +578,16 @@ impl SharedVideoPipeline {
config.input_format config.input_format
); );
let decoder = MjpegTurboDecoder::new(config.resolution)?; let decoder = MjpegTurboDecoder::new(config.resolution)?;
*self.mjpeg_decoder.lock().await = Some(MjpegDecoderKind::Turbo(decoder)); (Some(MjpegDecoderKind::Turbo(decoder)), PixelFormat::Rgb24)
PixelFormat::Rgb24
} else { } else {
return Err(AppError::VideoError( return Err(AppError::VideoError(
"MJPEG input requires RKMPP or software encoder".to_string(), "MJPEG input requires RKMPP or software encoder".to_string(),
)); ));
} }
} else { } else {
*self.mjpeg_decoder.lock().await = None; (None, config.input_format)
config.input_format
}; };
let (mjpeg_decoder, pipeline_input_format) = pipeline_input_format;
// Create encoder based on codec type // Create encoder based on codec type
let encoder: Box<dyn VideoEncoderTrait + Send> = match config.output_codec { let encoder: Box<dyn VideoEncoderTrait + Send> = match config.output_codec {
@@ -856,24 +903,32 @@ impl SharedVideoPipeline {
} }
}; };
*self.encoder.lock().await = Some(encoder); Ok(EncoderThreadState {
*self.nv12_converter.lock().await = nv12_converter; encoder: Some(encoder),
*self.yuv420p_converter.lock().await = yuv420p_converter; mjpeg_decoder,
self.encoder_needs_yuv420p nv12_converter,
.store(needs_yuv420p, Ordering::Release); yuv420p_converter,
self.direct_input.store(use_direct_input, Ordering::Release); encoder_needs_yuv420p: needs_yuv420p,
#[cfg(any(target_arch = "aarch64", target_arch = "arm"))]
Ok(()) ffmpeg_hw_pipeline: None,
#[cfg(any(target_arch = "aarch64", target_arch = "arm"))]
ffmpeg_hw_enabled: false,
fps: config.fps,
codec: config.output_codec,
input_format: config.input_format,
})
} }
/// Subscribe to encoded frames /// Subscribe to encoded frames
pub fn subscribe(&self) -> broadcast::Receiver<EncodedVideoFrame> { pub fn subscribe(&self) -> mpsc::Receiver<Arc<EncodedVideoFrame>> {
self.frame_tx.subscribe() let (tx, rx) = mpsc::channel(4);
self.subscribers.write().push(tx);
rx
} }
/// Get subscriber count /// Get subscriber count
pub fn subscriber_count(&self) -> usize { pub fn subscriber_count(&self) -> usize {
self.frame_tx.receiver_count() self.subscribers.read().iter().filter(|tx| !tx.is_closed()).count()
} }
/// Report that a receiver has lagged behind /// Report that a receiver has lagged behind
@@ -899,11 +954,50 @@ impl SharedVideoPipeline {
info!("[Pipeline] Keyframe requested for new client"); info!("[Pipeline] Keyframe requested for new client");
} }
fn send_cmd(&self, cmd: PipelineCmd) {
let tx = self.cmd_tx.read().clone();
if let Some(tx) = tx {
let _ = tx.send(cmd);
}
}
fn clear_cmd_tx(&self) {
let mut guard = self.cmd_tx.write();
*guard = None;
}
fn apply_cmd(&self, state: &mut EncoderThreadState, cmd: PipelineCmd) -> Result<()> {
match cmd {
PipelineCmd::SetBitrate { bitrate_kbps, gop } => {
#[cfg(not(any(target_arch = "aarch64", target_arch = "arm")))]
let _ = gop;
#[cfg(any(target_arch = "aarch64", target_arch = "arm"))]
if state.ffmpeg_hw_enabled {
if let Some(ref mut pipeline) = state.ffmpeg_hw_pipeline {
pipeline
.reconfigure(bitrate_kbps as i32, gop as i32)
.map_err(|e| {
let detail = if e.is_empty() { ffmpeg_hw_last_error() } else { e };
AppError::VideoError(format!(
"FFmpeg HW reconfigure failed: {}",
detail
))
})?;
return Ok(());
}
}
if let Some(ref mut encoder) = state.encoder {
encoder.set_bitrate(bitrate_kbps)?;
}
}
}
Ok(())
}
/// Get current stats /// Get current stats
pub async fn stats(&self) -> SharedVideoPipelineStats { pub async fn stats(&self) -> SharedVideoPipelineStats {
let mut stats = self.stats.lock().await.clone(); self.stats.lock().await.clone()
stats.subscribers = self.frame_tx.receiver_count() as u64;
stats
} }
/// Check if running /// Check if running
@@ -919,6 +1013,27 @@ impl SharedVideoPipeline {
self.running_rx.clone() self.running_rx.clone()
} }
async fn broadcast_encoded(&self, frame: Arc<EncodedVideoFrame>) {
let subscribers = {
let guard = self.subscribers.read();
if guard.is_empty() {
return;
}
guard.iter().cloned().collect::<Vec<_>>()
};
for tx in &subscribers {
if tx.send(frame.clone()).await.is_err() {
// Receiver dropped; cleanup happens below.
}
}
if subscribers.iter().any(|tx| tx.is_closed()) {
let mut guard = self.subscribers.write();
guard.retain(|tx| !tx.is_closed());
}
}
/// Get current codec /// Get current codec
pub async fn current_codec(&self) -> VideoEncoderType { pub async fn current_codec(&self) -> VideoEncoderType {
self.config.read().await.output_codec self.config.read().await.output_codec
@@ -938,12 +1053,7 @@ impl SharedVideoPipeline {
config.output_codec = codec; config.output_codec = codec;
} }
// Clear encoder state self.clear_cmd_tx();
*self.encoder.lock().await = None;
*self.mjpeg_decoder.lock().await = None;
*self.nv12_converter.lock().await = None;
*self.yuv420p_converter.lock().await = None;
self.encoder_needs_yuv420p.store(false, Ordering::Release);
info!("Switched to {} codec", codec); info!("Switched to {} codec", codec);
Ok(()) Ok(())
@@ -959,10 +1069,10 @@ impl SharedVideoPipeline {
return Ok(()); return Ok(());
} }
self.init_encoder().await?;
let _ = self.running.send(true);
let config = self.config.read().await.clone(); let config = self.config.read().await.clone();
let mut encoder_state = Self::build_encoder_state(&config)?;
let _ = self.running.send(true);
self.running_flag.store(true, Ordering::Release);
let gop_size = config.gop_size(); let gop_size = config.gop_size();
info!( info!(
"Starting {} pipeline (GOP={})", "Starting {} pipeline (GOP={})",
@@ -970,6 +1080,11 @@ impl SharedVideoPipeline {
); );
let pipeline = self.clone(); let pipeline = self.clone();
let (cmd_tx, mut cmd_rx) = tokio::sync::mpsc::unbounded_channel();
{
let mut guard = self.cmd_tx.write();
*guard = Some(cmd_tx);
}
tokio::spawn(async move { tokio::spawn(async move {
let mut frame_count: u64 = 0; let mut frame_count: u64 = 0;
@@ -977,13 +1092,6 @@ impl SharedVideoPipeline {
let mut fps_frame_count: u64 = 0; let mut fps_frame_count: u64 = 0;
let mut running_rx = pipeline.running_rx.clone(); let mut running_rx = pipeline.running_rx.clone();
// Local counters for batch stats update (reduce lock contention)
let mut local_frames_encoded: u64 = 0;
let mut local_bytes_encoded: u64 = 0;
let mut local_keyframes: u64 = 0;
let mut local_errors: u64 = 0;
let mut local_dropped: u64 = 0;
let mut local_skipped: u64 = 0;
// Track when we last had subscribers for auto-stop feature // Track when we last had subscribers for auto-stop feature
let mut no_subscribers_since: Option<Instant> = None; let mut no_subscribers_since: Option<Instant> = None;
let grace_period = Duration::from_secs(AUTO_STOP_GRACE_PERIOD_SECS); let grace_period = Duration::from_secs(AUTO_STOP_GRACE_PERIOD_SECS);
@@ -1001,7 +1109,12 @@ impl SharedVideoPipeline {
result = frame_rx.recv() => { result = frame_rx.recv() => {
match result { match result {
Ok(video_frame) => { Ok(video_frame) => {
let subscriber_count = pipeline.frame_tx.receiver_count(); while let Ok(cmd) = cmd_rx.try_recv() {
if let Err(e) = pipeline.apply_cmd(&mut encoder_state, cmd) {
error!("Failed to apply pipeline command: {}", e);
}
}
let subscriber_count = pipeline.subscriber_count();
if subscriber_count == 0 { if subscriber_count == 0 {
// Track when we started having no subscribers // Track when we started having no subscribers
@@ -1019,6 +1132,9 @@ impl SharedVideoPipeline {
); );
// Signal stop and break out of loop // Signal stop and break out of loop
let _ = pipeline.running.send(false); let _ = pipeline.running.send(false);
pipeline
.running_flag
.store(false, Ordering::Release);
break; break;
} }
} }
@@ -1033,18 +1149,10 @@ impl SharedVideoPipeline {
} }
} }
match pipeline.encode_frame(&video_frame, frame_count).await { match pipeline.encode_frame_sync(&mut encoder_state, &video_frame, frame_count) {
Ok(Some(encoded_frame)) => { Ok(Some(encoded_frame)) => {
// Send frame to all subscribers let encoded_arc = Arc::new(encoded_frame);
// Note: broadcast::send is non-blocking pipeline.broadcast_encoded(encoded_arc).await;
let _ = pipeline.frame_tx.send(encoded_frame.clone());
// Update local counters (no lock)
local_frames_encoded += 1;
local_bytes_encoded += encoded_frame.data.len() as u64;
if encoded_frame.is_keyframe {
local_keyframes += 1;
}
frame_count += 1; frame_count += 1;
fps_frame_count += 1; fps_frame_count += 1;
@@ -1052,11 +1160,10 @@ impl SharedVideoPipeline {
Ok(None) => {} Ok(None) => {}
Err(e) => { Err(e) => {
error!("Encoding failed: {}", e); error!("Encoding failed: {}", e);
local_errors += 1;
} }
} }
// Batch update stats every second (reduces lock contention) // Update FPS every second (reduces lock contention)
let fps_elapsed = last_fps_time.elapsed(); let fps_elapsed = last_fps_time.elapsed();
if fps_elapsed >= Duration::from_secs(1) { if fps_elapsed >= Duration::from_secs(1) {
let current_fps = let current_fps =
@@ -1064,27 +1171,13 @@ impl SharedVideoPipeline {
fps_frame_count = 0; fps_frame_count = 0;
last_fps_time = Instant::now(); last_fps_time = Instant::now();
// Single lock acquisition for all stats // Single lock acquisition for FPS
let mut s = pipeline.stats.lock().await; let mut s = pipeline.stats.lock().await;
s.frames_encoded += local_frames_encoded;
s.bytes_encoded += local_bytes_encoded;
s.keyframes_encoded += local_keyframes;
s.errors += local_errors;
s.frames_dropped += local_dropped;
s.frames_skipped += local_skipped;
s.current_fps = current_fps; s.current_fps = current_fps;
// Reset local counters
local_frames_encoded = 0;
local_bytes_encoded = 0;
local_keyframes = 0;
local_errors = 0;
local_dropped = 0;
local_skipped = 0;
} }
} }
Err(broadcast::error::RecvError::Lagged(n)) => { Err(broadcast::error::RecvError::Lagged(n)) => {
local_dropped += n; let _ = n;
} }
Err(broadcast::error::RecvError::Closed) => { Err(broadcast::error::RecvError::Closed) => {
break; break;
@@ -1094,37 +1187,277 @@ impl SharedVideoPipeline {
} }
} }
pipeline.clear_cmd_tx();
pipeline.running_flag.store(false, Ordering::Release);
info!("Video pipeline stopped"); info!("Video pipeline stopped");
}); });
Ok(()) Ok(())
} }
/// Encode a single frame /// Start the pipeline by owning capture + encode in a single loop.
async fn encode_frame( ///
/// This avoids the raw-frame broadcast path and keeps capture and encode
/// in the same thread for lower overhead.
pub async fn start_with_device(
self: &Arc<Self>,
device_path: std::path::PathBuf,
buffer_count: u32,
_jpeg_quality: u8,
) -> Result<()> {
if *self.running_rx.borrow() {
warn!("Pipeline already running");
return Ok(());
}
let config = self.config.read().await.clone();
let mut encoder_state = Self::build_encoder_state(&config)?;
let _ = self.running.send(true);
self.running_flag.store(true, Ordering::Release);
let pipeline = self.clone();
let latest_frame: Arc<ParkingRwLock<Option<Arc<VideoFrame>>>> =
Arc::new(ParkingRwLock::new(None));
let (frame_seq_tx, mut frame_seq_rx) = watch::channel(0u64);
let buffer_pool = Arc::new(FrameBufferPool::new(buffer_count.max(4) as usize));
let (cmd_tx, mut cmd_rx) = tokio::sync::mpsc::unbounded_channel();
{
let mut guard = self.cmd_tx.write();
*guard = Some(cmd_tx);
}
// Encoder loop (runs on tokio, consumes latest frame)
{
let pipeline = pipeline.clone();
let latest_frame = latest_frame.clone();
tokio::spawn(async move {
let mut frame_count: u64 = 0;
let mut last_fps_time = Instant::now();
let mut fps_frame_count: u64 = 0;
let mut last_seq = *frame_seq_rx.borrow();
while pipeline.running_flag.load(Ordering::Acquire) {
if frame_seq_rx.changed().await.is_err() {
break;
}
if !pipeline.running_flag.load(Ordering::Acquire) {
break;
}
let seq = *frame_seq_rx.borrow();
if seq == last_seq {
continue;
}
last_seq = seq;
if pipeline.subscriber_count() == 0 {
continue;
}
while let Ok(cmd) = cmd_rx.try_recv() {
if let Err(e) = pipeline.apply_cmd(&mut encoder_state, cmd) {
error!("Failed to apply pipeline command: {}", e);
}
}
let frame = {
let guard = latest_frame.read();
guard.clone()
};
let frame = match frame {
Some(f) => f,
None => continue,
};
match pipeline.encode_frame_sync(&mut encoder_state, &frame, frame_count) {
Ok(Some(encoded_frame)) => {
let encoded_arc = Arc::new(encoded_frame);
pipeline.broadcast_encoded(encoded_arc).await;
frame_count += 1;
fps_frame_count += 1;
}
Ok(None) => {}
Err(e) => {
error!("Encoding failed: {}", e);
}
}
let fps_elapsed = last_fps_time.elapsed();
if fps_elapsed >= Duration::from_secs(1) {
let current_fps = fps_frame_count as f32 / fps_elapsed.as_secs_f32();
fps_frame_count = 0;
last_fps_time = Instant::now();
let mut s = pipeline.stats.lock().await;
s.current_fps = current_fps;
}
}
pipeline.clear_cmd_tx();
});
}
// Capture loop (runs on thread, updates latest frame)
{
let pipeline = pipeline.clone();
let latest_frame = latest_frame.clone();
let frame_seq_tx = frame_seq_tx.clone();
let buffer_pool = buffer_pool.clone();
std::thread::spawn(move || {
let device = match Device::with_path(&device_path) {
Ok(d) => d,
Err(e) => {
error!("Failed to open device {:?}: {}", device_path, e);
let _ = pipeline.running.send(false);
pipeline.running_flag.store(false, Ordering::Release);
let _ = frame_seq_tx.send(1);
return;
}
};
let requested_format = Format::new(
config.resolution.width,
config.resolution.height,
config.input_format.to_fourcc(),
);
let actual_format = match device.set_format(&requested_format) {
Ok(f) => f,
Err(e) => {
error!("Failed to set capture format: {}", e);
let _ = pipeline.running.send(false);
pipeline.running_flag.store(false, Ordering::Release);
let _ = frame_seq_tx.send(1);
return;
}
};
let resolution = Resolution::new(actual_format.width, actual_format.height);
let pixel_format =
PixelFormat::from_fourcc(actual_format.fourcc).unwrap_or(config.input_format);
let stride = actual_format.stride;
if config.fps > 0 {
if let Err(e) = device.set_params(&Parameters::with_fps(config.fps)) {
warn!("Failed to set hardware FPS: {}", e);
}
}
let mut stream = match MmapStream::with_buffers(
&device,
BufferType::VideoCapture,
buffer_count.max(1),
) {
Ok(s) => s,
Err(e) => {
error!("Failed to create capture stream: {}", e);
let _ = pipeline.running.send(false);
pipeline.running_flag.store(false, Ordering::Release);
let _ = frame_seq_tx.send(1);
return;
}
};
let mut no_subscribers_since: Option<Instant> = None;
let grace_period = Duration::from_secs(AUTO_STOP_GRACE_PERIOD_SECS);
let mut sequence: u64 = 0;
let mut validate_counter: u64 = 0;
while pipeline.running_flag.load(Ordering::Acquire) {
let subscriber_count = pipeline.subscriber_count();
if subscriber_count == 0 {
if no_subscribers_since.is_none() {
no_subscribers_since = Some(Instant::now());
trace!("No subscribers, starting grace period timer");
}
if let Some(since) = no_subscribers_since {
if since.elapsed() >= grace_period {
info!(
"No subscribers for {}s, auto-stopping video pipeline",
grace_period.as_secs()
);
let _ = pipeline.running.send(false);
pipeline.running_flag.store(false, Ordering::Release);
let _ = frame_seq_tx.send(sequence.wrapping_add(1));
break;
}
}
std::thread::sleep(Duration::from_millis(5));
continue;
} else if no_subscribers_since.is_some() {
trace!("Subscriber connected, resetting grace period timer");
no_subscribers_since = None;
}
let (buf, meta) = match stream.next() {
Ok(frame_data) => frame_data,
Err(e) => {
if e.kind() == std::io::ErrorKind::TimedOut {
warn!("Capture timeout - no signal?");
} else {
error!("Capture error: {}", e);
}
continue;
}
};
let frame_size = meta.bytesused as usize;
if frame_size < MIN_CAPTURE_FRAME_SIZE {
continue;
}
validate_counter = validate_counter.wrapping_add(1);
if pixel_format.is_compressed()
&& validate_counter % JPEG_VALIDATE_INTERVAL == 0
&& !VideoFrame::is_valid_jpeg_bytes(&buf[..frame_size])
{
continue;
}
let mut owned = buffer_pool.take(frame_size);
owned.resize(frame_size, 0);
owned[..frame_size].copy_from_slice(&buf[..frame_size]);
let frame = Arc::new(VideoFrame::from_pooled(
Arc::new(FrameBuffer::new(owned, Some(buffer_pool.clone()))),
resolution,
pixel_format,
stride,
sequence,
));
sequence = sequence.wrapping_add(1);
{
let mut guard = latest_frame.write();
*guard = Some(frame);
}
let _ = frame_seq_tx.send(sequence);
}
pipeline.running_flag.store(false, Ordering::Release);
let _ = pipeline.running.send(false);
let _ = frame_seq_tx.send(sequence.wrapping_add(1));
info!("Video pipeline stopped");
});
}
Ok(())
}
/// Encode a single frame (synchronous, no async locks)
fn encode_frame_sync(
&self, &self,
state: &mut EncoderThreadState,
frame: &VideoFrame, frame: &VideoFrame,
frame_count: u64, frame_count: u64,
) -> Result<Option<EncodedVideoFrame>> { ) -> Result<Option<EncodedVideoFrame>> {
let (fps, codec, input_format) = { let fps = state.fps;
let config = self.config.read().await; let codec = state.codec;
(config.fps, config.output_codec, config.input_format) let input_format = state.input_format;
};
let raw_frame = frame.data(); let raw_frame = frame.data();
let decoded_buf = if input_format.is_compressed() {
let decoded = {
let mut decoder_guard = self.mjpeg_decoder.lock().await;
let decoder = decoder_guard.as_mut().ok_or_else(|| {
AppError::VideoError("MJPEG decoder not initialized".to_string())
})?;
decoder.decode(raw_frame)?
};
Some(decoded)
} else {
None
};
let raw_frame = decoded_buf.as_deref().unwrap_or(raw_frame);
// Calculate PTS from real capture timestamp (lock-free using AtomicI64) // Calculate PTS from real capture timestamp (lock-free using AtomicI64)
// This ensures smooth playback even when capture timing varies // This ensures smooth playback even when capture timing varies
@@ -1149,6 +1482,53 @@ impl SharedVideoPipeline {
current_ts_ms - start_ts current_ts_ms - start_ts
}; };
#[cfg(any(target_arch = "aarch64", target_arch = "arm"))]
if state.ffmpeg_hw_enabled {
if input_format != PixelFormat::Mjpeg {
return Err(AppError::VideoError(
"FFmpeg HW pipeline requires MJPEG input".to_string(),
));
}
let pipeline = state.ffmpeg_hw_pipeline.as_mut().ok_or_else(|| {
AppError::VideoError("FFmpeg HW pipeline not initialized".to_string())
})?;
if self.keyframe_requested.swap(false, Ordering::AcqRel) {
pipeline.request_keyframe();
debug!("[Pipeline] FFmpeg HW keyframe requested");
}
let packet = pipeline.encode(raw_frame, pts_ms).map_err(|e| {
let detail = if e.is_empty() { ffmpeg_hw_last_error() } else { e };
AppError::VideoError(format!("FFmpeg HW encode failed: {}", detail))
})?;
if let Some((data, is_keyframe)) = packet {
let sequence = self.sequence.fetch_add(1, Ordering::Relaxed) + 1;
return Ok(Some(EncodedVideoFrame {
data: Bytes::from(data),
pts_ms,
is_keyframe,
sequence,
duration: Duration::from_millis(1000 / fps as u64),
codec,
}));
}
return Ok(None);
}
let decoded_buf = if input_format.is_compressed() {
let decoder = state.mjpeg_decoder.as_mut().ok_or_else(|| {
AppError::VideoError("MJPEG decoder not initialized".to_string())
})?;
let decoded = decoder.decode(raw_frame)?;
Some(decoded)
} else {
None
};
let raw_frame = decoded_buf.as_deref().unwrap_or(raw_frame);
// Debug log for H265 // Debug log for H265
if codec == VideoEncoderType::H265 && frame_count % 30 == 1 { if codec == VideoEncoderType::H265 && frame_count % 30 == 1 {
debug!( debug!(
@@ -1159,12 +1539,9 @@ impl SharedVideoPipeline {
); );
} }
let mut nv12_converter = self.nv12_converter.lock().await; let needs_yuv420p = state.encoder_needs_yuv420p;
let mut yuv420p_converter = self.yuv420p_converter.lock().await; let encoder = state
let needs_yuv420p = self.encoder_needs_yuv420p.load(Ordering::Acquire); .encoder
let mut encoder_guard = self.encoder.lock().await;
let encoder = encoder_guard
.as_mut() .as_mut()
.ok_or_else(|| AppError::VideoError("Encoder not initialized".to_string()))?; .ok_or_else(|| AppError::VideoError("Encoder not initialized".to_string()))?;
@@ -1174,16 +1551,16 @@ impl SharedVideoPipeline {
debug!("[Pipeline] Keyframe will be generated for this frame"); debug!("[Pipeline] Keyframe will be generated for this frame");
} }
let encode_result = if needs_yuv420p && yuv420p_converter.is_some() { let encode_result = if needs_yuv420p && state.yuv420p_converter.is_some() {
// Software encoder with direct input conversion to YUV420P // Software encoder with direct input conversion to YUV420P
let conv = yuv420p_converter.as_mut().unwrap(); let conv = state.yuv420p_converter.as_mut().unwrap();
let yuv420p_data = conv let yuv420p_data = conv
.convert(raw_frame) .convert(raw_frame)
.map_err(|e| AppError::VideoError(format!("YUV420P conversion failed: {}", e)))?; .map_err(|e| AppError::VideoError(format!("YUV420P conversion failed: {}", e)))?;
encoder.encode_raw(yuv420p_data, pts_ms) encoder.encode_raw(yuv420p_data, pts_ms)
} else if nv12_converter.is_some() { } else if state.nv12_converter.is_some() {
// Hardware encoder with input conversion to NV12 // Hardware encoder with input conversion to NV12
let conv = nv12_converter.as_mut().unwrap(); let conv = state.nv12_converter.as_mut().unwrap();
let nv12_data = conv let nv12_data = conv
.convert(raw_frame) .convert(raw_frame)
.map_err(|e| AppError::VideoError(format!("NV12 conversion failed: {}", e)))?; .map_err(|e| AppError::VideoError(format!("NV12 conversion failed: {}", e)))?;
@@ -1193,10 +1570,6 @@ impl SharedVideoPipeline {
encoder.encode_raw(raw_frame, pts_ms) encoder.encode_raw(raw_frame, pts_ms)
}; };
drop(encoder_guard);
drop(nv12_converter);
drop(yuv420p_converter);
match encode_result { match encode_result {
Ok(frames) => { Ok(frames) => {
if !frames.is_empty() { if !frames.is_empty() {
@@ -1255,6 +1628,8 @@ impl SharedVideoPipeline {
pub fn stop(&self) { pub fn stop(&self) {
if *self.running_rx.borrow() { if *self.running_rx.borrow() {
let _ = self.running.send(false); let _ = self.running.send(false);
self.running_flag.store(false, Ordering::Release);
self.clear_cmd_tx();
info!("Stopping video pipeline"); info!("Stopping video pipeline");
} }
} }
@@ -1265,10 +1640,12 @@ impl SharedVideoPipeline {
preset: crate::video::encoder::BitratePreset, preset: crate::video::encoder::BitratePreset,
) -> Result<()> { ) -> Result<()> {
let bitrate_kbps = preset.bitrate_kbps(); let bitrate_kbps = preset.bitrate_kbps();
if let Some(ref mut encoder) = *self.encoder.lock().await { let gop = {
encoder.set_bitrate(bitrate_kbps)?; let mut config = self.config.write().await;
self.config.write().await.bitrate_preset = preset; config.bitrate_preset = preset;
} config.gop_size()
};
self.send_cmd(PipelineCmd::SetBitrate { bitrate_kbps, gop });
Ok(()) Ok(())
} }

View File

@@ -135,7 +135,8 @@ impl VideoStreamManager {
/// Set event bus for notifications /// Set event bus for notifications
pub async fn set_event_bus(&self, events: Arc<EventBus>) { pub async fn set_event_bus(&self, events: Arc<EventBus>) {
*self.events.write().await = Some(events); *self.events.write().await = Some(events.clone());
self.webrtc_streamer.set_event_bus(events).await;
} }
/// Set configuration store /// Set configuration store
@@ -199,19 +200,20 @@ impl VideoStreamManager {
} }
} }
// Always reconnect frame source after initialization // Configure WebRTC capture source after initialization
// This ensures WebRTC has the correct frame_tx from the current capturer let (device_path, resolution, format, fps, jpeg_quality) =
if let Some(frame_tx) = self.streamer.frame_sender().await { self.streamer.current_capture_config().await;
// Synchronize WebRTC config with actual capture format
let (format, resolution, fps) = self.streamer.current_video_config().await;
info!( info!(
"Reconnecting frame source to WebRTC after init: {}x{} {:?} @ {}fps (receiver_count={})", "WebRTC capture config after init: {}x{} {:?} @ {}fps",
resolution.width, resolution.height, format, fps, frame_tx.receiver_count() resolution.width, resolution.height, format, fps
); );
self.webrtc_streamer self.webrtc_streamer
.update_video_config(resolution, format, fps) .update_video_config(resolution, format, fps)
.await; .await;
self.webrtc_streamer.set_video_source(frame_tx).await; if let Some(device_path) = device_path {
self.webrtc_streamer
.set_capture_device(device_path, jpeg_quality)
.await;
} }
Ok(()) Ok(())
@@ -329,7 +331,7 @@ impl VideoStreamManager {
/// Ensure video capture is running (for WebRTC mode) /// Ensure video capture is running (for WebRTC mode)
async fn ensure_video_capture_running(self: &Arc<Self>) -> Result<()> { async fn ensure_video_capture_running(self: &Arc<Self>) -> Result<()> {
// Initialize streamer if not already initialized // Initialize streamer if not already initialized (for config discovery)
if self.streamer.state().await == StreamerState::Uninitialized { if self.streamer.state().await == StreamerState::Uninitialized {
info!("Initializing video capture for WebRTC (ensure)"); info!("Initializing video capture for WebRTC (ensure)");
if let Err(e) = self.streamer.init_auto().await { if let Err(e) = self.streamer.init_auto().await {
@@ -338,29 +340,19 @@ impl VideoStreamManager {
} }
} }
// Start video capture if not streaming let (device_path, resolution, format, fps, jpeg_quality) =
if self.streamer.state().await != StreamerState::Streaming { self.streamer.current_capture_config().await;
info!("Starting video capture for WebRTC (ensure)");
if let Err(e) = self.streamer.start().await {
error!("Failed to start video capture: {}", e);
return Err(e);
}
// Wait a bit for capture to stabilize
tokio::time::sleep(tokio::time::Duration::from_millis(100)).await;
}
// Reconnect frame source to WebRTC
if let Some(frame_tx) = self.streamer.frame_sender().await {
let (format, resolution, fps) = self.streamer.current_video_config().await;
info!( info!(
"Reconnecting frame source to WebRTC: {}x{} {:?} @ {}fps", "Configuring WebRTC capture: {}x{} {:?} @ {}fps",
resolution.width, resolution.height, format, fps resolution.width, resolution.height, format, fps
); );
self.webrtc_streamer self.webrtc_streamer
.update_video_config(resolution, format, fps) .update_video_config(resolution, format, fps)
.await; .await;
self.webrtc_streamer.set_video_source(frame_tx).await; if let Some(device_path) = device_path {
self.webrtc_streamer
.set_capture_device(device_path, jpeg_quality)
.await;
} }
Ok(()) Ok(())
@@ -403,7 +395,6 @@ impl VideoStreamManager {
match current_mode { match current_mode {
StreamMode::Mjpeg => { StreamMode::Mjpeg => {
info!("Stopping MJPEG streaming"); info!("Stopping MJPEG streaming");
// Only stop MJPEG distribution, keep video capture running for WebRTC
self.streamer.mjpeg_handler().set_offline(); self.streamer.mjpeg_handler().set_offline();
if let Err(e) = self.streamer.stop().await { if let Err(e) = self.streamer.stop().await {
warn!("Error stopping MJPEG streamer: {}", e); warn!("Error stopping MJPEG streamer: {}", e);
@@ -458,10 +449,9 @@ impl VideoStreamManager {
} }
} }
StreamMode::WebRTC => { StreamMode::WebRTC => {
// WebRTC mode: ensure video capture is running for H264 encoding // WebRTC mode: configure direct capture for encoder pipeline
info!("Activating WebRTC mode"); info!("Activating WebRTC mode");
// Initialize streamer if not already initialized
if self.streamer.state().await == StreamerState::Uninitialized { if self.streamer.state().await == StreamerState::Uninitialized {
info!("Initializing video capture for WebRTC"); info!("Initializing video capture for WebRTC");
if let Err(e) = self.streamer.init_auto().await { if let Err(e) = self.streamer.init_auto().await {
@@ -470,63 +460,23 @@ impl VideoStreamManager {
} }
} }
// Auto-switch to non-compressed format if current format is MJPEG/JPEG let (device_path, resolution, format, fps, jpeg_quality) =
if let Some(device) = self.streamer.current_device().await { self.streamer.current_capture_config().await;
let (current_format, resolution, fps) =
self.streamer.current_video_config().await;
if current_format.is_compressed() {
let available_formats: Vec<PixelFormat> =
device.formats.iter().map(|f| f.format).collect();
// Determine if using hardware encoding
let is_hardware = self.webrtc_streamer.is_hardware_encoding().await;
if let Some(recommended) =
PixelFormat::recommended_for_encoding(&available_formats, is_hardware)
{
info!( info!(
"Auto-switching from {:?} to {:?} for WebRTC encoding (hardware={})", "Configuring WebRTC capture pipeline: {}x{} {:?} @ {}fps",
current_format, recommended, is_hardware
);
let device_path = device.path.to_string_lossy().to_string();
if let Err(e) = self
.streamer
.apply_video_config(&device_path, recommended, resolution, fps)
.await
{
warn!("Failed to auto-switch format for WebRTC: {}, keeping current format", e);
}
}
}
}
// Start video capture if not streaming
if self.streamer.state().await != StreamerState::Streaming {
info!("Starting video capture for WebRTC");
if let Err(e) = self.streamer.start().await {
error!("Failed to start video capture for WebRTC: {}", e);
return Err(e);
}
}
// Wait a bit for capture to stabilize
tokio::time::sleep(tokio::time::Duration::from_millis(100)).await;
// Connect frame source to WebRTC with correct format
if let Some(frame_tx) = self.streamer.frame_sender().await {
// Synchronize WebRTC config with actual capture format
let (format, resolution, fps) = self.streamer.current_video_config().await;
info!(
"Connecting frame source to WebRTC pipeline: {}x{} {:?} @ {}fps",
resolution.width, resolution.height, format, fps resolution.width, resolution.height, format, fps
); );
self.webrtc_streamer self.webrtc_streamer
.update_video_config(resolution, format, fps) .update_video_config(resolution, format, fps)
.await; .await;
self.webrtc_streamer.set_video_source(frame_tx).await; if let Some(device_path) = device_path {
self.webrtc_streamer
.set_capture_device(device_path, jpeg_quality)
.await;
} else {
warn!("No capture device configured for WebRTC");
}
// Publish WebRTCReady event - frame source is now connected
let codec = self.webrtc_streamer.current_video_codec().await; let codec = self.webrtc_streamer.current_video_codec().await;
let is_hardware = self.webrtc_streamer.is_hardware_encoding().await; let is_hardware = self.webrtc_streamer.is_hardware_encoding().await;
self.publish_event(SystemEvent::WebRTCReady { self.publish_event(SystemEvent::WebRTCReady {
@@ -535,11 +485,6 @@ impl VideoStreamManager {
hardware: is_hardware, hardware: is_hardware,
}) })
.await; .await;
} else {
warn!(
"No frame source available for WebRTC - sessions may fail to receive video"
);
}
info!("WebRTC mode activated (sessions created on-demand)"); info!("WebRTC mode activated (sessions created on-demand)");
} }
@@ -587,22 +532,8 @@ impl VideoStreamManager {
.update_video_config(resolution, format, fps) .update_video_config(resolution, format, fps)
.await; .await;
// Restart video capture for WebRTC (it was stopped during config change) let (device_path, actual_resolution, actual_format, actual_fps, jpeg_quality) =
info!("Restarting video capture for WebRTC after config change"); self.streamer.current_capture_config().await;
if let Err(e) = self.streamer.start().await {
error!("Failed to restart video capture for WebRTC: {}", e);
return Err(e);
}
// Wait a bit for capture to stabilize
tokio::time::sleep(tokio::time::Duration::from_millis(100)).await;
// Reconnect frame source with the new capturer
if let Some(frame_tx) = self.streamer.frame_sender().await {
// Note: update_video_config was already called above with the requested config,
// but verify that actual capture matches
let (actual_format, actual_resolution, actual_fps) =
self.streamer.current_video_config().await;
if actual_format != format || actual_resolution != resolution || actual_fps != fps { if actual_format != format || actual_resolution != resolution || actual_fps != fps {
info!( info!(
"Actual capture config differs from requested, updating WebRTC: {}x{} {:?} @ {}fps", "Actual capture config differs from requested, updating WebRTC: {}x{} {:?} @ {}fps",
@@ -612,11 +543,23 @@ impl VideoStreamManager {
.update_video_config(actual_resolution, actual_format, actual_fps) .update_video_config(actual_resolution, actual_format, actual_fps)
.await; .await;
} }
info!("Reconnecting frame source to WebRTC after config change"); if let Some(device_path) = device_path {
self.webrtc_streamer.set_video_source(frame_tx).await; info!("Configuring direct capture for WebRTC after config change");
self.webrtc_streamer
.set_capture_device(device_path, jpeg_quality)
.await;
} else { } else {
warn!("No frame source available after config change"); warn!("No capture device configured for WebRTC after config change");
} }
let codec = self.webrtc_streamer.current_video_codec().await;
let is_hardware = self.webrtc_streamer.is_hardware_encoding().await;
self.publish_event(SystemEvent::WebRTCReady {
transition_id: None,
codec: codec_to_string(codec),
hardware: is_hardware,
})
.await;
} }
Ok(()) Ok(())
@@ -631,22 +574,23 @@ impl VideoStreamManager {
self.streamer.start().await?; self.streamer.start().await?;
} }
StreamMode::WebRTC => { StreamMode::WebRTC => {
// Ensure video capture is running // Ensure device is initialized for config discovery
if self.streamer.state().await == StreamerState::Uninitialized { if self.streamer.state().await == StreamerState::Uninitialized {
self.streamer.init_auto().await?; self.streamer.init_auto().await?;
} }
if self.streamer.state().await != StreamerState::Streaming {
self.streamer.start().await?;
}
// Connect frame source with correct format // Synchronize WebRTC config with current capture config
if let Some(frame_tx) = self.streamer.frame_sender().await { let (device_path, resolution, format, fps, jpeg_quality) =
// Synchronize WebRTC config with actual capture format self.streamer.current_capture_config().await;
let (format, resolution, fps) = self.streamer.current_video_config().await;
self.webrtc_streamer self.webrtc_streamer
.update_video_config(resolution, format, fps) .update_video_config(resolution, format, fps)
.await; .await;
self.webrtc_streamer.set_video_source(frame_tx).await; if let Some(device_path) = device_path {
self.webrtc_streamer
.set_capture_device(device_path, jpeg_quality)
.await;
} else {
warn!("No capture device configured for WebRTC");
} }
} }
} }
@@ -764,13 +708,6 @@ impl VideoStreamManager {
self.streamer.is_streaming().await self.streamer.is_streaming().await
} }
/// Get frame sender for video frames
pub async fn frame_sender(
&self,
) -> Option<tokio::sync::broadcast::Sender<crate::video::frame::VideoFrame>> {
self.streamer.frame_sender().await
}
/// Subscribe to encoded video frames from the shared video pipeline /// Subscribe to encoded video frames from the shared video pipeline
/// ///
/// This allows RustDesk (and other consumers) to receive H264/H265/VP8/VP9 /// This allows RustDesk (and other consumers) to receive H264/H265/VP8/VP9
@@ -781,10 +718,10 @@ impl VideoStreamManager {
/// Returns None if video capture cannot be started or pipeline creation fails. /// Returns None if video capture cannot be started or pipeline creation fails.
pub async fn subscribe_encoded_frames( pub async fn subscribe_encoded_frames(
&self, &self,
) -> Option< ) -> Option<tokio::sync::mpsc::Receiver<std::sync::Arc<
tokio::sync::broadcast::Receiver<crate::video::shared_video_pipeline::EncodedVideoFrame>, crate::video::shared_video_pipeline::EncodedVideoFrame,
> { >>> {
// 1. Ensure video capture is initialized // 1. Ensure video capture is initialized (for config discovery)
if self.streamer.state().await == StreamerState::Uninitialized { if self.streamer.state().await == StreamerState::Uninitialized {
tracing::info!("Initializing video capture for encoded frame subscription"); tracing::info!("Initializing video capture for encoded frame subscription");
if let Err(e) = self.streamer.init_auto().await { if let Err(e) = self.streamer.init_auto().await {
@@ -796,28 +733,9 @@ impl VideoStreamManager {
} }
} }
// 2. Ensure video capture is running (streaming) // 2. Synchronize WebRTC config with capture config
if self.streamer.state().await != StreamerState::Streaming { let (device_path, resolution, format, fps, jpeg_quality) =
tracing::info!("Starting video capture for encoded frame subscription"); self.streamer.current_capture_config().await;
if let Err(e) = self.streamer.start().await {
tracing::error!("Failed to start video capture for encoded frames: {}", e);
return None;
}
// Wait for capture to stabilize
tokio::time::sleep(tokio::time::Duration::from_millis(100)).await;
}
// 3. Get frame sender from running capture
let frame_tx = match self.streamer.frame_sender().await {
Some(tx) => tx,
None => {
tracing::warn!("Cannot subscribe to encoded frames: no frame sender available");
return None;
}
};
// 4. Synchronize WebRTC config with actual capture format
let (format, resolution, fps) = self.streamer.current_video_config().await;
tracing::info!( tracing::info!(
"Connecting encoded frame subscription: {}x{} {:?} @ {}fps", "Connecting encoded frame subscription: {}x{} {:?} @ {}fps",
resolution.width, resolution.width,
@@ -828,14 +746,17 @@ impl VideoStreamManager {
self.webrtc_streamer self.webrtc_streamer
.update_video_config(resolution, format, fps) .update_video_config(resolution, format, fps)
.await; .await;
if let Some(device_path) = device_path {
self.webrtc_streamer
.set_capture_device(device_path, jpeg_quality)
.await;
} else {
tracing::warn!("No capture device configured for encoded frames");
return None;
}
// 5. Use WebRtcStreamer to ensure the shared video pipeline is running // 3. Use WebRtcStreamer to ensure the shared video pipeline is running
// This will create the pipeline if needed match self.webrtc_streamer.ensure_video_pipeline_for_external().await {
match self
.webrtc_streamer
.ensure_video_pipeline_for_external(frame_tx)
.await
{
Ok(pipeline) => Some(pipeline.subscribe()), Ok(pipeline) => Some(pipeline.subscribe()),
Err(e) => { Err(e) => {
tracing::error!("Failed to start shared video pipeline: {}", e); tracing::error!("Failed to start shared video pipeline: {}", e);

View File

@@ -4,17 +4,28 @@
//! managing the lifecycle of the capture thread and MJPEG/WebRTC distribution. //! managing the lifecycle of the capture thread and MJPEG/WebRTC distribution.
use std::path::PathBuf; use std::path::PathBuf;
use std::sync::atomic::{AtomicBool, AtomicU32, Ordering};
use std::sync::Arc; use std::sync::Arc;
use tokio::sync::{broadcast, RwLock}; use tokio::sync::RwLock;
use tracing::{debug, error, info, trace, warn}; use tracing::{debug, error, info, trace, warn};
use super::capture::{CaptureConfig, CaptureState, VideoCapturer};
use super::device::{enumerate_devices, find_best_device, VideoDeviceInfo}; use super::device::{enumerate_devices, find_best_device, VideoDeviceInfo};
use super::format::{PixelFormat, Resolution}; use super::format::{PixelFormat, Resolution};
use super::frame::VideoFrame; use super::frame::{FrameBuffer, FrameBufferPool, VideoFrame};
use crate::error::{AppError, Result}; use crate::error::{AppError, Result};
use crate::events::{EventBus, SystemEvent}; use crate::events::{EventBus, SystemEvent};
use crate::stream::MjpegStreamHandler; use crate::stream::MjpegStreamHandler;
use v4l::buffer::Type as BufferType;
use v4l::io::traits::CaptureStream;
use v4l::prelude::*;
use v4l::video::capture::Parameters;
use v4l::video::Capture;
use v4l::Format;
/// Minimum valid frame size for capture
const MIN_CAPTURE_FRAME_SIZE: usize = 128;
/// Validate JPEG header every N frames to reduce overhead
const JPEG_VALIDATE_INTERVAL: u64 = 30;
/// Streamer configuration /// Streamer configuration
#[derive(Debug, Clone)] #[derive(Debug, Clone)]
@@ -65,11 +76,14 @@ pub enum StreamerState {
/// Video streamer service /// Video streamer service
pub struct Streamer { pub struct Streamer {
config: RwLock<StreamerConfig>, config: RwLock<StreamerConfig>,
capturer: RwLock<Option<Arc<VideoCapturer>>>,
mjpeg_handler: Arc<MjpegStreamHandler>, mjpeg_handler: Arc<MjpegStreamHandler>,
current_device: RwLock<Option<VideoDeviceInfo>>, current_device: RwLock<Option<VideoDeviceInfo>>,
state: RwLock<StreamerState>, state: RwLock<StreamerState>,
start_lock: tokio::sync::Mutex<()>, start_lock: tokio::sync::Mutex<()>,
direct_stop: AtomicBool,
direct_active: AtomicBool,
direct_handle: tokio::sync::Mutex<Option<tokio::task::JoinHandle<()>>>,
current_fps: AtomicU32,
/// Event bus for broadcasting state changes (optional) /// Event bus for broadcasting state changes (optional)
events: RwLock<Option<Arc<EventBus>>>, events: RwLock<Option<Arc<EventBus>>>,
/// Last published state (for change detection) /// Last published state (for change detection)
@@ -94,11 +108,14 @@ impl Streamer {
pub fn new() -> Arc<Self> { pub fn new() -> Arc<Self> {
Arc::new(Self { Arc::new(Self {
config: RwLock::new(StreamerConfig::default()), config: RwLock::new(StreamerConfig::default()),
capturer: RwLock::new(None),
mjpeg_handler: Arc::new(MjpegStreamHandler::new()), mjpeg_handler: Arc::new(MjpegStreamHandler::new()),
current_device: RwLock::new(None), current_device: RwLock::new(None),
state: RwLock::new(StreamerState::Uninitialized), state: RwLock::new(StreamerState::Uninitialized),
start_lock: tokio::sync::Mutex::new(()), start_lock: tokio::sync::Mutex::new(()),
direct_stop: AtomicBool::new(false),
direct_active: AtomicBool::new(false),
direct_handle: tokio::sync::Mutex::new(None),
current_fps: AtomicU32::new(0),
events: RwLock::new(None), events: RwLock::new(None),
last_published_state: RwLock::new(None), last_published_state: RwLock::new(None),
config_changing: std::sync::atomic::AtomicBool::new(false), config_changing: std::sync::atomic::AtomicBool::new(false),
@@ -114,11 +131,14 @@ impl Streamer {
pub fn with_config(config: StreamerConfig) -> Arc<Self> { pub fn with_config(config: StreamerConfig) -> Arc<Self> {
Arc::new(Self { Arc::new(Self {
config: RwLock::new(config), config: RwLock::new(config),
capturer: RwLock::new(None),
mjpeg_handler: Arc::new(MjpegStreamHandler::new()), mjpeg_handler: Arc::new(MjpegStreamHandler::new()),
current_device: RwLock::new(None), current_device: RwLock::new(None),
state: RwLock::new(StreamerState::Uninitialized), state: RwLock::new(StreamerState::Uninitialized),
start_lock: tokio::sync::Mutex::new(()), start_lock: tokio::sync::Mutex::new(()),
direct_stop: AtomicBool::new(false),
direct_active: AtomicBool::new(false),
direct_handle: tokio::sync::Mutex::new(None),
current_fps: AtomicU32::new(0),
events: RwLock::new(None), events: RwLock::new(None),
last_published_state: RwLock::new(None), last_published_state: RwLock::new(None),
config_changing: std::sync::atomic::AtomicBool::new(false), config_changing: std::sync::atomic::AtomicBool::new(false),
@@ -176,20 +196,6 @@ impl Streamer {
self.mjpeg_handler.clone() self.mjpeg_handler.clone()
} }
/// Get frame sender for WebRTC integration
/// Returns None if no capturer is initialized
pub async fn frame_sender(&self) -> Option<broadcast::Sender<VideoFrame>> {
let capturer = self.capturer.read().await;
capturer.as_ref().map(|c| c.frame_sender())
}
/// Subscribe to video frames
/// Returns None if no capturer is initialized
pub async fn subscribe_frames(&self) -> Option<broadcast::Receiver<VideoFrame>> {
let capturer = self.capturer.read().await;
capturer.as_ref().map(|c| c.subscribe())
}
/// Get current device info /// Get current device info
pub async fn current_device(&self) -> Option<VideoDeviceInfo> { pub async fn current_device(&self) -> Option<VideoDeviceInfo> {
self.current_device.read().await.clone() self.current_device.read().await.clone()
@@ -201,6 +207,20 @@ impl Streamer {
(config.format, config.resolution, config.fps) (config.format, config.resolution, config.fps)
} }
/// Get current capture configuration for direct pipelines
pub async fn current_capture_config(
&self,
) -> (Option<PathBuf>, Resolution, PixelFormat, u32, u8) {
let config = self.config.read().await;
(
config.device_path.clone(),
config.resolution,
config.format,
config.fps,
config.jpeg_quality,
)
}
/// List available video devices /// List available video devices
pub async fn list_devices(&self) -> Result<Vec<VideoDeviceInfo>> { pub async fn list_devices(&self) -> Result<Vec<VideoDeviceInfo>> {
enumerate_devices() enumerate_devices()
@@ -278,18 +298,11 @@ impl Streamer {
// Give clients time to receive the disconnect signal and close their connections // Give clients time to receive the disconnect signal and close their connections
tokio::time::sleep(std::time::Duration::from_millis(100)).await; tokio::time::sleep(std::time::Duration::from_millis(100)).await;
// Stop existing capturer and wait for device release // Stop active capture and wait for device release
{ if self.direct_active.load(Ordering::SeqCst) {
// Take ownership of the old capturer to ensure it's dropped
let old_capturer = self.capturer.write().await.take();
if let Some(capturer) = old_capturer {
info!("Stopping existing capture before applying new config..."); info!("Stopping existing capture before applying new config...");
if let Err(e) = capturer.stop().await { self.stop().await?;
warn!("Error stopping old capturer: {}", e); tokio::time::sleep(std::time::Duration::from_millis(100)).await;
}
// Explicitly drop the capturer to release V4L2 resources
drop(capturer);
}
} }
// Update config // Update config
@@ -301,18 +314,6 @@ impl Streamer {
cfg.fps = fps; cfg.fps = fps;
} }
// Recreate capturer
let capture_config = CaptureConfig {
device_path: device.path.clone(),
resolution,
format,
fps,
jpeg_quality: self.config.read().await.jpeg_quality,
..Default::default()
};
let capturer = Arc::new(VideoCapturer::new(capture_config));
*self.capturer.write().await = Some(capturer.clone());
*self.current_device.write().await = Some(device.clone()); *self.current_device.write().await = Some(device.clone());
*self.state.write().await = StreamerState::Ready; *self.state.write().await = StreamerState::Ready;
@@ -374,21 +375,6 @@ impl Streamer {
// Store device info // Store device info
*self.current_device.write().await = Some(device.clone()); *self.current_device.write().await = Some(device.clone());
// Create capturer
let config = self.config.read().await;
let capture_config = CaptureConfig {
device_path: device.path.clone(),
resolution: config.resolution,
format: config.format,
fps: config.fps,
jpeg_quality: config.jpeg_quality,
..Default::default()
};
drop(config);
let capturer = Arc::new(VideoCapturer::new(capture_config));
*self.capturer.write().await = Some(capturer);
*self.state.write().await = StreamerState::Ready; *self.state.write().await = StreamerState::Ready;
info!("Streamer initialized: {} @ {}", format, resolution); info!("Streamer initialized: {} @ {}", format, resolution);
@@ -445,43 +431,30 @@ impl Streamer {
.ok_or_else(|| AppError::VideoError("No resolutions available".to_string())) .ok_or_else(|| AppError::VideoError("No resolutions available".to_string()))
} }
/// Restart the capturer only (for recovery - doesn't spawn new monitor) /// Restart capture for recovery (direct capture path)
/// async fn restart_capture(self: &Arc<Self>) -> Result<()> {
/// This is a simpler version of start() used during device recovery. self.direct_stop.store(false, Ordering::SeqCst);
/// It doesn't spawn a new state monitor since the existing one is still active. self.start().await?;
async fn restart_capturer(&self) -> Result<()> {
let capturer = self.capturer.read().await;
let capturer = capturer
.as_ref()
.ok_or_else(|| AppError::VideoError("Capturer not initialized".to_string()))?;
// Start capture // Wait briefly for the capture thread to initialize the device.
capturer.start().await?; // If it fails immediately, the state will flip to Error/DeviceLost.
for _ in 0..5 {
// Set MJPEG handler online tokio::time::sleep(std::time::Duration::from_millis(100)).await;
self.mjpeg_handler.set_online(); let state = *self.state.read().await;
match state {
// Start frame distribution task StreamerState::Streaming | StreamerState::NoSignal => return Ok(()),
let mjpeg_handler = self.mjpeg_handler.clone(); StreamerState::Error | StreamerState::DeviceLost => {
let mut frame_rx = capturer.subscribe(); return Err(AppError::VideoError(
"Failed to restart capture".to_string(),
tokio::spawn(async move { ))
debug!("Recovery frame distribution task started");
loop {
match frame_rx.recv().await {
Ok(frame) => {
mjpeg_handler.update_frame(frame);
} }
Err(tokio::sync::broadcast::error::RecvError::Lagged(_)) => {} _ => {}
Err(tokio::sync::broadcast::error::RecvError::Closed) => {
debug!("Frame channel closed");
break;
} }
} }
}
});
Ok(()) Err(AppError::VideoError(
"Capture restart timed out".to_string(),
))
} }
/// Start streaming /// Start streaming
@@ -498,138 +471,26 @@ impl Streamer {
self.init_auto().await?; self.init_auto().await?;
} }
let capturer = self.capturer.read().await; let device = self
let capturer = capturer .current_device
.as_ref() .read()
.ok_or_else(|| AppError::VideoError("Capturer not initialized".to_string()))?; .await
.clone()
.ok_or_else(|| AppError::VideoError("No video device configured".to_string()))?;
// Start capture let config = self.config.read().await.clone();
capturer.start().await?; self.direct_stop.store(false, Ordering::SeqCst);
self.direct_active.store(true, Ordering::SeqCst);
// Set MJPEG handler online before starting frame distribution let streamer = self.clone();
// This is important after config changes where disconnect_all_clients() set it offline let handle = tokio::task::spawn_blocking(move || {
streamer.run_direct_capture(device.path, config);
});
*self.direct_handle.lock().await = Some(handle);
// Set MJPEG handler online before starting capture
self.mjpeg_handler.set_online(); self.mjpeg_handler.set_online();
// Start frame distribution task
let mjpeg_handler = self.mjpeg_handler.clone();
let mut frame_rx = capturer.subscribe();
let state_ref = Arc::downgrade(self);
let frame_tx = capturer.frame_sender();
tokio::spawn(async move {
info!("Frame distribution task started");
// Track when we started having no active consumers
let mut idle_since: Option<std::time::Instant> = None;
const IDLE_STOP_DELAY_SECS: u64 = 5;
loop {
match frame_rx.recv().await {
Ok(frame) => {
mjpeg_handler.update_frame(frame);
// Check if there are any active consumers:
// - MJPEG clients via mjpeg_handler
// - Other subscribers (WebRTC/RustDesk) via frame_tx receiver_count
// Note: receiver_count includes this task, so > 1 means other subscribers
let mjpeg_clients = mjpeg_handler.client_count();
let other_subscribers = frame_tx.receiver_count().saturating_sub(1);
if mjpeg_clients == 0 && other_subscribers == 0 {
if idle_since.is_none() {
idle_since = Some(std::time::Instant::now());
trace!("No active video consumers, starting idle timer");
} else if let Some(since) = idle_since {
if since.elapsed().as_secs() >= IDLE_STOP_DELAY_SECS {
info!(
"No active video consumers for {}s, stopping frame distribution",
IDLE_STOP_DELAY_SECS
);
// Stop the streamer
if let Some(streamer) = state_ref.upgrade() {
if let Err(e) = streamer.stop().await {
warn!(
"Failed to stop streamer during idle cleanup: {}",
e
);
}
}
break;
}
}
} else {
// Reset idle timer when we have consumers
if idle_since.is_some() {
trace!("Video consumers active, resetting idle timer");
idle_since = None;
}
}
}
Err(tokio::sync::broadcast::error::RecvError::Lagged(_)) => {}
Err(tokio::sync::broadcast::error::RecvError::Closed) => {
debug!("Frame channel closed");
break;
}
}
// Check if streamer still exists
if state_ref.upgrade().is_none() {
break;
}
}
info!("Frame distribution task ended");
});
// Monitor capture state
let mut state_rx = capturer.state_watch();
let state_ref = Arc::downgrade(self);
let mjpeg_handler = self.mjpeg_handler.clone();
tokio::spawn(async move {
while state_rx.changed().await.is_ok() {
let capture_state = *state_rx.borrow();
match capture_state {
CaptureState::Running => {
if let Some(streamer) = state_ref.upgrade() {
*streamer.state.write().await = StreamerState::Streaming;
}
}
CaptureState::NoSignal => {
mjpeg_handler.set_offline();
if let Some(streamer) = state_ref.upgrade() {
*streamer.state.write().await = StreamerState::NoSignal;
}
}
CaptureState::Stopped => {
mjpeg_handler.set_offline();
if let Some(streamer) = state_ref.upgrade() {
*streamer.state.write().await = StreamerState::Ready;
}
}
CaptureState::Error => {
mjpeg_handler.set_offline();
if let Some(streamer) = state_ref.upgrade() {
*streamer.state.write().await = StreamerState::Error;
}
}
CaptureState::DeviceLost => {
mjpeg_handler.set_offline();
if let Some(streamer) = state_ref.upgrade() {
*streamer.state.write().await = StreamerState::DeviceLost;
// Start device recovery task (fire and forget)
let streamer_clone = Arc::clone(&streamer);
tokio::spawn(async move {
streamer_clone.start_device_recovery_internal().await;
});
}
}
CaptureState::Starting => {
// Starting state - device is initializing, no action needed
}
}
}
});
// Start background tasks only once per Streamer instance // Start background tasks only once per Streamer instance
// Use compare_exchange to atomically check and set the flag // Use compare_exchange to atomically check and set the flag
if self if self
@@ -735,9 +596,11 @@ impl Streamer {
/// Stop streaming /// Stop streaming
pub async fn stop(&self) -> Result<()> { pub async fn stop(&self) -> Result<()> {
if let Some(capturer) = self.capturer.read().await.as_ref() { self.direct_stop.store(true, Ordering::SeqCst);
capturer.stop().await?; if let Some(handle) = self.direct_handle.lock().await.take() {
let _ = handle.await;
} }
self.direct_active.store(false, Ordering::SeqCst);
self.mjpeg_handler.set_offline(); self.mjpeg_handler.set_offline();
*self.state.write().await = StreamerState::Ready; *self.state.write().await = StreamerState::Ready;
@@ -749,6 +612,258 @@ impl Streamer {
Ok(()) Ok(())
} }
/// Direct capture loop for MJPEG mode (single loop, no broadcast)
fn run_direct_capture(self: Arc<Self>, device_path: PathBuf, config: StreamerConfig) {
const MAX_RETRIES: u32 = 5;
const RETRY_DELAY_MS: u64 = 200;
const IDLE_STOP_DELAY_SECS: u64 = 5;
const BUFFER_COUNT: u32 = 2;
let handle = tokio::runtime::Handle::current();
let mut last_state = StreamerState::Streaming;
let mut set_state = |new_state: StreamerState| {
if new_state != last_state {
handle.block_on(async {
*self.state.write().await = new_state;
self.publish_event(self.current_state_event().await).await;
});
last_state = new_state;
}
};
let mut device_opt: Option<Device> = None;
let mut format_opt: Option<Format> = None;
let mut last_error: Option<String> = None;
for attempt in 0..MAX_RETRIES {
if self.direct_stop.load(Ordering::Relaxed) {
self.direct_active.store(false, Ordering::SeqCst);
return;
}
let device = match Device::with_path(&device_path) {
Ok(d) => d,
Err(e) => {
let err_str = e.to_string();
if err_str.contains("busy") || err_str.contains("resource") {
warn!(
"Device busy on attempt {}/{}, retrying in {}ms...",
attempt + 1,
MAX_RETRIES,
RETRY_DELAY_MS
);
std::thread::sleep(std::time::Duration::from_millis(RETRY_DELAY_MS));
last_error = Some(err_str);
continue;
}
last_error = Some(err_str);
break;
}
};
let requested = Format::new(
config.resolution.width,
config.resolution.height,
config.format.to_fourcc(),
);
match device.set_format(&requested) {
Ok(actual) => {
device_opt = Some(device);
format_opt = Some(actual);
break;
}
Err(e) => {
let err_str = e.to_string();
if err_str.contains("busy") || err_str.contains("resource") {
warn!(
"Device busy on set_format attempt {}/{}, retrying in {}ms...",
attempt + 1,
MAX_RETRIES,
RETRY_DELAY_MS
);
std::thread::sleep(std::time::Duration::from_millis(RETRY_DELAY_MS));
last_error = Some(err_str);
continue;
}
last_error = Some(err_str);
break;
}
}
}
let (device, actual_format) = match (device_opt, format_opt) {
(Some(d), Some(f)) => (d, f),
_ => {
error!(
"Failed to open device {:?}: {}",
device_path,
last_error.unwrap_or_else(|| "unknown error".to_string())
);
self.mjpeg_handler.set_offline();
set_state(StreamerState::Error);
self.direct_active.store(false, Ordering::SeqCst);
self.current_fps.store(0, Ordering::Relaxed);
return;
}
};
info!(
"Capture format: {}x{} {:?} stride={}",
actual_format.width, actual_format.height, actual_format.fourcc, actual_format.stride
);
let resolution = Resolution::new(actual_format.width, actual_format.height);
let pixel_format =
PixelFormat::from_fourcc(actual_format.fourcc).unwrap_or(config.format);
if config.fps > 0 {
if let Err(e) = device.set_params(&Parameters::with_fps(config.fps)) {
warn!("Failed to set hardware FPS: {}", e);
}
}
let mut stream =
match MmapStream::with_buffers(&device, BufferType::VideoCapture, BUFFER_COUNT) {
Ok(s) => s,
Err(e) => {
error!("Failed to create capture stream: {}", e);
self.mjpeg_handler.set_offline();
set_state(StreamerState::Error);
self.direct_active.store(false, Ordering::SeqCst);
self.current_fps.store(0, Ordering::Relaxed);
return;
}
};
let buffer_pool = Arc::new(FrameBufferPool::new(BUFFER_COUNT.max(4) as usize));
let mut signal_present = true;
let mut sequence: u64 = 0;
let mut validate_counter: u64 = 0;
let mut idle_since: Option<std::time::Instant> = None;
let mut fps_frame_count: u64 = 0;
let mut last_fps_time = std::time::Instant::now();
while !self.direct_stop.load(Ordering::Relaxed) {
let mjpeg_clients = self.mjpeg_handler.client_count();
if mjpeg_clients == 0 {
if idle_since.is_none() {
idle_since = Some(std::time::Instant::now());
trace!("No active video consumers, starting idle timer");
} else if let Some(since) = idle_since {
if since.elapsed().as_secs() >= IDLE_STOP_DELAY_SECS {
info!(
"No active video consumers for {}s, stopping capture",
IDLE_STOP_DELAY_SECS
);
self.mjpeg_handler.set_offline();
set_state(StreamerState::Ready);
break;
}
}
} else if idle_since.is_some() {
trace!("Video consumers active, resetting idle timer");
idle_since = None;
}
let (buf, meta) = match stream.next() {
Ok(frame_data) => frame_data,
Err(e) => {
if e.kind() == std::io::ErrorKind::TimedOut {
if signal_present {
signal_present = false;
self.mjpeg_handler.set_offline();
set_state(StreamerState::NoSignal);
self.current_fps.store(0, Ordering::Relaxed);
fps_frame_count = 0;
last_fps_time = std::time::Instant::now();
}
std::thread::sleep(std::time::Duration::from_millis(100));
continue;
}
let is_device_lost = match e.raw_os_error() {
Some(6) => true, // ENXIO
Some(19) => true, // ENODEV
Some(5) => true, // EIO
Some(32) => true, // EPIPE
Some(108) => true, // ESHUTDOWN
_ => false,
};
if is_device_lost {
error!("Video device lost: {} - {}", device_path.display(), e);
self.mjpeg_handler.set_offline();
handle.block_on(async {
*self.last_lost_device.write().await =
Some(device_path.display().to_string());
*self.last_lost_reason.write().await = Some(e.to_string());
});
set_state(StreamerState::DeviceLost);
handle.block_on(async {
let streamer = Arc::clone(&self);
tokio::spawn(async move {
streamer.start_device_recovery_internal().await;
});
});
break;
}
error!("Capture error: {}", e);
continue;
}
};
let frame_size = meta.bytesused as usize;
if frame_size < MIN_CAPTURE_FRAME_SIZE {
continue;
}
validate_counter = validate_counter.wrapping_add(1);
if pixel_format.is_compressed()
&& validate_counter % JPEG_VALIDATE_INTERVAL == 0
&& !VideoFrame::is_valid_jpeg_bytes(&buf[..frame_size])
{
continue;
}
let mut owned = buffer_pool.take(frame_size);
owned.resize(frame_size, 0);
owned[..frame_size].copy_from_slice(&buf[..frame_size]);
let frame = VideoFrame::from_pooled(
Arc::new(FrameBuffer::new(owned, Some(buffer_pool.clone()))),
resolution,
pixel_format,
actual_format.stride,
sequence,
);
sequence = sequence.wrapping_add(1);
if !signal_present {
signal_present = true;
self.mjpeg_handler.set_online();
set_state(StreamerState::Streaming);
}
self.mjpeg_handler.update_frame(frame);
fps_frame_count += 1;
let fps_elapsed = last_fps_time.elapsed();
if fps_elapsed >= std::time::Duration::from_secs(1) {
let current_fps = fps_frame_count as f32 / fps_elapsed.as_secs_f32();
fps_frame_count = 0;
last_fps_time = std::time::Instant::now();
self.current_fps
.store((current_fps * 100.0) as u32, Ordering::Relaxed);
}
}
self.direct_active.store(false, Ordering::SeqCst);
self.current_fps.store(0, Ordering::Relaxed);
}
/// Check if streaming /// Check if streaming
pub async fn is_streaming(&self) -> bool { pub async fn is_streaming(&self) -> bool {
self.state().await == StreamerState::Streaming self.state().await == StreamerState::Streaming
@@ -756,14 +871,8 @@ impl Streamer {
/// Get stream statistics /// Get stream statistics
pub async fn stats(&self) -> StreamerStats { pub async fn stats(&self) -> StreamerStats {
let capturer = self.capturer.read().await;
let capture_stats = if let Some(c) = capturer.as_ref() {
Some(c.stats().await)
} else {
None
};
let config = self.config.read().await; let config = self.config.read().await;
let fps = self.current_fps.load(Ordering::Relaxed) as f32 / 100.0;
StreamerStats { StreamerStats {
state: self.state().await, state: self.state().await,
@@ -772,15 +881,7 @@ impl Streamer {
resolution: Some((config.resolution.width, config.resolution.height)), resolution: Some((config.resolution.width, config.resolution.height)),
clients: self.mjpeg_handler.client_count(), clients: self.mjpeg_handler.client_count(),
target_fps: config.fps, target_fps: config.fps,
fps: capture_stats.as_ref().map(|s| s.current_fps).unwrap_or(0.0), fps,
frames_captured: capture_stats
.as_ref()
.map(|s| s.frames_captured)
.unwrap_or(0),
frames_dropped: capture_stats
.as_ref()
.map(|s| s.frames_dropped)
.unwrap_or(0),
} }
} }
@@ -829,23 +930,23 @@ impl Streamer {
return; return;
} }
// Get last lost device info from capturer // Get last lost device info (from direct capture)
let (device, reason) = { let device = if let Some(device) = self.last_lost_device.read().await.clone() {
let capturer = self.capturer.read().await; device
if let Some(cap) = capturer.as_ref() { } else {
cap.last_error().unwrap_or_else(|| { self.current_device
let device_path = self .read()
.current_device .await
.blocking_read()
.as_ref() .as_ref()
.map(|d| d.path.display().to_string()) .map(|d| d.path.display().to_string())
.unwrap_or_else(|| "unknown".to_string()); .unwrap_or_else(|| "unknown".to_string())
(device_path, "Device lost".to_string())
})
} else {
("unknown".to_string(), "Device lost".to_string())
}
}; };
let reason = self
.last_lost_reason
.read()
.await
.clone()
.unwrap_or_else(|| "Device lost".to_string());
// Store error info // Store error info
*self.last_lost_device.write().await = Some(device.clone()); *self.last_lost_device.write().await = Some(device.clone());
@@ -908,7 +1009,7 @@ impl Streamer {
} }
// Try to restart capture // Try to restart capture
match streamer.restart_capturer().await { match streamer.restart_capture().await {
Ok(_) => { Ok(_) => {
info!( info!(
"Video device {} recovered after {} attempts", "Video device {} recovered after {} attempts",
@@ -947,11 +1048,14 @@ impl Default for Streamer {
fn default() -> Self { fn default() -> Self {
Self { Self {
config: RwLock::new(StreamerConfig::default()), config: RwLock::new(StreamerConfig::default()),
capturer: RwLock::new(None),
mjpeg_handler: Arc::new(MjpegStreamHandler::new()), mjpeg_handler: Arc::new(MjpegStreamHandler::new()),
current_device: RwLock::new(None), current_device: RwLock::new(None),
state: RwLock::new(StreamerState::Uninitialized), state: RwLock::new(StreamerState::Uninitialized),
start_lock: tokio::sync::Mutex::new(()), start_lock: tokio::sync::Mutex::new(()),
direct_stop: AtomicBool::new(false),
direct_active: AtomicBool::new(false),
direct_handle: tokio::sync::Mutex::new(None),
current_fps: AtomicU32::new(0),
events: RwLock::new(None), events: RwLock::new(None),
last_published_state: RwLock::new(None), last_published_state: RwLock::new(None),
config_changing: std::sync::atomic::AtomicBool::new(false), config_changing: std::sync::atomic::AtomicBool::new(false),
@@ -976,8 +1080,6 @@ pub struct StreamerStats {
pub target_fps: u32, pub target_fps: u32,
/// Current actual FPS /// Current actual FPS
pub fps: f32, pub fps: f32,
pub frames_captured: u64,
pub frames_dropped: u64,
} }
impl serde::Serialize for StreamerState { impl serde::Serialize for StreamerState {

View File

@@ -83,7 +83,7 @@ struct VideoSession {
/// Last activity time /// Last activity time
last_activity: Instant, last_activity: Instant,
/// Frame receiver /// Frame receiver
frame_rx: Option<broadcast::Receiver<EncodedVideoFrame>>, frame_rx: Option<tokio::sync::mpsc::Receiver<std::sync::Arc<EncodedVideoFrame>>>,
/// Stats /// Stats
frames_received: u64, frames_received: u64,
bytes_received: u64, bytes_received: u64,
@@ -243,7 +243,7 @@ impl VideoSessionManager {
pub async fn start_session( pub async fn start_session(
&self, &self,
session_id: &str, session_id: &str,
) -> Result<broadcast::Receiver<EncodedVideoFrame>> { ) -> Result<tokio::sync::mpsc::Receiver<std::sync::Arc<EncodedVideoFrame>>> {
// Ensure pipeline is running with correct codec // Ensure pipeline is running with correct codec
self.ensure_pipeline_for_session(session_id).await?; self.ensure_pipeline_for_session(session_id).await?;

View File

@@ -26,7 +26,6 @@ use axum::{
use futures::{SinkExt, StreamExt}; use futures::{SinkExt, StreamExt};
use std::sync::Arc; use std::sync::Arc;
use std::time::Instant; use std::time::Instant;
use tokio::sync::broadcast;
use tracing::{debug, info, warn}; use tracing::{debug, info, warn};
use crate::audio::OpusFrame; use crate::audio::OpusFrame;
@@ -79,25 +78,23 @@ async fn handle_audio_socket(socket: WebSocket, state: Arc<AppState>) {
loop { loop {
tokio::select! { tokio::select! {
// Receive Opus frames and send to client // Receive Opus frames and send to client
opus_result = opus_rx.recv() => { opus_result = opus_rx.changed() => {
match opus_result { if opus_result.is_err() {
Ok(frame) => { info!("Audio stream closed");
break;
}
let frame = match opus_rx.borrow().clone() {
Some(frame) => frame,
None => continue,
};
let binary = encode_audio_packet(&frame, stream_start); let binary = encode_audio_packet(&frame, stream_start);
if sender.send(Message::Binary(binary.into())).await.is_err() { if sender.send(Message::Binary(binary.into())).await.is_err() {
debug!("Failed to send audio frame, client disconnected"); debug!("Failed to send audio frame, client disconnected");
break; break;
} }
} }
Err(broadcast::error::RecvError::Lagged(n)) => {
warn!("Audio WebSocket client lagged by {} frames", n);
// Continue - just skip the missed frames
}
Err(broadcast::error::RecvError::Closed) => {
info!("Audio stream closed");
break;
}
}
}
// Handle client messages (ping/close) // Handle client messages (ping/close)
msg = receiver.next() => { msg = receiver.next() => {

View File

@@ -6,6 +6,7 @@ use std::sync::Arc;
use crate::config::*; use crate::config::*;
use crate::error::{AppError, Result}; use crate::error::{AppError, Result};
use crate::events::SystemEvent;
use crate::state::AppState; use crate::state::AppState;
/// 应用 Video 配置变更 /// 应用 Video 配置变更
@@ -57,27 +58,55 @@ pub async fn apply_video_config(
.map_err(|e| AppError::VideoError(format!("Failed to apply video config: {}", e)))?; .map_err(|e| AppError::VideoError(format!("Failed to apply video config: {}", e)))?;
tracing::info!("Video config applied to streamer"); tracing::info!("Video config applied to streamer");
// Step 3: 重启 streamer // Step 3: 重启 streamer(仅 MJPEG 模式)
if !state.stream_manager.is_webrtc_enabled().await {
if let Err(e) = state.stream_manager.start().await { if let Err(e) = state.stream_manager.start().await {
tracing::error!("Failed to start streamer after config change: {}", e); tracing::error!("Failed to start streamer after config change: {}", e);
} else { } else {
tracing::info!("Streamer started after config change"); tracing::info!("Streamer started after config change");
} }
}
// Step 4: 更新 WebRTC frame source // 配置 WebRTC direct capture所有模式统一配置
if let Some(frame_tx) = state.stream_manager.frame_sender().await { let (device_path, _resolution, _format, _fps, jpeg_quality) = state
let receiver_count = frame_tx.receiver_count(); .stream_manager
.streamer()
.current_capture_config()
.await;
if let Some(device_path) = device_path {
state state
.stream_manager .stream_manager
.webrtc_streamer() .webrtc_streamer()
.set_video_source(frame_tx) .set_capture_device(device_path, jpeg_quality)
.await; .await;
tracing::info!(
"WebRTC streamer frame source updated (receiver_count={})",
receiver_count
);
} else { } else {
tracing::warn!("No frame source available after config change"); tracing::warn!("No capture device configured for WebRTC");
}
if state.stream_manager.is_webrtc_enabled().await {
use crate::video::encoder::VideoCodecType;
let codec = state
.stream_manager
.webrtc_streamer()
.current_video_codec()
.await;
let codec_str = match codec {
VideoCodecType::H264 => "h264",
VideoCodecType::H265 => "h265",
VideoCodecType::VP8 => "vp8",
VideoCodecType::VP9 => "vp9",
}
.to_string();
let is_hardware = state
.stream_manager
.webrtc_streamer()
.is_hardware_encoding()
.await;
state.events.publish(SystemEvent::WebRTCReady {
transition_id: None,
codec: codec_str,
hardware: is_hardware,
});
} }
tracing::info!("Video config applied successfully"); tracing::info!("Video config applied successfully");
@@ -157,6 +186,15 @@ pub async fn apply_hid_config(
) -> Result<()> { ) -> Result<()> {
// 检查 OTG 描述符是否变更 // 检查 OTG 描述符是否变更
let descriptor_changed = old_config.otg_descriptor != new_config.otg_descriptor; let descriptor_changed = old_config.otg_descriptor != new_config.otg_descriptor;
let old_hid_functions = old_config.effective_otg_functions();
let new_hid_functions = new_config.effective_otg_functions();
let hid_functions_changed = old_hid_functions != new_hid_functions;
if new_config.backend == HidBackend::Otg && new_hid_functions.is_empty() {
return Err(AppError::BadRequest(
"OTG HID functions cannot be empty".to_string(),
));
}
// 如果描述符变更且当前使用 OTG 后端,需要重建 Gadget // 如果描述符变更且当前使用 OTG 后端,需要重建 Gadget
if descriptor_changed && new_config.backend == HidBackend::Otg { if descriptor_changed && new_config.backend == HidBackend::Otg {
@@ -181,6 +219,7 @@ pub async fn apply_hid_config(
&& old_config.ch9329_baudrate == new_config.ch9329_baudrate && old_config.ch9329_baudrate == new_config.ch9329_baudrate
&& old_config.otg_udc == new_config.otg_udc && old_config.otg_udc == new_config.otg_udc
&& !descriptor_changed && !descriptor_changed
&& !hid_functions_changed
{ {
tracing::info!("HID config unchanged, skipping reload"); tracing::info!("HID config unchanged, skipping reload");
return Ok(()); return Ok(());
@@ -188,6 +227,16 @@ pub async fn apply_hid_config(
tracing::info!("Applying HID config changes..."); tracing::info!("Applying HID config changes...");
if new_config.backend == HidBackend::Otg
&& (hid_functions_changed || old_config.backend != HidBackend::Otg)
{
state
.otg_service
.update_hid_functions(new_hid_functions.clone())
.await
.map_err(|e| AppError::Config(format!("OTG HID function update failed: {}", e)))?;
}
let new_hid_backend = match new_config.backend { let new_hid_backend = match new_config.backend {
HidBackend::Otg => crate::hid::HidBackendType::Otg, HidBackend::Otg => crate::hid::HidBackendType::Otg,
HidBackend::Ch9329 => crate::hid::HidBackendType::Ch9329 { HidBackend::Ch9329 => crate::hid::HidBackendType::Ch9329 {
@@ -208,32 +257,6 @@ pub async fn apply_hid_config(
new_config.backend new_config.backend
); );
// When switching to OTG backend, automatically enable MSD if not already enabled
// OTG HID and MSD share the same USB gadget, so it makes sense to enable both
if new_config.backend == HidBackend::Otg && old_config.backend != HidBackend::Otg {
let msd_guard = state.msd.read().await;
if msd_guard.is_none() {
drop(msd_guard); // Release read lock before acquiring write lock
tracing::info!("OTG HID enabled, automatically initializing MSD...");
// Get MSD config from store
let config = state.config.get();
let msd =
crate::msd::MsdController::new(state.otg_service.clone(), config.msd.msd_dir_path());
if let Err(e) = msd.init().await {
tracing::warn!("Failed to auto-initialize MSD for OTG: {}", e);
} else {
let events = state.events.clone();
msd.set_event_bus(events).await;
*state.msd.write().await = Some(msd);
tracing::info!("MSD automatically initialized for OTG mode");
}
}
}
Ok(()) Ok(())
} }

View File

@@ -0,0 +1,33 @@
use axum::{extract::State, Json};
use std::sync::Arc;
use crate::config::AuthConfig;
use crate::error::Result;
use crate::state::AppState;
use super::types::AuthConfigUpdate;
/// Get auth configuration (sensitive fields are cleared)
pub async fn get_auth_config(State(state): State<Arc<AppState>>) -> Json<AuthConfig> {
let mut auth = state.config.get().auth.clone();
auth.totp_secret = None;
Json(auth)
}
/// Update auth configuration
pub async fn update_auth_config(
State(state): State<Arc<AppState>>,
Json(update): Json<AuthConfigUpdate>,
) -> Result<Json<AuthConfig>> {
update.validate()?;
state
.config
.update(|config| {
update.apply_to(&mut config.auth);
})
.await?;
let mut auth = state.config.get().auth.clone();
auth.totp_secret = None;
Ok(Json(auth))
}

View File

@@ -21,6 +21,7 @@ mod types;
mod atx; mod atx;
mod audio; mod audio;
mod auth;
mod hid; mod hid;
mod msd; mod msd;
mod rustdesk; mod rustdesk;
@@ -31,6 +32,7 @@ mod web;
// 导出 handler 函数 // 导出 handler 函数
pub use atx::{get_atx_config, update_atx_config}; pub use atx::{get_atx_config, update_atx_config};
pub use audio::{get_audio_config, update_audio_config}; pub use audio::{get_audio_config, update_audio_config};
pub use auth::{get_auth_config, update_auth_config};
pub use hid::{get_hid_config, update_hid_config}; pub use hid::{get_hid_config, update_hid_config};
pub use msd::{get_msd_config, update_msd_config}; pub use msd::{get_msd_config, update_msd_config};
pub use rustdesk::{ pub use rustdesk::{

View File

@@ -6,6 +6,25 @@ use serde::Deserialize;
use std::path::Path; use std::path::Path;
use typeshare::typeshare; use typeshare::typeshare;
// ===== Auth Config =====
#[typeshare]
#[derive(Debug, Deserialize)]
pub struct AuthConfigUpdate {
pub single_user_allow_multiple_sessions: Option<bool>,
}
impl AuthConfigUpdate {
pub fn validate(&self) -> crate::error::Result<()> {
Ok(())
}
pub fn apply_to(&self, config: &mut AuthConfig) {
if let Some(allow_multiple) = self.single_user_allow_multiple_sessions {
config.single_user_allow_multiple_sessions = allow_multiple;
}
}
}
// ===== Video Config ===== // ===== Video Config =====
#[typeshare] #[typeshare]
#[derive(Debug, Deserialize)] #[derive(Debug, Deserialize)]
@@ -252,6 +271,32 @@ impl OtgDescriptorConfigUpdate {
} }
} }
#[typeshare]
#[derive(Debug, Deserialize)]
pub struct OtgHidFunctionsUpdate {
pub keyboard: Option<bool>,
pub mouse_relative: Option<bool>,
pub mouse_absolute: Option<bool>,
pub consumer: Option<bool>,
}
impl OtgHidFunctionsUpdate {
pub fn apply_to(&self, config: &mut OtgHidFunctions) {
if let Some(enabled) = self.keyboard {
config.keyboard = enabled;
}
if let Some(enabled) = self.mouse_relative {
config.mouse_relative = enabled;
}
if let Some(enabled) = self.mouse_absolute {
config.mouse_absolute = enabled;
}
if let Some(enabled) = self.consumer {
config.consumer = enabled;
}
}
}
#[typeshare] #[typeshare]
#[derive(Debug, Deserialize)] #[derive(Debug, Deserialize)]
pub struct HidConfigUpdate { pub struct HidConfigUpdate {
@@ -260,6 +305,8 @@ pub struct HidConfigUpdate {
pub ch9329_baudrate: Option<u32>, pub ch9329_baudrate: Option<u32>,
pub otg_udc: Option<String>, pub otg_udc: Option<String>,
pub otg_descriptor: Option<OtgDescriptorConfigUpdate>, pub otg_descriptor: Option<OtgDescriptorConfigUpdate>,
pub otg_profile: Option<OtgHidProfile>,
pub otg_functions: Option<OtgHidFunctionsUpdate>,
pub mouse_absolute: Option<bool>, pub mouse_absolute: Option<bool>,
} }
@@ -295,6 +342,12 @@ impl HidConfigUpdate {
if let Some(ref desc) = self.otg_descriptor { if let Some(ref desc) = self.otg_descriptor {
desc.apply_to(&mut config.otg_descriptor); desc.apply_to(&mut config.otg_descriptor);
} }
if let Some(profile) = self.otg_profile.clone() {
config.otg_profile = profile;
}
if let Some(ref functions) = self.otg_functions {
functions.apply_to(&mut config.otg_functions);
}
if let Some(absolute) = self.mouse_absolute { if let Some(absolute) = self.mouse_absolute {
config.mouse_absolute = absolute; config.mouse_absolute = absolute;
} }

View File

@@ -12,6 +12,7 @@ use tracing::{info, warn};
use crate::auth::{Session, SESSION_COOKIE}; use crate::auth::{Session, SESSION_COOKIE};
use crate::config::{AppConfig, StreamMode}; use crate::config::{AppConfig, StreamMode};
use crate::error::{AppError, Result}; use crate::error::{AppError, Result};
use crate::events::SystemEvent;
use crate::state::AppState; use crate::state::AppState;
use crate::video::encoder::BitratePreset; use crate::video::encoder::BitratePreset;
@@ -407,6 +408,13 @@ pub async fn login(
.await? .await?
.ok_or_else(|| AppError::AuthError("Invalid username or password".to_string()))?; .ok_or_else(|| AppError::AuthError("Invalid username or password".to_string()))?;
if !config.auth.single_user_allow_multiple_sessions {
// Kick existing sessions before creating a new one.
let revoked_ids = state.sessions.list_ids().await?;
state.sessions.delete_all().await?;
state.remember_revoked_sessions(revoked_ids).await;
}
// Create session // Create session
let session = state.sessions.create(&user.id).await?; let session = state.sessions.create(&user.id).await?;
@@ -465,15 +473,15 @@ pub async fn auth_check(
axum::Extension(session): axum::Extension<Session>, axum::Extension(session): axum::Extension<Session>,
) -> Json<AuthCheckResponse> { ) -> Json<AuthCheckResponse> {
// Get user info from user_id // Get user info from user_id
let (username, is_admin) = match state.users.get(&session.user_id).await { let username = match state.users.get(&session.user_id).await {
Ok(Some(user)) => (Some(user.username), user.is_admin), Ok(Some(user)) => Some(user.username),
_ => (Some(session.user_id.clone()), false), // Fallback to user_id if user not found _ => Some(session.user_id.clone()), // Fallback to user_id if user not found
}; };
Json(AuthCheckResponse { Json(AuthCheckResponse {
authenticated: true, authenticated: true,
user: username, user: username,
is_admin, is_admin: true,
}) })
} }
@@ -797,7 +805,8 @@ pub async fn update_config(
} }
tracing::info!("Video config applied successfully"); tracing::info!("Video config applied successfully");
// Step 3: Start the streamer to begin capturing frames // Step 3: Start the streamer to begin capturing frames (MJPEG mode only)
if !state.stream_manager.is_webrtc_enabled().await {
// This is necessary because apply_video_config only creates the capturer but doesn't start it // This is necessary because apply_video_config only creates the capturer but doesn't start it
if let Err(e) = state.stream_manager.start().await { if let Err(e) = state.stream_manager.start().await {
tracing::error!("Failed to start streamer after config change: {}", e); tracing::error!("Failed to start streamer after config change: {}", e);
@@ -805,26 +814,48 @@ pub async fn update_config(
} else { } else {
tracing::info!("Streamer started after config change"); tracing::info!("Streamer started after config change");
} }
}
// Update frame source from the NEW capturer // Configure WebRTC direct capture (all modes)
// This is critical - the old frame_tx is invalid after config change let (device_path, _resolution, _format, _fps, jpeg_quality) = state
// New sessions will use this frame_tx when they connect .stream_manager
if let Some(frame_tx) = state.stream_manager.frame_sender().await { .streamer()
let receiver_count = frame_tx.receiver_count(); .current_capture_config()
// Use WebRtcStreamer (new unified interface) .await;
if let Some(device_path) = device_path {
state state
.stream_manager .stream_manager
.webrtc_streamer() .webrtc_streamer()
.set_video_source(frame_tx) .set_capture_device(device_path, jpeg_quality)
.await; .await;
tracing::info!(
"WebRTC streamer frame source updated with new capturer (receiver_count={})",
receiver_count
);
} else { } else {
tracing::warn!( tracing::warn!("No capture device configured for WebRTC");
"No frame source available after config change - streamer may not be running" }
);
if state.stream_manager.is_webrtc_enabled().await {
use crate::video::encoder::VideoCodecType;
let codec = state
.stream_manager
.webrtc_streamer()
.current_video_codec()
.await;
let codec_str = match codec {
VideoCodecType::H264 => "h264",
VideoCodecType::H265 => "h265",
VideoCodecType::VP8 => "vp8",
VideoCodecType::VP9 => "vp9",
}
.to_string();
let is_hardware = state
.stream_manager
.webrtc_streamer()
.is_hardware_encoding()
.await;
state.events.publish(SystemEvent::WebRTCReady {
transition_id: None,
codec: codec_str,
hardware: is_hardware,
});
} }
} }
@@ -1388,8 +1419,9 @@ pub async fn stream_mode_set(
} }
}; };
let no_switch_needed = !tx.accepted && !tx.switching && tx.transition_id.is_none();
Ok(Json(StreamModeResponse { Ok(Json(StreamModeResponse {
success: tx.accepted, success: tx.accepted || no_switch_needed,
mode: if tx.accepted { mode: if tx.accepted {
requested_mode_str.to_string() requested_mode_str.to_string()
} else { } else {
@@ -1935,6 +1967,7 @@ pub async fn webrtc_close_session(
#[derive(Serialize)] #[derive(Serialize)]
pub struct IceServersResponse { pub struct IceServersResponse {
pub ice_servers: Vec<IceServerInfo>, pub ice_servers: Vec<IceServerInfo>,
pub mdns_mode: String,
} }
#[derive(Serialize)] #[derive(Serialize)]
@@ -1950,6 +1983,7 @@ pub struct IceServerInfo {
/// Returns user-configured servers, or Google STUN as fallback if none configured /// Returns user-configured servers, or Google STUN as fallback if none configured
pub async fn webrtc_ice_servers(State(state): State<Arc<AppState>>) -> Json<IceServersResponse> { pub async fn webrtc_ice_servers(State(state): State<Arc<AppState>>) -> Json<IceServersResponse> {
use crate::webrtc::config::public_ice; use crate::webrtc::config::public_ice;
use crate::webrtc::mdns::{mdns_mode, mdns_mode_label};
let config = state.config.get(); let config = state.config.get();
let mut ice_servers = Vec::new(); let mut ice_servers = Vec::new();
@@ -2005,7 +2039,13 @@ pub async fn webrtc_ice_servers(State(state): State<Arc<AppState>>) -> Json<IceS
// Note: TURN servers are not provided - users must configure their own // Note: TURN servers are not provided - users must configure their own
} }
Json(IceServersResponse { ice_servers }) let mdns_mode = mdns_mode();
let mdns_mode = mdns_mode_label(mdns_mode).to_string();
Json(IceServersResponse {
ice_servers,
mdns_mode,
})
} }
// ============================================================================ // ============================================================================
@@ -2661,200 +2701,9 @@ pub async fn list_audio_devices(
} }
// ============================================================================ // ============================================================================
// User Management // Password Management
// ============================================================================ // ============================================================================
use axum::extract::Path;
use axum::Extension;
/// User response (without password hash)
#[derive(Serialize)]
pub struct UserResponse {
pub id: String,
pub username: String,
pub is_admin: bool,
pub created_at: String,
pub updated_at: String,
}
impl From<crate::auth::User> for UserResponse {
fn from(user: crate::auth::User) -> Self {
Self {
id: user.id,
username: user.username,
is_admin: user.is_admin,
created_at: user.created_at.to_rfc3339(),
updated_at: user.updated_at.to_rfc3339(),
}
}
}
/// List all users (admin only)
pub async fn list_users(
State(state): State<Arc<AppState>>,
Extension(session): Extension<Session>,
) -> Result<Json<Vec<UserResponse>>> {
// Check if current user is admin
let current_user = state
.users
.get(&session.user_id)
.await?
.ok_or_else(|| AppError::AuthError("User not found".to_string()))?;
if !current_user.is_admin {
return Err(AppError::Forbidden("Admin access required".to_string()));
}
let users = state.users.list().await?;
let response: Vec<UserResponse> = users.into_iter().map(UserResponse::from).collect();
Ok(Json(response))
}
/// Create user request
#[derive(Deserialize)]
pub struct CreateUserRequest {
pub username: String,
pub password: String,
pub is_admin: bool,
}
/// Create new user (admin only)
pub async fn create_user(
State(state): State<Arc<AppState>>,
Extension(session): Extension<Session>,
Json(req): Json<CreateUserRequest>,
) -> Result<Json<UserResponse>> {
// Check if current user is admin
let current_user = state
.users
.get(&session.user_id)
.await?
.ok_or_else(|| AppError::AuthError("User not found".to_string()))?;
if !current_user.is_admin {
return Err(AppError::Forbidden("Admin access required".to_string()));
}
// Validate input
if req.username.len() < 2 {
return Err(AppError::BadRequest(
"Username must be at least 2 characters".to_string(),
));
}
if req.password.len() < 4 {
return Err(AppError::BadRequest(
"Password must be at least 4 characters".to_string(),
));
}
let user = state
.users
.create(&req.username, &req.password, req.is_admin)
.await?;
info!("User created: {} (admin: {})", user.username, user.is_admin);
Ok(Json(UserResponse::from(user)))
}
/// Update user request
#[derive(Deserialize)]
pub struct UpdateUserRequest {
pub username: Option<String>,
pub is_admin: Option<bool>,
}
/// Update user (admin only)
pub async fn update_user(
State(state): State<Arc<AppState>>,
Extension(session): Extension<Session>,
Path(user_id): Path<String>,
Json(req): Json<UpdateUserRequest>,
) -> Result<Json<UserResponse>> {
// Check if current user is admin
let current_user = state
.users
.get(&session.user_id)
.await?
.ok_or_else(|| AppError::AuthError("User not found".to_string()))?;
if !current_user.is_admin {
return Err(AppError::Forbidden("Admin access required".to_string()));
}
// Get target user
let mut user = state
.users
.get(&user_id)
.await?
.ok_or_else(|| AppError::NotFound("User not found".to_string()))?;
// Update fields if provided
if let Some(username) = req.username {
if username.len() < 2 {
return Err(AppError::BadRequest(
"Username must be at least 2 characters".to_string(),
));
}
user.username = username;
}
if let Some(is_admin) = req.is_admin {
user.is_admin = is_admin;
}
// Note: We need to add an update method to UserStore
// For now, return error
Err(AppError::Internal(
"User update not yet implemented".to_string(),
))
}
/// Delete user (admin only)
pub async fn delete_user(
State(state): State<Arc<AppState>>,
Extension(session): Extension<Session>,
Path(user_id): Path<String>,
) -> Result<Json<LoginResponse>> {
// Check if current user is admin
let current_user = state
.users
.get(&session.user_id)
.await?
.ok_or_else(|| AppError::AuthError("User not found".to_string()))?;
if !current_user.is_admin {
return Err(AppError::Forbidden("Admin access required".to_string()));
}
// Prevent deleting self
if user_id == session.user_id {
return Err(AppError::BadRequest(
"Cannot delete your own account".to_string(),
));
}
// Check if this is the last admin
let users = state.users.list().await?;
let admin_count = users.iter().filter(|u| u.is_admin).count();
let target_user = state
.users
.get(&user_id)
.await?
.ok_or_else(|| AppError::NotFound("User not found".to_string()))?;
if target_user.is_admin && admin_count <= 1 {
return Err(AppError::BadRequest(
"Cannot delete the last admin user".to_string(),
));
}
state.users.delete(&user_id).await?;
info!("User deleted: {}", target_user.username);
Ok(Json(LoginResponse {
success: true,
message: Some("User deleted successfully".to_string()),
}))
}
/// Change password request /// Change password request
#[derive(Deserialize)] #[derive(Deserialize)]
pub struct ChangePasswordRequest { pub struct ChangePasswordRequest {
@@ -2862,38 +2711,24 @@ pub struct ChangePasswordRequest {
pub new_password: String, pub new_password: String,
} }
/// Change user password /// Change current user's password
pub async fn change_user_password( pub async fn change_password(
State(state): State<Arc<AppState>>, State(state): State<Arc<AppState>>,
Extension(session): Extension<Session>, axum::Extension(session): axum::Extension<Session>,
Path(user_id): Path<String>,
Json(req): Json<ChangePasswordRequest>, Json(req): Json<ChangePasswordRequest>,
) -> Result<Json<LoginResponse>> { ) -> Result<Json<LoginResponse>> {
// Check if current user is admin or changing own password
let current_user = state let current_user = state
.users .users
.get(&session.user_id) .get(&session.user_id)
.await? .await?
.ok_or_else(|| AppError::AuthError("User not found".to_string()))?; .ok_or_else(|| AppError::AuthError("User not found".to_string()))?;
let is_self = user_id == session.user_id;
let is_admin = current_user.is_admin;
if !is_self && !is_admin {
return Err(AppError::Forbidden(
"Cannot change other user's password".to_string(),
));
}
// Validate new password
if req.new_password.len() < 4 { if req.new_password.len() < 4 {
return Err(AppError::BadRequest( return Err(AppError::BadRequest(
"Password must be at least 4 characters".to_string(), "Password must be at least 4 characters".to_string(),
)); ));
} }
// If changing own password, verify current password
if is_self {
let verified = state let verified = state
.users .users
.verify(&current_user.username, &req.current_password) .verify(&current_user.username, &req.current_password)
@@ -2903,13 +2738,12 @@ pub async fn change_user_password(
"Current password is incorrect".to_string(), "Current password is incorrect".to_string(),
)); ));
} }
}
state state
.users .users
.update_password(&user_id, &req.new_password) .update_password(&session.user_id, &req.new_password)
.await?; .await?;
info!("Password changed for user ID: {}", user_id); info!("Password changed for user ID: {}", session.user_id);
Ok(Json(LoginResponse { Ok(Json(LoginResponse {
success: true, success: true,
@@ -2917,6 +2751,55 @@ pub async fn change_user_password(
})) }))
} }
/// Change username request
#[derive(Deserialize)]
pub struct ChangeUsernameRequest {
pub username: String,
pub current_password: String,
}
/// Change current user's username
pub async fn change_username(
State(state): State<Arc<AppState>>,
axum::Extension(session): axum::Extension<Session>,
Json(req): Json<ChangeUsernameRequest>,
) -> Result<Json<LoginResponse>> {
let current_user = state
.users
.get(&session.user_id)
.await?
.ok_or_else(|| AppError::AuthError("User not found".to_string()))?;
if req.username.len() < 2 {
return Err(AppError::BadRequest(
"Username must be at least 2 characters".to_string(),
));
}
let verified = state
.users
.verify(&current_user.username, &req.current_password)
.await?;
if verified.is_none() {
return Err(AppError::AuthError(
"Current password is incorrect".to_string(),
));
}
if current_user.username != req.username {
state
.users
.update_username(&session.user_id, &req.username)
.await?;
}
info!("Username changed for user ID: {}", session.user_id);
Ok(Json(LoginResponse {
success: true,
message: Some("Username changed successfully".to_string()),
}))
}
// ============================================================================ // ============================================================================
// System Control // System Control
// ============================================================================ // ============================================================================

View File

@@ -1,7 +1,7 @@
use axum::{ use axum::{
extract::DefaultBodyLimit, extract::DefaultBodyLimit,
middleware, middleware,
routing::{any, delete, get, patch, post, put}, routing::{any, delete, get, patch, post},
Router, Router,
}; };
use std::sync::Arc; use std::sync::Arc;
@@ -13,7 +13,7 @@ use tower_http::{
use super::audio_ws::audio_ws_handler; use super::audio_ws::audio_ws_handler;
use super::handlers; use super::handlers;
use super::ws::ws_handler; use super::ws::ws_handler;
use crate::auth::{auth_middleware, require_admin}; use crate::auth::auth_middleware;
use crate::hid::websocket::ws_hid_handler; use crate::hid::websocket::ws_hid_handler;
use crate::state::AppState; use crate::state::AppState;
@@ -37,6 +37,8 @@ pub fn create_router(state: Arc<AppState>) -> Router {
.route("/info", get(handlers::system_info)) .route("/info", get(handlers::system_info))
.route("/auth/logout", post(handlers::logout)) .route("/auth/logout", post(handlers::logout))
.route("/auth/check", get(handlers::auth_check)) .route("/auth/check", get(handlers::auth_check))
.route("/auth/password", post(handlers::change_password))
.route("/auth/username", post(handlers::change_username))
.route("/devices", get(handlers::list_devices)) .route("/devices", get(handlers::list_devices))
// WebSocket endpoint for real-time events // WebSocket endpoint for real-time events
.route("/ws", any(ws_handler)) .route("/ws", any(ws_handler))
@@ -69,8 +71,7 @@ pub fn create_router(state: Arc<AppState>) -> Router {
.route("/audio/devices", get(handlers::list_audio_devices)) .route("/audio/devices", get(handlers::list_audio_devices))
// Audio WebSocket endpoint // Audio WebSocket endpoint
.route("/ws/audio", any(audio_ws_handler)) .route("/ws/audio", any(audio_ws_handler))
// User can change their own password (handler will check ownership) ;
.route("/users/{id}/password", post(handlers::change_user_password));
// Admin-only routes (require admin privileges) // Admin-only routes (require admin privileges)
let admin_routes = Router::new() let admin_routes = Router::new()
@@ -126,6 +127,9 @@ pub fn create_router(state: Arc<AppState>) -> Router {
// Web server configuration // Web server configuration
.route("/config/web", get(handlers::config::get_web_config)) .route("/config/web", get(handlers::config::get_web_config))
.route("/config/web", patch(handlers::config::update_web_config)) .route("/config/web", patch(handlers::config::update_web_config))
// Auth configuration
.route("/config/auth", get(handlers::config::get_auth_config))
.route("/config/auth", patch(handlers::config::update_auth_config))
// System control // System control
.route("/system/restart", post(handlers::system_restart)) .route("/system/restart", post(handlers::system_restart))
// MSD (Mass Storage Device) endpoints // MSD (Mass Storage Device) endpoints
@@ -160,11 +164,6 @@ pub fn create_router(state: Arc<AppState>) -> Router {
.route("/atx/wol", post(handlers::atx_wol)) .route("/atx/wol", post(handlers::atx_wol))
// Device discovery endpoints // Device discovery endpoints
.route("/devices/atx", get(handlers::devices::list_atx_devices)) .route("/devices/atx", get(handlers::devices::list_atx_devices))
// User management endpoints
.route("/users", get(handlers::list_users))
.route("/users", post(handlers::create_user))
.route("/users/{id}", put(handlers::update_user))
.route("/users/{id}", delete(handlers::delete_user))
// Extension management endpoints // Extension management endpoints
.route("/extensions", get(handlers::extensions::list_extensions)) .route("/extensions", get(handlers::extensions::list_extensions))
.route("/extensions/{id}", get(handlers::extensions::get_extension)) .route("/extensions/{id}", get(handlers::extensions::get_extension))
@@ -201,8 +200,7 @@ pub fn create_router(state: Arc<AppState>) -> Router {
.route("/terminal/", get(handlers::terminal::terminal_index)) .route("/terminal/", get(handlers::terminal::terminal_index))
.route("/terminal/ws", get(handlers::terminal::terminal_ws)) .route("/terminal/ws", get(handlers::terminal::terminal_ws))
.route("/terminal/{*path}", get(handlers::terminal::terminal_proxy)) .route("/terminal/{*path}", get(handlers::terminal::terminal_proxy))
// Apply admin middleware to all admin routes ;
.layer(middleware::from_fn_with_state(state.clone(), require_admin));
// Combine protected routes (user + admin) // Combine protected routes (user + admin)
let protected_routes = Router::new().merge(user_routes).merge(admin_routes); let protected_routes = Router::new().merge(user_routes).merge(admin_routes);

34
src/webrtc/mdns.rs Normal file
View File

@@ -0,0 +1,34 @@
use webrtc::ice::mdns::MulticastDnsMode;
pub fn mdns_mode_from_env() -> Option<MulticastDnsMode> {
let raw = std::env::var("ONE_KVM_WEBRTC_MDNS_MODE").ok()?;
let value = raw.trim().to_ascii_lowercase();
if value.is_empty() {
return None;
}
match value.as_str() {
"disabled" | "off" | "false" | "0" => Some(MulticastDnsMode::Disabled),
"query" | "query_only" | "query-only" => Some(MulticastDnsMode::QueryOnly),
"gather" | "query_and_gather" | "query-and-gather" | "on" | "true" | "1" => {
Some(MulticastDnsMode::QueryAndGather)
}
_ => None,
}
}
pub fn mdns_mode() -> MulticastDnsMode {
mdns_mode_from_env().unwrap_or(MulticastDnsMode::QueryAndGather)
}
pub fn mdns_mode_label(mode: MulticastDnsMode) -> &'static str {
match mode {
MulticastDnsMode::Disabled => "disabled",
MulticastDnsMode::QueryOnly => "query_only",
MulticastDnsMode::QueryAndGather => "query_and_gather",
}
}
pub fn default_mdns_host_name(session_id: &str) -> String {
format!("{session_id}.local")
}

View File

@@ -27,6 +27,7 @@
pub mod config; pub mod config;
pub mod h265_payloader; pub mod h265_payloader;
pub(crate) mod mdns;
pub mod peer; pub mod peer;
pub mod rtp; pub mod rtp;
pub mod session; pub mod session;
@@ -42,7 +43,5 @@ pub use rtp::{H264VideoTrack, H264VideoTrackConfig, OpusAudioTrack};
pub use session::WebRtcSessionManager; pub use session::WebRtcSessionManager;
pub use signaling::{ConnectionState, IceCandidate, SdpAnswer, SdpOffer, SignalingMessage}; pub use signaling::{ConnectionState, IceCandidate, SdpAnswer, SdpOffer, SignalingMessage};
pub use universal_session::{UniversalSession, UniversalSessionConfig, UniversalSessionInfo}; pub use universal_session::{UniversalSession, UniversalSessionConfig, UniversalSessionInfo};
pub use video_track::{ pub use video_track::{UniversalVideoTrack, UniversalVideoTrackConfig, VideoCodec};
UniversalVideoTrack, UniversalVideoTrackConfig, VideoCodec, VideoTrackStats,
};
pub use webrtc_streamer::{SessionInfo, WebRtcStreamer, WebRtcStreamerConfig, WebRtcStreamerStats}; pub use webrtc_streamer::{SessionInfo, WebRtcStreamer, WebRtcStreamerConfig, WebRtcStreamerStats};

View File

@@ -5,9 +5,11 @@ use tokio::sync::{broadcast, watch, Mutex, RwLock};
use tracing::{debug, info}; use tracing::{debug, info};
use webrtc::api::interceptor_registry::register_default_interceptors; use webrtc::api::interceptor_registry::register_default_interceptors;
use webrtc::api::media_engine::MediaEngine; use webrtc::api::media_engine::MediaEngine;
use webrtc::api::setting_engine::SettingEngine;
use webrtc::api::APIBuilder; use webrtc::api::APIBuilder;
use webrtc::data_channel::data_channel_message::DataChannelMessage; use webrtc::data_channel::data_channel_message::DataChannelMessage;
use webrtc::data_channel::RTCDataChannel; use webrtc::data_channel::RTCDataChannel;
use webrtc::ice::mdns::MulticastDnsMode;
use webrtc::ice_transport::ice_candidate::RTCIceCandidate; use webrtc::ice_transport::ice_candidate::RTCIceCandidate;
use webrtc::ice_transport::ice_server::RTCIceServer; use webrtc::ice_transport::ice_server::RTCIceServer;
use webrtc::interceptor::registry::Registry; use webrtc::interceptor::registry::Registry;
@@ -17,6 +19,7 @@ use webrtc::peer_connection::sdp::session_description::RTCSessionDescription;
use webrtc::peer_connection::RTCPeerConnection; use webrtc::peer_connection::RTCPeerConnection;
use super::config::WebRtcConfig; use super::config::WebRtcConfig;
use super::mdns::{default_mdns_host_name, mdns_mode};
use super::signaling::{ConnectionState, IceCandidate, SdpAnswer, SdpOffer}; use super::signaling::{ConnectionState, IceCandidate, SdpAnswer, SdpOffer};
use super::track::{VideoTrack, VideoTrackConfig}; use super::track::{VideoTrack, VideoTrackConfig};
use crate::error::{AppError, Result}; use crate::error::{AppError, Result};
@@ -60,8 +63,17 @@ impl PeerConnection {
registry = register_default_interceptors(registry, &mut media_engine) registry = register_default_interceptors(registry, &mut media_engine)
.map_err(|e| AppError::VideoError(format!("Failed to register interceptors: {}", e)))?; .map_err(|e| AppError::VideoError(format!("Failed to register interceptors: {}", e)))?;
// Create API // Create API (with optional mDNS settings)
let mut setting_engine = SettingEngine::default();
let mode = mdns_mode();
setting_engine.set_ice_multicast_dns_mode(mode);
if mode == MulticastDnsMode::QueryAndGather {
setting_engine.set_multicast_dns_host_name(default_mdns_host_name(&session_id));
}
info!("WebRTC mDNS mode: {:?} (session {})", mode, session_id);
let api = APIBuilder::new() let api = APIBuilder::new()
.with_setting_engine(setting_engine)
.with_media_engine(media_engine) .with_media_engine(media_engine)
.with_interceptor_registry(registry) .with_interceptor_registry(registry)
.build(); .build();
@@ -418,7 +430,7 @@ pub struct PeerConnectionManager {
impl PeerConnectionManager { impl PeerConnectionManager {
/// Create a new peer connection manager /// Create a new peer connection manager
pub fn new(config: WebRtcConfig) -> Self { pub fn new(config: WebRtcConfig) -> Self {
let (frame_tx, _) = broadcast::channel(16); // Buffer size 16 for low latency let (frame_tx, _) = broadcast::channel(16);
Self { Self {
config, config,
@@ -430,7 +442,7 @@ impl PeerConnectionManager {
/// Create a new peer connection manager with HID controller /// Create a new peer connection manager with HID controller
pub fn with_hid(config: WebRtcConfig, hid: Arc<HidController>) -> Self { pub fn with_hid(config: WebRtcConfig, hid: Arc<HidController>) -> Self {
let (frame_tx, _) = broadcast::channel(16); // Buffer size 16 for low latency let (frame_tx, _) = broadcast::channel(16);
Self { Self {
config, config,

View File

@@ -42,8 +42,6 @@ pub struct H264VideoTrack {
config: H264VideoTrackConfig, config: H264VideoTrackConfig,
/// H264 payloader for manual packetization (if needed) /// H264 payloader for manual packetization (if needed)
payloader: Mutex<H264Payloader>, payloader: Mutex<H264Payloader>,
/// Statistics
stats: Mutex<H264TrackStats>,
/// Cached SPS NAL unit for injection before IDR frames /// Cached SPS NAL unit for injection before IDR frames
/// Some hardware encoders don't repeat SPS/PPS with every keyframe /// Some hardware encoders don't repeat SPS/PPS with every keyframe
cached_sps: Mutex<Option<Bytes>>, cached_sps: Mutex<Option<Bytes>>,
@@ -83,21 +81,6 @@ impl Default for H264VideoTrackConfig {
} }
} }
/// H264 track statistics
#[derive(Debug, Clone, Default)]
pub struct H264TrackStats {
/// Frames sent
pub frames_sent: u64,
/// Bytes sent
pub bytes_sent: u64,
/// Packets sent (RTP packets)
pub packets_sent: u64,
/// Key frames sent
pub keyframes_sent: u64,
/// Errors encountered
pub errors: u64,
}
impl H264VideoTrack { impl H264VideoTrack {
/// Create a new H264 video track /// Create a new H264 video track
/// ///
@@ -134,7 +117,6 @@ impl H264VideoTrack {
track, track,
config, config,
payloader: Mutex::new(H264Payloader::default()), payloader: Mutex::new(H264Payloader::default()),
stats: Mutex::new(H264TrackStats::default()),
cached_sps: Mutex::new(None), cached_sps: Mutex::new(None),
cached_pps: Mutex::new(None), cached_pps: Mutex::new(None),
} }
@@ -150,11 +132,6 @@ impl H264VideoTrack {
self.track.clone() self.track.clone()
} }
/// Get current statistics
pub async fn stats(&self) -> H264TrackStats {
self.stats.lock().await.clone()
}
/// Write an H264 encoded frame to the track /// Write an H264 encoded frame to the track
/// ///
/// The frame data should be H264 Annex B format (with start codes 0x00000001 or 0x000001). /// The frame data should be H264 Annex B format (with start codes 0x00000001 or 0x000001).
@@ -288,16 +265,6 @@ impl H264VideoTrack {
nal_count += 1; nal_count += 1;
} }
// Update statistics
if nal_count > 0 {
let mut stats = self.stats.lock().await;
stats.frames_sent += 1;
stats.bytes_sent += total_bytes;
if is_keyframe {
stats.keyframes_sent += 1;
}
}
trace!( trace!(
"Sent frame: {} NAL units, {} bytes, keyframe={}", "Sent frame: {} NAL units, {} bytes, keyframe={}",
nal_count, nal_count,
@@ -344,19 +311,6 @@ impl H264VideoTrack {
pub struct OpusAudioTrack { pub struct OpusAudioTrack {
/// The underlying WebRTC track /// The underlying WebRTC track
track: Arc<TrackLocalStaticSample>, track: Arc<TrackLocalStaticSample>,
/// Statistics
stats: Mutex<OpusTrackStats>,
}
/// Opus track statistics
#[derive(Debug, Clone, Default)]
pub struct OpusTrackStats {
/// Packets sent
pub packets_sent: u64,
/// Bytes sent
pub bytes_sent: u64,
/// Errors
pub errors: u64,
} }
impl OpusAudioTrack { impl OpusAudioTrack {
@@ -378,7 +332,6 @@ impl OpusAudioTrack {
Self { Self {
track, track,
stats: Mutex::new(OpusTrackStats::default()),
} }
} }
@@ -392,11 +345,6 @@ impl OpusAudioTrack {
self.track.clone() self.track.clone()
} }
/// Get statistics
pub async fn stats(&self) -> OpusTrackStats {
self.stats.lock().await.clone()
}
/// Write Opus encoded audio data /// Write Opus encoded audio data
/// ///
/// # Arguments /// # Arguments
@@ -417,23 +365,13 @@ impl OpusAudioTrack {
..Default::default() ..Default::default()
}; };
match self.track.write_sample(&sample).await { self.track
Ok(_) => { .write_sample(&sample)
let mut stats = self.stats.lock().await; .await
stats.packets_sent += 1; .map_err(|e| {
stats.bytes_sent += data.len() as u64;
Ok(())
}
Err(e) => {
let mut stats = self.stats.lock().await;
stats.errors += 1;
error!("Failed to write Opus sample: {}", e); error!("Failed to write Opus sample: {}", e);
Err(AppError::WebRtcError(format!( AppError::WebRtcError(format!("Failed to write audio sample: {}", e))
"Failed to write audio sample: {}", })
e
)))
}
}
} }
} }

View File

@@ -2,7 +2,7 @@
use std::sync::Arc; use std::sync::Arc;
use std::time::Instant; use std::time::Instant;
use tokio::sync::{broadcast, watch, Mutex}; use tokio::sync::{broadcast, watch};
use tracing::{debug, error, info}; use tracing::{debug, error, info};
use webrtc::rtp_transceiver::rtp_codec::RTCRtpCodecCapability; use webrtc::rtp_transceiver::rtp_codec::RTCRtpCodecCapability;
use webrtc::track::track_local::track_local_static_rtp::TrackLocalStaticRTP; use webrtc::track::track_local::track_local_static_rtp::TrackLocalStaticRTP;
@@ -87,38 +87,11 @@ pub fn audio_codec_capability() -> RTCRtpCodecCapability {
} }
} }
/// Video track statistics
#[derive(Debug, Clone, Default)]
pub struct VideoTrackStats {
/// Frames sent
pub frames_sent: u64,
/// Bytes sent
pub bytes_sent: u64,
/// Packets sent
pub packets_sent: u64,
/// Packets lost (RTCP feedback)
pub packets_lost: u64,
/// Current bitrate (bps)
pub current_bitrate: u64,
/// Round trip time (ms)
pub rtt_ms: f64,
/// Jitter (ms)
pub jitter_ms: f64,
}
/// Video track for WebRTC streaming /// Video track for WebRTC streaming
pub struct VideoTrack { pub struct VideoTrack {
config: VideoTrackConfig, config: VideoTrackConfig,
/// RTP track /// RTP track
track: Arc<TrackLocalStaticRTP>, track: Arc<TrackLocalStaticRTP>,
/// Statistics
stats: Arc<Mutex<VideoTrackStats>>,
/// Sequence number for RTP
sequence_number: Arc<Mutex<u16>>,
/// Timestamp for RTP
timestamp: Arc<Mutex<u32>>,
/// Last frame time
last_frame_time: Arc<Mutex<Option<Instant>>>,
/// Running flag /// Running flag
running: Arc<watch::Sender<bool>>, running: Arc<watch::Sender<bool>>,
} }
@@ -139,10 +112,6 @@ impl VideoTrack {
Self { Self {
config, config,
track, track,
stats: Arc::new(Mutex::new(VideoTrackStats::default())),
sequence_number: Arc::new(Mutex::new(0)),
timestamp: Arc::new(Mutex::new(0)),
last_frame_time: Arc::new(Mutex::new(None)),
running: Arc::new(running_tx), running: Arc::new(running_tx),
} }
} }
@@ -152,25 +121,17 @@ impl VideoTrack {
self.track.clone() self.track.clone()
} }
/// Get current statistics
pub async fn stats(&self) -> VideoTrackStats {
self.stats.lock().await.clone()
}
/// Start sending frames from a broadcast receiver /// Start sending frames from a broadcast receiver
pub async fn start_sending(&self, mut frame_rx: broadcast::Receiver<VideoFrame>) { pub async fn start_sending(&self, mut frame_rx: broadcast::Receiver<VideoFrame>) {
let _ = self.running.send(true); let _ = self.running.send(true);
let track = self.track.clone(); let track = self.track.clone();
let stats = self.stats.clone();
let sequence_number = self.sequence_number.clone();
let timestamp = self.timestamp.clone();
let last_frame_time = self.last_frame_time.clone();
let clock_rate = self.config.clock_rate; let clock_rate = self.config.clock_rate;
let mut running_rx = self.running.subscribe(); let mut running_rx = self.running.subscribe();
info!("Starting video track sender"); info!("Starting video track sender");
tokio::spawn(async move { tokio::spawn(async move {
let mut state = SendState::default();
loop { loop {
tokio::select! { tokio::select! {
result = frame_rx.recv() => { result = frame_rx.recv() => {
@@ -179,10 +140,7 @@ impl VideoTrack {
if let Err(e) = Self::send_frame( if let Err(e) = Self::send_frame(
&track, &track,
&frame, &frame,
&stats, &mut state,
&sequence_number,
&timestamp,
&last_frame_time,
clock_rate, clock_rate,
).await { ).await {
debug!("Failed to send frame: {}", e); debug!("Failed to send frame: {}", e);
@@ -219,29 +177,22 @@ impl VideoTrack {
async fn send_frame( async fn send_frame(
track: &TrackLocalStaticRTP, track: &TrackLocalStaticRTP,
frame: &VideoFrame, frame: &VideoFrame,
stats: &Mutex<VideoTrackStats>, state: &mut SendState,
sequence_number: &Mutex<u16>,
timestamp: &Mutex<u32>,
last_frame_time: &Mutex<Option<Instant>>,
clock_rate: u32, clock_rate: u32,
) -> Result<(), Box<dyn std::error::Error + Send + Sync>> { ) -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
// Calculate timestamp increment based on frame timing // Calculate timestamp increment based on frame timing
let now = Instant::now(); let now = Instant::now();
let mut last_time = last_frame_time.lock().await; let timestamp_increment = if let Some(last) = state.last_frame_time {
let timestamp_increment = if let Some(last) = *last_time {
let elapsed = now.duration_since(last); let elapsed = now.duration_since(last);
((elapsed.as_secs_f64() * clock_rate as f64) as u32).min(clock_rate / 10) ((elapsed.as_secs_f64() * clock_rate as f64) as u32).min(clock_rate / 10)
} else { } else {
clock_rate / 30 // Default to 30 fps clock_rate / 30 // Default to 30 fps
}; };
*last_time = Some(now); state.last_frame_time = Some(now);
drop(last_time);
// Update timestamp // Update timestamp
let mut ts = timestamp.lock().await; state.timestamp = state.timestamp.wrapping_add(timestamp_increment);
*ts = ts.wrapping_add(timestamp_increment); let _current_ts = state.timestamp;
let _current_ts = *ts;
drop(ts);
// For H.264, we need to packetize into RTP // For H.264, we need to packetize into RTP
// This is a simplified implementation - real implementation needs proper NAL unit handling // This is a simplified implementation - real implementation needs proper NAL unit handling
@@ -257,33 +208,34 @@ impl VideoTrack {
let _is_last = i == packet_count - 1; let _is_last = i == packet_count - 1;
// Get sequence number // Get sequence number
let mut seq = sequence_number.lock().await; let _seq_num = state.sequence_number;
let _seq_num = *seq; state.sequence_number = state.sequence_number.wrapping_add(1);
*seq = seq.wrapping_add(1);
drop(seq);
// Build RTP packet payload // Build RTP packet payload
// For simplicity, just send raw data - real implementation needs proper RTP packetization // For simplicity, just send raw data - real implementation needs proper RTP packetization
let payload = data[start..end].to_vec(); let payload = &data[start..end];
bytes_sent += payload.len() as u64; bytes_sent += payload.len() as u64;
// Write sample (the track handles RTP header construction) // Write sample (the track handles RTP header construction)
if let Err(e) = track.write(&payload).await { if let Err(e) = track.write(payload).await {
error!("Failed to write RTP packet: {}", e); error!("Failed to write RTP packet: {}", e);
return Err(e.into()); return Err(e.into());
} }
} }
// Update stats let _ = bytes_sent;
let mut s = stats.lock().await;
s.frames_sent += 1;
s.bytes_sent += bytes_sent;
s.packets_sent += packet_count as u64;
Ok(()) Ok(())
} }
} }
#[derive(Debug, Default)]
struct SendState {
sequence_number: u16,
timestamp: u32,
last_frame_time: Option<Instant>,
}
/// Audio track configuration /// Audio track configuration
#[derive(Debug, Clone)] #[derive(Debug, Clone)]
pub struct AudioTrackConfig { pub struct AudioTrackConfig {

View File

@@ -123,15 +123,6 @@ impl Default for UnifiedVideoTrackConfig {
} }
} }
/// Unified video track statistics
#[derive(Debug, Clone, Default)]
pub struct UnifiedVideoTrackStats {
pub frames_sent: u64,
pub bytes_sent: u64,
pub keyframes_sent: u64,
pub errors: u64,
}
/// Cached NAL parameter sets for H264 /// Cached NAL parameter sets for H264
struct H264ParameterSets { struct H264ParameterSets {
sps: Option<Bytes>, sps: Option<Bytes>,
@@ -179,8 +170,6 @@ pub struct UnifiedVideoTrack {
track: Arc<TrackLocalStaticSample>, track: Arc<TrackLocalStaticSample>,
/// Track configuration /// Track configuration
config: UnifiedVideoTrackConfig, config: UnifiedVideoTrackConfig,
/// Statistics
stats: Mutex<UnifiedVideoTrackStats>,
/// H264 parameter set cache /// H264 parameter set cache
h264_params: Mutex<H264ParameterSets>, h264_params: Mutex<H264ParameterSets>,
/// H265 parameter set cache /// H265 parameter set cache
@@ -207,7 +196,6 @@ impl UnifiedVideoTrack {
Self { Self {
track, track,
config, config,
stats: Mutex::new(UnifiedVideoTrackStats::default()),
h264_params: Mutex::new(H264ParameterSets { sps: None, pps: None }), h264_params: Mutex::new(H264ParameterSets { sps: None, pps: None }),
h265_params: Mutex::new(H265ParameterSets { vps: None, sps: None, pps: None }), h265_params: Mutex::new(H265ParameterSets { vps: None, sps: None, pps: None }),
} }
@@ -277,9 +265,6 @@ impl UnifiedVideoTrack {
} }
/// Get statistics /// Get statistics
pub async fn stats(&self) -> UnifiedVideoTrackStats {
self.stats.lock().await.clone()
}
/// Write an encoded frame to the track /// Write an encoded frame to the track
/// ///
@@ -504,13 +489,6 @@ impl UnifiedVideoTrack {
debug!("VP8 write_sample failed: {}", e); debug!("VP8 write_sample failed: {}", e);
} }
let mut stats = self.stats.lock().await;
stats.frames_sent += 1;
stats.bytes_sent += data.len() as u64;
if is_keyframe {
stats.keyframes_sent += 1;
}
trace!("VP8 frame: {} bytes, keyframe={}", data.len(), is_keyframe); trace!("VP8 frame: {} bytes, keyframe={}", data.len(), is_keyframe);
Ok(()) Ok(())
} }
@@ -531,13 +509,6 @@ impl UnifiedVideoTrack {
debug!("VP9 write_sample failed: {}", e); debug!("VP9 write_sample failed: {}", e);
} }
let mut stats = self.stats.lock().await;
stats.frames_sent += 1;
stats.bytes_sent += data.len() as u64;
if is_keyframe {
stats.keyframes_sent += 1;
}
trace!("VP9 frame: {} bytes, keyframe={}", data.len(), is_keyframe); trace!("VP9 frame: {} bytes, keyframe={}", data.len(), is_keyframe);
Ok(()) Ok(())
} }
@@ -572,15 +543,6 @@ impl UnifiedVideoTrack {
total_bytes += nal_data.len() as u64; total_bytes += nal_data.len() as u64;
} }
if nal_count > 0 {
let mut stats = self.stats.lock().await;
stats.frames_sent += 1;
stats.bytes_sent += total_bytes;
if is_keyframe {
stats.keyframes_sent += 1;
}
}
trace!("Sent {} NAL units, {} bytes, keyframe={}", nal_count, total_bytes, is_keyframe); trace!("Sent {} NAL units, {} bytes, keyframe={}", nal_count, total_bytes, is_keyframe);
Ok(()) Ok(())
} }

View File

@@ -4,13 +4,16 @@
//! Replaces the H264-only H264Session with a more flexible implementation. //! Replaces the H264-only H264Session with a more flexible implementation.
use std::sync::Arc; use std::sync::Arc;
use tokio::sync::{broadcast, watch, Mutex, RwLock}; use std::time::{Duration, Instant};
use tracing::{debug, info, trace, warn}; use tokio::sync::{watch, Mutex, RwLock};
use tracing::{debug, info, warn};
use webrtc::api::interceptor_registry::register_default_interceptors; use webrtc::api::interceptor_registry::register_default_interceptors;
use webrtc::api::media_engine::MediaEngine; use webrtc::api::media_engine::MediaEngine;
use webrtc::api::setting_engine::SettingEngine;
use webrtc::api::APIBuilder; use webrtc::api::APIBuilder;
use webrtc::data_channel::data_channel_message::DataChannelMessage; use webrtc::data_channel::data_channel_message::DataChannelMessage;
use webrtc::data_channel::RTCDataChannel; use webrtc::data_channel::RTCDataChannel;
use webrtc::ice::mdns::MulticastDnsMode;
use webrtc::ice_transport::ice_candidate::RTCIceCandidate; use webrtc::ice_transport::ice_candidate::RTCIceCandidate;
use webrtc::ice_transport::ice_server::RTCIceServer; use webrtc::ice_transport::ice_server::RTCIceServer;
use webrtc::interceptor::registry::Registry; use webrtc::interceptor::registry::Registry;
@@ -24,17 +27,21 @@ use webrtc::rtp_transceiver::rtp_codec::{
use webrtc::rtp_transceiver::RTCPFeedback; use webrtc::rtp_transceiver::RTCPFeedback;
use super::config::WebRtcConfig; use super::config::WebRtcConfig;
use super::mdns::{default_mdns_host_name, mdns_mode};
use super::rtp::OpusAudioTrack; use super::rtp::OpusAudioTrack;
use super::signaling::{ConnectionState, IceCandidate, SdpAnswer, SdpOffer}; use super::signaling::{ConnectionState, IceCandidate, SdpAnswer, SdpOffer};
use super::video_track::{UniversalVideoTrack, UniversalVideoTrackConfig, VideoCodec}; use super::video_track::{UniversalVideoTrack, UniversalVideoTrackConfig, VideoCodec};
use crate::audio::OpusFrame; use crate::audio::OpusFrame;
use crate::error::{AppError, Result}; use crate::error::{AppError, Result};
use crate::events::{EventBus, SystemEvent};
use crate::hid::datachannel::{parse_hid_message, HidChannelEvent}; use crate::hid::datachannel::{parse_hid_message, HidChannelEvent};
use crate::hid::HidController; use crate::hid::HidController;
use crate::video::encoder::registry::VideoEncoderType; use crate::video::encoder::registry::VideoEncoderType;
use crate::video::encoder::BitratePreset; use crate::video::encoder::BitratePreset;
use crate::video::format::{PixelFormat, Resolution}; use crate::video::format::{PixelFormat, Resolution};
use crate::video::shared_video_pipeline::EncodedVideoFrame; use crate::video::shared_video_pipeline::EncodedVideoFrame;
use std::sync::atomic::AtomicBool;
use webrtc::ice_transport::ice_gatherer_state::RTCIceGathererState;
/// H.265/HEVC MIME type (RFC 7798) /// H.265/HEVC MIME type (RFC 7798)
const MIME_TYPE_H265: &str = "video/H265"; const MIME_TYPE_H265: &str = "video/H265";
@@ -117,6 +124,8 @@ pub struct UniversalSession {
ice_candidates: Arc<Mutex<Vec<IceCandidate>>>, ice_candidates: Arc<Mutex<Vec<IceCandidate>>>,
/// HID controller reference /// HID controller reference
hid_controller: Option<Arc<HidController>>, hid_controller: Option<Arc<HidController>>,
/// Event bus for WebRTC signaling events (optional)
event_bus: Option<Arc<EventBus>>,
/// Video frame receiver handle /// Video frame receiver handle
video_receiver_handle: Mutex<Option<tokio::task::JoinHandle<()>>>, video_receiver_handle: Mutex<Option<tokio::task::JoinHandle<()>>>,
/// Audio frame receiver handle /// Audio frame receiver handle
@@ -127,7 +136,11 @@ pub struct UniversalSession {
impl UniversalSession { impl UniversalSession {
/// Create a new universal WebRTC session /// Create a new universal WebRTC session
pub async fn new(config: UniversalSessionConfig, session_id: String) -> Result<Self> { pub async fn new(
config: UniversalSessionConfig,
session_id: String,
event_bus: Option<Arc<EventBus>>,
) -> Result<Self> {
info!( info!(
"Creating {} session: {} @ {}x{} (audio={})", "Creating {} session: {} @ {}x{} (audio={})",
config.codec, config.codec,
@@ -243,8 +256,17 @@ impl UniversalSession {
registry = register_default_interceptors(registry, &mut media_engine) registry = register_default_interceptors(registry, &mut media_engine)
.map_err(|e| AppError::VideoError(format!("Failed to register interceptors: {}", e)))?; .map_err(|e| AppError::VideoError(format!("Failed to register interceptors: {}", e)))?;
// Create API // Create API (with optional mDNS settings)
let mut setting_engine = SettingEngine::default();
let mode = mdns_mode();
setting_engine.set_ice_multicast_dns_mode(mode);
if mode == MulticastDnsMode::QueryAndGather {
setting_engine.set_multicast_dns_host_name(default_mdns_host_name(&session_id));
}
info!("WebRTC mDNS mode: {:?} (session {})", mode, session_id);
let api = APIBuilder::new() let api = APIBuilder::new()
.with_setting_engine(setting_engine)
.with_media_engine(media_engine) .with_media_engine(media_engine)
.with_interceptor_registry(registry) .with_interceptor_registry(registry)
.build(); .build();
@@ -321,6 +343,7 @@ impl UniversalSession {
state_rx, state_rx,
ice_candidates: Arc::new(Mutex::new(vec![])), ice_candidates: Arc::new(Mutex::new(vec![])),
hid_controller: None, hid_controller: None,
event_bus,
video_receiver_handle: Mutex::new(None), video_receiver_handle: Mutex::new(None),
audio_receiver_handle: Mutex::new(None), audio_receiver_handle: Mutex::new(None),
fps: config.fps, fps: config.fps,
@@ -337,6 +360,7 @@ impl UniversalSession {
let state = self.state.clone(); let state = self.state.clone();
let session_id = self.session_id.clone(); let session_id = self.session_id.clone();
let codec = self.codec; let codec = self.codec;
let event_bus = self.event_bus.clone();
// Connection state change handler // Connection state change handler
self.pc self.pc
@@ -372,33 +396,57 @@ impl UniversalSession {
// ICE gathering state handler // ICE gathering state handler
let session_id_gather = self.session_id.clone(); let session_id_gather = self.session_id.clone();
let event_bus_gather = event_bus.clone();
self.pc self.pc
.on_ice_gathering_state_change(Box::new(move |state| { .on_ice_gathering_state_change(Box::new(move |state| {
let session_id = session_id_gather.clone(); let session_id = session_id_gather.clone();
let event_bus = event_bus_gather.clone();
Box::pin(async move { Box::pin(async move {
debug!("[ICE] Session {} gathering state: {:?}", session_id, state); if matches!(state, RTCIceGathererState::Complete) {
if let Some(bus) = event_bus.as_ref() {
bus.publish(SystemEvent::WebRTCIceComplete { session_id });
}
}
}) })
})); }));
// ICE candidate handler // ICE candidate handler
let ice_candidates = self.ice_candidates.clone(); let ice_candidates = self.ice_candidates.clone();
let session_id_candidate = self.session_id.clone();
let event_bus_candidate = event_bus.clone();
self.pc self.pc
.on_ice_candidate(Box::new(move |candidate: Option<RTCIceCandidate>| { .on_ice_candidate(Box::new(move |candidate: Option<RTCIceCandidate>| {
let ice_candidates = ice_candidates.clone(); let ice_candidates = ice_candidates.clone();
let session_id = session_id_candidate.clone();
let event_bus = event_bus_candidate.clone();
Box::pin(async move { Box::pin(async move {
if let Some(c) = candidate { if let Some(c) = candidate {
let candidate_str = c.to_json().map(|j| j.candidate).unwrap_or_default(); let candidate_json = c.to_json().ok();
debug!("ICE candidate: {}", candidate_str); let candidate_str = candidate_json
.as_ref()
.map(|j| j.candidate.clone())
.unwrap_or_default();
let candidate = IceCandidate {
candidate: candidate_str,
sdp_mid: candidate_json.as_ref().and_then(|j| j.sdp_mid.clone()),
sdp_mline_index: candidate_json.as_ref().and_then(|j| j.sdp_mline_index),
username_fragment: candidate_json
.as_ref()
.and_then(|j| j.username_fragment.clone()),
};
let mut candidates = ice_candidates.lock().await; let mut candidates = ice_candidates.lock().await;
candidates.push(IceCandidate { candidates.push(candidate.clone());
candidate: candidate_str, drop(candidates);
sdp_mid: c.to_json().ok().and_then(|j| j.sdp_mid),
sdp_mline_index: c.to_json().ok().and_then(|j| j.sdp_mline_index), if let Some(bus) = event_bus.as_ref() {
username_fragment: None, bus.publish(SystemEvent::WebRTCIceCandidate {
session_id,
candidate,
}); });
} }
}
}) })
})); }));
@@ -488,13 +536,11 @@ impl UniversalSession {
/// ///
/// The `on_connected` callback is called when ICE connection is established, /// The `on_connected` callback is called when ICE connection is established,
/// allowing the caller to request a keyframe at the right time. /// allowing the caller to request a keyframe at the right time.
pub async fn start_from_video_pipeline<F>( pub async fn start_from_video_pipeline(
&self, &self,
mut frame_rx: broadcast::Receiver<EncodedVideoFrame>, mut frame_rx: tokio::sync::mpsc::Receiver<std::sync::Arc<EncodedVideoFrame>>,
on_connected: F, request_keyframe: Arc<dyn Fn() + Send + Sync + 'static>,
) where ) {
F: FnOnce() + Send + 'static,
{
info!( info!(
"Starting {} session {} with shared encoder", "Starting {} session {} with shared encoder",
self.codec, self.session_id self.codec, self.session_id
@@ -505,6 +551,7 @@ impl UniversalSession {
let session_id = self.session_id.clone(); let session_id = self.session_id.clone();
let _fps = self.fps; let _fps = self.fps;
let expected_codec = self.codec; let expected_codec = self.codec;
let send_in_flight = Arc::new(AtomicBool::new(false));
let handle = tokio::spawn(async move { let handle = tokio::spawn(async move {
info!( info!(
@@ -536,7 +583,10 @@ impl UniversalSession {
); );
// Request keyframe now that connection is established // Request keyframe now that connection is established
on_connected(); request_keyframe();
let mut waiting_for_keyframe = true;
let mut last_sequence: Option<u64> = None;
let mut last_keyframe_request = Instant::now() - Duration::from_secs(1);
let mut frames_sent: u64 = 0; let mut frames_sent: u64 = 0;
@@ -556,8 +606,14 @@ impl UniversalSession {
} }
result = frame_rx.recv() => { result = frame_rx.recv() => {
match result { let encoded_frame = match result {
Ok(encoded_frame) => { Some(frame) => frame,
None => {
info!("Video frame channel closed for session {}", session_id);
break;
}
};
// Verify codec matches // Verify codec matches
let frame_codec = match encoded_frame.codec { let frame_codec = match encoded_frame.codec {
VideoEncoderType::H264 => VideoEncoderType::H264, VideoEncoderType::H264 => VideoEncoderType::H264,
@@ -567,7 +623,6 @@ impl UniversalSession {
}; };
if frame_codec != expected_codec { if frame_codec != expected_codec {
trace!("Skipping frame with codec {:?}, expected {:?}", frame_codec, expected_codec);
continue; continue;
} }
@@ -584,36 +639,48 @@ impl UniversalSession {
} }
} }
// Send encoded frame via RTP // Ensure decoder starts from a keyframe and recover on gaps.
if let Err(e) = video_track let mut gap_detected = false;
if let Some(prev) = last_sequence {
if encoded_frame.sequence > prev.saturating_add(1) {
gap_detected = true;
}
}
if waiting_for_keyframe || gap_detected {
if encoded_frame.is_keyframe {
waiting_for_keyframe = false;
} else {
if gap_detected {
waiting_for_keyframe = true;
}
let now = Instant::now();
if now.duration_since(last_keyframe_request)
>= Duration::from_millis(200)
{
request_keyframe();
last_keyframe_request = now;
}
continue;
}
}
let _ = send_in_flight;
// Send encoded frame via RTP (drop if previous send is still in flight)
let send_result = video_track
.write_frame_bytes( .write_frame_bytes(
encoded_frame.data.clone(), encoded_frame.data.clone(),
encoded_frame.is_keyframe, encoded_frame.is_keyframe,
) )
.await .await;
{ let _ = send_in_flight;
if frames_sent % 100 == 0 {
debug!("Failed to write frame to track: {}", e); if send_result.is_err() {
} // Keep quiet unless debugging send failures elsewhere
} else { } else {
frames_sent += 1; frames_sent += 1;
last_sequence = Some(encoded_frame.sequence);
// Log successful H265 frame send
if expected_codec == VideoEncoderType::H265 && (encoded_frame.is_keyframe || frames_sent % 30 == 0) {
debug!(
"[Session-H265] Frame #{} sent successfully",
frames_sent
);
}
}
}
Err(broadcast::error::RecvError::Lagged(n)) => {
debug!("Session {} lagged by {} frames", session_id, n);
}
Err(broadcast::error::RecvError::Closed) => {
info!("Video frame channel closed for session {}", session_id);
break;
}
} }
} }
} }
@@ -629,7 +696,10 @@ impl UniversalSession {
} }
/// Start receiving Opus audio frames /// Start receiving Opus audio frames
pub async fn start_audio_from_opus(&self, mut opus_rx: broadcast::Receiver<OpusFrame>) { pub async fn start_audio_from_opus(
&self,
mut opus_rx: tokio::sync::watch::Receiver<Option<std::sync::Arc<OpusFrame>>>,
) {
let audio_track = match &self.audio_track { let audio_track = match &self.audio_track {
Some(track) => track.clone(), Some(track) => track.clone(),
None => { None => {
@@ -684,9 +754,17 @@ impl UniversalSession {
} }
} }
result = opus_rx.recv() => { result = opus_rx.changed() => {
match result { if result.is_err() {
Ok(opus_frame) => { info!("Opus channel closed for session {}", session_id);
break;
}
let opus_frame = match opus_rx.borrow().clone() {
Some(frame) => frame,
None => continue,
};
// 20ms at 48kHz = 960 samples // 20ms at 48kHz = 960 samples
let samples = 960u32; let samples = 960u32;
if let Err(e) = audio_track.write_packet(&opus_frame.data, samples).await { if let Err(e) = audio_track.write_packet(&opus_frame.data, samples).await {
@@ -697,15 +775,6 @@ impl UniversalSession {
packets_sent += 1; packets_sent += 1;
} }
} }
Err(broadcast::error::RecvError::Lagged(n)) => {
warn!("Session {} audio lagged by {} packets", session_id, n);
}
Err(broadcast::error::RecvError::Closed) => {
info!("Opus channel closed for session {}", session_id);
break;
}
}
}
} }
} }

View File

@@ -186,19 +186,6 @@ impl UniversalVideoTrackConfig {
} }
} }
/// Track statistics
#[derive(Debug, Clone, Default)]
pub struct VideoTrackStats {
/// Frames sent
pub frames_sent: u64,
/// Bytes sent
pub bytes_sent: u64,
/// Keyframes sent
pub keyframes_sent: u64,
/// Errors
pub errors: u64,
}
/// Track type wrapper to support different underlying track implementations /// Track type wrapper to support different underlying track implementations
enum TrackType { enum TrackType {
/// Sample-based track with built-in payloader (H264, VP8, VP9) /// Sample-based track with built-in payloader (H264, VP8, VP9)
@@ -227,8 +214,6 @@ pub struct UniversalVideoTrack {
codec: VideoCodec, codec: VideoCodec,
/// Configuration /// Configuration
config: UniversalVideoTrackConfig, config: UniversalVideoTrackConfig,
/// Statistics
stats: Mutex<VideoTrackStats>,
/// H265 RTP state (only used for H265) /// H265 RTP state (only used for H265)
h265_state: Option<Mutex<H265RtpState>>, h265_state: Option<Mutex<H265RtpState>>,
} }
@@ -277,7 +262,6 @@ impl UniversalVideoTrack {
track, track,
codec: config.codec, codec: config.codec,
config, config,
stats: Mutex::new(VideoTrackStats::default()),
h265_state, h265_state,
} }
} }
@@ -301,9 +285,6 @@ impl UniversalVideoTrack {
} }
/// Get current statistics /// Get current statistics
pub async fn stats(&self) -> VideoTrackStats {
self.stats.lock().await.clone()
}
/// Write an encoded frame to the track /// Write an encoded frame to the track
/// ///
@@ -332,7 +313,7 @@ impl UniversalVideoTrack {
/// ///
/// Sends the entire Annex B frame as a single Sample to allow the /// Sends the entire Annex B frame as a single Sample to allow the
/// H264Payloader to aggregate SPS+PPS into STAP-A packets. /// H264Payloader to aggregate SPS+PPS into STAP-A packets.
async fn write_h264_frame(&self, data: Bytes, is_keyframe: bool) -> Result<()> { async fn write_h264_frame(&self, data: Bytes, _is_keyframe: bool) -> Result<()> {
// Send entire Annex B frame as one Sample // Send entire Annex B frame as one Sample
// The H264Payloader in rtp crate will: // The H264Payloader in rtp crate will:
// 1. Parse NAL units from Annex B format // 1. Parse NAL units from Annex B format
@@ -340,7 +321,6 @@ impl UniversalVideoTrack {
// 3. Aggregate SPS+PPS+IDR into STAP-A when possible // 3. Aggregate SPS+PPS+IDR into STAP-A when possible
// 4. Fragment large NALs using FU-A // 4. Fragment large NALs using FU-A
let frame_duration = Duration::from_micros(1_000_000 / self.config.fps.max(1) as u64); let frame_duration = Duration::from_micros(1_000_000 / self.config.fps.max(1) as u64);
let data_len = data.len();
let sample = Sample { let sample = Sample {
data, data,
duration: frame_duration, duration: frame_duration,
@@ -358,14 +338,6 @@ impl UniversalVideoTrack {
} }
} }
// Update stats
let mut stats = self.stats.lock().await;
stats.frames_sent += 1;
stats.bytes_sent += data_len as u64;
if is_keyframe {
stats.keyframes_sent += 1;
}
Ok(()) Ok(())
} }
@@ -379,11 +351,10 @@ impl UniversalVideoTrack {
} }
/// Write VP8 frame /// Write VP8 frame
async fn write_vp8_frame(&self, data: Bytes, is_keyframe: bool) -> Result<()> { async fn write_vp8_frame(&self, data: Bytes, _is_keyframe: bool) -> Result<()> {
// VP8 frames are sent directly without NAL parsing // VP8 frames are sent directly without NAL parsing
// Calculate frame duration based on configured FPS // Calculate frame duration based on configured FPS
let frame_duration = Duration::from_micros(1_000_000 / self.config.fps.max(1) as u64); let frame_duration = Duration::from_micros(1_000_000 / self.config.fps.max(1) as u64);
let data_len = data.len();
let sample = Sample { let sample = Sample {
data, data,
duration: frame_duration, duration: frame_duration,
@@ -401,23 +372,14 @@ impl UniversalVideoTrack {
} }
} }
// Update stats
let mut stats = self.stats.lock().await;
stats.frames_sent += 1;
stats.bytes_sent += data_len as u64;
if is_keyframe {
stats.keyframes_sent += 1;
}
Ok(()) Ok(())
} }
/// Write VP9 frame /// Write VP9 frame
async fn write_vp9_frame(&self, data: Bytes, is_keyframe: bool) -> Result<()> { async fn write_vp9_frame(&self, data: Bytes, _is_keyframe: bool) -> Result<()> {
// VP9 frames are sent directly without NAL parsing // VP9 frames are sent directly without NAL parsing
// Calculate frame duration based on configured FPS // Calculate frame duration based on configured FPS
let frame_duration = Duration::from_micros(1_000_000 / self.config.fps.max(1) as u64); let frame_duration = Duration::from_micros(1_000_000 / self.config.fps.max(1) as u64);
let data_len = data.len();
let sample = Sample { let sample = Sample {
data, data,
duration: frame_duration, duration: frame_duration,
@@ -435,19 +397,11 @@ impl UniversalVideoTrack {
} }
} }
// Update stats
let mut stats = self.stats.lock().await;
stats.frames_sent += 1;
stats.bytes_sent += data_len as u64;
if is_keyframe {
stats.keyframes_sent += 1;
}
Ok(()) Ok(())
} }
/// Send H265 NAL units via custom H265Payloader /// Send H265 NAL units via custom H265Payloader
async fn send_h265_rtp(&self, payload: Bytes, is_keyframe: bool) -> Result<()> { async fn send_h265_rtp(&self, payload: Bytes, _is_keyframe: bool) -> Result<()> {
let rtp_track = match &self.track { let rtp_track = match &self.track {
TrackType::Rtp(t) => t, TrackType::Rtp(t) => t,
TrackType::Sample(_) => { TrackType::Sample(_) => {
@@ -486,8 +440,6 @@ impl UniversalVideoTrack {
(payloads, timestamp, seq_start, num_payloads) (payloads, timestamp, seq_start, num_payloads)
}; // Lock released here, before network I/O }; // Lock released here, before network I/O
let mut total_bytes = 0u64;
// Send RTP packets without holding the lock // Send RTP packets without holding the lock
for (i, payload_data) in payloads.into_iter().enumerate() { for (i, payload_data) in payloads.into_iter().enumerate() {
let seq = seq_start.wrapping_add(i as u16); let seq = seq_start.wrapping_add(i as u16);
@@ -513,15 +465,6 @@ impl UniversalVideoTrack {
trace!("H265 write_rtp failed: {}", e); trace!("H265 write_rtp failed: {}", e);
} }
total_bytes += payload_data.len() as u64;
}
// Update stats
let mut stats = self.stats.lock().await;
stats.frames_sent += 1;
stats.bytes_sent += total_bytes;
if is_keyframe {
stats.keyframes_sent += 1;
} }
Ok(()) Ok(())

View File

@@ -15,10 +15,6 @@
//! | +-- VP8 Encoder (hardware only - VAAPI) //! | +-- VP8 Encoder (hardware only - VAAPI)
//! | +-- VP9 Encoder (hardware only - VAAPI) //! | +-- VP9 Encoder (hardware only - VAAPI)
//! | //! |
//! +-- Audio Pipeline
//! | +-- SharedAudioPipeline
//! | +-- OpusEncoder
//! |
//! +-- UniversalSession[] (video + audio tracks + DataChannel) //! +-- UniversalSession[] (video + audio tracks + DataChannel)
//! +-- UniversalVideoTrack (H264/H265/VP8/VP9) //! +-- UniversalVideoTrack (H264/H265/VP8/VP9)
//! +-- Audio Track (RTP/Opus) //! +-- Audio Track (RTP/Opus)
@@ -29,23 +25,23 @@
//! //!
//! - **Single encoder**: All sessions share one video encoder //! - **Single encoder**: All sessions share one video encoder
//! - **Multi-codec support**: H264, H265, VP8, VP9 //! - **Multi-codec support**: H264, H265, VP8, VP9
//! - **Audio support**: Opus audio streaming via SharedAudioPipeline //! - **Audio support**: Opus audio streaming via AudioController
//! - **HID via DataChannel**: Keyboard/mouse events through WebRTC DataChannel //! - **HID via DataChannel**: Keyboard/mouse events through WebRTC DataChannel
use std::collections::HashMap; use std::collections::HashMap;
use std::path::PathBuf;
use std::sync::Arc; use std::sync::Arc;
use tokio::sync::{broadcast, RwLock}; use tokio::sync::RwLock;
use tracing::{debug, error, info, trace, warn}; use tracing::{debug, info, trace, warn};
use crate::audio::shared_pipeline::{SharedAudioPipeline, SharedAudioPipelineConfig};
use crate::audio::{AudioController, OpusFrame}; use crate::audio::{AudioController, OpusFrame};
use crate::events::EventBus;
use crate::error::{AppError, Result}; use crate::error::{AppError, Result};
use crate::hid::HidController; use crate::hid::HidController;
use crate::video::encoder::registry::EncoderBackend; use crate::video::encoder::registry::EncoderBackend;
use crate::video::encoder::registry::VideoEncoderType; use crate::video::encoder::registry::VideoEncoderType;
use crate::video::encoder::VideoCodecType; use crate::video::encoder::VideoCodecType;
use crate::video::format::{PixelFormat, Resolution}; use crate::video::format::{PixelFormat, Resolution};
use crate::video::frame::VideoFrame;
use crate::video::shared_video_pipeline::{ use crate::video::shared_video_pipeline::{
SharedVideoPipeline, SharedVideoPipelineConfig, SharedVideoPipelineStats, SharedVideoPipeline, SharedVideoPipelineConfig, SharedVideoPipelineStats,
}; };
@@ -91,6 +87,14 @@ impl Default for WebRtcStreamerConfig {
} }
} }
/// Capture device configuration for direct capture pipeline
#[derive(Debug, Clone)]
pub struct CaptureDeviceConfig {
pub device_path: PathBuf,
pub buffer_count: u32,
pub jpeg_quality: u8,
}
/// WebRTC streamer statistics /// WebRTC streamer statistics
#[derive(Debug, Clone, Default)] #[derive(Debug, Clone, Default)]
pub struct WebRtcStreamerStats { pub struct WebRtcStreamerStats {
@@ -102,30 +106,12 @@ pub struct WebRtcStreamerStats {
pub video_pipeline: Option<VideoPipelineStats>, pub video_pipeline: Option<VideoPipelineStats>,
/// Audio enabled /// Audio enabled
pub audio_enabled: bool, pub audio_enabled: bool,
/// Audio pipeline stats (if available)
pub audio_pipeline: Option<AudioPipelineStats>,
} }
/// Video pipeline statistics /// Video pipeline statistics
#[derive(Debug, Clone, Default)] #[derive(Debug, Clone, Default)]
pub struct VideoPipelineStats { pub struct VideoPipelineStats {
pub frames_encoded: u64,
pub frames_dropped: u64,
pub bytes_encoded: u64,
pub keyframes_encoded: u64,
pub avg_encode_time_ms: f32,
pub current_fps: f32, pub current_fps: f32,
pub subscribers: u64,
}
/// Audio pipeline statistics
#[derive(Debug, Clone, Default)]
pub struct AudioPipelineStats {
pub frames_encoded: u64,
pub frames_dropped: u64,
pub bytes_encoded: u64,
pub avg_encode_time_ms: f32,
pub subscribers: u64,
} }
/// Session info for listing /// Session info for listing
@@ -151,20 +137,21 @@ pub struct WebRtcStreamer {
video_pipeline: RwLock<Option<Arc<SharedVideoPipeline>>>, video_pipeline: RwLock<Option<Arc<SharedVideoPipeline>>>,
/// All sessions (unified management) /// All sessions (unified management)
sessions: Arc<RwLock<HashMap<String, Arc<UniversalSession>>>>, sessions: Arc<RwLock<HashMap<String, Arc<UniversalSession>>>>,
/// Video frame source /// Capture device configuration for direct capture mode
video_frame_tx: RwLock<Option<broadcast::Sender<VideoFrame>>>, capture_device: RwLock<Option<CaptureDeviceConfig>>,
// === Audio === // === Audio ===
/// Audio enabled flag /// Audio enabled flag
audio_enabled: RwLock<bool>, audio_enabled: RwLock<bool>,
/// Shared audio pipeline for Opus encoding
audio_pipeline: RwLock<Option<Arc<SharedAudioPipeline>>>,
/// Audio controller reference /// Audio controller reference
audio_controller: RwLock<Option<Arc<AudioController>>>, audio_controller: RwLock<Option<Arc<AudioController>>>,
// === Controllers === // === Controllers ===
/// HID controller for DataChannel /// HID controller for DataChannel
hid_controller: RwLock<Option<Arc<HidController>>>, hid_controller: RwLock<Option<Arc<HidController>>>,
/// Event bus for WebRTC signaling (optional)
events: RwLock<Option<Arc<EventBus>>>,
} }
impl WebRtcStreamer { impl WebRtcStreamer {
@@ -180,11 +167,11 @@ impl WebRtcStreamer {
video_codec: RwLock::new(config.video_codec), video_codec: RwLock::new(config.video_codec),
video_pipeline: RwLock::new(None), video_pipeline: RwLock::new(None),
sessions: Arc::new(RwLock::new(HashMap::new())), sessions: Arc::new(RwLock::new(HashMap::new())),
video_frame_tx: RwLock::new(None), capture_device: RwLock::new(None),
audio_enabled: RwLock::new(config.audio_enabled), audio_enabled: RwLock::new(config.audio_enabled),
audio_pipeline: RwLock::new(None),
audio_controller: RwLock::new(None), audio_controller: RwLock::new(None),
hid_controller: RwLock::new(None), hid_controller: RwLock::new(None),
events: RwLock::new(None),
}) })
} }
@@ -219,9 +206,10 @@ impl WebRtcStreamer {
// Update codec // Update codec
*self.video_codec.write().await = codec; *self.video_codec.write().await = codec;
// Create new pipeline with new codec // Create new pipeline with new codec if capture source is configured
if let Some(ref tx) = *self.video_frame_tx.read().await { let has_capture = self.capture_device.read().await.is_some();
self.ensure_video_pipeline(tx.clone()).await?; if has_capture {
self.ensure_video_pipeline().await?;
} }
info!("Video codec switched to {:?}", codec); info!("Video codec switched to {:?}", codec);
@@ -263,10 +251,7 @@ impl WebRtcStreamer {
} }
/// Ensure video pipeline is initialized and running /// Ensure video pipeline is initialized and running
async fn ensure_video_pipeline( async fn ensure_video_pipeline(self: &Arc<Self>) -> Result<Arc<SharedVideoPipeline>> {
self: &Arc<Self>,
tx: broadcast::Sender<VideoFrame>,
) -> Result<Arc<SharedVideoPipeline>> {
let mut pipeline_guard = self.video_pipeline.write().await; let mut pipeline_guard = self.video_pipeline.write().await;
if let Some(ref pipeline) = *pipeline_guard { if let Some(ref pipeline) = *pipeline_guard {
@@ -290,7 +275,16 @@ impl WebRtcStreamer {
info!("Creating shared video pipeline for {:?}", codec); info!("Creating shared video pipeline for {:?}", codec);
let pipeline = SharedVideoPipeline::new(pipeline_config)?; let pipeline = SharedVideoPipeline::new(pipeline_config)?;
pipeline.start(tx.subscribe()).await?; let capture_device = self.capture_device.read().await.clone();
if let Some(device) = capture_device {
pipeline
.start_with_device(device.device_path, device.buffer_count, device.jpeg_quality)
.await?;
} else {
return Err(AppError::VideoError(
"No capture device configured".to_string(),
));
}
// Start a monitor task to detect when pipeline auto-stops // Start a monitor task to detect when pipeline auto-stops
let pipeline_weak = Arc::downgrade(&pipeline); let pipeline_weak = Arc::downgrade(&pipeline);
@@ -317,11 +311,7 @@ impl WebRtcStreamer {
} }
drop(pipeline_guard); drop(pipeline_guard);
// NOTE: Don't clear video_frame_tx here! info!("Video pipeline stopped, but keeping capture config for new sessions");
// The frame source is managed by stream_manager and should
// remain available for new sessions. Only stream_manager
// should clear it during mode switches.
info!("Video pipeline stopped, but keeping frame source for new sessions");
} }
break; break;
} }
@@ -339,9 +329,8 @@ impl WebRtcStreamer {
/// components (like RustDesk) that need to share the encoded video stream. /// components (like RustDesk) that need to share the encoded video stream.
pub async fn ensure_video_pipeline_for_external( pub async fn ensure_video_pipeline_for_external(
self: &Arc<Self>, self: &Arc<Self>,
tx: broadcast::Sender<VideoFrame>,
) -> Result<Arc<SharedVideoPipeline>> { ) -> Result<Arc<SharedVideoPipeline>> {
self.ensure_video_pipeline(tx).await self.ensure_video_pipeline().await
} }
/// Get the current pipeline configuration (if pipeline is running) /// Get the current pipeline configuration (if pipeline is running)
@@ -367,13 +356,10 @@ impl WebRtcStreamer {
self.config.write().await.audio_enabled = enabled; self.config.write().await.audio_enabled = enabled;
if enabled && !was_enabled { if enabled && !was_enabled {
// Start audio pipeline if we have an audio controller // Reconnect audio for existing sessions if we have a controller
if let Some(ref controller) = *self.audio_controller.read().await { if let Some(ref _controller) = *self.audio_controller.read().await {
self.start_audio_pipeline(controller.clone()).await?; self.reconnect_audio_sources().await;
} }
} else if !enabled && was_enabled {
// Stop audio pipeline
self.stop_audio_pipeline().await;
} }
info!("WebRTC audio enabled: {}", enabled); info!("WebRTC audio enabled: {}", enabled);
@@ -385,61 +371,16 @@ impl WebRtcStreamer {
info!("Setting audio controller for WebRTC streamer"); info!("Setting audio controller for WebRTC streamer");
*self.audio_controller.write().await = Some(controller.clone()); *self.audio_controller.write().await = Some(controller.clone());
// Start audio pipeline if audio is enabled // Reconnect audio for existing sessions if audio is enabled
if *self.audio_enabled.read().await { if *self.audio_enabled.read().await {
if let Err(e) = self.start_audio_pipeline(controller).await {
error!("Failed to start audio pipeline: {}", e);
}
}
}
/// Start the shared audio pipeline
async fn start_audio_pipeline(&self, controller: Arc<AudioController>) -> Result<()> {
// Check if already running
if let Some(ref pipeline) = *self.audio_pipeline.read().await {
if pipeline.is_running() {
debug!("Audio pipeline already running");
return Ok(());
}
}
// Get Opus frame receiver from audio controller
let _opus_rx = match controller.subscribe_opus_async().await {
Some(rx) => rx,
None => {
warn!("Audio controller not streaming, cannot start audio pipeline");
return Ok(());
}
};
// Create shared audio pipeline config
let config = SharedAudioPipelineConfig::default();
let pipeline = SharedAudioPipeline::new(config)?;
// Note: SharedAudioPipeline expects raw AudioFrame, but AudioController
// already provides encoded OpusFrame. We'll pass the OpusFrame directly
// to sessions instead of re-encoding.
// For now, store the pipeline reference for future use
*self.audio_pipeline.write().await = Some(pipeline);
// Reconnect audio for all existing sessions
self.reconnect_audio_sources().await; self.reconnect_audio_sources().await;
info!("WebRTC audio pipeline started");
Ok(())
} }
/// Stop the shared audio pipeline
async fn stop_audio_pipeline(&self) {
if let Some(ref pipeline) = *self.audio_pipeline.read().await {
pipeline.stop();
}
*self.audio_pipeline.write().await = None;
info!("WebRTC audio pipeline stopped");
} }
/// Subscribe to encoded Opus frames (for sessions) /// Subscribe to encoded Opus frames (for sessions)
pub async fn subscribe_opus(&self) -> Option<broadcast::Receiver<OpusFrame>> { pub async fn subscribe_opus(
&self,
) -> Option<tokio::sync::watch::Receiver<Option<std::sync::Arc<OpusFrame>>>> {
if let Some(ref controller) = *self.audio_controller.read().await { if let Some(ref controller) = *self.audio_controller.read().await {
controller.subscribe_opus_async().await controller.subscribe_opus_async().await
} else { } else {
@@ -463,38 +404,22 @@ impl WebRtcStreamer {
} }
} }
// === Video Frame Source === /// Set capture device for direct capture pipeline
pub async fn set_capture_device(&self, device_path: PathBuf, jpeg_quality: u8) {
/// Set video frame source
pub async fn set_video_source(&self, tx: broadcast::Sender<VideoFrame>) {
info!( info!(
"Setting video source for WebRTC streamer (receiver_count={})", "Setting direct capture device for WebRTC: {:?}",
tx.receiver_count() device_path
); );
*self.video_frame_tx.write().await = Some(tx.clone()); *self.capture_device.write().await = Some(CaptureDeviceConfig {
device_path,
buffer_count: 2,
jpeg_quality,
});
}
// Start or restart pipeline if it exists /// Clear direct capture device configuration
if let Some(ref pipeline) = *self.video_pipeline.read().await { pub async fn clear_capture_device(&self) {
if !pipeline.is_running() { *self.capture_device.write().await = None;
info!("Starting video pipeline with new frame source");
if let Err(e) = pipeline.start(tx.subscribe()).await {
error!("Failed to start video pipeline: {}", e);
}
} else {
// Pipeline is already running but may have old frame source
// We need to restart it with the new frame source
info!("Video pipeline already running, restarting with new frame source");
pipeline.stop();
tokio::time::sleep(tokio::time::Duration::from_millis(50)).await;
if let Err(e) = pipeline.start(tx.subscribe()).await {
error!("Failed to restart video pipeline: {}", e);
}
}
} else {
info!(
"No video pipeline exists yet, frame source will be used when pipeline is created"
);
}
} }
/// Prepare for configuration change /// Prepare for configuration change
@@ -509,11 +434,6 @@ impl WebRtcStreamer {
self.close_all_sessions().await; self.close_all_sessions().await;
} }
/// Reconnect video source after configuration change
pub async fn reconnect_video_source(&self, tx: broadcast::Sender<VideoFrame>) {
self.set_video_source(tx).await;
}
// === Configuration === // === Configuration ===
/// Update video configuration /// Update video configuration
@@ -690,6 +610,11 @@ impl WebRtcStreamer {
*self.hid_controller.write().await = Some(hid); *self.hid_controller.write().await = Some(hid);
} }
/// Set event bus for WebRTC signaling events
pub async fn set_event_bus(&self, events: Arc<EventBus>) {
*self.events.write().await = Some(events);
}
// === Session Management === // === Session Management ===
/// Create a new WebRTC session /// Create a new WebRTC session
@@ -698,13 +623,7 @@ impl WebRtcStreamer {
let codec = *self.video_codec.read().await; let codec = *self.video_codec.read().await;
// Ensure video pipeline is running // Ensure video pipeline is running
let frame_tx = self let pipeline = self.ensure_video_pipeline().await?;
.video_frame_tx
.read()
.await
.clone()
.ok_or_else(|| AppError::VideoError("No video frame source".to_string()))?;
let pipeline = self.ensure_video_pipeline(frame_tx).await?;
// Create session config // Create session config
let config = self.config.read().await; let config = self.config.read().await;
@@ -720,7 +639,9 @@ impl WebRtcStreamer {
drop(config); drop(config);
// Create universal session // Create universal session
let mut session = UniversalSession::new(session_config.clone(), session_id.clone()).await?; let event_bus = self.events.read().await.clone();
let mut session =
UniversalSession::new(session_config.clone(), session_id.clone(), event_bus).await?;
// Set HID controller if available // Set HID controller if available
// Note: We DON'T create a data channel here - the frontend creates it. // Note: We DON'T create a data channel here - the frontend creates it.
@@ -734,14 +655,12 @@ impl WebRtcStreamer {
let session = Arc::new(session); let session = Arc::new(session);
// Subscribe to video pipeline frames // Subscribe to video pipeline frames
// Request keyframe after ICE connection is established (via callback) // Request keyframe after ICE connection is established and on gaps
let pipeline_for_callback = pipeline.clone(); let pipeline_for_callback = pipeline.clone();
let session_id_for_callback = session_id.clone(); let session_id_for_callback = session_id.clone();
session let request_keyframe = Arc::new(move || {
.start_from_video_pipeline(pipeline.subscribe(), move || { let pipeline = pipeline_for_callback.clone();
// Spawn async task to request keyframe let sid = session_id_for_callback.clone();
let pipeline = pipeline_for_callback;
let sid = session_id_for_callback;
tokio::spawn(async move { tokio::spawn(async move {
info!( info!(
"Requesting keyframe for session {} after ICE connected", "Requesting keyframe for session {} after ICE connected",
@@ -749,7 +668,9 @@ impl WebRtcStreamer {
); );
pipeline.request_keyframe().await; pipeline.request_keyframe().await;
}); });
}) });
session
.start_from_video_pipeline(pipeline.subscribe(), request_keyframe)
.await; .await;
// Start audio if enabled // Start audio if enabled
@@ -913,27 +834,7 @@ impl WebRtcStreamer {
let video_pipeline = if let Some(ref pipeline) = *self.video_pipeline.read().await { let video_pipeline = if let Some(ref pipeline) = *self.video_pipeline.read().await {
let s = pipeline.stats().await; let s = pipeline.stats().await;
Some(VideoPipelineStats { Some(VideoPipelineStats {
frames_encoded: s.frames_encoded,
frames_dropped: s.frames_dropped,
bytes_encoded: s.bytes_encoded,
keyframes_encoded: s.keyframes_encoded,
avg_encode_time_ms: s.avg_encode_time_ms,
current_fps: s.current_fps, current_fps: s.current_fps,
subscribers: s.subscribers,
})
} else {
None
};
// Get audio pipeline stats
let audio_pipeline = if let Some(ref pipeline) = *self.audio_pipeline.read().await {
let stats = pipeline.stats().await;
Some(AudioPipelineStats {
frames_encoded: stats.frames_encoded,
frames_dropped: stats.frames_dropped,
bytes_encoded: stats.bytes_encoded,
avg_encode_time_ms: stats.avg_encode_time_ms,
subscribers: stats.subscribers,
}) })
} else { } else {
None None
@@ -944,7 +845,6 @@ impl WebRtcStreamer {
video_codec: format!("{:?}", codec), video_codec: format!("{:?}", codec),
video_pipeline, video_pipeline,
audio_enabled: *self.audio_enabled.read().await, audio_enabled: *self.audio_enabled.read().await,
audio_pipeline,
} }
} }
@@ -984,9 +884,6 @@ impl WebRtcStreamer {
if pipeline_running { if pipeline_running {
info!("Restarting video pipeline to apply new bitrate: {}", preset); info!("Restarting video pipeline to apply new bitrate: {}", preset);
// Save video_frame_tx BEFORE stopping pipeline (monitor task will clear it)
let saved_frame_tx = self.video_frame_tx.read().await.clone();
// Stop existing pipeline // Stop existing pipeline
if let Some(ref pipeline) = *self.video_pipeline.read().await { if let Some(ref pipeline) = *self.video_pipeline.read().await {
pipeline.stop(); pipeline.stop();
@@ -998,28 +895,24 @@ impl WebRtcStreamer {
// Clear pipeline reference - will be recreated // Clear pipeline reference - will be recreated
*self.video_pipeline.write().await = None; *self.video_pipeline.write().await = None;
// Recreate pipeline with new config if we have a frame source let has_source = self.capture_device.read().await.is_some();
if let Some(tx) = saved_frame_tx { if !has_source {
// Get existing sessions that need to be reconnected return Ok(());
}
let session_ids: Vec<String> = self.sessions.read().await.keys().cloned().collect(); let session_ids: Vec<String> = self.sessions.read().await.keys().cloned().collect();
if !session_ids.is_empty() { if !session_ids.is_empty() {
// Restore video_frame_tx before recreating pipeline let pipeline = self.ensure_video_pipeline().await?;
*self.video_frame_tx.write().await = Some(tx.clone());
// Recreate pipeline
let pipeline = self.ensure_video_pipeline(tx).await?;
// Reconnect all sessions to new pipeline
let sessions = self.sessions.read().await; let sessions = self.sessions.read().await;
for session_id in &session_ids { for session_id in &session_ids {
if let Some(session) = sessions.get(session_id) { if let Some(session) = sessions.get(session_id) {
info!("Reconnecting session {} to new pipeline", session_id); info!("Reconnecting session {} to new pipeline", session_id);
let pipeline_for_callback = pipeline.clone(); let pipeline_for_callback = pipeline.clone();
let sid = session_id.clone(); let sid = session_id.clone();
session let request_keyframe = Arc::new(move || {
.start_from_video_pipeline(pipeline.subscribe(), move || { let pipeline = pipeline_for_callback.clone();
let pipeline = pipeline_for_callback; let sid = sid.clone();
tokio::spawn(async move { tokio::spawn(async move {
info!( info!(
"Requesting keyframe for session {} after reconnect", "Requesting keyframe for session {} after reconnect",
@@ -1027,7 +920,9 @@ impl WebRtcStreamer {
); );
pipeline.request_keyframe().await; pipeline.request_keyframe().await;
}); });
}) });
session
.start_from_video_pipeline(pipeline.subscribe(), request_keyframe)
.await; .await;
} }
} }
@@ -1038,7 +933,6 @@ impl WebRtcStreamer {
session_ids.len() session_ids.len()
); );
} }
}
} else { } else {
debug!( debug!(
"Pipeline not running, bitrate {} will apply on next start", "Pipeline not running, bitrate {} will apply on next start",
@@ -1057,11 +951,11 @@ impl Default for WebRtcStreamer {
video_codec: RwLock::new(VideoCodecType::H264), video_codec: RwLock::new(VideoCodecType::H264),
video_pipeline: RwLock::new(None), video_pipeline: RwLock::new(None),
sessions: Arc::new(RwLock::new(HashMap::new())), sessions: Arc::new(RwLock::new(HashMap::new())),
video_frame_tx: RwLock::new(None), capture_device: RwLock::new(None),
audio_enabled: RwLock::new(false), audio_enabled: RwLock::new(false),
audio_pipeline: RwLock::new(None),
audio_controller: RwLock::new(None), audio_controller: RwLock::new(None),
hid_controller: RwLock::new(None), hid_controller: RwLock::new(None),
events: RwLock::new(None),
} }
} }
} }

View File

@@ -2,7 +2,7 @@
<html lang="en"> <html lang="en">
<head> <head>
<meta charset="UTF-8" /> <meta charset="UTF-8" />
<link rel="icon" type="image/svg+xml" href="/vite.svg" /> <link rel="icon" type="image/png" href="/favicon.png" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" /> <meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>One-KVM</title> <title>One-KVM</title>
</head> </head>

14
web/package-lock.json generated
View File

@@ -1,12 +1,12 @@
{ {
"name": "web", "name": "web",
"version": "0.0.0", "version": "0.1.1",
"lockfileVersion": 3, "lockfileVersion": 3,
"requires": true, "requires": true,
"packages": { "packages": {
"": { "": {
"name": "web", "name": "web",
"version": "0.0.0", "version": "0.1.1",
"dependencies": { "dependencies": {
"@vueuse/core": "^14.1.0", "@vueuse/core": "^14.1.0",
"class-variance-authority": "^0.7.1", "class-variance-authority": "^0.7.1",
@@ -1368,6 +1368,7 @@
"integrity": "sha512-GNWcUTRBgIRJD5zj+Tq0fKOJ5XZajIiBroOF0yvj2bSU1WvNdYS/dn9UxwsujGW4JX06dnHyjV2y9rRaybH0iQ==", "integrity": "sha512-GNWcUTRBgIRJD5zj+Tq0fKOJ5XZajIiBroOF0yvj2bSU1WvNdYS/dn9UxwsujGW4JX06dnHyjV2y9rRaybH0iQ==",
"dev": true, "dev": true,
"license": "MIT", "license": "MIT",
"peer": true,
"dependencies": { "dependencies": {
"undici-types": "~7.16.0" "undici-types": "~7.16.0"
} }
@@ -1782,6 +1783,7 @@
} }
], ],
"license": "MIT", "license": "MIT",
"peer": true,
"dependencies": { "dependencies": {
"baseline-browser-mapping": "^2.9.0", "baseline-browser-mapping": "^2.9.0",
"caniuse-lite": "^1.0.30001759", "caniuse-lite": "^1.0.30001759",
@@ -2448,6 +2450,7 @@
"integrity": "sha512-5gTmgEY/sqK6gFXLIsQNH19lWb4ebPDLA4SdLP7dsWkIXHWlG66oPuVvXSGFPppYZz8ZDZq0dYYrbHfBCVUb1Q==", "integrity": "sha512-5gTmgEY/sqK6gFXLIsQNH19lWb4ebPDLA4SdLP7dsWkIXHWlG66oPuVvXSGFPppYZz8ZDZq0dYYrbHfBCVUb1Q==",
"dev": true, "dev": true,
"license": "MIT", "license": "MIT",
"peer": true,
"engines": { "engines": {
"node": ">=12" "node": ">=12"
}, },
@@ -2495,6 +2498,7 @@
} }
], ],
"license": "MIT", "license": "MIT",
"peer": true,
"dependencies": { "dependencies": {
"nanoid": "^3.3.11", "nanoid": "^3.3.11",
"picocolors": "^1.1.1", "picocolors": "^1.1.1",
@@ -2787,7 +2791,8 @@
"resolved": "https://registry.npmmirror.com/tailwindcss/-/tailwindcss-4.1.17.tgz", "resolved": "https://registry.npmmirror.com/tailwindcss/-/tailwindcss-4.1.17.tgz",
"integrity": "sha512-j9Ee2YjuQqYT9bbRTfTZht9W/ytp5H+jJpZKiYdP/bpnXARAuELt9ofP0lPnmHjbga7SNQIxdTAXCmtKVYjN+Q==", "integrity": "sha512-j9Ee2YjuQqYT9bbRTfTZht9W/ytp5H+jJpZKiYdP/bpnXARAuELt9ofP0lPnmHjbga7SNQIxdTAXCmtKVYjN+Q==",
"dev": true, "dev": true,
"license": "MIT" "license": "MIT",
"peer": true
}, },
"node_modules/tapable": { "node_modules/tapable": {
"version": "2.3.0", "version": "2.3.0",
@@ -2841,6 +2846,7 @@
"integrity": "sha512-jl1vZzPDinLr9eUt3J/t7V6FgNEw9QjvBPdysz9KfQDD41fQrC2Y4vKQdiaUpFT4bXlb1RHhLpp8wtm6M5TgSw==", "integrity": "sha512-jl1vZzPDinLr9eUt3J/t7V6FgNEw9QjvBPdysz9KfQDD41fQrC2Y4vKQdiaUpFT4bXlb1RHhLpp8wtm6M5TgSw==",
"devOptional": true, "devOptional": true,
"license": "Apache-2.0", "license": "Apache-2.0",
"peer": true,
"bin": { "bin": {
"tsc": "bin/tsc", "tsc": "bin/tsc",
"tsserver": "bin/tsserver" "tsserver": "bin/tsserver"
@@ -2906,6 +2912,7 @@
"integrity": "sha512-tI2l/nFHC5rLh7+5+o7QjKjSR04ivXDF4jcgV0f/bTQ+OJiITy5S6gaynVsEM+7RqzufMnVbIon6Sr5x1SDYaQ==", "integrity": "sha512-tI2l/nFHC5rLh7+5+o7QjKjSR04ivXDF4jcgV0f/bTQ+OJiITy5S6gaynVsEM+7RqzufMnVbIon6Sr5x1SDYaQ==",
"dev": true, "dev": true,
"license": "MIT", "license": "MIT",
"peer": true,
"dependencies": { "dependencies": {
"esbuild": "^0.25.0", "esbuild": "^0.25.0",
"fdir": "^6.5.0", "fdir": "^6.5.0",
@@ -2987,6 +2994,7 @@
"resolved": "https://registry.npmmirror.com/vue/-/vue-3.5.25.tgz", "resolved": "https://registry.npmmirror.com/vue/-/vue-3.5.25.tgz",
"integrity": "sha512-YLVdgv2K13WJ6n+kD5owehKtEXwdwXuj2TTyJMsO7pSeKw2bfRNZGjhB7YzrpbMYj5b5QsUebHpOqR3R3ziy/g==", "integrity": "sha512-YLVdgv2K13WJ6n+kD5owehKtEXwdwXuj2TTyJMsO7pSeKw2bfRNZGjhB7YzrpbMYj5b5QsUebHpOqR3R3ziy/g==",
"license": "MIT", "license": "MIT",
"peer": true,
"dependencies": { "dependencies": {
"@vue/compiler-dom": "3.5.25", "@vue/compiler-dom": "3.5.25",
"@vue/compiler-sfc": "3.5.25", "@vue/compiler-sfc": "3.5.25",

BIN
web/public/favicon.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 37 KiB

View File

@@ -7,6 +7,8 @@
import type { import type {
AppConfig, AppConfig,
AuthConfig,
AuthConfigUpdate,
VideoConfig, VideoConfig,
VideoConfigUpdate, VideoConfigUpdate,
StreamConfigResponse, StreamConfigResponse,
@@ -41,6 +43,24 @@ export const configApi = {
getAll: () => request<AppConfig>('/config'), getAll: () => request<AppConfig>('/config'),
} }
// ===== Auth 配置 API =====
export const authConfigApi = {
/**
* 获取认证配置
*/
get: () => request<AuthConfig>('/config/auth'),
/**
* 更新认证配置
* @param config 要更新的字段
*/
update: (config: AuthConfigUpdate) =>
request<AuthConfig>('/config/auth', {
method: 'PATCH',
body: JSON.stringify(config),
}),
}
// ===== Video 配置 API ===== // ===== Video 配置 API =====
export const videoConfigApi = { export const videoConfigApi = {
/** /**

View File

@@ -17,6 +17,18 @@ export const authApi = {
check: () => check: () =>
request<{ authenticated: boolean; user?: string; is_admin?: boolean }>('/auth/check'), request<{ authenticated: boolean; user?: string; is_admin?: boolean }>('/auth/check'),
changePassword: (currentPassword: string, newPassword: string) =>
request<{ success: boolean }>('/auth/password', {
method: 'POST',
body: JSON.stringify({ current_password: currentPassword, new_password: newPassword }),
}),
changeUsername: (username: string, currentPassword: string) =>
request<{ success: boolean }>('/auth/username', {
method: 'POST',
body: JSON.stringify({ username, current_password: currentPassword }),
}),
} }
// System API // System API
@@ -121,8 +133,6 @@ export const streamApi = {
clients: number clients: number
target_fps: number target_fps: number
fps: number fps: number
frames_captured: number
frames_dropped: number
}>('/stream/status'), }>('/stream/status'),
start: () => start: () =>
@@ -200,7 +210,7 @@ export const webrtcApi = {
}), }),
getIceServers: () => getIceServers: () =>
request<{ ice_servers: IceServerConfig[] }>('/webrtc/ice-servers'), request<{ ice_servers: IceServerConfig[]; mdns_mode: string }>('/webrtc/ice-servers'),
} }
// HID API // HID API
@@ -516,6 +526,7 @@ export const configApi = {
// 导出新的域分离配置 API // 导出新的域分离配置 API
export { export {
authConfigApi,
videoConfigApi, videoConfigApi,
streamConfigApi, streamConfigApi,
hidConfigApi, hidConfigApi,
@@ -535,6 +546,8 @@ export {
// 导出生成的类型 // 导出生成的类型
export type { export type {
AppConfig, AppConfig,
AuthConfig,
AuthConfigUpdate,
VideoConfig, VideoConfig,
VideoConfigUpdate, VideoConfigUpdate,
StreamConfig, StreamConfig,
@@ -588,53 +601,4 @@ export const audioApi = {
}), }),
} }
// User Management API
export interface User {
id: string
username: string
role: 'admin' | 'user'
created_at: string
}
interface UserApiResponse {
id: string
username: string
is_admin: boolean
created_at: string
}
export const userApi = {
list: async () => {
const rawUsers = await request<UserApiResponse[]>('/users')
const users: User[] = rawUsers.map(u => ({
id: u.id,
username: u.username,
role: u.is_admin ? 'admin' : 'user',
created_at: u.created_at,
}))
return { success: true, users }
},
create: (username: string, password: string, role: 'admin' | 'user' = 'user') =>
request<UserApiResponse>('/users', {
method: 'POST',
body: JSON.stringify({ username, password, is_admin: role === 'admin' }),
}),
update: (id: string, data: { username?: string; role?: 'admin' | 'user' }) =>
request<{ success: boolean }>(`/users/${id}`, {
method: 'PUT',
body: JSON.stringify({ username: data.username, is_admin: data.role === 'admin' }),
}),
delete: (id: string) =>
request<{ success: boolean }>(`/users/${id}`, { method: 'DELETE' }),
changePassword: (id: string, newPassword: string, currentPassword?: string) =>
request<{ success: boolean }>(`/users/${id}/password`, {
method: 'POST',
body: JSON.stringify({ new_password: newPassword, current_password: currentPassword }),
}),
}
export { ApiError } export { ApiError }

View File

@@ -6,6 +6,7 @@ const API_BASE = '/api'
// Toast debounce mechanism - prevent toast spam (5 seconds) // Toast debounce mechanism - prevent toast spam (5 seconds)
const toastDebounceMap = new Map<string, number>() const toastDebounceMap = new Map<string, number>()
const TOAST_DEBOUNCE_TIME = 5000 const TOAST_DEBOUNCE_TIME = 5000
let sessionExpiredNotified = false
function shouldShowToast(key: string): boolean { function shouldShowToast(key: string): boolean {
const now = Date.now() const now = Date.now()
@@ -81,7 +82,26 @@ export async function request<T>(
// Handle HTTP errors (in case backend returns non-2xx) // Handle HTTP errors (in case backend returns non-2xx)
if (!response.ok) { if (!response.ok) {
const message = getErrorMessage(data, `HTTP ${response.status}`) const message = getErrorMessage(data, `HTTP ${response.status}`)
if (toastOnError && shouldShowToast(toastKey)) { const normalized = message.toLowerCase()
const isNotAuthenticated = normalized.includes('not authenticated')
if (response.status === 401 && !sessionExpiredNotified) {
const isLoggedInElsewhere = normalized.includes('logged in elsewhere')
const isSessionExpired = normalized.includes('session expired')
if (isLoggedInElsewhere || isSessionExpired) {
sessionExpiredNotified = true
const titleKey = isLoggedInElsewhere ? 'auth.loggedInElsewhere' : 'auth.sessionExpired'
if (toastOnError && shouldShowToast('error_session_expired')) {
toast.error(t(titleKey), {
description: message,
duration: 3000,
})
}
setTimeout(() => {
window.location.reload()
}, 1200)
}
}
if (toastOnError && shouldShowToast(toastKey) && !(response.status === 401 && isNotAuthenticated)) {
toast.error(t('api.operationFailed'), { toast.error(t('api.operationFailed'), {
description: message, description: message,
duration: 4000, duration: 4000,
@@ -130,4 +150,3 @@ export async function request<T>(
throw new ApiError(0, t('api.networkError')) throw new ApiError(0, t('api.networkError'))
} }
} }

View File

@@ -32,6 +32,10 @@ function createVideoSession() {
resolve: (ready: boolean) => void resolve: (ready: boolean) => void
timer: ReturnType<typeof setTimeout> timer: ReturnType<typeof setTimeout>
} | null = null } | null = null
let webrtcReadyAnyWaiter: {
resolve: (ready: boolean) => void
timer: ReturnType<typeof setTimeout>
} | null = null
let modeReadyWaiter: { let modeReadyWaiter: {
transitionId: string transitionId: string
@@ -62,6 +66,11 @@ function createVideoSession() {
webrtcReadyWaiter.resolve(false) webrtcReadyWaiter.resolve(false)
webrtcReadyWaiter = null webrtcReadyWaiter = null
} }
if (webrtcReadyAnyWaiter) {
clearTimeout(webrtcReadyAnyWaiter.timer)
webrtcReadyAnyWaiter.resolve(false)
webrtcReadyAnyWaiter = null
}
if (modeReadyWaiter) { if (modeReadyWaiter) {
clearTimeout(modeReadyWaiter.timer) clearTimeout(modeReadyWaiter.timer)
modeReadyWaiter.resolve(null) modeReadyWaiter.resolve(null)
@@ -104,6 +113,28 @@ function createVideoSession() {
}) })
} }
function waitForWebRTCReadyAny(timeoutMs = 3000): Promise<boolean> {
if (webrtcReadyAnyWaiter) {
clearTimeout(webrtcReadyAnyWaiter.timer)
webrtcReadyAnyWaiter.resolve(false)
webrtcReadyAnyWaiter = null
}
return new Promise((resolve) => {
const timer = setTimeout(() => {
if (webrtcReadyAnyWaiter) {
webrtcReadyAnyWaiter = null
}
resolve(false)
}, timeoutMs)
webrtcReadyAnyWaiter = {
resolve,
timer,
}
})
}
function waitForModeReady(transitionId: string, timeoutMs = 5000): Promise<string | null> { function waitForModeReady(transitionId: string, timeoutMs = 5000): Promise<string | null> {
if (modeReadyWaiter) { if (modeReadyWaiter) {
clearTimeout(modeReadyWaiter.timer) clearTimeout(modeReadyWaiter.timer)
@@ -156,6 +187,10 @@ function createVideoSession() {
clearTimeout(webrtcReadyWaiter.timer) clearTimeout(webrtcReadyWaiter.timer)
webrtcReadyWaiter.resolve(true) webrtcReadyWaiter.resolve(true)
webrtcReadyWaiter = null webrtcReadyWaiter = null
} else if (!data.transition_id && webrtcReadyAnyWaiter) {
clearTimeout(webrtcReadyAnyWaiter.timer)
webrtcReadyAnyWaiter.resolve(true)
webrtcReadyAnyWaiter = null
} }
} }
@@ -170,6 +205,7 @@ function createVideoSession() {
clearWaiters, clearWaiters,
registerTransition, registerTransition,
waitForWebRTCReady, waitForWebRTCReady,
waitForWebRTCReadyAny,
waitForModeReady, waitForModeReady,
onModeSwitching, onModeSwitching,
onModeReady, onModeReady,

View File

@@ -2,7 +2,7 @@
// Provides low-latency video via WebRTC with DataChannel for HID // Provides low-latency video via WebRTC with DataChannel for HID
import { ref, onUnmounted, computed, type Ref } from 'vue' import { ref, onUnmounted, computed, type Ref } from 'vue'
import { webrtcApi } from '@/api' import { webrtcApi, type IceCandidate } from '@/api'
import { generateUUID } from '@/lib/utils' import { generateUUID } from '@/lib/utils'
import { import {
type HidKeyboardEvent, type HidKeyboardEvent,
@@ -10,6 +10,7 @@ import {
encodeKeyboardEvent, encodeKeyboardEvent,
encodeMouseEvent, encodeMouseEvent,
} from '@/types/hid' } from '@/types/hid'
import { useWebSocket } from '@/composables/useWebSocket'
export type { HidKeyboardEvent, HidMouseEvent } export type { HidKeyboardEvent, HidMouseEvent }
@@ -39,10 +40,25 @@ export interface WebRTCStats {
// Cached ICE servers from backend API // Cached ICE servers from backend API
let cachedIceServers: RTCIceServer[] | null = null let cachedIceServers: RTCIceServer[] | null = null
interface WebRTCIceCandidateEvent {
session_id: string
candidate: IceCandidate
}
interface WebRTCIceCompleteEvent {
session_id: string
}
// Fetch ICE servers from backend API // Fetch ICE servers from backend API
async function fetchIceServers(): Promise<RTCIceServer[]> { async function fetchIceServers(): Promise<RTCIceServer[]> {
try { try {
const response = await webrtcApi.getIceServers() const response = await webrtcApi.getIceServers()
if (response.mdns_mode) {
allowMdnsHostCandidates = response.mdns_mode !== 'disabled'
} else if (response.ice_servers) {
allowMdnsHostCandidates = response.ice_servers.length === 0
}
if (response.ice_servers && response.ice_servers.length > 0) { if (response.ice_servers && response.ice_servers.length > 0) {
cachedIceServers = response.ice_servers.map(server => ({ cachedIceServers = response.ice_servers.map(server => ({
urls: server.urls, urls: server.urls,
@@ -65,6 +81,7 @@ async function fetchIceServers(): Promise<RTCIceServer[]> {
window.location.hostname.startsWith('10.')) window.location.hostname.startsWith('10.'))
if (isLocalConnection) { if (isLocalConnection) {
allowMdnsHostCandidates = false
console.log('[WebRTC] Local connection detected, using host candidates only') console.log('[WebRTC] Local connection detected, using host candidates only')
return [] return []
} }
@@ -83,8 +100,16 @@ let sessionId: string | null = null
let statsInterval: number | null = null let statsInterval: number | null = null
let isConnecting = false // Lock to prevent concurrent connect calls let isConnecting = false // Lock to prevent concurrent connect calls
let pendingIceCandidates: RTCIceCandidate[] = [] // Queue for ICE candidates before sessionId is set let pendingIceCandidates: RTCIceCandidate[] = [] // Queue for ICE candidates before sessionId is set
let pendingRemoteCandidates: WebRTCIceCandidateEvent[] = [] // Queue for server ICE candidates
let pendingRemoteIceComplete = new Set<string>() // Session IDs waiting for end-of-candidates
let seenRemoteCandidates = new Set<string>() // Deduplicate server ICE candidates
let cachedMediaStream: MediaStream | null = null // Cached MediaStream to avoid recreating let cachedMediaStream: MediaStream | null = null // Cached MediaStream to avoid recreating
let allowMdnsHostCandidates = false
let wsHandlersRegistered = false
const { on: wsOn } = useWebSocket()
const state = ref<WebRTCState>('disconnected') const state = ref<WebRTCState>('disconnected')
const videoTrack = ref<MediaStreamTrack | null>(null) const videoTrack = ref<MediaStreamTrack | null>(null)
const audioTrack = ref<MediaStreamTrack | null>(null) const audioTrack = ref<MediaStreamTrack | null>(null)
@@ -148,6 +173,7 @@ function createPeerConnection(iceServers: RTCIceServer[]): RTCPeerConnection {
// Handle ICE candidates // Handle ICE candidates
pc.onicecandidate = async (event) => { pc.onicecandidate = async (event) => {
if (!event.candidate) return if (!event.candidate) return
if (shouldSkipLocalCandidate(event.candidate)) return
const currentSessionId = sessionId const currentSessionId = sessionId
if (currentSessionId && pc.connectionState !== 'closed') { if (currentSessionId && pc.connectionState !== 'closed') {
@@ -218,6 +244,99 @@ function createDataChannel(pc: RTCPeerConnection): RTCDataChannel {
return channel return channel
} }
function registerWebSocketHandlers() {
if (wsHandlersRegistered) return
wsHandlersRegistered = true
wsOn('webrtc.ice_candidate', handleRemoteIceCandidate)
wsOn('webrtc.ice_complete', handleRemoteIceComplete)
}
function shouldSkipLocalCandidate(candidate: RTCIceCandidate): boolean {
if (allowMdnsHostCandidates) return false
const value = candidate.candidate || ''
return value.includes(' typ host') && value.includes('.local')
}
async function handleRemoteIceCandidate(data: WebRTCIceCandidateEvent) {
if (!data || !data.candidate) return
// Queue until session is ready and remote description is set
if (!sessionId) {
pendingRemoteCandidates.push(data)
return
}
if (data.session_id !== sessionId) return
if (!peerConnection || !peerConnection.remoteDescription) {
pendingRemoteCandidates.push(data)
return
}
await addRemoteIceCandidate(data.candidate)
}
async function handleRemoteIceComplete(data: WebRTCIceCompleteEvent) {
if (!data || !data.session_id) return
if (!sessionId) {
pendingRemoteIceComplete.add(data.session_id)
return
}
if (data.session_id !== sessionId) return
if (!peerConnection || !peerConnection.remoteDescription) {
pendingRemoteIceComplete.add(data.session_id)
return
}
try {
await peerConnection.addIceCandidate(null)
} catch {
// End-of-candidates failures are non-fatal
}
}
async function addRemoteIceCandidate(candidate: IceCandidate) {
if (!peerConnection) return
if (!candidate.candidate) return
if (seenRemoteCandidates.has(candidate.candidate)) return
seenRemoteCandidates.add(candidate.candidate)
const iceCandidate: RTCIceCandidateInit = {
candidate: candidate.candidate,
sdpMid: candidate.sdpMid ?? undefined,
sdpMLineIndex: candidate.sdpMLineIndex ?? undefined,
usernameFragment: candidate.usernameFragment ?? undefined,
}
try {
await peerConnection.addIceCandidate(iceCandidate)
} catch {
// ICE candidate add failures are non-fatal
}
}
async function flushPendingRemoteIce() {
if (!peerConnection || !sessionId || !peerConnection.remoteDescription) return
const remaining: WebRTCIceCandidateEvent[] = []
for (const event of pendingRemoteCandidates) {
if (event.session_id === sessionId) {
await addRemoteIceCandidate(event.candidate)
} else {
// Drop candidates for old sessions
}
}
pendingRemoteCandidates = remaining
if (pendingRemoteIceComplete.has(sessionId)) {
pendingRemoteIceComplete.delete(sessionId)
try {
await peerConnection.addIceCandidate(null)
} catch {
// Ignore end-of-candidates errors
}
}
}
// Start collecting stats // Start collecting stats
function startStatsCollection() { function startStatsCollection() {
if (statsInterval) return if (statsInterval) return
@@ -315,6 +434,7 @@ async function flushPendingIceCandidates() {
pendingIceCandidates = [] pendingIceCandidates = []
for (const candidate of candidates) { for (const candidate of candidates) {
if (shouldSkipLocalCandidate(candidate)) continue
try { try {
await webrtcApi.addIceCandidate(sessionId, { await webrtcApi.addIceCandidate(sessionId, {
candidate: candidate.candidate, candidate: candidate.candidate,
@@ -330,6 +450,8 @@ async function flushPendingIceCandidates() {
// Connect to WebRTC server // Connect to WebRTC server
async function connect(): Promise<boolean> { async function connect(): Promise<boolean> {
registerWebSocketHandlers()
// Prevent concurrent connection attempts // Prevent concurrent connection attempts
if (isConnecting) { if (isConnecting) {
return false return false
@@ -384,19 +506,13 @@ async function connect(): Promise<boolean> {
} }
await peerConnection.setRemoteDescription(answer) await peerConnection.setRemoteDescription(answer)
// Flush any pending server ICE candidates once remote description is set
await flushPendingRemoteIce()
// Add any ICE candidates from the response // Add any ICE candidates from the response
if (response.ice_candidates && response.ice_candidates.length > 0) { if (response.ice_candidates && response.ice_candidates.length > 0) {
for (const candidateObj of response.ice_candidates) { for (const candidateObj of response.ice_candidates) {
try { await addRemoteIceCandidate(candidateObj)
const iceCandidate: RTCIceCandidateInit = {
candidate: candidateObj.candidate,
sdpMid: candidateObj.sdpMid ?? '0',
sdpMLineIndex: candidateObj.sdpMLineIndex ?? 0,
}
await peerConnection.addIceCandidate(iceCandidate)
} catch {
// ICE candidate add failures are non-fatal
}
} }
} }
@@ -440,6 +556,9 @@ async function disconnect() {
sessionId = null sessionId = null
isConnecting = false isConnecting = false
pendingIceCandidates = [] pendingIceCandidates = []
pendingRemoteCandidates = []
pendingRemoteIceComplete.clear()
seenRemoteCandidates.clear()
if (dataChannel) { if (dataChannel) {
dataChannel.close() dataChannel.close()

View File

@@ -76,6 +76,8 @@ export default {
passwordTooShort: 'Password must be at least 4 characters', passwordTooShort: 'Password must be at least 4 characters',
passwordChanged: 'Password changed successfully', passwordChanged: 'Password changed successfully',
userNotFound: 'User not found', userNotFound: 'User not found',
sessionExpired: 'Session expired',
loggedInElsewhere: 'Logged in elsewhere',
}, },
status: { status: {
connected: 'Connected', connected: 'Connected',
@@ -270,8 +272,6 @@ export default {
extensionsDescription: 'Choose which extensions to auto-start', extensionsDescription: 'Choose which extensions to auto-start',
ttydTitle: 'Web Terminal (ttyd)', ttydTitle: 'Web Terminal (ttyd)',
ttydDescription: 'Access device command line in browser', ttydDescription: 'Access device command line in browser',
rustdeskTitle: 'RustDesk Remote Desktop',
rustdeskDescription: 'Remote access via RustDesk client',
extensionsHint: 'These settings can be changed later in Settings', extensionsHint: 'These settings can be changed later in Settings',
notInstalled: 'Not installed', notInstalled: 'Not installed',
// Password strength // Password strength
@@ -427,6 +427,8 @@ export default {
basic: 'Basic', basic: 'Basic',
general: 'General', general: 'General',
appearance: 'Appearance', appearance: 'Appearance',
account: 'User',
access: 'Access',
video: 'Video', video: 'Video',
encoder: 'Encoder', encoder: 'Encoder',
hid: 'HID', hid: 'HID',
@@ -436,6 +438,7 @@ export default {
users: 'Users', users: 'Users',
hardware: 'Hardware', hardware: 'Hardware',
system: 'System', system: 'System',
other: 'Other',
extensions: 'Extensions', extensions: 'Extensions',
configured: 'Configured', configured: 'Configured',
security: 'Security', security: 'Security',
@@ -457,6 +460,8 @@ export default {
changePassword: 'Change Password', changePassword: 'Change Password',
currentPassword: 'Current Password', currentPassword: 'Current Password',
newPassword: 'New Password', newPassword: 'New Password',
usernameDesc: 'Change your login username',
passwordDesc: 'Change your login password',
version: 'Version', version: 'Version',
buildInfo: 'Build Info', buildInfo: 'Build Info',
detectDevices: 'Detect Devices', detectDevices: 'Detect Devices',
@@ -470,7 +475,7 @@ export default {
httpPort: 'HTTP Port', httpPort: 'HTTP Port',
configureHttpPort: 'Configure HTTP server port', configureHttpPort: 'Configure HTTP server port',
// Web server // Web server
webServer: 'Basic', webServer: 'Access Address',
webServerDesc: 'Configure HTTP/HTTPS ports and bind address. Restart required for changes to take effect.', webServerDesc: 'Configure HTTP/HTTPS ports and bind address. Restart required for changes to take effect.',
httpsPort: 'HTTPS Port', httpsPort: 'HTTPS Port',
bindAddress: 'Bind Address', bindAddress: 'Bind Address',
@@ -480,6 +485,13 @@ export default {
restartRequired: 'Restart Required', restartRequired: 'Restart Required',
restartMessage: 'Web server configuration saved. A restart is required for changes to take effect.', restartMessage: 'Web server configuration saved. A restart is required for changes to take effect.',
restarting: 'Restarting...', restarting: 'Restarting...',
// Auth
auth: 'Access',
authSettings: 'Access Settings',
authSettingsDesc: 'Single-user access and session behavior',
allowMultipleSessions: 'Allow multiple web sessions',
allowMultipleSessionsDesc: 'When disabled, a new login will kick the previous session.',
singleUserSessionNote: 'Single-user mode is enforced; only session concurrency is configurable.',
// User management // User management
userManagement: 'User Management', userManagement: 'User Management',
userManagementDesc: 'Manage user accounts and permissions', userManagementDesc: 'Manage user accounts and permissions',
@@ -569,6 +581,25 @@ export default {
hidBackend: 'HID Backend', hidBackend: 'HID Backend',
serialDevice: 'Serial Device', serialDevice: 'Serial Device',
baudRate: 'Baud Rate', baudRate: 'Baud Rate',
otgHidProfile: 'OTG HID Profile',
otgHidProfileDesc: 'Select which HID functions are exposed to the host',
profile: 'Profile',
otgProfileFull: 'Full (keyboard + relative mouse + absolute mouse + consumer)',
otgProfileLegacyKeyboard: 'Legacy: keyboard only',
otgProfileLegacyMouseRelative: 'Legacy: relative mouse only',
otgProfileCustom: 'Custom',
otgFunctionKeyboard: 'Keyboard',
otgFunctionKeyboardDesc: 'Standard HID keyboard device',
otgFunctionMouseRelative: 'Relative Mouse',
otgFunctionMouseRelativeDesc: 'Traditional mouse movement (HID boot mouse)',
otgFunctionMouseAbsolute: 'Absolute Mouse',
otgFunctionMouseAbsoluteDesc: 'Absolute positioning (touchscreen-like)',
otgFunctionConsumer: 'Consumer Control',
otgFunctionConsumerDesc: 'Media keys like volume/play/pause',
otgFunctionMsd: 'Mass Storage (MSD)',
otgFunctionMsdDesc: 'Expose USB storage to the host',
otgProfileWarning: 'Changing HID functions will reconnect the USB device',
otgFunctionMinWarning: 'Enable at least one HID function before saving',
// OTG Descriptor // OTG Descriptor
otgDescriptor: 'USB Device Descriptor', otgDescriptor: 'USB Device Descriptor',
otgDescriptorDesc: 'Configure USB device identification', otgDescriptorDesc: 'Configure USB device identification',
@@ -678,6 +709,10 @@ export default {
viewLogs: 'View Logs', viewLogs: 'View Logs',
noLogs: 'No logs available', noLogs: 'No logs available',
binaryNotFound: '{path} not found, please install the required program', binaryNotFound: '{path} not found, please install the required program',
remoteAccess: {
title: 'Remote Access',
desc: 'GOSTC NAT traversal and Easytier networking',
},
// ttyd // ttyd
ttyd: { ttyd: {
title: 'Ttyd Web Terminal', title: 'Ttyd Web Terminal',

View File

@@ -76,6 +76,8 @@ export default {
passwordTooShort: '密码至少需要4个字符', passwordTooShort: '密码至少需要4个字符',
passwordChanged: '密码修改成功', passwordChanged: '密码修改成功',
userNotFound: '用户不存在', userNotFound: '用户不存在',
sessionExpired: '会话已过期',
loggedInElsewhere: '已在别处登录',
}, },
status: { status: {
connected: '已连接', connected: '已连接',
@@ -270,8 +272,6 @@ export default {
extensionsDescription: '选择要自动启动的扩展服务', extensionsDescription: '选择要自动启动的扩展服务',
ttydTitle: 'Web 终端 (ttyd)', ttydTitle: 'Web 终端 (ttyd)',
ttydDescription: '在浏览器中访问设备的命令行终端', ttydDescription: '在浏览器中访问设备的命令行终端',
rustdeskTitle: 'RustDesk 远程桌面',
rustdeskDescription: '通过 RustDesk 客户端远程访问设备',
extensionsHint: '这些设置可以在设置页面中随时更改', extensionsHint: '这些设置可以在设置页面中随时更改',
notInstalled: '未安装', notInstalled: '未安装',
// Password strength // Password strength
@@ -427,6 +427,8 @@ export default {
basic: '基础', basic: '基础',
general: '通用', general: '通用',
appearance: '外观', appearance: '外观',
account: '用户',
access: '访问',
video: '视频', video: '视频',
encoder: '编码器', encoder: '编码器',
hid: 'HID', hid: 'HID',
@@ -436,6 +438,7 @@ export default {
users: '用户', users: '用户',
hardware: '硬件', hardware: '硬件',
system: '系统', system: '系统',
other: '其他',
extensions: '扩展', extensions: '扩展',
configured: '已配置', configured: '已配置',
security: '安全', security: '安全',
@@ -457,6 +460,8 @@ export default {
changePassword: '修改密码', changePassword: '修改密码',
currentPassword: '当前密码', currentPassword: '当前密码',
newPassword: '新密码', newPassword: '新密码',
usernameDesc: '修改登录用户名',
passwordDesc: '修改登录密码',
version: '版本', version: '版本',
buildInfo: '构建信息', buildInfo: '构建信息',
detectDevices: '探测设备', detectDevices: '探测设备',
@@ -470,7 +475,7 @@ export default {
httpPort: 'HTTP 端口', httpPort: 'HTTP 端口',
configureHttpPort: '配置 HTTP 服务器端口', configureHttpPort: '配置 HTTP 服务器端口',
// Web server // Web server
webServer: '基础', webServer: '访问地址',
webServerDesc: '配置 HTTP/HTTPS 端口和绑定地址,修改后需要重启生效', webServerDesc: '配置 HTTP/HTTPS 端口和绑定地址,修改后需要重启生效',
httpsPort: 'HTTPS 端口', httpsPort: 'HTTPS 端口',
bindAddress: '绑定地址', bindAddress: '绑定地址',
@@ -480,6 +485,13 @@ export default {
restartRequired: '需要重启', restartRequired: '需要重启',
restartMessage: 'Web 服务器配置已保存,需要重启程序才能生效。', restartMessage: 'Web 服务器配置已保存,需要重启程序才能生效。',
restarting: '正在重启...', restarting: '正在重启...',
// Auth
auth: '访问控制',
authSettings: '访问设置',
authSettingsDesc: '单用户访问与会话策略',
allowMultipleSessions: '允许多个 Web 会话',
allowMultipleSessionsDesc: '关闭后,新登录会踢掉旧会话。',
singleUserSessionNote: '系统固定为单用户模式,仅可配置会话并发方式。',
// User management // User management
userManagement: '用户管理', userManagement: '用户管理',
userManagementDesc: '管理用户账号和权限', userManagementDesc: '管理用户账号和权限',
@@ -569,6 +581,25 @@ export default {
hidBackend: 'HID 后端', hidBackend: 'HID 后端',
serialDevice: '串口设备', serialDevice: '串口设备',
baudRate: '波特率', baudRate: '波特率',
otgHidProfile: 'OTG HID 组合',
otgHidProfileDesc: '选择对目标主机暴露的 HID 功能',
profile: '组合',
otgProfileFull: '完整(键盘 + 相对鼠标 + 绝对鼠标 + 多媒体)',
otgProfileLegacyKeyboard: '兼容:仅键盘',
otgProfileLegacyMouseRelative: '兼容:仅相对鼠标',
otgProfileCustom: '自定义',
otgFunctionKeyboard: '键盘',
otgFunctionKeyboardDesc: '标准 HID 键盘设备',
otgFunctionMouseRelative: '相对鼠标',
otgFunctionMouseRelativeDesc: '传统鼠标移动HID 启动鼠标)',
otgFunctionMouseAbsolute: '绝对鼠标',
otgFunctionMouseAbsoluteDesc: '绝对定位(类似触控)',
otgFunctionConsumer: '多媒体控制',
otgFunctionConsumerDesc: '音量/播放/暂停等按键',
otgFunctionMsd: 'U盘MSD',
otgFunctionMsdDesc: '向目标主机暴露 USB 存储',
otgProfileWarning: '修改 HID 功能将导致 USB 设备重新连接',
otgFunctionMinWarning: '请至少启用一个 HID 功能后再保存',
// OTG Descriptor // OTG Descriptor
otgDescriptor: 'USB 设备描述符', otgDescriptor: 'USB 设备描述符',
otgDescriptorDesc: '配置 USB 设备标识信息', otgDescriptorDesc: '配置 USB 设备标识信息',
@@ -678,6 +709,10 @@ export default {
viewLogs: '查看日志', viewLogs: '查看日志',
noLogs: '暂无日志', noLogs: '暂无日志',
binaryNotFound: '未找到 {path},请先安装对应程序', binaryNotFound: '未找到 {path},请先安装对应程序',
remoteAccess: {
title: '远程访问',
desc: 'GOSTC 内网穿透与 Easytier 组网',
},
// ttyd // ttyd
ttyd: { ttyd: {
title: 'Ttyd 网页终端', title: 'Ttyd 网页终端',

View File

@@ -24,7 +24,7 @@ const routes: RouteRecordRaw[] = [
path: '/settings', path: '/settings',
name: 'Settings', name: 'Settings',
component: () => import('@/views/SettingsView.vue'), component: () => import('@/views/SettingsView.vue'),
meta: { requiresAuth: true, requiresAdmin: true }, meta: { requiresAuth: true },
}, },
] ]
@@ -63,11 +63,6 @@ router.beforeEach(async (to, _from, next) => {
} }
} }
// Check admin requirement
if (to.meta.requiresAdmin && !authStore.isAdmin) {
// Redirect non-admin users to console
return next({ name: 'Console' })
}
} }
next() next()

View File

@@ -24,8 +24,6 @@ interface StreamState {
resolution: [number, number] | null resolution: [number, number] | null
targetFps: number targetFps: number
clients: number clients: number
framesCaptured: number
framesDropped: number
streamMode: string // 'mjpeg' or 'webrtc' streamMode: string // 'mjpeg' or 'webrtc'
error: string | null error: string | null
} }
@@ -277,8 +275,6 @@ export const useSystemStore = defineStore('system', () => {
resolution: data.video.resolution, resolution: data.video.resolution,
targetFps: data.video.fps, targetFps: data.video.fps,
clients: stream.value?.clients ?? 0, clients: stream.value?.clients ?? 0,
framesCaptured: stream.value?.framesCaptured ?? 0,
framesDropped: stream.value?.framesDropped ?? 0,
streamMode: data.video.stream_mode || 'mjpeg', streamMode: data.video.stream_mode || 'mjpeg',
error: data.video.error, error: data.video.error,
} }

View File

@@ -6,6 +6,8 @@
export interface AuthConfig { export interface AuthConfig {
/** Session timeout in seconds */ /** Session timeout in seconds */
session_timeout_secs: number; session_timeout_secs: number;
/** Allow multiple concurrent web sessions (single-user mode) */
single_user_allow_multiple_sessions: boolean;
/** Enable 2FA */ /** Enable 2FA */
totp_enabled: boolean; totp_enabled: boolean;
/** TOTP secret (encrypted) */ /** TOTP secret (encrypted) */
@@ -52,6 +54,26 @@ export interface OtgDescriptorConfig {
serial_number?: string; serial_number?: string;
} }
/** OTG HID function profile */
export enum OtgHidProfile {
/** Full HID device set (keyboard + relative mouse + absolute mouse + consumer control) */
Full = "full",
/** Legacy profile: only keyboard */
LegacyKeyboard = "legacy_keyboard",
/** Legacy profile: only relative mouse */
LegacyMouseRelative = "legacy_mouse_relative",
/** Custom function selection */
Custom = "custom",
}
/** OTG HID function selection (used when profile is Custom) */
export interface OtgHidFunctions {
keyboard: boolean;
mouse_relative: boolean;
mouse_absolute: boolean;
consumer: boolean;
}
/** HID configuration */ /** HID configuration */
export interface HidConfig { export interface HidConfig {
/** HID backend type */ /** HID backend type */
@@ -64,6 +86,10 @@ export interface HidConfig {
otg_udc?: string; otg_udc?: string;
/** OTG USB device descriptor configuration */ /** OTG USB device descriptor configuration */
otg_descriptor?: OtgDescriptorConfig; otg_descriptor?: OtgDescriptorConfig;
/** OTG HID function profile */
otg_profile?: OtgHidProfile;
/** OTG HID function selection (used when profile is Custom) */
otg_functions?: OtgHidFunctions;
/** CH9329 serial port */ /** CH9329 serial port */
ch9329_port: string; ch9329_port: string;
/** CH9329 baud rate */ /** CH9329 baud rate */
@@ -392,6 +418,10 @@ export interface AudioConfigUpdate {
quality?: string; quality?: string;
} }
export interface AuthConfigUpdate {
single_user_allow_multiple_sessions?: boolean;
}
/** Update easytier config */ /** Update easytier config */
export interface EasytierConfigUpdate { export interface EasytierConfigUpdate {
enabled?: boolean; enabled?: boolean;
@@ -496,12 +526,21 @@ export interface OtgDescriptorConfigUpdate {
serial_number?: string; serial_number?: string;
} }
export interface OtgHidFunctionsUpdate {
keyboard?: boolean;
mouse_relative?: boolean;
mouse_absolute?: boolean;
consumer?: boolean;
}
export interface HidConfigUpdate { export interface HidConfigUpdate {
backend?: HidBackend; backend?: HidBackend;
ch9329_port?: string; ch9329_port?: string;
ch9329_baudrate?: number; ch9329_baudrate?: number;
otg_udc?: string; otg_udc?: string;
otg_descriptor?: OtgDescriptorConfigUpdate; otg_descriptor?: OtgDescriptorConfigUpdate;
otg_profile?: OtgHidProfile;
otg_functions?: OtgHidFunctionsUpdate;
mouse_absolute?: boolean; mouse_absolute?: boolean;
} }

View File

@@ -10,7 +10,7 @@ import { useHidWebSocket } from '@/composables/useHidWebSocket'
import { useWebRTC } from '@/composables/useWebRTC' import { useWebRTC } from '@/composables/useWebRTC'
import { useVideoSession } from '@/composables/useVideoSession' import { useVideoSession } from '@/composables/useVideoSession'
import { getUnifiedAudio } from '@/composables/useUnifiedAudio' import { getUnifiedAudio } from '@/composables/useUnifiedAudio'
import { streamApi, hidApi, atxApi, extensionsApi, atxConfigApi, userApi } from '@/api' import { streamApi, hidApi, atxApi, extensionsApi, atxConfigApi, authApi } from '@/api'
import type { HidKeyboardEvent, HidMouseEvent } from '@/types/hid' import type { HidKeyboardEvent, HidMouseEvent } from '@/types/hid'
import { toast } from 'vue-sonner' import { toast } from 'vue-sonner'
import { generateUUID } from '@/lib/utils' import { generateUUID } from '@/lib/utils'
@@ -641,7 +641,7 @@ function handleStreamConfigChanging(data: any) {
}) })
} }
function handleStreamConfigApplied(data: any) { async function handleStreamConfigApplied(data: any) {
// Reset consecutive error counter for new config // Reset consecutive error counter for new config
consecutiveErrors = 0 consecutiveErrors = 0
@@ -662,6 +662,10 @@ function handleStreamConfigApplied(data: any) {
if (videoMode.value !== 'mjpeg') { if (videoMode.value !== 'mjpeg') {
// In WebRTC mode, reconnect WebRTC (session was closed due to config change) // In WebRTC mode, reconnect WebRTC (session was closed due to config change)
const ready = await videoSession.waitForWebRTCReadyAny(3000)
if (!ready) {
console.warn('[WebRTC] Backend not ready after timeout (config change), attempting connection anyway')
}
switchToWebRTC(videoMode.value) switchToWebRTC(videoMode.value)
} else { } else {
// In MJPEG mode, refresh the MJPEG stream // In MJPEG mode, refresh the MJPEG stream
@@ -1259,16 +1263,7 @@ async function handleChangePassword() {
changingPassword.value = true changingPassword.value = true
try { try {
// Get current user ID - we need to fetch user list first await authApi.changePassword(currentPassword.value, newPassword.value)
const result = await userApi.list()
const currentUser = result.users.find(u => u.username === authStore.user)
if (!currentUser) {
toast.error(t('auth.userNotFound'))
return
}
await userApi.changePassword(currentUser.id, newPassword.value, currentPassword.value)
toast.success(t('auth.passwordChanged')) toast.success(t('auth.passwordChanged'))
// Reset form and close dialog // Reset form and close dialog

View File

@@ -1,11 +1,13 @@
<script setup lang="ts"> <script setup lang="ts">
import { ref, computed, onMounted } from 'vue' import { ref, computed, onMounted, watch } from 'vue'
import { useI18n } from 'vue-i18n' import { useI18n } from 'vue-i18n'
import { useSystemStore } from '@/stores/system' import { useSystemStore } from '@/stores/system'
import { useAuthStore } from '@/stores/auth'
import { import {
authApi,
authConfigApi,
configApi, configApi,
streamApi, streamApi,
userApi,
videoConfigApi, videoConfigApi,
streamConfigApi, streamConfigApi,
hidConfigApi, hidConfigApi,
@@ -16,7 +18,7 @@ import {
webConfigApi, webConfigApi,
systemApi, systemApi,
type EncoderBackendInfo, type EncoderBackendInfo,
type User as UserType, type AuthConfig,
type RustDeskConfigResponse, type RustDeskConfigResponse,
type RustDeskStatusResponse, type RustDeskStatusResponse,
type RustDeskPasswordResponse, type RustDeskPasswordResponse,
@@ -28,6 +30,8 @@ import type {
AtxDriverType, AtxDriverType,
ActiveLevel, ActiveLevel,
AtxDevices, AtxDevices,
OtgHidProfile,
OtgHidFunctions,
} from '@/types/generated' } from '@/types/generated'
import { setLanguage } from '@/i18n' import { setLanguage } from '@/i18n'
import { useClipboard } from '@/composables/useClipboard' import { useClipboard } from '@/composables/useClipboard'
@@ -57,16 +61,11 @@ import {
EyeOff, EyeOff,
Save, Save,
Check, Check,
Network,
HardDrive, HardDrive,
Power, Power,
UserPlus,
User,
Pencil,
Trash2,
Menu, Menu,
Users, Lock,
Globe, User,
RefreshCw, RefreshCw,
Terminal, Terminal,
Play, Play,
@@ -80,6 +79,7 @@ import {
const { t, locale } = useI18n() const { t, locale } = useI18n()
const systemStore = useSystemStore() const systemStore = useSystemStore()
const authStore = useAuthStore()
// Settings state // Settings state
const activeSection = ref('appearance') const activeSection = ref('appearance')
@@ -90,9 +90,11 @@ const saved = ref(false)
// Navigation structure // Navigation structure
const navGroups = computed(() => [ const navGroups = computed(() => [
{ {
title: t('settings.general'), title: t('settings.system'),
items: [ items: [
{ id: 'appearance', label: t('settings.appearance'), icon: Sun }, { id: 'appearance', label: t('settings.appearance'), icon: Sun },
{ id: 'account', label: t('settings.account'), icon: User },
{ id: 'access', label: t('settings.access'), icon: Lock },
] ]
}, },
{ {
@@ -100,7 +102,7 @@ const navGroups = computed(() => [
items: [ items: [
{ id: 'video', label: t('settings.video'), icon: Monitor, status: config.value.video_device ? t('settings.configured') : null }, { id: 'video', label: t('settings.video'), icon: Monitor, status: config.value.video_device ? t('settings.configured') : null },
{ id: 'hid', label: t('settings.hid'), icon: Keyboard, status: config.value.hid_backend.toUpperCase() }, { id: 'hid', label: t('settings.hid'), icon: Keyboard, status: config.value.hid_backend.toUpperCase() },
{ id: 'msd', label: t('settings.msd'), icon: HardDrive }, ...(config.value.msd_enabled ? [{ id: 'msd', label: t('settings.msd'), icon: HardDrive }] : []),
{ id: 'atx', label: t('settings.atx'), icon: Power }, { id: 'atx', label: t('settings.atx'), icon: Power },
] ]
}, },
@@ -108,16 +110,13 @@ const navGroups = computed(() => [
title: t('settings.extensions'), title: t('settings.extensions'),
items: [ items: [
{ id: 'ext-rustdesk', label: t('extensions.rustdesk.title'), icon: ScreenShare }, { id: 'ext-rustdesk', label: t('extensions.rustdesk.title'), icon: ScreenShare },
{ id: 'ext-remote-access', label: t('extensions.remoteAccess.title'), icon: ExternalLink },
{ id: 'ext-ttyd', label: t('extensions.ttyd.title'), icon: Terminal }, { id: 'ext-ttyd', label: t('extensions.ttyd.title'), icon: Terminal },
{ id: 'ext-gostc', label: t('extensions.gostc.title'), icon: Globe },
{ id: 'ext-easytier', label: t('extensions.easytier.title'), icon: Network },
] ]
}, },
{ {
title: t('settings.system'), title: t('settings.other'),
items: [ items: [
{ id: 'web-server', label: t('settings.webServer'), icon: Globe },
{ id: 'users', label: t('settings.users'), icon: Users },
{ id: 'about', label: t('settings.about'), icon: Info }, { id: 'about', label: t('settings.about'), icon: Info },
] ]
} }
@@ -131,22 +130,28 @@ function selectSection(id: string) {
// Theme // Theme
const theme = ref<'light' | 'dark' | 'system'>('system') const theme = ref<'light' | 'dark' | 'system'>('system')
// Password change // Account settings
const showPasswordDialog = ref(false) const usernameInput = ref('')
const usernamePassword = ref('')
const usernameSaving = ref(false)
const usernameSaved = ref(false)
const usernameError = ref('')
const currentPassword = ref('') const currentPassword = ref('')
const newPassword = ref('') const newPassword = ref('')
const confirmPassword = ref('') const confirmPassword = ref('')
const showPasswords = ref(false) const passwordSaving = ref(false)
const passwordSaved = ref(false)
const passwordError = ref('') const passwordError = ref('')
const showPasswords = ref(false)
// User management // Auth config state
const users = ref<UserType[]>([]) const authConfig = ref<AuthConfig>({
const usersLoading = ref(false) session_timeout_secs: 3600 * 24,
const showAddUserDialog = ref(false) single_user_allow_multiple_sessions: false,
const showEditUserDialog = ref(false) totp_enabled: false,
const editingUser = ref<UserType | null>(null) totp_secret: undefined,
const newUser = ref({ username: '', password: '', role: 'user' as 'admin' | 'user' }) })
const editUserData = ref({ username: '', role: 'user' as 'admin' | 'user' }) const authConfigLoading = ref(false)
// Extensions management // Extensions management
const extensions = ref<ExtensionsStatus | null>(null) const extensions = ref<ExtensionsStatus | null>(null)
@@ -232,6 +237,13 @@ const config = ref({
hid_backend: 'ch9329', hid_backend: 'ch9329',
hid_serial_device: '', hid_serial_device: '',
hid_serial_baudrate: 9600, hid_serial_baudrate: 9600,
hid_otg_profile: 'full' as OtgHidProfile,
hid_otg_functions: {
keyboard: true,
mouse_relative: true,
mouse_absolute: true,
consumer: true,
} as OtgHidFunctions,
msd_enabled: false, msd_enabled: false,
msd_dir: '', msd_dir: '',
network_port: 8080, network_port: 8080,
@@ -246,6 +258,13 @@ const config = ref({
// 跟踪服务器是否已配置 TURN 密码 // 跟踪服务器是否已配置 TURN 密码
const hasTurnPassword = ref(false) const hasTurnPassword = ref(false)
const isHidFunctionSelectionValid = computed(() => {
if (config.value.hid_backend !== 'otg') return true
if (config.value.hid_otg_profile !== 'custom') return true
const f = config.value.hid_otg_functions
return !!(f.keyboard || f.mouse_relative || f.mouse_absolute || f.consumer)
})
// OTG Descriptor settings // OTG Descriptor settings
const otgVendorIdHex = ref('1d6b') const otgVendorIdHex = ref('1d6b')
const otgProductIdHex = ref('0104') const otgProductIdHex = ref('0104')
@@ -259,6 +278,12 @@ const validateHex = (event: Event, _field: string) => {
input.value = input.value.replace(/[^0-9a-fA-F]/g, '').toLowerCase() input.value = input.value.replace(/[^0-9a-fA-F]/g, '').toLowerCase()
} }
watch(() => config.value.msd_enabled, (enabled) => {
if (!enabled && activeSection.value === 'msd') {
activeSection.value = 'hid'
}
})
// ATX config state // ATX config state
const atxConfig = ref({ const atxConfig = ref({
enabled: false, enabled: false,
@@ -300,9 +325,6 @@ const selectedBackendFormats = computed(() => {
const isCh9329Backend = computed(() => config.value.hid_backend === 'ch9329') const isCh9329Backend = computed(() => config.value.hid_backend === 'ch9329')
// Video selection computed properties
import { watch } from 'vue'
const selectedDevice = computed(() => { const selectedDevice = computed(() => {
return devices.value.video.find(d => d.path === config.value.video_device) return devices.value.video.find(d => d.path === config.value.video_device)
}) })
@@ -384,6 +406,12 @@ watch(() => [config.value.video_width, config.value.video_height], () => {
} }
}) })
watch(() => authStore.user, (value) => {
if (value) {
usernameInput.value = value
}
})
// Format bytes to human readable string // Format bytes to human readable string
function formatBytes(bytes: number): string { function formatBytes(bytes: number): string {
@@ -414,39 +442,71 @@ function handleLanguageChange(lang: string) {
} }
} }
// Password change // Account updates
async function changeUsername() {
usernameError.value = ''
usernameSaved.value = false
if (usernameInput.value.length < 2) {
usernameError.value = t('auth.enterUsername')
return
}
if (!usernamePassword.value) {
usernameError.value = t('auth.enterPassword')
return
}
usernameSaving.value = true
try {
await authApi.changeUsername(usernameInput.value, usernamePassword.value)
usernameSaved.value = true
usernamePassword.value = ''
await authStore.checkAuth()
usernameInput.value = authStore.user || usernameInput.value
setTimeout(() => {
usernameSaved.value = false
}, 2000)
} catch (e) {
usernameError.value = t('auth.invalidPassword')
} finally {
usernameSaving.value = false
}
}
async function changePassword() { async function changePassword() {
passwordError.value = '' passwordError.value = ''
passwordSaved.value = false
if (!currentPassword.value) {
passwordError.value = t('auth.enterPassword')
return
}
if (newPassword.value.length < 4) { if (newPassword.value.length < 4) {
passwordError.value = t('setup.passwordHint') passwordError.value = t('setup.passwordHint')
return return
} }
if (newPassword.value !== confirmPassword.value) { if (newPassword.value !== confirmPassword.value) {
passwordError.value = t('setup.passwordMismatch') passwordError.value = t('setup.passwordMismatch')
return return
} }
passwordSaving.value = true
try { try {
await configApi.update({ await authApi.changePassword(currentPassword.value, newPassword.value)
current_password: currentPassword.value,
new_password: newPassword.value,
})
showPasswordDialog.value = false
currentPassword.value = '' currentPassword.value = ''
newPassword.value = '' newPassword.value = ''
confirmPassword.value = '' confirmPassword.value = ''
passwordSaved.value = true
setTimeout(() => {
passwordSaved.value = false
}, 2000)
} catch (e) { } catch (e) {
passwordError.value = t('auth.invalidPassword') passwordError.value = t('auth.invalidPassword')
} finally {
passwordSaving.value = false
} }
} }
// MSD 开关变更处理
function onMsdEnabledChange(val: boolean) {
config.value.msd_enabled = val
}
// Save config - 使用域分离 API // Save config - 使用域分离 API
async function saveConfig() { async function saveConfig() {
loading.value = true loading.value = true
@@ -481,6 +541,20 @@ async function saveConfig() {
// HID 配置 // HID 配置
if (activeSection.value === 'hid') { if (activeSection.value === 'hid') {
if (!isHidFunctionSelectionValid.value) {
return
}
let desiredMsdEnabled = config.value.msd_enabled
if (config.value.hid_backend === 'otg') {
if (config.value.hid_otg_profile === 'full') {
desiredMsdEnabled = true
} else if (
config.value.hid_otg_profile === 'legacy_keyboard'
|| config.value.hid_otg_profile === 'legacy_mouse_relative'
) {
desiredMsdEnabled = false
}
}
const hidUpdate: any = { const hidUpdate: any = {
backend: config.value.hid_backend as any, backend: config.value.hid_backend as any,
ch9329_port: config.value.hid_serial_device || undefined, ch9329_port: config.value.hid_serial_device || undefined,
@@ -495,15 +569,25 @@ async function saveConfig() {
product: otgProduct.value || 'One-KVM USB Device', product: otgProduct.value || 'One-KVM USB Device',
serial_number: otgSerialNumber.value || undefined, serial_number: otgSerialNumber.value || undefined,
} }
hidUpdate.otg_profile = config.value.hid_otg_profile
hidUpdate.otg_functions = { ...config.value.hid_otg_functions }
} }
savePromises.push(hidConfigApi.update(hidUpdate)) savePromises.push(hidConfigApi.update(hidUpdate))
if (config.value.msd_enabled !== desiredMsdEnabled) {
config.value.msd_enabled = desiredMsdEnabled
}
savePromises.push(
msdConfigApi.update({
enabled: desiredMsdEnabled,
})
)
} }
// MSD 配置 // MSD 配置
if (activeSection.value === 'msd') { if (activeSection.value === 'msd') {
savePromises.push( savePromises.push(
msdConfigApi.update({ msdConfigApi.update({
enabled: config.value.msd_enabled, msd_dir: config.value.msd_dir || undefined,
}) })
) )
} }
@@ -538,6 +622,13 @@ async function loadConfig() {
hid_backend: hid.backend || 'none', hid_backend: hid.backend || 'none',
hid_serial_device: hid.ch9329_port || '', hid_serial_device: hid.ch9329_port || '',
hid_serial_baudrate: hid.ch9329_baudrate || 9600, hid_serial_baudrate: hid.ch9329_baudrate || 9600,
hid_otg_profile: (hid.otg_profile || 'full') as OtgHidProfile,
hid_otg_functions: {
keyboard: hid.otg_functions?.keyboard ?? true,
mouse_relative: hid.otg_functions?.mouse_relative ?? true,
mouse_absolute: hid.otg_functions?.mouse_absolute ?? true,
consumer: hid.otg_functions?.consumer ?? true,
} as OtgHidFunctions,
msd_enabled: msd.enabled || false, msd_enabled: msd.enabled || false,
msd_dir: msd.msd_dir || '', msd_dir: msd.msd_dir || '',
network_port: 8080, // 从旧 API 加载 network_port: 8080, // 从旧 API 加载
@@ -591,56 +682,29 @@ async function loadBackends() {
} }
} }
// User management functions // Auth config functions
async function loadUsers() { async function loadAuthConfig() {
usersLoading.value = true authConfigLoading.value = true
try { try {
const result = await userApi.list() authConfig.value = await authConfigApi.get()
users.value = result.users || []
} catch (e) { } catch (e) {
console.error('Failed to load users:', e) console.error('Failed to load auth config:', e)
} finally { } finally {
usersLoading.value = false authConfigLoading.value = false
} }
} }
async function createUser() { async function saveAuthConfig() {
if (!newUser.value.username || !newUser.value.password) return authConfigLoading.value = true
try { try {
await userApi.create(newUser.value.username, newUser.value.password, newUser.value.role) await authConfigApi.update({
showAddUserDialog.value = false single_user_allow_multiple_sessions: authConfig.value.single_user_allow_multiple_sessions,
newUser.value = { username: '', password: '', role: 'user' } })
await loadUsers() await loadAuthConfig()
} catch (e) { } catch (e) {
console.error('Failed to create user:', e) console.error('Failed to save auth config:', e)
} } finally {
} authConfigLoading.value = false
function openEditUserDialog(user: UserType) {
editingUser.value = user
editUserData.value = { username: user.username, role: user.role }
showEditUserDialog.value = true
}
async function updateUser() {
if (!editingUser.value) return
try {
await userApi.update(editingUser.value.id, editUserData.value)
showEditUserDialog.value = false
editingUser.value = null
await loadUsers()
} catch (e) {
console.error('Failed to update user:', e)
}
}
async function confirmDeleteUser(user: UserType) {
if (!confirm(`Delete user "${user.username}"?`)) return
try {
await userApi.delete(user.id)
await loadUsers()
} catch (e) {
console.error('Failed to delete user:', e)
} }
} }
@@ -1052,7 +1116,7 @@ onMounted(async () => {
loadConfig(), loadConfig(),
loadDevices(), loadDevices(),
loadBackends(), loadBackends(),
loadUsers(), loadAuthConfig(),
loadExtensions(), loadExtensions(),
loadAtxConfig(), loadAtxConfig(),
loadAtxDevices(), loadAtxDevices(),
@@ -1060,6 +1124,7 @@ onMounted(async () => {
loadRustdeskPassword(), loadRustdeskPassword(),
loadWebServerConfig(), loadWebServerConfig(),
]) ])
usernameInput.value = authStore.user || ''
}) })
</script> </script>
@@ -1171,6 +1236,63 @@ onMounted(async () => {
</Card> </Card>
</div> </div>
<!-- Account Section -->
<div v-show="activeSection === 'account'" class="space-y-6">
<Card>
<CardHeader>
<CardTitle>{{ t('settings.username') }}</CardTitle>
<CardDescription>{{ t('settings.usernameDesc') }}</CardDescription>
</CardHeader>
<CardContent class="space-y-4">
<div class="space-y-2">
<Label for="account-username">{{ t('settings.username') }}</Label>
<Input id="account-username" v-model="usernameInput" />
</div>
<div class="space-y-2">
<Label for="account-username-password">{{ t('settings.currentPassword') }}</Label>
<Input id="account-username-password" v-model="usernamePassword" type="password" />
</div>
<p v-if="usernameError" class="text-xs text-destructive">{{ usernameError }}</p>
<p v-else-if="usernameSaved" class="text-xs text-emerald-600">{{ t('common.success') }}</p>
<div class="flex justify-end">
<Button @click="changeUsername" :disabled="usernameSaving">
<Save class="h-4 w-4 mr-2" />
{{ t('common.save') }}
</Button>
</div>
</CardContent>
</Card>
<Card>
<CardHeader>
<CardTitle>{{ t('settings.changePassword') }}</CardTitle>
<CardDescription>{{ t('settings.passwordDesc') }}</CardDescription>
</CardHeader>
<CardContent class="space-y-4">
<div class="space-y-2">
<Label for="account-current-password">{{ t('settings.currentPassword') }}</Label>
<Input id="account-current-password" v-model="currentPassword" type="password" />
</div>
<div class="space-y-2">
<Label for="account-new-password">{{ t('settings.newPassword') }}</Label>
<Input id="account-new-password" v-model="newPassword" type="password" />
</div>
<div class="space-y-2">
<Label for="account-confirm-password">{{ t('auth.confirmPassword') }}</Label>
<Input id="account-confirm-password" v-model="confirmPassword" type="password" />
</div>
<p v-if="passwordError" class="text-xs text-destructive">{{ passwordError }}</p>
<p v-else-if="passwordSaved" class="text-xs text-emerald-600">{{ t('common.success') }}</p>
<div class="flex justify-end">
<Button @click="changePassword" :disabled="passwordSaving">
<Save class="h-4 w-4 mr-2" />
{{ t('common.save') }}
</Button>
</div>
</CardContent>
</Card>
</div>
<!-- Video Section --> <!-- Video Section -->
<div v-show="activeSection === 'video'" class="space-y-6"> <div v-show="activeSection === 'video'" class="space-y-6">
<!-- Video Device Settings --> <!-- Video Device Settings -->
@@ -1345,6 +1467,66 @@ onMounted(async () => {
<!-- OTG Descriptor Settings --> <!-- OTG Descriptor Settings -->
<template v-if="config.hid_backend === 'otg'"> <template v-if="config.hid_backend === 'otg'">
<Separator class="my-4" />
<div class="space-y-4">
<div>
<h4 class="text-sm font-medium">{{ t('settings.otgHidProfile') }}</h4>
<p class="text-sm text-muted-foreground">{{ t('settings.otgHidProfileDesc') }}</p>
</div>
<div class="space-y-2">
<Label for="otg-profile">{{ t('settings.profile') }}</Label>
<select id="otg-profile" v-model="config.hid_otg_profile" class="w-full h-9 px-3 rounded-md border border-input bg-background text-sm">
<option value="full">{{ t('settings.otgProfileFull') }}</option>
<option value="legacy_keyboard">{{ t('settings.otgProfileLegacyKeyboard') }}</option>
<option value="legacy_mouse_relative">{{ t('settings.otgProfileLegacyMouseRelative') }}</option>
<option value="custom">{{ t('settings.otgProfileCustom') }}</option>
</select>
</div>
<div v-if="config.hid_otg_profile === 'custom'" class="space-y-3 rounded-md border border-border/60 p-3">
<div class="flex items-center justify-between">
<div>
<Label>{{ t('settings.otgFunctionKeyboard') }}</Label>
<p class="text-xs text-muted-foreground">{{ t('settings.otgFunctionKeyboardDesc') }}</p>
</div>
<Switch v-model="config.hid_otg_functions.keyboard" />
</div>
<Separator />
<div class="flex items-center justify-between">
<div>
<Label>{{ t('settings.otgFunctionMouseRelative') }}</Label>
<p class="text-xs text-muted-foreground">{{ t('settings.otgFunctionMouseRelativeDesc') }}</p>
</div>
<Switch v-model="config.hid_otg_functions.mouse_relative" />
</div>
<Separator />
<div class="flex items-center justify-between">
<div>
<Label>{{ t('settings.otgFunctionMouseAbsolute') }}</Label>
<p class="text-xs text-muted-foreground">{{ t('settings.otgFunctionMouseAbsoluteDesc') }}</p>
</div>
<Switch v-model="config.hid_otg_functions.mouse_absolute" />
</div>
<Separator />
<div class="flex items-center justify-between">
<div>
<Label>{{ t('settings.otgFunctionConsumer') }}</Label>
<p class="text-xs text-muted-foreground">{{ t('settings.otgFunctionConsumerDesc') }}</p>
</div>
<Switch v-model="config.hid_otg_functions.consumer" />
</div>
<Separator />
<div class="flex items-center justify-between">
<div>
<Label>{{ t('settings.otgFunctionMsd') }}</Label>
<p class="text-xs text-muted-foreground">{{ t('settings.otgFunctionMsdDesc') }}</p>
</div>
<Switch v-model="config.msd_enabled" />
</div>
</div>
<p class="text-xs text-amber-600 dark:text-amber-400">
{{ t('settings.otgProfileWarning') }}
</p>
</div>
<Separator class="my-4" /> <Separator class="my-4" />
<div class="space-y-4"> <div class="space-y-4">
<div> <div>
@@ -1409,8 +1591,8 @@ onMounted(async () => {
</Card> </Card>
</div> </div>
<!-- Web Server Section --> <!-- Access Section -->
<div v-show="activeSection === 'web-server'" class="space-y-6"> <div v-show="activeSection === 'access'" class="space-y-6">
<Card> <Card>
<CardHeader> <CardHeader>
<CardTitle>{{ t('settings.webServer') }}</CardTitle> <CardTitle>{{ t('settings.webServer') }}</CardTitle>
@@ -1452,51 +1634,36 @@ onMounted(async () => {
</div> </div>
</CardContent> </CardContent>
</Card> </Card>
</div>
<!-- Users Section -->
<div v-show="activeSection === 'users'" class="space-y-6">
<Card> <Card>
<CardHeader class="flex flex-row items-center justify-between space-y-0 pb-4"> <CardHeader>
<div class="space-y-1.5"> <CardTitle>{{ t('settings.authSettings') }}</CardTitle>
<CardTitle>{{ t('settings.userManagement') }}</CardTitle> <CardDescription>{{ t('settings.authSettingsDesc') }}</CardDescription>
<CardDescription>{{ t('settings.userManagementDesc') }}</CardDescription>
</div>
<Button size="sm" @click="showAddUserDialog = true">
<UserPlus class="h-4 w-4 mr-2" />{{ t('settings.addUser') }}
</Button>
</CardHeader> </CardHeader>
<CardContent> <CardContent class="space-y-4">
<div v-if="usersLoading" class="text-center py-8"> <div class="flex items-center justify-between">
<p class="text-sm text-muted-foreground">{{ t('settings.loadingUsers') }}</p> <div class="space-y-0.5">
</div> <Label>{{ t('settings.allowMultipleSessions') }}</Label>
<div v-else-if="users.length === 0" class="text-center py-8"> <p class="text-xs text-muted-foreground">{{ t('settings.allowMultipleSessionsDesc') }}</p>
<User class="h-8 w-8 mx-auto mb-2 text-muted-foreground" />
<p class="text-sm text-muted-foreground">{{ t('settings.noUsers') }}</p>
</div>
<div v-else class="divide-y">
<div v-for="user in users" :key="user.id" class="flex items-center justify-between py-3">
<div class="flex items-center gap-3">
<div class="h-8 w-8 rounded-full bg-muted flex items-center justify-center">
<User class="h-4 w-4" />
</div>
<div>
<p class="text-sm font-medium">{{ user.username }}</p>
<Badge variant="outline" class="text-xs">{{ user.role === 'admin' ? t('settings.roleAdmin') : t('settings.roleUser') }}</Badge>
</div>
</div>
<div class="flex gap-1">
<Button size="icon" variant="ghost" class="h-8 w-8" @click="openEditUserDialog(user)"><Pencil class="h-4 w-4" /></Button>
<Button size="icon" variant="ghost" class="h-8 w-8 text-destructive" :disabled="user.role === 'admin' && users.filter(u => u.role === 'admin').length === 1" @click="confirmDeleteUser(user)"><Trash2 class="h-4 w-4" /></Button>
</div> </div>
<Switch
v-model="authConfig.single_user_allow_multiple_sessions"
:disabled="authConfigLoading"
/>
</div> </div>
<Separator />
<p class="text-xs text-muted-foreground">{{ t('settings.singleUserSessionNote') }}</p>
<div class="flex justify-end pt-2">
<Button @click="saveAuthConfig" :disabled="authConfigLoading">
<Save class="h-4 w-4 mr-2" />
{{ t('common.save') }}
</Button>
</div> </div>
</CardContent> </CardContent>
</Card> </Card>
</div> </div>
<!-- MSD Section --> <!-- MSD Section -->
<div v-show="activeSection === 'msd'" class="space-y-6"> <div v-show="activeSection === 'msd' && config.msd_enabled" class="space-y-6">
<Card> <Card>
<CardHeader> <CardHeader>
<CardTitle>{{ t('settings.msdSettings') }}</CardTitle> <CardTitle>{{ t('settings.msdSettings') }}</CardTitle>
@@ -1507,19 +1674,6 @@ onMounted(async () => {
<p class="font-medium">{{ t('settings.msdCh9329Warning') }}</p> <p class="font-medium">{{ t('settings.msdCh9329Warning') }}</p>
<p class="text-xs text-amber-900/80">{{ t('settings.msdCh9329WarningDesc') }}</p> <p class="text-xs text-amber-900/80">{{ t('settings.msdCh9329WarningDesc') }}</p>
</div> </div>
<div class="flex items-center justify-between">
<div class="space-y-0.5">
<Label for="msd-enabled">{{ t('settings.msdEnable') }}</Label>
<p class="text-xs text-muted-foreground">{{ t('settings.msdEnableDesc') }}</p>
</div>
<Switch
id="msd-enabled"
:disabled="isCh9329Backend"
:model-value="config.msd_enabled"
@update:model-value="onMsdEnabledChange"
/>
</div>
<Separator />
<div class="space-y-4"> <div class="space-y-4">
<div class="space-y-2"> <div class="space-y-2">
<Label for="msd-dir">{{ t('settings.msdDir') }}</Label> <Label for="msd-dir">{{ t('settings.msdDir') }}</Label>
@@ -1821,8 +1975,8 @@ onMounted(async () => {
</div> </div>
</div> </div>
<!-- gostc Section --> <!-- Remote Access Section -->
<div v-show="activeSection === 'ext-gostc'" class="space-y-6"> <div v-show="activeSection === 'ext-remote-access'" class="space-y-6">
<Card> <Card>
<CardHeader> <CardHeader>
<div class="flex items-center justify-between"> <div class="flex items-center justify-between">
@@ -1913,10 +2067,7 @@ onMounted(async () => {
<Check v-if="saved" class="h-4 w-4 mr-2" /><Save v-else class="h-4 w-4 mr-2" />{{ saved ? t('common.success') : t('common.save') }} <Check v-if="saved" class="h-4 w-4 mr-2" /><Save v-else class="h-4 w-4 mr-2" />{{ saved ? t('common.success') : t('common.save') }}
</Button> </Button>
</div> </div>
</div>
<!-- easytier Section -->
<div v-show="activeSection === 'ext-easytier'" class="space-y-6">
<Card> <Card>
<CardHeader> <CardHeader>
<div class="flex items-center justify-between"> <div class="flex items-center justify-between">
@@ -2249,105 +2400,21 @@ onMounted(async () => {
<!-- Save Button (sticky) --> <!-- Save Button (sticky) -->
<div v-if="['video', 'hid', 'msd'].includes(activeSection)" class="sticky bottom-0 pt-4 pb-2 bg-background border-t -mx-6 px-6 lg:-mx-8 lg:px-8"> <div v-if="['video', 'hid', 'msd'].includes(activeSection)" class="sticky bottom-0 pt-4 pb-2 bg-background border-t -mx-6 px-6 lg:-mx-8 lg:px-8">
<div class="flex justify-end"> <div class="flex justify-end">
<Button :disabled="loading" @click="saveConfig"> <div class="flex items-center gap-3">
<p v-if="activeSection === 'hid' && !isHidFunctionSelectionValid" class="text-xs text-amber-600 dark:text-amber-400">
{{ t('settings.otgFunctionMinWarning') }}
</p>
<Button :disabled="loading || (activeSection === 'hid' && !isHidFunctionSelectionValid)" @click="saveConfig">
<Check v-if="saved" class="h-4 w-4 mr-2" /><Save v-else class="h-4 w-4 mr-2" />{{ saved ? t('common.success') : t('common.save') }} <Check v-if="saved" class="h-4 w-4 mr-2" /><Save v-else class="h-4 w-4 mr-2" />{{ saved ? t('common.success') : t('common.save') }}
</Button> </Button>
</div> </div>
</div> </div>
</div>
</div> </div>
</main> </main>
</div> </div>
<!-- Password Change Dialog -->
<Dialog v-model:open="showPasswordDialog">
<DialogContent class="sm:max-w-md">
<DialogHeader>
<DialogTitle>{{ t('settings.changePassword') }}</DialogTitle>
</DialogHeader>
<div class="space-y-4">
<div class="space-y-2">
<Label for="current-password">{{ t('settings.currentPassword') }}</Label>
<div class="relative">
<Input id="current-password" v-model="currentPassword" :type="showPasswords ? 'text' : 'password'" />
<button type="button" class="absolute right-3 top-1/2 -translate-y-1/2 text-muted-foreground" @click="showPasswords = !showPasswords">
<Eye v-if="!showPasswords" class="h-4 w-4" /><EyeOff v-else class="h-4 w-4" />
</button>
</div>
</div>
<div class="space-y-2">
<Label for="new-password">{{ t('settings.newPassword') }}</Label>
<Input id="new-password" v-model="newPassword" :type="showPasswords ? 'text' : 'password'" />
</div>
<div class="space-y-2">
<Label for="confirm-password">{{ t('setup.confirmPassword') }}</Label>
<Input id="confirm-password" v-model="confirmPassword" :type="showPasswords ? 'text' : 'password'" />
</div>
<p v-if="passwordError" class="text-sm text-destructive">{{ passwordError }}</p>
</div>
<DialogFooter>
<Button variant="outline" size="sm" @click="showPasswordDialog = false">{{ t('common.cancel') }}</Button>
<Button size="sm" @click="changePassword">{{ t('common.save') }}</Button>
</DialogFooter>
</DialogContent>
</Dialog>
<!-- Add User Dialog -->
<Dialog v-model:open="showAddUserDialog">
<DialogContent class="sm:max-w-md">
<DialogHeader>
<DialogTitle>{{ t('settings.addUser') }}</DialogTitle>
</DialogHeader>
<div class="space-y-4">
<div class="space-y-2">
<Label for="new-username">{{ t('settings.username') }}</Label>
<Input id="new-username" v-model="newUser.username" />
</div>
<div class="space-y-2">
<Label for="new-user-password">{{ t('settings.password') }}</Label>
<Input id="new-user-password" v-model="newUser.password" type="password" />
</div>
<div class="space-y-2">
<Label for="new-user-role">{{ t('settings.role') }}</Label>
<select id="new-user-role" v-model="newUser.role" class="w-full h-9 px-3 rounded-md border border-input bg-background text-sm">
<option value="user">{{ t('settings.roleUser') }}</option>
<option value="admin">{{ t('settings.roleAdmin') }}</option>
</select>
</div>
</div>
<DialogFooter>
<Button variant="outline" size="sm" @click="showAddUserDialog = false">{{ t('common.cancel') }}</Button>
<Button size="sm" @click="createUser">{{ t('settings.create') }}</Button>
</DialogFooter>
</DialogContent>
</Dialog>
<!-- Edit User Dialog -->
<Dialog v-model:open="showEditUserDialog">
<DialogContent class="sm:max-w-md">
<DialogHeader>
<DialogTitle>{{ t('settings.editUser') }}</DialogTitle>
</DialogHeader>
<div class="space-y-4">
<div class="space-y-2">
<Label for="edit-username">{{ t('settings.username') }}</Label>
<Input id="edit-username" v-model="editUserData.username" />
</div>
<div class="space-y-2">
<Label for="edit-user-role">{{ t('settings.role') }}</Label>
<select id="edit-user-role" v-model="editUserData.role" class="w-full h-9 px-3 rounded-md border border-input bg-background text-sm">
<option value="user">{{ t('settings.roleUser') }}</option>
<option value="admin">{{ t('settings.roleAdmin') }}</option>
</select>
</div>
</div>
<DialogFooter>
<Button variant="outline" size="sm" @click="showEditUserDialog = false">{{ t('common.cancel') }}</Button>
<Button size="sm" @click="updateUser">{{ t('common.save') }}</Button>
</DialogFooter>
</DialogContent>
</Dialog>
<!-- Terminal Dialog --> <!-- Terminal Dialog -->
<Dialog v-model:open="showTerminalDialog"> <Dialog v-model:open="showTerminalDialog">
<DialogContent class="max-w-[95vw] w-[1200px] h-[600px] p-0 flex flex-col overflow-hidden"> <DialogContent class="max-w-[95vw] w-[1200px] h-[600px] p-0 flex flex-col overflow-hidden">

View File

@@ -99,9 +99,7 @@ const otgUdc = ref('')
// Extension settings // Extension settings
const ttydEnabled = ref(false) const ttydEnabled = ref(false)
const rustdeskEnabled = ref(false)
const ttydAvailable = ref(false) const ttydAvailable = ref(false)
const rustdeskAvailable = ref(true) // RustDesk is built-in, always available
// Encoder backend settings // Encoder backend settings
const encoderBackend = ref('auto') const encoderBackend = ref('auto')
@@ -139,7 +137,6 @@ interface DeviceInfo {
udc: Array<{ name: string }> udc: Array<{ name: string }>
extensions: { extensions: {
ttyd_available: boolean ttyd_available: boolean
rustdesk_available: boolean
} }
} }
@@ -150,7 +147,6 @@ const devices = ref<DeviceInfo>({
udc: [], udc: [],
extensions: { extensions: {
ttyd_available: false, ttyd_available: false,
rustdesk_available: true,
}, },
}) })
@@ -351,7 +347,6 @@ onMounted(async () => {
// Set extension availability from devices API // Set extension availability from devices API
if (result.extensions) { if (result.extensions) {
ttydAvailable.value = result.extensions.ttyd_available ttydAvailable.value = result.extensions.ttyd_available
rustdeskAvailable.value = result.extensions.rustdesk_available
} }
} catch { } catch {
// Use defaults // Use defaults
@@ -506,7 +501,6 @@ async function handleSetup() {
// Extension settings // Extension settings
setupData.ttyd_enabled = ttydEnabled.value setupData.ttyd_enabled = ttydEnabled.value
setupData.rustdesk_enabled = rustdeskEnabled.value
const success = await authStore.setup(setupData) const success = await authStore.setup(setupData)
@@ -956,19 +950,6 @@ const stepIcons = [User, Video, Keyboard, Puzzle]
<Switch v-model="ttydEnabled" :disabled="!ttydAvailable" /> <Switch v-model="ttydEnabled" :disabled="!ttydAvailable" />
</div> </div>
<!-- RustDesk -->
<div class="flex items-center justify-between p-4 rounded-lg border">
<div class="space-y-1">
<div class="flex items-center gap-2">
<Label class="text-base font-medium">{{ t('setup.rustdeskTitle') }}</Label>
</div>
<p class="text-sm text-muted-foreground">
{{ t('setup.rustdeskDescription') }}
</p>
</div>
<Switch v-model="rustdeskEnabled" />
</div>
<p class="text-xs text-muted-foreground text-center pt-2"> <p class="text-xs text-muted-foreground text-center pt-2">
{{ t('setup.extensionsHint') }} {{ t('setup.extensionsHint') }}
</p> </p>