A Node.js video streaming server that uses HTTP Live Streaming (HLS) to deliver video content. It includes a processing script to transcode video into multiple resolutions and generate subtitles using local AI.
Read the full technical deep-dive on Medium — Learn about the "Level 1 to Level 4" evolution of this project, from a naive static server to an AI-powered autonomous pipeline.
- Adaptive Streaming: Converts video to HLS (
.m3u8) format for smooth playback. - Multi-Resolution: Automatically transcodes video to 360p, 720p, and 1080p.
- AI-Powered Subtitles: Uses the Whisper model (via
@xenova/transformers) to generate VTT subtitles locally. - Thumbnail Previews: Generates a thumbnail sprite and VTT file for interactive timeline previews.
- Custom Player: Includes a simple frontend player with
hls.jsto test the stream, with quality switching and subtitle controls.
graph TD
A[storage/video.mp4] -->|videoProcessor.js| B{Processing Services};
B --> C[aiService.js] --> D[subtitles.vtt];
B --> E[ffmpegService.js] --> F[HLS Segments & Playlists];
B --> G[ffmpegService.js] --> H[thumbnails.jpg];
B --> I[thumbnailService.js] --> J[metadata.vtt];
subgraph "HTTP Server"
K[Node.js]
end
D --> K;
F --> K;
H --> K;
J --> K;
K --> L[Client Browser];
- Backend: Node.js,
fluent-ffmpeg,@xenova/transformersfor AI. - Frontend: HTML5 Video,
hls.js. - Video Processing: FFmpeg.
Important: You must have FFmpeg installed on your system for video processing to work.
- Node.js: v18 or higher.
- FFmpeg:
- macOS:
brew install ffmpeg - Linux:
sudo apt install ffmpeg - Windows: Download FFmpeg and add it to your System PATH.
- macOS:
Clone the repository and install the dependencies:
git clone https://github.com/Taophycc/Video-streaming-server.git
cd Video-streaming-server
pnpm installPlace your raw video file in the storage directory. By default, the script looks for storage/video.mp4.
Run the processing script. This will transcode the video, generate HLS segments, create AI subtitles, and produce thumbnail sprites.
Note: This process may take time as it runs an AI model locally.
node src/services/videoProcessor.jsStart the streaming server:
npm run devThe server will start at http://localhost:3000.
GET /: Serves the main video player dashboard (public/index.html).GET /hls/master.m3u8: The master HLS playlist.GET /hls/{resolution}/playlist.m3u8: Playlist for a specific resolution (e.g.,360p,720p).GET /hls/{resolution}/segment_{id}.ts: Video segments for a specific resolution.GET /hls/subtitles.vtt: AI-generated subtitles.GET /hls/thumbnails.jpg: Thumbnail sprite image.GET /hls/metadata.vtt: Thumbnail metadata for timeline previews.
.
├── public
│ ├── index.html # Frontend player
│ └── demo.png
├── src
│ ├── config
│ │ └── cors.js # CORS configuration
│ ├── controllers
│ │ ├── mediaController.js # Serves HLS files
│ │ └── staticController.js # Serves the dashboard
│ ├── services
│ │ ├── aiService.js # Whisper AI subtitle generation
│ │ ├── ffmpegService.js # FFmpeg transcoding and thumbnail sprite generation
│ │ ├── thumbnailService.js # VTT metadata for thumbnails
│ │ └── videoProcessor.js # Main processing pipeline
│ ├── router.js # Main request router
│ └── server.js # HTTP server setup
├── storage
│ ├── video.mp4 # Input source file (Git ignored)
│ └── hls # Output: .m3u8, .ts, .vtt files
├── package.json
└── README.md
Contributions are welcome! Please feel free to submit a pull request.
- Fork the Project
- Create your Feature Branch (
git checkout -b feature/AmazingFeature) - Commit your Changes (
git commit -m 'Add some AmazingFeature') - Push to the Branch (
git push origin feature/AmazingFeature) - Open a Pull Request
Distributed under the ISC License. See LICENCE for more information.
