How To Record A Video With Audio In The Browser With javascript (Webrtc)

A couple of years ago, Flash was necessary in your browser if you wanted to interact with the user media devices (camera and microphone). Today, with the constant development and innovation on jаvascript APIs, WebRTC has come to replace the obsolete flash, so you will be able as well to record videos using the getUserMedia API. In this article, we'll show you 2 ways to record a video (and audio) from the user webcam using jаvascript. To record a video using jаvascript and WebRTC both of the mentioned options in this article use the open source library RecordRTC, written and maintained by @muaz-khan. To know more about this library, please visit the official repository at Github here or checkout the official demo of RecordRTC here.

Both of the approaches will end up by generating a Blob in the browser that contains the recorder video and audio, we will cover basically how to record the video in the client side and a little example of how could you upload the blob to your server using PHP, however we won't write a lot about the server side logic in this article.

Note

Both methods will create a video in the Webm format, so if you need this video in another format you may want to use server side logic to convert them in what you want, like ffmpeg.

Having said that, let's get started !

A. Using VideoJS Record

Note

If you don't want to mess up with a lot of code by yourself configuring audio filters, bitrate rates and probably a lot of things that aren't of your interest, this may be the best solution to implement a video recorder with jаvascript easily.

The first option that you have to record a video in the browser easily, is using the VideoJS Record library. This library maintained by @collab-project uses 3 extra libraries to accomplish an awesome and very robust video recorder, taking care of the user experience at the same time. If you are willing to implement the feature of recording a video with the web camera, this plugin is exactly what you need.

Start by including Video.js on your page, VideoJS runs on top of this library instead of a plain video tag. Then include VideoJS record that needs as well a copy of the mentioned RecordRTC library and proceed with the initialization. The following snippet shows a basic example of VideoJS that records video and audio simultaneously:

Note

Video.js and VideoJS Record are 2 different libraries. Video.js is a web video player built from the ground up for an HTML5 world. VideoJS is a plugin for Video.js that allows you to record the user's camera with the help of RecordRTC.

<!doctype html><html> <head> <meta charset="utf-8"> <title>Audio/Video Example - Record Plugin for Video.js</title> <!-- Include Video.js stylesheet (https://videojs.com/) --> <link href="../node_modules/video.js/dist/video-js.min.css" rel="stylesheet"> <!-- Style of VideoJS --> <link href="../dist/css/videojs.record.css" rel="stylesheet"> <style> /* change player background color */ #myVideo { background-color: #9ab87a; } </style> </head> <body> <!-- Create the preview video element --> <video id="myVideo" ></video> <!-- Load video.js --> <script src="../node_modules/video.js/dist/video.min.js"></script> <!-- Load RecordRTC core and adapter --> <script src="../node_modules/recordrtc/RecordRTC.js"></script> <script src="../node_modules/webrtc-adapter/out/adapter.js"></script> <!-- Load VideoJS Record Extension --> <script src="../dist/videojs.record.js"></script> <script> var videoMaxLengthInSeconds = 120; // Inialize the video player var player = videojs("myVideo", { controls: true, width: 720, height: 480, fluid: false, plugins: { record: { audio: true, video: true, maxLength: videoMaxLengthInSeconds, debug: true, videoMimeType: "video/webm;codecs=H264" } } }, function(){ // print version information at startup videojs.log( 'Using video.js', videojs.VERSION, 'with videojs-record', videojs.getPluginVersion('record'), 'and recordrtc', RecordRTC.version ); }); // error handling for getUserMedia player.on('deviceError', function() { console.log('device error:', player.deviceErrorCode); }); // Handle error events of the video player player.on('error', function(error) { console.log('error:', error); }); // user clicked the record button and started recording ! player.on('startRecord', function() { console.log('started recording! Do whatever you need to'); }); // user completed recording and stream is available // Upload the Blob to your server or download it locally ! player.on('finishRecord', function() { // the blob object contains the recorded data that // can be downloaded by the user, stored on server etc. var videoBlob = player.recordedData.video; console.log('finished recording: ', videoBlob); }); </script> </body></html>

The recorder function is pretty simple, with the help of Video.js a dynamic video player will be initialized, then the mentioned plugin VideoJS creates an extension for the videoplayer that allows you to record with the help of RecordRTC the stream generated by the users camera and microphone.

Note

If your user can't afford a decent camera, don't expect 4K videos :).

You can see a live demo of how to record a video with Audio using VideoJS Record here. For more information about this library, please visit the official repository at Github here.

B. Using RecordRTC

If you don't want to use the first library because you find it a little bit heavy including 3 libraries as well, you are free to implement in the "raw" version of RecordRTC. The logic by itself is the same in this process as with the previous library. The user will need to grant access to the Camera and Microphone using the getUserMedia API. Using this stream, RecordRTC will be able to start the video recording. As previously mentioned, you will need the RecordRTC and the RecordRTC adapter scripts from the official Github repository, that provides cross-browser support for getUserMedia and other browser APIs used in the plugin.

Final example

The following example shows how to implement a basic start/stop recorder using RecordRTC (the promise based version):

<!-- 1. Include action buttons play/stop --><button id="btn-start-recording">Start Recording</button><button id="btn-stop-recording" disabled="disabled">Stop Recording</button><!-- 2. Include a video element that will display the current video stream and as well to show the recorded video at the end. --><hr><video id="my-preview" controls autoplay></video><!-- 3. Include the RecordRTC library and the latest adapter.Note that you may want to host these scripts in your own server--><script src="https://cdn.webrtc-experiment.com/RecordRTC.js"></script><script src="https://webrtc.github.io/adapter/adapter-latest.js"></script><!-- 4. Initialize and prepare the video recorder logic --><script> // Store a reference of the preview video element and a global reference to the recorder instance var video = document.getElementById('my-preview'); var recorder; // When the user clicks on start video recording document.getElementById('btn-start-recording').addEventListener("click", function(){ // Disable start recording button this.disabled = true; // Request access to the media devices navigator.mediaDevices.getUserMedia({ audio: true,  video: true }).then(function(stream) { // Display a live preview on the video element of the page setSrcObject(stream, video); // Start to display the preview on the video element // and mute the video to disable the echo issue ! video.play(); video.muted = true; // Initialize the recorder recorder = new RecordRTCPromisesHandler(stream, { mimeType: 'video/webm', bitsPerSecond: 128000 }); // Start recording the video recorder.startRecording().then(function() { console.info('Recording video ...'); }).catch(function(error) { console.error('Cannot start video recording: ', error); }); // release stream on stopRecording recorder.stream = stream; // Enable stop recording button document.getElementById('btn-stop-recording').disabled = false; }).catch(function(error) { console.error("Cannot access media devices: ", error); }); }, false); // When the user clicks on Stop video recording document.getElementById('btn-stop-recording').addEventListener("click", function(){ this.disabled = true; recorder.stopRecording().then(function() { console.info('stopRecording success'); // Retrieve recorded video as blob and display in the preview element var videoBlob = recorder.getBlob(); video.src = URL.createObjectURL(videoBlob); video.play(); // Unmute video on preview video.muted = false; // Stop the device streaming recorder.stream.stop(); // Enable record button again ! document.getElementById('btn-start-recording').disabled = false; }).catch(function(error) { console.error('stopRecording failure', error); }); }, false);</script>

You can see this code in action in the following fiddle:

RecordRTC is the Holy Grail when we talk about recording videos in the browser with jаvascript, however although some things are easy to configure, other things may be a little complicated to understand and implement. This library is used by many others, which are basically wrappers with predefined settings that usually work on every browser (like VideoJS Record). To learn more about Record RTC please visit the official repository at Github.

Saving the blob video in your server

Both of the mentioned solutions will produce a manipulable Blob that contains our video, in our code this blob is named as videoBlob and you will need to send this to your server in order to save it as a video. You can upload easily a blob via jаvascript using the FormData API, for example with our example using the VideoJS library you could upload the blob with the following approach:

// user completed recording and stream is availableplayer.on('finishRecord', function() { // the blob object contains the recorded data that // can be downloaded by the user, stored on server etc. console.log('finished recording: ', player.recordedData); // Create an instance of FormData and append the video parameter that // will be interpreted in the server as a file var formData = new FormData(); formData.append('video', player.recordedData.video); // Execute the ajax request, in this case we have a very simple PHP script // that accepts and save the uploaded "video" file xhr('./upload-video.php', formData, function (fName) { console.log("Video succesfully uploaded !"); }); // Helper function to send  function xhr(url, data, callback) { var request = new XMLHttpRequest(); request.onreadystatechange = function () { if (request.readyState == 4 && request.status == 200) { callback(location.href + request.responseText); } }; request.open('POST', url); request.send(data); }});

The logic on the server is totally up to you, you only need to accept files and retrieve the one identified with the same name as the uploaded parameter, for example in our script we sent the Blob with name "video", so using PHP (upload-video.php) our server logic would be so simple as:

<?phpif(isset($_FILES["video"])){ // Define a name for the file $fileName = "myvideo.webm"; // In this case the current directory of the PHP script $uploadDirectory = './'. $fileName; // Move the file to your server if (!move_uploaded_file($_FILES["video"]["tmp_name"], $uploadDirectory)) { echo("Couldn't upload video !"); }}else{ echo "No file uploaded";}?>

This will check if there's an uploaded file in the "video" parameter and will write it in your server, in this case in the current directory of the PHP script, creating a file named myvideo.webm with the recorded content in the client side.

  • 45