我想将音频流从我的 Angular Web 应用程序录制到我的 Asp.net Core Api。
我认为,使用 SignalR 及其 websockets 是实现这一目标的好方法。
通过这个打字稿代码,我可以获得一个 MediaStream:
import { HubConnection } from '@aspnet/signalr';
[...]
private stream: MediaStream;
private connection: webkitRTCPeerConnection;
@ViewChild('video') video;
[...]
navigator.mediaDevices.getUserMedia({ audio: true })
.then(stream => {
console.trace('Received local stream');
this.video.srcObject = stream;
this.stream = stream;
var _hubConnection = new HubConnection('[MY_API_URL]/webrtc');
this._hubConnection.send("SendStream", stream);
})
.catch(function (e) {
console.error('getUserMedia() error: ' + e.message);
});
我使用 .Net Core API 处理流
public class MyHub: Hub{
public void SendStream(object o)
{
}
}
但是当我将 o 转换为 System.IO.Stream 时,我得到了一个 null。
当我阅读WebRTC的文档时,我看到了有关RTCPeerConnection的信息。 IceConnection ...我需要那个吗?
如何使用 SignalR 将音频从 WebClient 流式传输到 Asp.net Core API?文档? GitHub?
感谢您的帮助
我找到了访问麦克风流并将其传输到服务器的方法,代码如下:
private audioCtx: AudioContext;
private stream: MediaStream;
convertFloat32ToInt16(buffer:Float32Array) {
let l = buffer.length;
let buf = new Int16Array(l);
while (l--) {
buf[l] = Math.min(1, buffer[l]) * 0x7FFF;
}
return buf.buffer;
}
startRecording() {
navigator.mediaDevices.getUserMedia({ audio: true })
.then(stream => {
this.audioCtx = new AudioContext();
this.audioCtx.createMediaStreamSource(stream);
this.audioCtx.onstatechange = (state) => { console.log(state); }
var scriptNode = this.audioCtx.createScriptProcessor(4096, 1, 1);
scriptNode.onaudioprocess = (audioProcessingEvent) => {
var buffer = [];
// The input buffer is the song we loaded earlier
var inputBuffer = audioProcessingEvent.inputBuffer;
// Loop through the output channels (in this case there is only one)
for (var channel = 0; channel < inputBuffer.numberOfChannels; channel++) {
console.log("inputBuffer:" + audioProcessingEvent.inputBuffer.getChannelData(channel));
var chunk = audioProcessingEvent.inputBuffer.getChannelData(channel);
//because endianness does matter
this.MySignalRService.send("SendStream", this.convertFloat32ToInt16(chunk));
}
}
var source = this.audioCtx.createMediaStreamSource(stream);
source.connect(scriptNode);
scriptNode.connect(this.audioCtx.destination);
this.stream = stream;
})
.catch(function (e) {
console.error('getUserMedia() error: ' + e.message);
});
}
stopRecording() {
try {
let stream = this.stream;
stream.getAudioTracks().forEach(track => track.stop());
stream.getVideoTracks().forEach(track => track.stop());
this.audioCtx.close();
}
catch (error) {
console.error('stopRecording() error: ' + error);
}
}
下一步是将我的 int32Array 转换为 wav 文件。
对我有帮助的来源:
- https://subvisual.co/blog/posts/39-tutorial-html-audio-capture-streaming-to-node-js-no-browser-extensions/ https://subvisual.co/blog/posts/39-tutorial-html-audio-capture-streaming-to-node-js-no-browser-extensions/
- https://medium.com/@yushulx/learning-how-to-capture-and-record-audio-in-html5-6fe68a769bf9 https://medium.com/@yushulx/learning-how-to-capture-and-record-audio-in-html5-6fe68a769bf9
笔记:
我没有添加如何配置 SignalR 的代码,这不是这里的目的。
本文内容由网友自发贡献,版权归原作者所有,本站不承担相应法律责任。如您发现有涉嫌抄袭侵权的内容,请联系:hwhale#tublm.com(使用前将#替换为@)