Open SelimEmre opened 3 years ago
Hi @SelimEmre,
Have you found any resolution on this.
Can you try enabling echo cancellation? It's disabled by default.
this.getIntent().putExtra(EXTRA_AECDUMP_ENABLED, true);
Hi @mekya Thank you for the reply. I will try this and get back if still facing same issue.
Thanks again.
Hi @mekya
We have to change in WebRTCClient.java as we are using ConferenceManager to join to Conference room, correct?
peerConnectionParameters = new PeerConnectionClient.PeerConnectionParameters(videoCallEnabled, loopback, tracing, videoWidth, videoHeight, intent.getIntExtra(CallActivity.EXTRA_VIDEO_FPS, 0), videoStartBitrate, videoCodec, intent.getBooleanExtra(CallActivity.EXTRA_HWCODEC_ENABLED, true), intent.getBooleanExtra(CallActivity.EXTRA_FLEXFEC_ENABLED, false), audioStartBitrate, intent.getStringExtra(CallActivity.EXTRA_AUDIOCODEC), intent.getBooleanExtra(CallActivity.EXTRA_NOAUDIOPROCESSING_ENABLED, false), intent.getBooleanExtra(CallActivity.EXTRA_AECDUMP_ENABLED, false), intent.getBooleanExtra(CallActivity.EXTRA_SAVE_INPUT_AUDIO_TO_FILE_ENABLED, false), intent.getBooleanExtra(CallActivity.EXTRA_OPENSLES_ENABLED, false), intent.getBooleanExtra(CallActivity.EXTRA_DISABLE_BUILT_IN_AEC, false), intent.getBooleanExtra(CallActivity.EXTRA_DISABLE_BUILT_IN_AGC, false), intent.getBooleanExtra(CallActivity.EXTRA_DISABLE_BUILT_IN_NS, false), intent.getBooleanExtra(CallActivity.EXTRA_DISABLE_WEBRTC_AGC_AND_HPF, false), intent.getBooleanExtra(CallActivity.EXTRA_ENABLE_RTCEVENTLOG, false), dataChannelParameters, audioCallEnabled);
intent.getBooleanExtra(CallActivity.EXTRA_AECDUMP_ENABLED, true)//change from false to true to enable echo cancellation, testing
Please confirm.
Hi @thinkgopal ,
Yeah please try this one and let us know if it works for you.
Hi @mekya ,
I tried this, but not working. Still getting echo sound.
Thank you for trying.
I understand you can reproduce the problem any time you want. In the issue rreport, it says that
"Sound echo issue not happening every time."
Do you provide some more information to let us reproduce the problem?
Hi @mekya
Join in conference room using loud speaker and test. join with minimum 3 to 4 users. But all should be use loud speaker instead of earphone.
https://drive.google.com/file/d/1CRmxL1oa293KEh6bCZgoANkAzbWMiGwE/view?usp=sharing
Ok. Moving to the backlog to schedule.
Hi @mekya,
Everytime, we are able to reproduce the bug in all samsung devices, pixel 2 xl as well as in Xiaomi mi 8. Please help us to fix this as soon as possible because it's a very critical issue and not able to go live.
earlier code snippet(not working, getting echo issue): PeerConnectionClient.java return JavaAudioDeviceModule.builder(appContext) .setSamplesReadyCallback(saveRecordedAudioToFile) .setUseHardwareAcousticEchoCanceler(!peerConnectionParameters.disableBuiltInAEC) .setUseHardwareNoiseSuppressor(!peerConnectionParameters.disableBuiltInNS) .setAudioRecordErrorCallback(audioRecordErrorCallback) .setAudioTrackErrorCallback(audioTrackErrorCallback) .setAudioRecordStateCallback(audioRecordStateCallback) .setAudioTrackStateCallback(audioTrackStateCallback) .createAudioDeviceModule(); @Nullable private AudioTrack createAudioTrack() { if (localAudioTrack == null) { //changes, 25th apr 21 - start , added to reduce noise and echo /*
to a Room. */
audioSource = factory.createAudioSource(audioConstraints); localAudioTrack = factory.createAudioTrack(AUDIO_TRACK_ID, audioSource); localAudioTrack.setEnabled(enableAudio); } return localAudioTrack; }
current code snippet(not working, still getting echo issue): PeerConnectionClient.java /*
package io.antmedia.webrtcandroidframework.apprtc;
import android.content.Context; import android.os.Environment; import android.os.ParcelFileDescriptor; import android.util.Log;
import androidx.annotation.Nullable;
import org.webrtc.AudioSource; import org.webrtc.AudioTrack; import org.webrtc.CameraVideoCapturer; import org.webrtc.CandidatePairChangeEvent; import org.webrtc.DataChannel; import org.webrtc.DefaultVideoDecoderFactory; import org.webrtc.DefaultVideoEncoderFactory; import org.webrtc.EglBase; import org.webrtc.IceCandidate; import org.webrtc.Logging; import org.webrtc.MediaConstraints; import org.webrtc.MediaStream; import org.webrtc.MediaStreamTrack; import org.webrtc.PeerConnection; import org.webrtc.PeerConnection.IceConnectionState; import org.webrtc.PeerConnection.PeerConnectionState; import org.webrtc.PeerConnectionFactory; import org.webrtc.RtpParameters; import org.webrtc.RtpReceiver; import org.webrtc.RtpSender; import org.webrtc.RtpTransceiver; import org.webrtc.SdpObserver; import org.webrtc.SessionDescription; import org.webrtc.SoftwareVideoDecoderFactory; import org.webrtc.SoftwareVideoEncoderFactory; import org.webrtc.StatsObserver; import org.webrtc.StatsReport; import org.webrtc.SurfaceTextureHelper; import org.webrtc.VideoCapturer; import org.webrtc.VideoDecoderFactory; import org.webrtc.VideoEncoderFactory; import org.webrtc.VideoSink; import org.webrtc.VideoSource; import org.webrtc.VideoTrack; import org.webrtc.audio.AudioDeviceModule; import org.webrtc.audio.JavaAudioDeviceModule; import org.webrtc.audio.JavaAudioDeviceModule.AudioRecordErrorCallback; import org.webrtc.audio.JavaAudioDeviceModule.AudioRecordStateCallback; import org.webrtc.audio.JavaAudioDeviceModule.AudioTrackErrorCallback; import org.webrtc.audio.JavaAudioDeviceModule.AudioTrackStateCallback; import org.webrtc.voiceengine.WebRtcAudioUtils;
import java.io.File; import java.io.IOException; import java.text.DateFormat; import java.text.SimpleDateFormat; import java.util.ArrayList; import java.util.Arrays; import java.util.Collections; import java.util.Date; import java.util.Iterator; import java.util.List; import java.util.Locale; import java.util.Timer; import java.util.TimerTask; import java.util.concurrent.ExecutorService; import java.util.concurrent.Executors; import java.util.regex.Matcher; import java.util.regex.Pattern;
import io.antmedia.webrtcandroidframework.IDataChannelObserver; import io.antmedia.webrtcandroidframework.apprtc.AppRTCClient.SignalingParameters;
/**
This class is a singleton. */ public class PeerConnectionClient implements IDataChannelMessageSender { public static final String VIDEO_TRACK_ID = "ARDAMSv0"; public static final String AUDIO_TRACK_ID = "ARDAMSa0"; public static final String VIDEO_TRACK_TYPE = "video"; private static final String TAG = "PCRTCClient:::"; private static final String VIDEO_CODEC_VP8 = "VP8"; private static final String VIDEO_CODEC_VP9 = "VP9"; private static final String VIDEO_CODEC_H264 = "H264"; private static final String VIDEO_CODEC_H264_BASELINE = "H264 Baseline"; private static final String VIDEO_CODEC_H264_HIGH = "H264 High"; private static final String AUDIO_CODEC_OPUS = "opus"; private static final String AUDIO_CODEC_ISAC = "ISAC"; private static final String VIDEO_CODEC_PARAM_START_BITRATE = "x-google-start-bitrate"; private static final String VIDEO_FLEXFEC_FIELDTRIAL = "WebRTC-FlexFEC-03-Advertised/Enabled/WebRTC-FlexFEC-03/Enabled/"; private static final String VIDEO_VP8_INTEL_HW_ENCODER_FIELDTRIAL = "WebRTC-IntelVP8/Enabled/"; private static final String DISABLE_WEBRTC_AGC_FIELDTRIAL = "WebRTC-Audio-MinimizeResamplingOnMobile/Enabled/"; private static final String AUDIO_CODEC_PARAM_BITRATE = "maxaveragebitrate"; private static final String AUDIO_ECHO_CANCELLATION_CONSTRAINT = "googEchoCancellation"; private static final String AUDIO_AUTO_GAIN_CONTROL_CONSTRAINT = "googAutoGainControl"; private static final String AUDIO_HIGH_PASS_FILTER_CONSTRAINT = "googHighpassFilter"; private static final String AUDIO_NOISE_SUPPRESSION_CONSTRAINT = "googNoiseSuppression"; private static final String DTLS_SRTP_KEY_AGREEMENT_CONSTRAINT = "DtlsSrtpKeyAgreement"; private static final int HD_VIDEO_WIDTH = 1280; private static final int HD_VIDEO_HEIGHT = 720; // private static final int BPS_IN_KBPS = 1000;//changes, commented on 27th apr 21 to change to 800kbps private static final int BPS_IN_KBPS = 800;// private static final String RTCEVENTLOG_OUTPUT_DIR_NAME = "rtc_event_log";
// Executor thread is started once in private ctor and is used for all // peer connection API calls to ensure new peer connection factory is // created on the same thread as previously destroyed factory. private static final ExecutorService executor = Executors.newSingleThreadExecutor();
private final PCObserver pcObserver = new PCObserver(); private final SDPObserver sdpObserver = new SDPObserver(); private final Timer statsTimer = new Timer(); private final EglBase rootEglBase; private final Context appContext; private final PeerConnectionParameters peerConnectionParameters; private final PeerConnectionEvents events;
@Nullable
private PeerConnectionFactory factory;
@Nullable
private PeerConnection peerConnection;
@Nullable
private AudioSource audioSource;
@Nullable private SurfaceTextureHelper surfaceTextureHelper;
@Nullable private VideoSource videoSource;
private boolean preferIsac;
private boolean videoCapturerStopped;
private boolean isError;
@Nullable
private VideoSink localRender;
@Nullable
private List
@Nullable IDataChannelObserver dataChannelObserver;
private IceConnectionState iceConnectionState; private boolean isWebRtcAECSupported;
//changes, added on 16th apr 21 public IceConnectionState getIceConnectionState(){ return iceConnectionState; }
final DataChannel.Observer dataChannelInternalObserver= new DataChannel.Observer() { @Override public void onBufferedAmountChange(long previousAmount) { if(dataChannelObserver == null) return; Log.d(TAG, "Data channel buffered amount changed: " + dataChannel.label() + ": " + dataChannel.state()); dataChannelObserver.onBufferedAmountChange(previousAmount, dataChannel.label()); }
@Override public void onStateChange() { if(dataChannelObserver == null) return; Log.d(TAG, "Data channel state changed: " + dataChannel.label() + ": " + dataChannel.state()); dataChannelObserver.onStateChange(dataChannel.state(), dataChannel.label()); }
@Override public void onMessage(final DataChannel.Buffer buffer) { if(dataChannelObserver == null) return; Log.d(TAG, "Received Message: " + dataChannel.label() + ": " + dataChannel.state()); dataChannelObserver.onMessage(buffer,dataChannel.label()); } };
@Nullable public DataChannel getDataChannel() { return dataChannel; }
@Override public void sendMessageViaDataChannel(DataChannel.Buffer buffer) { if (dataChannel != null && dataChannel.state() == DataChannel.State.OPEN) { executor.execute(() -> { try {
boolean success = dataChannel.send(buffer);
buffer.data.rewind();
if (dataChannelObserver != null) {
if (success) {
dataChannelObserver.onMessageSent(buffer, true);
} else {
dataChannelObserver.onMessageSent(buffer, false);
reportError("Failed to send the message via Data Channel ");
}
}
} catch (Exception e) {
reportError("An error occurred when sending the message via Data Channel " + e.getMessage());
if (dataChannelObserver != null) {
buffer.data.rewind();
dataChannelObserver.onMessageSent(buffer, false);
}
}
}); } else { reportError("Data Channel is not ready for usage."); } }
public void init(VideoCapturer videoCapturer, VideoSink localRender) { this.localRender = localRender; this.videoCapturer = videoCapturer; executor.execute(() -> { createMediaConstraintsInternal(); createVideoTrack(videoCapturer); createAudioTrack(); }); }
public void setLocalVideoTrack(@javax.annotation.Nullable VideoTrack localVideoTrack) { this.localVideoTrack = localVideoTrack; }
/**
public DataChannelParameters(boolean ordered, int maxRetransmitTimeMs, int maxRetransmits, String protocol, boolean negotiated, int id, String label, boolean isDataChannelCreator) { this.ordered = ordered; this.maxRetransmitTimeMs = maxRetransmitTimeMs; this.maxRetransmits = maxRetransmits; this.protocol = protocol == null ? "" : protocol; this.negotiated = negotiated; this.id = id; this.label = label; this.isDataChannelCreator = isDataChannelCreator; } }
/**
public PeerConnectionParameters(boolean videoCallEnabled, boolean loopback, boolean tracing, int videoWidth, int videoHeight, int videoFps, int videoMaxBitrate, String videoCodec, boolean videoCodecHwAcceleration, boolean videoFlexfecEnabled, int audioStartBitrate, String audioCodec, boolean noAudioProcessing, boolean aecDump, boolean saveInputAudioToFile, boolean useOpenSLES, boolean disableBuiltInAEC, boolean disableBuiltInAGC, boolean disableBuiltInNS, boolean disableWebRtcAGCAndHPF, boolean enableRtcEventLog, DataChannelParameters dataChannelParameters, boolean audioCallEnabled) { this.videoCallEnabled = videoCallEnabled; this.loopback = loopback; this.tracing = tracing; this.videoWidth = videoWidth; this.videoHeight = videoHeight; this.videoFps = videoFps; this.videoMaxBitrate = videoMaxBitrate; this.videoCodec = videoCodec; this.videoFlexfecEnabled = videoFlexfecEnabled; this.videoCodecHwAcceleration = videoCodecHwAcceleration; this.audioStartBitrate = audioStartBitrate; this.audioCodec = audioCodec; this.noAudioProcessing = noAudioProcessing; this.aecDump = aecDump; this.saveInputAudioToFile = saveInputAudioToFile; this.useOpenSLES = useOpenSLES; this.disableBuiltInAEC = disableBuiltInAEC; this.disableBuiltInAGC = disableBuiltInAGC; this.disableBuiltInNS = disableBuiltInNS; this.disableWebRtcAGCAndHPF = disableWebRtcAGCAndHPF; this.enableRtcEventLog = enableRtcEventLog; this.dataChannelParameters = dataChannelParameters; this.audioCallEnabled = audioCallEnabled; } }
/**
/**
/**
/**
/**
/**
/**
/**
/**
/**
/**
Log.d(TAG, "Preferred video codec: " + getSdpVideoCodecName(peerConnectionParameters));
final String fieldTrials = getFieldTrials(peerConnectionParameters); executor.execute(() -> { Log.d(TAG, "Initialize WebRTC. Field trials: " + fieldTrials); PeerConnectionFactory.initialize( PeerConnectionFactory.InitializationOptions.builder(appContext) .setFieldTrials(fieldTrials) .setEnableInternalTracer(true) .createInitializationOptions()); }); }
/**
public void createPeerConnection(final VideoSink localRender, final VideoSink remoteSink, final VideoCapturer videoCapturer, final SignalingParameters signalingParameters) { if (peerConnectionParameters.videoCallEnabled && videoCapturer == null) { Log.w(TAG, "Video call enabled but no video capturer provided."); } createPeerConnection( localRender, Collections.singletonList(remoteSink), videoCapturer, signalingParameters); }
public void createPeerConnection(final VideoSink localRender, final List
public void close() { executor.execute(this ::closeInternal); }
private boolean isVideoCallEnabled() { return peerConnectionParameters.videoCallEnabled && videoCapturer != null; }
private boolean isAudioEnabled() { return peerConnectionParameters.audioCallEnabled; }
private void createPeerConnectionFactoryInternal(PeerConnectionFactory.Options options) { isError = false;
if (peerConnectionParameters.tracing) { PeerConnectionFactory.startInternalTracingCapture( Environment.getExternalStorageDirectory().getAbsolutePath() + File.separator
// Check if ISAC is used by default. preferIsac = peerConnectionParameters.audioCodec != null && peerConnectionParameters.audioCodec.equals(AUDIO_CODEC_ISAC);
// It is possible to save a copy in raw PCM format on a file by checking // the "Save input audio to file" checkbox in the Settings UI. A callback // interface is set when this flag is enabled. As a result, a copy of recorded // audio samples are provided to this client directly from the native audio // layer in Java. if (peerConnectionParameters.saveInputAudioToFile) { if (!peerConnectionParameters.useOpenSLES) { Log.d(TAG, "Enable recording of microphone input audio to file"); saveRecordedAudioToFile = new RecordedAudioToFileController(executor); } else { // TODO(henrika): ensure that the UI reflects that if OpenSL ES is selected, // then the "Save inut audio to file" option shall be grayed out. Log.e(TAG, "Recording of input audio is not supported for OpenSL ES"); } }
final AudioDeviceModule adm = createJavaAudioDevice();
// Create peer connection factory. if (options != null) { Log.d(TAG, "Factory networkIgnoreMask option: " + options.networkIgnoreMask); } final boolean enableH264HighProfile = VIDEO_CODEC_H264_HIGH.equals(peerConnectionParameters.videoCodec); final VideoEncoderFactory encoderFactory; final VideoDecoderFactory decoderFactory;
if (peerConnectionParameters.videoCodecHwAcceleration) { encoderFactory = new DefaultVideoEncoderFactory( rootEglBase.getEglBaseContext(), true / enableIntelVp8Encoder /, enableH264HighProfile); decoderFactory = new DefaultVideoDecoderFactory(rootEglBase.getEglBaseContext()); } else { encoderFactory = new SoftwareVideoEncoderFactory(); decoderFactory = new SoftwareVideoDecoderFactory(); }
factory = PeerConnectionFactory.builder() .setOptions(options) .setAudioDeviceModule(adm) .setVideoEncoderFactory(encoderFactory) .setVideoDecoderFactory(decoderFactory) .createPeerConnectionFactory(); Log.d(TAG, "Peer connection factory created."); adm.release(); }
AudioDeviceModule createJavaAudioDevice() { // Enable/disable OpenSL ES playback. if (!peerConnectionParameters.useOpenSLES) { Log.w(TAG, "External OpenSLES ADM not implemented yet."); // TODO(magjed): Add support for external OpenSLES ADM. }
// Set audio record error callbacks. AudioRecordErrorCallback audioRecordErrorCallback = new AudioRecordErrorCallback() { @Override public void onWebRtcAudioRecordInitError(String errorMessage) { Log.e(TAG, "onWebRtcAudioRecordInitError: " + errorMessage); reportError(errorMessage); }
@Override public void onWebRtcAudioRecordStartError( JavaAudioDeviceModule.AudioRecordStartErrorCode errorCode, String errorMessage) { Log.e(TAG, "onWebRtcAudioRecordStartError: " + errorCode + ". " + errorMessage); reportError(errorMessage); }
@Override public void onWebRtcAudioRecordError(String errorMessage) { Log.e(TAG, "onWebRtcAudioRecordError: " + errorMessage); reportError(errorMessage); } };
AudioTrackErrorCallback audioTrackErrorCallback = new AudioTrackErrorCallback() { @Override public void onWebRtcAudioTrackInitError(String errorMessage) { Log.e(TAG, "onWebRtcAudioTrackInitError: " + errorMessage); reportError(errorMessage); }
@Override public void onWebRtcAudioTrackStartError( JavaAudioDeviceModule.AudioTrackStartErrorCode errorCode, String errorMessage) { Log.e(TAG, "onWebRtcAudioTrackStartError: " + errorCode + ". " + errorMessage); reportError(errorMessage); }
@Override public void onWebRtcAudioTrackError(String errorMessage) { Log.e(TAG, "onWebRtcAudioTrackError: " + errorMessage); reportError(errorMessage); } };
// Set audio record state callbacks. AudioRecordStateCallback audioRecordStateCallback = new AudioRecordStateCallback() { @Override public void onWebRtcAudioRecordStart() { Log.i(TAG, "Audio recording starts"); }
@Override public void onWebRtcAudioRecordStop() { Log.i(TAG, "Audio recording stops"); } };
// Set audio track state callbacks. AudioTrackStateCallback audioTrackStateCallback = new AudioTrackStateCallback() { @Override public void onWebRtcAudioTrackStart() { Log.i(TAG, "Audio playout starts"); }
@Override public void onWebRtcAudioTrackStop() { Log.i(TAG, "Audio playout stops"); } };
return JavaAudioDeviceModule.builder(appContext) .setSamplesReadyCallback(saveRecordedAudioToFile) .setUseHardwareAcousticEchoCanceler(false) .setUseHardwareNoiseSuppressor(false) .setAudioRecordErrorCallback(audioRecordErrorCallback) .setAudioTrackErrorCallback(audioTrackErrorCallback) .setAudioRecordStateCallback(audioRecordStateCallback) .setAudioTrackStateCallback(audioTrackStateCallback) .createAudioDeviceModule();
}
private void createMediaConstraintsInternal() { // Create video constraints if video call is enabled. if (isVideoCallEnabled()) { videoWidth = peerConnectionParameters.videoWidth; videoHeight = peerConnectionParameters.videoHeight; videoFps = peerConnectionParameters.videoFps;
// If video resolution is not specified, default to HD. if (videoWidth == 0 || videoHeight == 0) { videoWidth = HD_VIDEO_WIDTH; videoHeight = HD_VIDEO_HEIGHT; }
// If fps is not specified, default to 30. if (videoFps == 0) { videoFps = 30; } Logging.d(TAG, "Capturing format: " + videoWidth + "x" + videoHeight + "@" + videoFps); }
// Create audio constraints. audioConstraints = new MediaConstraints(); // added for audio performance measurements if (peerConnectionParameters.noAudioProcessing) { Log.d(TAG, "Disabling audio processing"); audioConstraints.mandatory.add( new MediaConstraints.KeyValuePair(AUDIO_ECHO_CANCELLATION_CONSTRAINT, "false")); audioConstraints.mandatory.add( new MediaConstraints.KeyValuePair(AUDIO_AUTO_GAIN_CONTROL_CONSTRAINT, "false")); audioConstraints.mandatory.add( new MediaConstraints.KeyValuePair(AUDIO_HIGH_PASS_FILTER_CONSTRAINT, "false")); audioConstraints.mandatory.add( new MediaConstraints.KeyValuePair(AUDIO_NOISE_SUPPRESSION_CONSTRAINT, "false")); } // Create SDP constraints. sdpMediaConstraints = new MediaConstraints(); sdpMediaConstraints.mandatory.add( new MediaConstraints.KeyValuePair("OfferToReceiveAudio", "true")); sdpMediaConstraints.mandatory.add(new MediaConstraints.KeyValuePair( "OfferToReceiveVideo", Boolean.toString(isVideoCallEnabled()))); sdpMediaConstraints.mandatory.add(new MediaConstraints.KeyValuePair("IceRestart", "true"));//changes, added on 26th apr 21 to ice restart }
private void createPeerConnectionInternal() { if (factory == null || isError) { Log.e(TAG, "Peerconnection factory is not created"); return; } Log.d(TAG, "Create peer connection.");
queuedRemoteCandidates = new ArrayList<>();
PeerConnection.RTCConfiguration rtcConfig = new PeerConnection.RTCConfiguration(signalingParameters.iceServers); // TCP candidates are only useful when connecting to a server that supports // ICE-TCP. rtcConfig.tcpCandidatePolicy = PeerConnection.TcpCandidatePolicy.DISABLED; rtcConfig.bundlePolicy = PeerConnection.BundlePolicy.MAXBUNDLE; rtcConfig.rtcpMuxPolicy = PeerConnection.RtcpMuxPolicy.REQUIRE; rtcConfig.continualGatheringPolicy = PeerConnection.ContinualGatheringPolicy.GATHER_CONTINUALLY; // Use ECDSA encryption. rtcConfig.keyType = PeerConnection.KeyType.ECDSA; // Enable DTLS for normal calls and disable for loopback calls. rtcConfig.enableDtlsSrtp = !peerConnectionParameters.loopback; rtcConfig.sdpSemantics = PeerConnection.SdpSemantics.UNIFIED_PLAN;
peerConnection = factory.createPeerConnection(rtcConfig, pcObserver);
isInitiator = false;
// Set INFO libjingle logging. // NOTE: this must happen while |factory| is alive! Logging.enableLogToDebugOutput(Logging.Severity.LS_INFO);
List
if (isVideoCallEnabled()) { findVideoSender(); }
if (peerConnectionParameters.aecDump) { try { ParcelFileDescriptor aecDumpFileDescriptor = ParcelFileDescriptor.open(new File(Environment.getExternalStorageDirectory().getPath()
if (saveRecordedAudioToFile != null) { if (saveRecordedAudioToFile.start()) { Log.d(TAG, "Recording input audio to file is activated"); } } Log.d(TAG, "Peer connection created."); }
private void initDataChannel() { if (dataChannelEnabled && peerConnectionParameters.dataChannelParameters.isDataChannelCreator) { DataChannel.Init init = new DataChannel.Init(); init.ordered = peerConnectionParameters.dataChannelParameters.ordered; init.negotiated = peerConnectionParameters.dataChannelParameters.negotiated; init.maxRetransmits = peerConnectionParameters.dataChannelParameters.maxRetransmits; init.maxRetransmitTimeMs = peerConnectionParameters.dataChannelParameters.maxRetransmitTimeMs; init.id = peerConnectionParameters.dataChannelParameters.id; init.protocol = peerConnectionParameters.dataChannelParameters.protocol; dataChannel = peerConnection.createDataChannel(peerConnectionParameters.dataChannelParameters.label, init); dataChannel.registerObserver(dataChannelInternalObserver); } }
private File createRtcEventLogOutputFile() { DateFormat dateFormat = new SimpleDateFormat("yyyyMMdd_hhmm_ss", Locale.getDefault()); Date date = new Date(); final String outputFileName = "eventlog" + dateFormat.format(date) + ".log"; return new File( appContext.getDir(RTCEVENTLOG_OUTPUT_DIR_NAME, Context.MODE_PRIVATE), outputFileName); }
private void maybeCreateAndStartRtcEventLog() { if (appContext == null || peerConnection == null) { return; } if (!peerConnectionParameters.enableRtcEventLog) { Log.d(TAG, "RtcEventLog is disabled."); return; } rtcEventLog = new RtcEventLog(peerConnection); rtcEventLog.start(createRtcEventLogOutputFile()); }
private void closeInternal() { if (factory != null && peerConnectionParameters.aecDump) { factory.stopAecDump(); } Log.d(TAG, "Closing peer connection."); statsTimer.cancel(); if (dataChannel != null) { dataChannel.dispose(); dataChannel = null; } dataChannelObserver = null;
if (rtcEventLog != null) { // RtcEventLog should stop before the peer connection is disposed. rtcEventLog.stop(); rtcEventLog = null; } if (peerConnection != null) { peerConnection.dispose(); peerConnection = null; } Log.d(TAG, "Closing audio source."); if (audioSource != null) { audioSource.dispose(); audioSource = null; } Log.d(TAG, "Stopping capture."); if (videoCapturer != null) { try { videoCapturer.stopCapture(); } catch (InterruptedException e) { throw new RuntimeException(e); } videoCapturerStopped = true; videoCapturer.dispose(); videoCapturer = null; } Log.d(TAG, "Closing video source."); if (videoSource != null) { videoSource.dispose(); videoSource = null; } if (surfaceTextureHelper != null) { surfaceTextureHelper.dispose(); surfaceTextureHelper = null; } if (saveRecordedAudioToFile != null) { Log.d(TAG, "Closing audio file for recorded input audio."); saveRecordedAudioToFile.stop(); saveRecordedAudioToFile = null; } localRender = null; remoteSinks = null; Log.d(TAG, "Closing peer connection factory."); if (factory != null) { factory.dispose(); factory = null; } rootEglBase.release(); Log.d(TAG, "Closing peer connection done."); events.onPeerConnectionClosed(); PeerConnectionFactory.stopInternalTracingCapture(); PeerConnectionFactory.shutdownInternalTracer(); }
public boolean isHDVideo() { return isVideoCallEnabled() && videoWidth videoHeight >= 1280 720; }
@SuppressWarnings("deprecation") // TODO(sakal): getStats is deprecated. private void getStats() { if (peerConnection == null || isError) { return; } boolean success = peerConnection.getStats(new StatsObserver() { @Override public void onComplete(final StatsReport[] reports) { events.onPeerConnectionStatsReady(reports); } }, null); if (!success) { //Log.e(TAG, "getStats() returns false!"); } }
public void enableStatsEvents(boolean enable, int periodMs) { if (enable) { try { statsTimer.schedule(new TimerTask() { @Override public void run() { executor.execute(() -> getStats()); } }, 0, periodMs); } catch (Exception e) { Log.e(TAG, "Can not schedule statistics timer", e); } } else { statsTimer.cancel(); } }
public void setAudioEnabled(final boolean enable) { executor.execute(() -> { enableAudio = enable; if (localAudioTrack != null) { localAudioTrack.setEnabled(enableAudio); } }); }
public void setVideoEnabled(final boolean enable) { executor.execute(() -> { renderVideo = enable; if (localVideoTrack != null) { if (enable) { startVideoSourceInternal(); } else { stopVideoSourceInternal(); } localVideoTrack.setEnabled(renderVideo); } if (remoteVideoTrack != null) { remoteVideoTrack.setEnabled(renderVideo); } }); }
public void createOffer() { executor.execute(() -> { if (peerConnection != null && !isError) { Log.d(TAG, "PC Create OFFER"); isInitiator = true; initDataChannel(); peerConnection.createOffer(sdpObserver, sdpMediaConstraints); } }); }
public void createAnswer() { executor.execute(() -> { if (peerConnection != null && !isError) { Log.d(TAG, "PC create ANSWER"); isInitiator = false; peerConnection.createAnswer(sdpObserver, sdpMediaConstraints); } }); }
public void addRemoteIceCandidate(final IceCandidate candidate) { executor.execute(() -> { if (peerConnection != null && !isError) { if (queuedRemoteCandidates != null) { queuedRemoteCandidates.add(candidate); } else { peerConnection.addIceCandidate(candidate); } } }); }
public void removeRemoteIceCandidates(final IceCandidate[] candidates) { executor.execute(() -> { if (peerConnection == null || isError) { return; } // Drain the queued remote candidates if there is any so that // they are processed in the proper order. drainCandidates(); peerConnection.removeIceCandidates(candidates); }); }
public void setRemoteDescription(final SessionDescription sdp) { executor.execute(() -> { if (peerConnection == null || isError) { return; } String sdpDescription = sdp.description; if (preferIsac) { sdpDescription = preferCodec(sdpDescription, AUDIO_CODEC_ISAC, true); } if (isVideoCallEnabled()) { sdpDescription = preferCodec(sdpDescription, getSdpVideoCodecName(peerConnectionParameters), false); } if (peerConnectionParameters.audioStartBitrate > 0) { sdpDescription = setStartBitrate( AUDIO_CODEC_OPUS, false, sdpDescription, peerConnectionParameters.audioStartBitrate); } Log.d(TAG, "Set remote SDP."); SessionDescription sdpRemote = new SessionDescription(sdp.type, sdpDescription); peerConnection.setRemoteDescription(sdpObserver, sdpRemote); }); }
public void stopVideoSource() { executor.execute(() -> { stopVideoSourceInternal(); }); }
private void stopVideoSourceInternal() { if (videoCapturer != null && !videoCapturerStopped) { Log.d(TAG, "Stop video source."); try { videoCapturer.stopCapture(); } catch (InterruptedException e) { Log.d(TAG, e.getMessage()); } videoCapturerStopped = true; } }
public void startVideoSource() { executor.execute(() -> { startVideoSourceInternal(); }); }
private void startVideoSourceInternal() { if (videoCapturer != null && videoCapturerStopped) { Log.d(TAG, "Restart video source."); videoCapturer.startCapture(videoWidth, videoHeight, videoFps); videoCapturerStopped = false; } }
public void setVideoMaxBitrate(@Nullable final Integer maxBitrateKbps) { executor.execute(() -> { if (peerConnection == null || localVideoSender == null || isError) { return; } Log.d(TAG, "Requested max video bitrate: " + maxBitrateKbps); if (localVideoSender == null) { Log.w(TAG, "Sender is not ready."); return; }
RtpParameters parameters = localVideoSender.getParameters(); if (parameters.encodings.size() == 0) { Log.w(TAG, "RtpParameters are not ready."); return; }
for (RtpParameters.Encoding encoding : parameters.encodings) { // Null value means no limit. encoding.maxBitrateBps = maxBitrateKbps == null ? null : maxBitrateKbps * BPS_IN_KBPS; } if (!localVideoSender.setParameters(parameters)) { Log.e(TAG, "RtpSender.setParameters failed."); } Log.d(TAG, "Configured max video bitrate to: " + maxBitrateKbps); }); }
private void reportError(final String errorMessage) { Log.e(TAG, "Peerconnection error: " + errorMessage); executor.execute(() -> { if (!isError) { events.onPeerConnectionError(errorMessage); isError = true; } }); }
@Nullable private AudioTrack createAudioTrack() { if (localAudioTrack == null) { //changes, 25th apr 21 - start , added to reduce noise and echo /*
to a Room. */ // Use software AEC if (WebRtcAudioUtils.isAcousticEchoCancelerSupported()) { Log.d(TAG, "echo: createAudioTrack: setWebRtcBasedAcousticEchoCanceler"); WebRtcAudioUtils.setWebRtcBasedAcousticEchoCanceler(true); WebRtcAudioUtils.useWebRtcBasedAcousticEchoCanceler(); } // Use software NS if (WebRtcAudioUtils.isNoiseSuppressorSupported()) { Log.d(TAG, "echo: createAudioTrack: isNoiseSuppressorSupported"); WebRtcAudioUtils.setWebRtcBasedNoiseSuppressor(true); WebRtcAudioUtils.useWebRtcBasedNoiseSuppressor();
// Use software AGC WebRtcAudioUtils.setWebRtcBasedAutomaticGainControl(true); }
/*
to a Room. */ // Enable OpenSL ES org.webrtc.voiceengine.WebRtcAudioManager.setBlacklistDeviceForOpenSLESUsage(false); // Check if OpenSL ES is disabled org.webrtc.voiceengine.WebRtcAudioUtils.deviceIsBlacklistedForOpenSLESUsage(); //25th apr 21 - end
audioSource = factory.createAudioSource(audioConstraints); localAudioTrack = factory.createAudioTrack(AUDIO_TRACK_ID, audioSource); localAudioTrack.setEnabled(enableAudio); } return localAudioTrack; }
@Nullable private VideoTrack createVideoTrack(VideoCapturer capturer) { if (localVideoTrack == null && capturer != null) { surfaceTextureHelper = SurfaceTextureHelper.create("CaptureThread", rootEglBase.getEglBaseContext()); videoSource = factory.createVideoSource(capturer.isScreencast()); capturer.initialize(surfaceTextureHelper, appContext, videoSource.getCapturerObserver()); capturer.startCapture(videoWidth, videoHeight, videoFps);
localVideoTrack = factory.createVideoTrack(VIDEO_TRACK_ID, videoSource); localVideoTrack.setEnabled(renderVideo); localVideoTrack.addSink(localRender); } return localVideoTrack; }
private void findVideoSender() { for (RtpSender sender : peerConnection.getSenders()) { if (sender.track() != null) { String trackType = sender.track().kind(); if (trackType.equals(VIDEO_TRACK_TYPE)) { Log.d(TAG, "Found video sender."); localVideoSender = sender; } } } }
private List
// Returns the remote VideoTrack, assuming there is only one. private @Nullable VideoTrack getRemoteVideoTrack() { for (RtpTransceiver transceiver : peerConnection.getTransceivers()) { MediaStreamTrack track = transceiver.getReceiver().track(); if (track instanceof VideoTrack) { return (VideoTrack) track; } } return null; }
private static String getSdpVideoCodecName(PeerConnectionParameters parameters) { switch (parameters.videoCodec) { case VIDEO_CODEC_VP8: return VIDEO_CODEC_VP8; case VIDEO_CODEC_VP9: return VIDEO_CODEC_VP9; case VIDEO_CODEC_H264_HIGH: case VIDEO_CODEC_H264_BASELINE: return VIDEO_CODEC_H264; default: return VIDEO_CODEC_VP8; } }
private static String getFieldTrials(PeerConnectionParameters peerConnectionParameters) { String fieldTrials = ""; if (peerConnectionParameters.videoFlexfecEnabled) { fieldTrials += VIDEO_FLEXFEC_FIELDTRIAL; Log.d(TAG, "Enable FlexFEC field trial."); } fieldTrials += VIDEO_VP8_INTEL_HW_ENCODER_FIELDTRIAL; if (peerConnectionParameters.disableWebRtcAGCAndHPF) { fieldTrials += DISABLE_WEBRTC_AGC_FIELDTRIAL; Log.d(TAG, "Disable WebRTC AGC field trial."); } return fieldTrials; }
@SuppressWarnings("StringSplitter")
private static String setStartBitrate(
String codec, boolean isVideoCodec, String sdpDescription, int bitrateKbps) {
String[] lines = sdpDescription.split("\r\n");
int rtpmapLineIndex = -1;
boolean sdpFormatUpdated = false;
String codecRtpMap = null;
// Search for codec rtpmap in format
// a=rtpmap:
// Check if a=fmtp string already exist in remote SDP for this codec and // update it with new bitrate parameter. regex = "^a=fmtp:" + codecRtpMap + " \w+=\d+.[\r]?$"; codecPattern = Pattern.compile(regex); for (int i = 0; i < lines.length; i++) { Matcher codecMatcher = codecPattern.matcher(lines[i]); if (codecMatcher.matches()) { Log.d(TAG, "Found " + codec + " " + lines[i]); if (isVideoCodec) { lines[i] += "; " + VIDEO_CODEC_PARAM_START_BITRATE + "=" + bitrateKbps; } else { lines[i] += "; " + AUDIO_CODEC_PARAM_BITRATE + "=" + (bitrateKbps 1000); } Log.d(TAG, "Update remote SDP line: " + lines[i]); sdpFormatUpdated = true; break; } }
StringBuilder newSdpDescription = new StringBuilder(); for (int i = 0; i < lines.length; i++) { newSdpDescription.append(lines[i]).append("\r\n"); // Append new a=fmtp line if no such line exist for a codec. if (!sdpFormatUpdated && i == rtpmapLineIndex) { String bitrateSet; if (isVideoCodec) { bitrateSet = "a=fmtp:" + codecRtpMap + " " + VIDEO_CODEC_PARAM_START_BITRATE + "=" + bitrateKbps; } else { bitrateSet = "a=fmtp:" + codecRtpMap + " " + AUDIO_CODEC_PARAM_BITRATE + "="
/* Returns the line number containing "m=audio|video", or -1 if no such line exists. / private static int findMediaDescriptionLine(boolean isAudio, String[] sdpLines) { final String mediaDescription = isAudio ? "m=audio " : "m=video "; for (int i = 0; i < sdpLines.length; ++i) { if (sdpLines[i].startsWith(mediaDescription)) { return i; } } return -1; }
private static String joinString( Iterable<? extends CharSequence> s, String delimiter, boolean delimiterAtEnd) { Iterator<? extends CharSequence> iter = s.iterator(); if (!iter.hasNext()) { return ""; } StringBuilder buffer = new StringBuilder(iter.next()); while (iter.hasNext()) { buffer.append(delimiter).append(iter.next()); } if (delimiterAtEnd) { buffer.append(delimiter); } return buffer.toString(); }
private static @Nullable String movePayloadTypesToFront(
List
private static String preferCodec(String sdpDescription, String codec, boolean isAudio) {
final String[] lines = sdpDescription.split("\r\n");
final int mLineIndex = findMediaDescriptionLine(isAudio, lines);
if (mLineIndex == -1) {
Log.w(TAG, "No mediaDescription line, so can't prefer " + codec);
return sdpDescription;
}
// A list with all the payload types with name |codec|. The payload types are integers in the
// range 96-127, but they are stored as strings here.
final List
final String newMLine = movePayloadTypesToFront(codecPayloadTypes, lines[mLineIndex]); if (newMLine == null) { return sdpDescription; } Log.d(TAG, "Change media description from: " + lines[mLineIndex] + " to " + newMLine); lines[mLineIndex] = newMLine; return joinString(Arrays.asList(lines), "\r\n", true / delimiterAtEnd /); }
private void drainCandidates() { if (queuedRemoteCandidates != null) { Log.d(TAG, "Add " + queuedRemoteCandidates.size() + " remote candidates"); for (IceCandidate candidate : queuedRemoteCandidates) { peerConnection.addIceCandidate(candidate); } queuedRemoteCandidates = null; } }
private void switchCameraInternal() { if (videoCapturer instanceof CameraVideoCapturer) { if (!isVideoCallEnabled() || isError) { Log.e(TAG, "Failed to switch camera. Video: " + isVideoCallEnabled() + ". Error : " + isError); return; // No video is sent or only one camera is available or error happened. } Log.d(TAG, "Switch camera"); CameraVideoCapturer cameraVideoCapturer = (CameraVideoCapturer) videoCapturer; cameraVideoCapturer.switchCamera(null); } else { Log.d(TAG, "Will not switch camera, video caputurer is not a camera"); } }
public void switchCamera() { executor.execute(this ::switchCameraInternal); }
public void changeCaptureFormat(final int width, final int height, final int framerate) { executor.execute(() -> changeCaptureFormatInternal(width, height, framerate)); }
private void changeCaptureFormatInternal(int width, int height, int framerate) { if (!isVideoCallEnabled() || isError || videoCapturer == null) { Log.e(TAG, "Failed to change capture format. Video: " + isVideoCallEnabled()
// Implementation detail: observe ICE & stream changes and react accordingly. private class PCObserver implements PeerConnection.Observer { @Override public void onIceCandidate(final IceCandidate candidate) { executor.execute(() -> events.onIceCandidate(candidate)); }
@Override public void onIceCandidatesRemoved(final IceCandidate[] candidates) { executor.execute(() -> events.onIceCandidatesRemoved(candidates)); }
@Override public void onSignalingChange(PeerConnection.SignalingState newState) { Log.d(TAG, "SignalingState: " + newState); }
@Override public void onIceConnectionChange(final PeerConnection.IceConnectionState newState) { iceConnectionState = newState;//changes, added on 16th apr 21 executor.execute(() -> { Log.d(TAG, "IceConnectionState: " + newState); if (newState == IceConnectionState.CONNECTED) { events.onIceConnected(); } else if (newState == IceConnectionState.DISCONNECTED) { events.onIceDisconnected(); } else if (newState == IceConnectionState.FAILED) { reportError("ICE connection failed."); } }); }
@Override public void onConnectionChange(final PeerConnection.PeerConnectionState newState) { executor.execute(() -> { Log.d(TAG, "PeerConnectionState: " + newState); if (newState == PeerConnectionState.CONNECTED) { events.onConnected(); } else if (newState == PeerConnectionState.DISCONNECTED) { events.onDisconnected(); } else if (newState == PeerConnectionState.FAILED) { reportError("DTLS connection failed."); } }); }
@Override public void onIceGatheringChange(PeerConnection.IceGatheringState newState) { Log.d(TAG, "IceGatheringState: " + newState); }
@Override public void onIceConnectionReceivingChange(boolean receiving) { Log.d(TAG, "IceConnectionReceiving changed to " + receiving); }
@Override public void onSelectedCandidatePairChanged(CandidatePairChangeEvent event) { Log.d(TAG, "Selected candidate pair changed because: " + event); }
@Override public void onAddStream(final MediaStream stream) { if (!isVideoCallEnabled() && !isAudioEnabled()) { // this is the case in play mode /VideoTrack remoteVideoTrack = getRemoteVideoTrack(); remoteVideoTrack.setEnabled(true); for (VideoSink remoteSink : remoteSinks) { remoteVideoTrack.addSink(remoteSink); } /
List<VideoTrack> remoteVideoTrackList = getRemoteVideoTrackList();
for (int i = 0; i < remoteVideoTrackList.size(); i++)
{
if (i < remoteSinks.size()) {
remoteVideoTrackList.get(i).addSink(remoteSinks.get(i));
} else {
Log.e(TAG, "There is no enough remote sinks to show video tracks");
}
}
} }
@Override public void onRemoveStream(final MediaStream stream) { }
/*
Not called for publisher mode **/ @Override public void onDataChannel(final DataChannel dc) { // if(PeerConnectionClient.this.dataChannel.state() == DataChannel.State.OPEN) { // dataChannel.unregisterObserver(); // dataChannel.close(); // dataChannel.dispose(); // }
PeerConnectionClient.this.dataChannel = dc;
Log.d(TAG, "New Data channel " + dc.label());
if (!dataChannelEnabled) return;
dc.registerObserver(dataChannelInternalObserver); }
@Override public void onRenegotiationNeeded() { // No need to do anything; AppRTC follows a pre-agreed-upon // signaling/negotiation protocol. }
@Override public void onAddTrack(final RtpReceiver receiver, final MediaStream[] mediaStreams) {} }
// Implementation detail: handle offer creation/signaling and answer setting, // as well as adding remote ICE candidates once the answer SDP is set. private class SDPObserver implements SdpObserver { @Override public void onCreateSuccess(final SessionDescription origSdp) { if (localSdp != null) { reportError("Multiple SDP create."); return; } String sdpDescription = origSdp.description; if (preferIsac) { sdpDescription = preferCodec(sdpDescription, AUDIO_CODEC_ISAC, true); } if (isVideoCallEnabled()) { sdpDescription = preferCodec(sdpDescription, getSdpVideoCodecName(peerConnectionParameters), false); } final SessionDescription sdp = new SessionDescription(origSdp.type, sdpDescription); localSdp = sdp; executor.execute(() -> { if (peerConnection != null && !isError) { Log.d(TAG, "Set local SDP from " + sdp.type); peerConnection.setLocalDescription(sdpObserver, sdp); } }); }
@Override public void onSetSuccess() { executor.execute(() -> { if (peerConnection == null || isError) { return; } if (isInitiator) { // For offering peer connection we first create offer and set // local SDP, then after receiving answer set remote SDP. if (peerConnection.getRemoteDescription() == null) { // We've just set our local SDP so time to send it. Log.d(TAG, "Local SDP set succesfully"); events.onLocalDescription(localSdp); } else { // We've just set remote description, so drain remote // and send local ICE candidates. Log.d(TAG, "Remote SDP set succesfully"); drainCandidates(); } } else { // For answering peer connection we set remote SDP and then // create answer and set local SDP. if (peerConnection.getLocalDescription() != null) { // We've just set our local SDP so time to send it, drain // remote and send local ICE candidates. Log.d(TAG, "Local SDP set succesfully"); events.onLocalDescription(localSdp); drainCandidates(); } else { // We've just set remote SDP - do nothing for now - // answer will be created soon. Log.d(TAG, "Remote SDP set succesfully"); } } }); }
@Override public void onCreateFailure(final String error) { reportError("createSDP error: " + error); }
@Override public void onSetFailure(final String error) { reportError("setSDP error: " + error); } }
public void changeVideoCaptureLandscape(){ executor.execute(() -> { Log.d(TAG, "changeVideoCaptureLandscape: videoWidth: "+videoWidth); Log.d(TAG, "changeVideoCaptureLandscape: videoHeight: "+videoHeight); if (videoWidth<videoHeight){ //way-1 int lVideoWidth= videoWidth; videoWidth = videoHeight; videoHeight = lVideoWidth; changeCaptureFormat(videoWidth, videoHeight, videoFps); //way-1 } }); }
public void changeVideoCapturePortrait(){ executor.execute(() -> { Log.d(TAG, "changeVideoCapturePortrait: videoWidth: "+videoWidth); Log.d(TAG, "changeVideoCapturePortrait: videoHeight: "+videoHeight); if (videoWidth>videoHeight){ //way-1 int lVideoWidth= videoWidth; videoWidth = videoHeight; videoHeight = lVideoWidth; changeCaptureFormat(videoWidth, videoHeight, videoFps); //way-1 - end } }); } }
hi @thinkgopal ,
I've put this into this week's sprint. I hope we can return back to you with a solution later this week.
Thank you for your cooperation.
Hi @thinkgopal,
I have investigated this issue in detail. It seems this issue is a chronic issue in latest version of devices. People are recommending using setUseHardwareAcousticEchoCanceler and setUseHardwareNoiseSuppressor with false parameters. But as I saw you already used with false parameter. So, I'm still investigating.
Short description
Sound echo issue not happening every time.
Ticket ID: 38631
Steps to reproduce
Expected behavior
It shouldn't be an echo.
Actual behavior
Echo happening.