Open hiteshtechshslok opened 8 months ago
If you are using CameraX, you can check out the OverlayEffect API.
You want to create a OverlayEffect with queue depth of 0, and targeting both Preview and VideoCapture. Then, set a listener via OverlayEffect#setOnDrawListener. For every new frame that is about to be drawn, you will get a callback in the listener. Then you can use the [Frame#getOverlayCanvas](https://developer.android.com/reference/androidx/camera/effects/Frame#getOverlayCanvas()) to get the Canvas
for drawing the text.
Please let us know if you run into any issues.
Hello @xizhang
Any idea on how to do this with Camera2 Surface Texture
Start preview works as it is on Texture View
private void startPreview() {
SurfaceTexture surfaceTexture = mTextureView.getSurfaceTexture();
surfaceTexture.setDefaultBufferSize(1920, 1080); // Set the desired size
Surface surface = new Surface(surfaceTexture);
try {
final CaptureRequest.Builder captureRequestBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
captureRequestBuilder.addTarget(surface);
// Apply black-and-white effect
captureRequestBuilder.set(CaptureRequest.CONTROL_EFFECT_MODE, CaptureRequest.CONTROL_EFFECT_MODE_MONO);
mCameraDevice.createCaptureSession(Arrays.asList(surface), new CameraCaptureSession.StateCallback() {
@Override
public void onConfigured(@NonNull CameraCaptureSession session) {
try {
session.setRepeatingRequest(captureRequestBuilder.build(), new CameraCaptureSession.CaptureCallback() {
@Override
public void onCaptureCompleted(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull TotalCaptureResult result) {
super.onCaptureCompleted(session, request, result);
Log.d(TAG, "Capturing");
drawTimestampOverlay();
}
}, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
@Override
public void onConfigureFailed(@NonNull CameraCaptureSession session) {
Log.e(TAG, "Failed to configure camera preview");
}
}, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
For surface texture any idea
private void startRecording() {
if (mCameraDevice == null || mTextureView.getSurfaceTexture() == null) {
return;
}
mMediaRecorder = new MediaRecorder();
// mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
if (((CheckBox) findViewById(R.id.audioCheckbox)).isChecked()) {
Log.e(TAG, "Check box was set");
mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
// mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);
}
mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.SURFACE);
mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
String timeStamp = new SimpleDateFormat("yyyyMMdd_HHmmss", Locale.getDefault()).format(new Date());
File videoFile = new File(Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DCIM), "VIDEO_" + timeStamp + ".mp4");
mMediaRecorder.setOutputFile(videoFile.getAbsolutePath());
mMediaRecorder.setVideoEncodingBitRate(10000000);
mMediaRecorder.setVideoFrameRate(30);
mMediaRecorder.setVideoSize(1280, 720);
mMediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.H264);
if (((CheckBox) findViewById(R.id.audioCheckbox)).isChecked()) {
Log.e(TAG, "Check box was set");
// mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);
}
// mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);
try {
mMediaRecorder.prepare();
} catch (IOException e) {
e.printStackTrace();
return;
}
Log.d(TAG, "Recording");
mLastFrameTime=System.nanoTime();
drawTimestampOverlay();
SurfaceTexture surfaceTexture = mTextureView.getSurfaceTexture();
surfaceTexture.setDefaultBufferSize(1920, 1080); // Set the desired size
Surface surface = new Surface(surfaceTexture);
List<Surface> surfaces = new ArrayList<>();
surfaces.add(surface);
surfaces.add(mMediaRecorder.getSurface());
try {
final CaptureRequest.Builder captureBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_RECORD);
captureBuilder.addTarget(surface);
captureBuilder.addTarget(mMediaRecorder.getSurface());
mCameraDevice.createCaptureSession(surfaces, new CameraCaptureSession.StateCallback() {
@Override
public void onConfigured(@NonNull CameraCaptureSession session) {
try {
mCaptureSession = session;
mCaptureSession.setRepeatingRequest(captureBuilder.build(), new CameraCaptureSession.CaptureCallback() {
@Override
public void onCaptureCompleted(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull TotalCaptureResult result) {
super.onCaptureCompleted(session, request, result);
Log.d(TAG, "Recording2");
// mFrameCount++;
// drawTimestampOverlay2();
}
}, null);
mMediaRecorder.start();
mIsRecording = true;
mRecordButton.setText("Stop");
// drawTimestampOverlay();
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
@Override
public void onConfigureFailed(@NonNull CameraCaptureSession session) {
Log.e(TAG, "Capture session configuration failed");
}
}, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
Tried using Canvas but since the surface is already connected to the camera, this does not get locked. Any idea on how to use GL on that?
The camera2 API is only recommended for lower level controls. If you wish to go this route, you can create an OpenGL renderer between the camera and the TextureView/MediaCodec. Then in the shaders you can draw the timestamp on top of the input. Example given by Gemini: https://g.co/gemini/share/b1c06fd6a9e8
However I should point out that this is basically what CameraX does. It's not an easy path to do it yourself and make it work on different devices. For overlaying a simple text, the existing CameraX API should just work. Please let us know if the existing API does not meet your need.
If you are using CameraX, you can check out the OverlayEffect API.
You want to create a OverlayEffect with queue depth of 0, and targeting both Preview and VideoCapture. Then, set a listener via OverlayEffect#setOnDrawListener. For every new frame that is about to be drawn, you will get a callback in the listener. Then you can use the [Frame#getOverlayCanvas](https://developer.android.com/reference/androidx/camera/effects/Frame#getOverlayCanvas()) to get the
Canvas
for drawing the text.Please let us know if you run into any issues.
Good morning!
Could you give an example of how to use OverlayEffect to apply a watermark to a video shot with CameraX.
As much as I read the OverlayEffect API documentation, I can't figure out how to use it.
Best regards
You can take a look at this WIP change for code samples: https://android-review.git.corp.google.com/c/platform/frameworks/support/+/2797834/9/camera/camera-effects/src/main/java/androidx/camera/effects/BitmapOverlayEffect.java
Otherwise you can post your detailed question on androidx-discuss@google.com and our engineers will be able to help you.
You can take a look at this WIP change for code samples: https://android-review.git.corp.google.com/c/platform/frameworks/support/+/2797834/9/camera/camera-effects/src/main/java/androidx/camera/effects/BitmapOverlayEffect.java
Otherwise you can post your detailed question on androidx-discuss@google.com and our engineers will be able to help you.
The link you passed me requires a "@google.com" user and the group you suggest I write to does not allow me to post questions.
With what little I understood from the documentation and taking into account your first answer, I created an object of type OverlayEffect in this way;
val handler = Handler(Looper.getMainLooper())
val errorListener = Consumer<Throwable> { error -> println("Error: ${error.message}") }
val overlayEffect = OverlayEffect(
CameraEffect.VIDEO_CAPTURE,
0,
handler,
errorListener
)
overlayEffect.setOnDrawListener(object : ViewTreeObserver.OnDrawListener,
Function<Frame, Boolean> {
override fun onDraw() {
}
override fun apply(input: Frame?): Boolean {
val canvas = input?.overlayCanvas
val textPaint = Paint().apply {
color = Color.RED
textSize = 50f
isFakeBoldText = true
textAlign = Paint.Align.CENTER
}
canvas?.drawText("WATERMARK", 200f, 200f + textPaint.textSize, textPaint)
return true
}
})
But I can't figure out, where I should pass the overlayeffect, so that the watermark is applied to the video.
Thank you for your response.
Sorry about the wrong link. I was having trouble with my work laptop for the past a few days. This is the right one: https://android-review.googlesource.com/c/platform/frameworks/support/+/2797834
You set the effect using the [UseCaseGroup#setEffects](https://developer.android.com/reference/androidx/camera/core/UseCaseGroup#getEffects()) API, or the CameraController#setEffects API if you are using CameraController.
Sorry about the wrong link. I was having trouble with my work laptop for the past a few days. This is the right one: https://android-review.googlesource.com/c/platform/frameworks/support/+/2797834
You set the effect using the [UseCaseGroup#setEffects](https://developer.android.com/reference/androidx/camera/core/UseCaseGroup#getEffects()) API, or the CameraController#setEffects API if you are using CameraController.
I applied the overlay effect based on several comments, but when I targeted both the preview and the video capture, the overlay effect was only applied to the preview. When I set only one target, the overlay effect was applied to that target without issue. How do I apply the overlay effect to both the preview and the video capture?
Overlay effect generation code
val handler = Handler(Looper.getMainLooper())
val overlayEffect = OverlayEffect(
CameraEffect.VIDEO_CAPTURE or CameraEffect.PREVIEW,
0,
handler
) {
Logger.e(it, "overlayEffect error")
}
val textPaint = Paint().apply {
color = Color.RED
textSize = 50f
}
overlayEffect.clearOnDrawListener()
overlayEffect.setOnDrawListener {
it.overlayCanvas.drawColor(Color.TRANSPARENT, PorterDuff.Mode.CLEAR);
it.overlayCanvas.drawText(
getTimeText(),
30f,
30f + textPaint.textSize,
textPaint,
)
true
}
Code to apply the overlay effect
val useCaseGroupBuilder = UseCaseGroup.Builder()
.addUseCase(videoCapture)
.addUseCase(preview)
.addEffect(overlayEffect)
val camera = cameraProvider.bindToLifecycle(
lifecycleOwner,
CameraSelector.DEFAULT_BACK_CAMERA,
useCaseGroupBuilder.build()
)
Your configuration looks good. It should apply to both preview and video capture. Things to try:
Otherwise, if you can upload a minimal reproducible code sample to GitHub, I am happy to take a look at it.
On Tue, Apr 30, 2024 at 1:00 AM Mo @.***> wrote:
Sorry about the wrong link. I was having trouble with my work laptop for the past a few days. This is the right one: https://android-review.googlesource.com/c/platform/frameworks/support/+/2797834
You set the effect using the UseCaseGroup#setEffects https://developer.android.com/reference/androidx/camera/core/UseCaseGroup#getEffects() API, or the CameraController#setEffects API if you are using CameraController.
I applied the overlay effect based on several comments, but when I targeted both the preview and the video capture, the overlay effect was only applied to the preview. If you set the target to either the preview or the video capture only, the overlay effect was applied according to each setting. How do I apply the overlay effect to both the preview and the video capture?
//// Overlay effect generation code val handler = Handler(Looper.getMainLooper()) val overlayEffect = OverlayEffect( CameraEffect.VIDEO_CAPTURE or CameraEffect.PREVIEW, 0, handler ) { Logger.e(it, "overlayEffect error") }
} val textPaint = Paint().apply { color = Color.RED textSize = 50f } overlayEffect.clearOnDrawListener() overlayEffect.setOnDrawListener { it.overlayCanvas.drawColor(Color.TRANSPARENT, PorterDuff.Mode.CLEAR); it.overlayCanvas.drawText( getTimeText(), 30f, 30f + textPaint.textSize, textPaint, ) true }
//// Code to apply the overlay effect
val useCaseGroupBuilder = UseCaseGroup.Builder() .addUseCase(videoCapture) .addUseCase(preview) .addEffect(overlayEffect)
val camera = cameraProvider.bindToLifecycle( lifecycleOwner, CameraSelector.DEFAULT_BACK_CAMERA, useCaseGroupBuilder.build() )
— Reply to this email directly, view it on GitHub https://github.com/android/camera-samples/issues/571#issuecomment-2084642899, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAEM2Z4SEUV42J7C3HHCTDTY75FSPAVCNFSM6AAAAABC5FX2KCVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDAOBUGY2DEOBZHE . You are receiving this because you were mentioned.Message ID: @.***>
Your configuration looks good. It should apply to both preview and video capture. Things to try: 1. Upgrade CameraX to the latest version. 2. set a viewport https://developer.android.com/reference/androidx/camera/core/UseCaseGroup.Builder#setViewPort(androidx.camera.core.ViewPort). I wonder if the overlay in video capture was cropped out due to transformation issues. 3. Use the CameraController API. CameraController is a high level API that takes care of the configuration such as the viewport which makes it less error prone. Otherwise, if you can upload a minimal reproducible code sample to GitHub, I am happy to take a look at it.
Thank you for your response. I tried moving the text output coordinates to about the center of the screen and it looks fine. You were right about the problem being caused by the cropping.
were you able to implement it?
Hello,
I am trying to add a text overlay using camera2 API over a texture view, I have been searching for the last 1 week so far nothing found.
Even something like this works.
Please let me know if it is possible. and any how-to's will be helpful.
Regards Hitesh