Feature/overlays (#502)

* Overlays (#421)

* get overlay working

* fix overlay drawing

* allow disabling overlay in pictures or videos

* Fix picture snapshot colors when there is an overlay

* Bug fixes

* Update example with watermark

* Fix bug

* Fix overlay orientation in pictures

* Fix overlay orientation in videos

* Fix overlay when changing preview size

* Fix bug

* Experiment

* Refactor EglViewport

* Refactor SnapshotPictureRecorder

* Use single EglViewport

* Refactor SnapshotVideoRecorder

* Bug fix

* fix some of the requested changes

* clean adding View to OverlayLayout

* Specify where to draw the overlay

* Refactor

* Remove unnecessary variable from CameraPreview

* Use mWithOverlay in SnapshotVideoRecorder

* Use multiple OverlayLayout

* Add explanation for OverlayLayoutManager

* override removeView

* Remove DisableOverlayFor

* Reorder to overlay package

* Address issues

* Draw selectively on preview, picture or video

* Use single Overlay with three targets

* Fix picture snapshots

* Add demo app control

* Fix video snapshot rotation for Camera2

* Fix video snapshot overlay rotation for Camera2 only

* Fix tests, improve performance

* Add animating watermark

* Add tests in CameraViewTest

* Add integration tests

* Fix race condition

* Improve README

* Remove isOverlay

* Remove isOverlay from docs

* Add documentation empty page

* Add documentation links

* Add real documentation

* Remove isOverlay from attrs

* Add doc links to main README

* Fix tests and logs

* Small changes in AudioMediaEncoder

* Add changelog line
pull/513/head
Mattia Iavarone 5 years ago committed by GitHub
parent 7d87d4af61
commit ea952d1497
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
  1. 86
      README.md
  2. 2
      cameraview/src/androidTest/java/com/otaliastudios/cameraview/BaseTest.java
  3. 55
      cameraview/src/androidTest/java/com/otaliastudios/cameraview/CameraViewTest.java
  4. 86
      cameraview/src/androidTest/java/com/otaliastudios/cameraview/engine/CameraIntegrationTest.java
  5. 197
      cameraview/src/androidTest/java/com/otaliastudios/cameraview/overlay/OverlayLayoutTest.java
  6. 4
      cameraview/src/androidTest/res/layout/not_overlay.xml
  7. 8
      cameraview/src/androidTest/res/layout/overlay.xml
  8. 43
      cameraview/src/main/java/com/otaliastudios/cameraview/CameraView.java
  9. 14
      cameraview/src/main/java/com/otaliastudios/cameraview/engine/Camera1Engine.java
  10. 20
      cameraview/src/main/java/com/otaliastudios/cameraview/engine/Camera2Engine.java
  11. 31
      cameraview/src/main/java/com/otaliastudios/cameraview/engine/CameraEngine.java
  12. 14
      cameraview/src/main/java/com/otaliastudios/cameraview/engine/offset/Angles.java
  13. 6
      cameraview/src/main/java/com/otaliastudios/cameraview/internal/egl/EglViewport.java
  14. 13
      cameraview/src/main/java/com/otaliastudios/cameraview/internal/utils/Pool.java
  15. 33
      cameraview/src/main/java/com/otaliastudios/cameraview/overlay/Overlay.java
  16. 211
      cameraview/src/main/java/com/otaliastudios/cameraview/overlay/OverlayLayout.java
  17. 95
      cameraview/src/main/java/com/otaliastudios/cameraview/picture/SnapshotGlPictureRecorder.java
  18. 32
      cameraview/src/main/java/com/otaliastudios/cameraview/preview/CameraPreview.java
  19. 1
      cameraview/src/main/java/com/otaliastudios/cameraview/preview/GlCameraPreview.java
  20. 79
      cameraview/src/main/java/com/otaliastudios/cameraview/video/SnapshotVideoRecorder.java
  21. 139
      cameraview/src/main/java/com/otaliastudios/cameraview/video/encoding/AudioMediaEncoder.java
  22. 34
      cameraview/src/main/java/com/otaliastudios/cameraview/video/encoding/MediaEncoder.java
  23. 72
      cameraview/src/main/java/com/otaliastudios/cameraview/video/encoding/MediaEncoderEngine.java
  24. 34
      cameraview/src/main/java/com/otaliastudios/cameraview/video/encoding/TextureMediaEncoder.java
  25. 2
      cameraview/src/main/java/com/otaliastudios/cameraview/video/encoding/VideoMediaEncoder.java
  26. 51
      cameraview/src/main/res/values/attrs.xml
  27. 62
      demo/src/main/java/com/otaliastudios/cameraview/demo/CameraActivity.java
  28. 187
      demo/src/main/java/com/otaliastudios/cameraview/demo/Option.java
  29. 22
      demo/src/main/java/com/otaliastudios/cameraview/demo/OptionView.java
  30. 19
      demo/src/main/res/layout/activity_camera.xml
  31. 4
      docs/_posts/2018-12-20-changelog.md
  32. 2
      docs/_posts/2018-12-20-debugging.md
  33. 2
      docs/_posts/2018-12-20-error-handling.md
  34. 2
      docs/_posts/2018-12-20-more-features.md
  35. 2
      docs/_posts/2018-12-20-previews.md
  36. 2
      docs/_posts/2018-12-20-runtime-permissions.md
  37. 73
      docs/_posts/2019-07-14-watermarks-and-overlays.md
  38. 19
      docs/index.md

@ -26,22 +26,22 @@ compile 'com.otaliastudios:cameraview:2.0.0-beta06'
``` ```
- Fast & reliable - Fast & reliable
- Gestures support - Gestures support [[docs]](https://natario1.github.io/CameraView/docs/gestures.html)
- Camera1 or Camera2 powered engine - Camera1 or Camera2 powered engine [[docs]](https://natario1.github.io/CameraView/docs/previews.html)
- Frame processing support - Frame processing support [[docs]](https://natario1.github.io/CameraView/docs/frame-processing.html)
- OpenGL powered preview - Watermarks & animated overlays [[docs]](https://natario1.github.io/CameraView/docs/watermarks-and-overlays.html)
- Take high-quality content with `takePicture` and `takeVideo` - OpenGL powered preview [[docs]](https://natario1.github.io/CameraView/docs/previews.html)
- Take super-fast snapshots with `takePictureSnapshot` and `takeVideoSnapshot` - Take high-quality content with `takePicture` and `takeVideo` [[docs]](https://natario1.github.io/CameraView/docs/capturing-media.html)
- Smart sizing: create a `CameraView` of any size - Take super-fast snapshots with `takePictureSnapshot` and `takeVideoSnapshot` [[docs]](https://natario1.github.io/CameraView/docs/capturing-media.html)
- Control HDR, flash, zoom, white balance, exposure, location, grid drawing & more - Smart sizing: create a `CameraView` of any size [[docs]](https://natario1.github.io/CameraView/docs/preview-size.html)
- Control HDR, flash, zoom, white balance, exposure, location, grid drawing & more [[docs]](https://natario1.github.io/CameraView/docs/controls.html)
- Lightweight - Lightweight
- Works down to API level 15 - Works down to API level 15
- Well tested - Well tested
Read the [official website](https://natario1.github.io/CameraView) for setup instructions and documentation. Read the [official website](https://natario1.github.io/CameraView) for setup instructions and documentation.
You might also be interested in [changelog](https://natario1.github.io/CameraView/about/changelog.html)
- Coming from v1? Take a look at the [migration guide](https://natario1.github.io/CameraView/extra/v1-migration-guide.html) or in the [v1 migration guide](https://natario1.github.io/CameraView/extra/v1-migration-guide.html).
- Changelog is hosted [here](https://natario1.github.io/CameraView/about/changelog.html)
<p> <p>
<img src="docs/static/screen1.jpg" width="250" vspace="20" hspace="5"> <img src="docs/static/screen1.jpg" width="250" vspace="20" hspace="5">
@ -56,6 +56,70 @@ donation or become a sponsor, in which case your company logo will immediately s
Thank you for any contribution - it is a nice reward for what has been done until now, and a Thank you for any contribution - it is a nice reward for what has been done until now, and a
motivation boost to push the library forward. motivation boost to push the library forward.
```xml
<com.otaliastudios.cameraview.CameraView
xmlns:app="http://schemas.android.com/apk/res-auto"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
app:cameraPictureSizeMinWidth="@integer/picture_min_width"
app:cameraPictureSizeMinHeight="@integer/picture_min_height"
app:cameraPictureSizeMaxWidth="@integer/picture_max_width"
app:cameraPictureSizeMaxHeight="@integer/picture_max_height"
app:cameraPictureSizeMinArea="@integer/picture_min_area"
app:cameraPictureSizeMaxArea="@integer/picture_max_area"
app:cameraPictureSizeSmallest="false|true"
app:cameraPictureSizeBiggest="false|true"
app:cameraPictureSizeAspectRatio="@string/video_ratio"
app:cameraVideoSizeMinWidth="@integer/video_min_width"
app:cameraVideoSizeMinHeight="@integer/video_min_height"
app:cameraVideoSizeMaxWidth="@integer/video_max_width"
app:cameraVideoSizeMaxHeight="@integer/video_max_height"
app:cameraVideoSizeMinArea="@integer/video_min_area"
app:cameraVideoSizeMaxArea="@integer/video_max_area"
app:cameraVideoSizeSmallest="false|true"
app:cameraVideoSizeBiggest="false|true"
app:cameraVideoSizeAspectRatio="@string/video_ratio"
app:cameraSnapshotMaxWidth="@integer/snapshot_max_width"
app:cameraSnapshotMaxHeight="@integer/snapshot_max_height"
app:cameraVideoBitRate="@integer/video_bit_rate"
app:cameraAudioBitRate="@integer/audio_bit_rate"
app:cameraGestureTap="none|autoFocus|takePicture"
app:cameraGestureLongTap="none|autoFocus|takePicture"
app:cameraGesturePinch="none|zoom|exposureCorrection"
app:cameraGestureScrollHorizontal="none|zoom|exposureCorrection"
app:cameraGestureScrollVertical="none|zoom|exposureCorrection"
app:cameraEngine="camera1|camera2"
app:cameraPreview="glSurface|surface|texture"
app:cameraFacing="back|front"
app:cameraHdr="on|off"
app:cameraFlash="on|auto|torch|off"
app:cameraWhiteBalance="auto|cloudy|daylight|fluorescent|incandescent"
app:cameraMode="picture|video"
app:cameraAudio="on|off"
app:cameraGrid="draw3x3|draw4x4|drawPhi|off"
app:cameraGridColor="@color/grid_color"
app:cameraPlaySounds="true|false"
app:cameraVideoMaxSize="@integer/video_max_size"
app:cameraVideoMaxDuration="@integer/video_max_duration"
app:cameraVideoCodec="deviceDefault|h264|h263"
app:cameraAutoFocusResetDelay="@integer/autofocus_delay"
app:cameraAutoFocusMarker="@string/cameraview_default_autofocus_marker"
app:cameraUseDeviceOrientation="true|false"
app:cameraExperimental="false|true">
<!-- Watermark! -->
<ImageView
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_gravity="bottom|end"
android:src="@drawable/watermark"
app:layout_drawOnPreview="true|false"
app:layout_drawOnPictureSnapshot="true|false"
app:layout_drawOnVideoSnapshot="true|false"/>
</com.otaliastudios.cameraview.CameraView>
```
## Backers ## Backers
Thanks to all backers! [Become a backer.](https://opencollective.com/cameraview#backer) Thanks to all backers! [Become a backer.](https://opencollective.com/cameraview#backer)

@ -33,8 +33,6 @@ import static org.mockito.Mockito.mock;
public class BaseTest { public class BaseTest {
public static CameraLogger LOG = CameraLogger.create("Test");
private static KeyguardManager.KeyguardLock keyguardLock; private static KeyguardManager.KeyguardLock keyguardLock;
private static PowerManager.WakeLock wakeLock; private static PowerManager.WakeLock wakeLock;

@ -8,7 +8,12 @@ import android.location.Location;
import androidx.annotation.NonNull; import androidx.annotation.NonNull;
import androidx.test.ext.junit.runners.AndroidJUnit4; import androidx.test.ext.junit.runners.AndroidJUnit4;
import androidx.test.filters.MediumTest; import androidx.test.filters.MediumTest;
import android.util.AttributeSet;
import android.view.Gravity;
import android.view.LayoutInflater;
import android.view.MotionEvent; import android.view.MotionEvent;
import android.view.View;
import android.view.ViewGroup; import android.view.ViewGroup;
import com.otaliastudios.cameraview.controls.Audio; import com.otaliastudios.cameraview.controls.Audio;
@ -36,6 +41,7 @@ import com.otaliastudios.cameraview.internal.utils.Op;
import com.otaliastudios.cameraview.markers.AutoFocusMarker; import com.otaliastudios.cameraview.markers.AutoFocusMarker;
import com.otaliastudios.cameraview.markers.DefaultAutoFocusMarker; import com.otaliastudios.cameraview.markers.DefaultAutoFocusMarker;
import com.otaliastudios.cameraview.markers.MarkerLayout; import com.otaliastudios.cameraview.markers.MarkerLayout;
import com.otaliastudios.cameraview.overlay.OverlayLayout;
import com.otaliastudios.cameraview.preview.MockCameraPreview; import com.otaliastudios.cameraview.preview.MockCameraPreview;
import com.otaliastudios.cameraview.preview.CameraPreview; import com.otaliastudios.cameraview.preview.CameraPreview;
import com.otaliastudios.cameraview.size.Size; import com.otaliastudios.cameraview.size.Size;
@ -48,6 +54,7 @@ import org.junit.Test;
import org.junit.runner.RunWith; import org.junit.runner.RunWith;
import org.mockito.invocation.InvocationOnMock; import org.mockito.invocation.InvocationOnMock;
import org.mockito.stubbing.Answer; import org.mockito.stubbing.Answer;
import org.w3c.dom.Attr;
import static org.junit.Assert.*; import static org.junit.Assert.*;
@ -777,5 +784,53 @@ public class CameraViewTest extends BaseTest {
//endregion //endregion
//region Overlays
@Test
public void testOverlays_generateLayoutParams() {
cameraView.mOverlayLayout = spy(cameraView.mOverlayLayout);
LayoutInflater inflater = LayoutInflater.from(context());
View overlay = inflater.inflate(com.otaliastudios.cameraview.test.R.layout.overlay, cameraView, false);
assertTrue(overlay.getLayoutParams() instanceof OverlayLayout.LayoutParams);
verify(cameraView.mOverlayLayout, times(1)).isOverlay(any(AttributeSet.class));
verify(cameraView.mOverlayLayout, times(1)).generateLayoutParams(any(AttributeSet.class));
}
@Test
public void testOverlays_dontGenerateLayoutParams() {
cameraView.mOverlayLayout = spy(cameraView.mOverlayLayout);
LayoutInflater inflater = LayoutInflater.from(context());
View overlay = inflater.inflate(com.otaliastudios.cameraview.test.R.layout.not_overlay, cameraView, false);
assertFalse(overlay.getLayoutParams() instanceof OverlayLayout.LayoutParams);
verify(cameraView.mOverlayLayout, times(1)).isOverlay(any(AttributeSet.class));
verify(cameraView.mOverlayLayout, never()).generateLayoutParams(any(AttributeSet.class));
}
@Test
public void testOverlays_addOverlayView() {
cameraView.mOverlayLayout = spy(cameraView.mOverlayLayout);
View overlay = new View(context());
OverlayLayout.LayoutParams params = new OverlayLayout.LayoutParams(10, 10);
int count = cameraView.getChildCount();
cameraView.addView(overlay, 0, params);
assertEquals(count, cameraView.getChildCount()); // Not added to CameraView
verify(cameraView.mOverlayLayout, times(1)).isOverlay(params);
verify(cameraView.mOverlayLayout, times(1)).addView(overlay, params);
}
@Test
public void testOverlays_dontAddOverlayView() {
cameraView.mOverlayLayout = spy(cameraView.mOverlayLayout);
View overlay = new View(context());
ViewGroup.LayoutParams params = new ViewGroup.LayoutParams(10, 10);
int count = cameraView.getChildCount();
cameraView.addView(overlay, 0, params);
assertEquals(count + 1, cameraView.getChildCount());
verify(cameraView.mOverlayLayout, times(1)).isOverlay(params);
verify(cameraView.mOverlayLayout, never()).addView(overlay, params);
}
//endregion
// TODO: test permissions // TODO: test permissions
} }

@ -2,6 +2,7 @@ package com.otaliastudios.cameraview.engine;
import android.graphics.Bitmap; import android.graphics.Bitmap;
import android.graphics.Canvas;
import android.graphics.PointF; import android.graphics.PointF;
import android.hardware.Camera; import android.hardware.Camera;
import android.os.Build; import android.os.Build;
@ -9,6 +10,7 @@ import android.os.Build;
import com.otaliastudios.cameraview.BaseTest; import com.otaliastudios.cameraview.BaseTest;
import com.otaliastudios.cameraview.CameraException; import com.otaliastudios.cameraview.CameraException;
import com.otaliastudios.cameraview.CameraListener; import com.otaliastudios.cameraview.CameraListener;
import com.otaliastudios.cameraview.CameraLogger;
import com.otaliastudios.cameraview.CameraOptions; import com.otaliastudios.cameraview.CameraOptions;
import com.otaliastudios.cameraview.CameraUtils; import com.otaliastudios.cameraview.CameraUtils;
import com.otaliastudios.cameraview.CameraView; import com.otaliastudios.cameraview.CameraView;
@ -25,6 +27,7 @@ import com.otaliastudios.cameraview.frame.Frame;
import com.otaliastudios.cameraview.frame.FrameProcessor; import com.otaliastudios.cameraview.frame.FrameProcessor;
import com.otaliastudios.cameraview.internal.utils.Op; import com.otaliastudios.cameraview.internal.utils.Op;
import com.otaliastudios.cameraview.internal.utils.WorkerHandler; import com.otaliastudios.cameraview.internal.utils.WorkerHandler;
import com.otaliastudios.cameraview.overlay.Overlay;
import com.otaliastudios.cameraview.size.Size; import com.otaliastudios.cameraview.size.Size;
import androidx.annotation.NonNull; import androidx.annotation.NonNull;
@ -48,14 +51,21 @@ import static org.junit.Assert.assertNotNull;
import static org.junit.Assert.assertNull; import static org.junit.Assert.assertNull;
import static org.junit.Assert.assertTrue; import static org.junit.Assert.assertTrue;
import static org.mockito.ArgumentMatchers.argThat; import static org.mockito.ArgumentMatchers.argThat;
import static org.mockito.ArgumentMatchers.eq;
import static org.mockito.Matchers.anyBoolean; import static org.mockito.Matchers.anyBoolean;
import static org.mockito.Mockito.any; import static org.mockito.Mockito.any;
import static org.mockito.Mockito.atLeastOnce;
import static org.mockito.Mockito.mock; import static org.mockito.Mockito.mock;
import static org.mockito.Mockito.spy; import static org.mockito.Mockito.spy;
import static org.mockito.Mockito.times;
import static org.mockito.Mockito.verify;
import static org.mockito.Mockito.when;
public abstract class CameraIntegrationTest extends BaseTest { public abstract class CameraIntegrationTest extends BaseTest {
private final static long DELAY = 9000; private final static CameraLogger LOG = CameraLogger.create(CameraIntegrationTest.class.getSimpleName());
private final static long DELAY = 8000;
private final static long VIDEO_DELAY = 16000;
@Rule @Rule
public ActivityTestRule<TestActivity> rule = new ActivityTestRule<>(TestActivity.class); public ActivityTestRule<TestActivity> rule = new ActivityTestRule<>(TestActivity.class);
@ -75,6 +85,7 @@ public abstract class CameraIntegrationTest extends BaseTest {
@Before @Before
public void setUp() { public void setUp() {
LOG.e("Test started. Setting up camera.");
WorkerHandler.destroy(); WorkerHandler.destroy();
ui(new Runnable() { ui(new Runnable() {
@ -113,7 +124,7 @@ public abstract class CameraIntegrationTest extends BaseTest {
@After @After
public void tearDown() { public void tearDown() {
camera.stopVideo(); LOG.e("Test ended. Tearing down camera.");
camera.destroy(); camera.destroy();
WorkerHandler.destroy(); WorkerHandler.destroy();
} }
@ -133,6 +144,7 @@ public abstract class CameraIntegrationTest extends BaseTest {
if (expectSuccess) { if (expectSuccess) {
assertNotNull("Can open", result); assertNotNull("Can open", result);
// Extra wait for the bind state. // Extra wait for the bind state.
// TODO fix this and other while {} in this class in a more elegant way.
while (controller.getBindState() != CameraEngine.STATE_STARTED) {} while (controller.getBindState() != CameraEngine.STATE_STARTED) {}
} else { } else {
assertNull("Should not open", result); assertNull("Should not open", result);
@ -163,7 +175,7 @@ public abstract class CameraIntegrationTest extends BaseTest {
return argument.getReason() == CameraException.REASON_VIDEO_FAILED; return argument.getReason() == CameraException.REASON_VIDEO_FAILED;
} }
})); }));
VideoResult result = video.await(DELAY); VideoResult result = video.await(VIDEO_DELAY);
if (expectSuccess) { if (expectSuccess) {
assertNotNull("Should end video", result); assertNotNull("Should end video", result);
} else { } else {
@ -219,6 +231,7 @@ public abstract class CameraIntegrationTest extends BaseTest {
} }
} }
@SuppressWarnings("unused")
private void takeVideoSnapshotSync(boolean expectSuccess) { private void takeVideoSnapshotSync(boolean expectSuccess) {
takeVideoSnapshotSync(expectSuccess,0); takeVideoSnapshotSync(expectSuccess,0);
} }
@ -431,7 +444,6 @@ public abstract class CameraIntegrationTest extends BaseTest {
@Test @Test
public void testSetAudio() { public void testSetAudio() {
// TODO: when permissions are managed, check that Audio.ON triggers the audio permission
openSync(true); openSync(true);
Audio[] values = Audio.values(); Audio[] values = Audio.values();
for (Audio value : values) { for (Audio value : values) {
@ -472,7 +484,7 @@ public abstract class CameraIntegrationTest extends BaseTest {
assertEquals(oldValue, camera.getPlaySounds()); assertEquals(oldValue, camera.getPlaySounds());
} }
} else { } else {
// TODO do when Camera2 is completed assertEquals(newValue, camera.getPlaySounds());
} }
} }
@ -502,6 +514,14 @@ public abstract class CameraIntegrationTest extends BaseTest {
waitForVideoResult(true); waitForVideoResult(true);
} }
@Test
public void testStartEndVideoSnapshot() {
// TODO should check api level for snapshot?
openSync(true);
takeVideoSnapshotSync(true, 4000);
waitForVideoResult(true);
}
@Test @Test
public void testEndVideo_withoutStarting() { public void testEndVideo_withoutStarting() {
camera.setMode(Mode.VIDEO); camera.setMode(Mode.VIDEO);
@ -519,6 +539,15 @@ public abstract class CameraIntegrationTest extends BaseTest {
waitForVideoResult(true); waitForVideoResult(true);
} }
@Test
public void testEndVideoSnapshot_withMaxSize() {
// TODO
// camera.setVideoMaxSize(3000*1000);
// waitForOpen(true);
// waitForVideoStart();
// waitForVideoEnd(true);
}
@Test @Test
public void testEndVideo_withMaxDuration() { public void testEndVideo_withMaxDuration() {
camera.setMode(Mode.VIDEO); camera.setMode(Mode.VIDEO);
@ -528,6 +557,15 @@ public abstract class CameraIntegrationTest extends BaseTest {
waitForVideoResult(true); waitForVideoResult(true);
} }
@Test
public void testEndVideoSnapshot_withMaxDuration() {
// TODO
// camera.setVideoMaxDuration(4000);
// waitForOpen(true);
// waitForVideoStart();
// waitForVideoEnd(true);
}
//endregion //endregion
//region startAutoFocus //region startAutoFocus
@ -589,18 +627,22 @@ public abstract class CameraIntegrationTest extends BaseTest {
camera.takePicture(); camera.takePicture();
boolean did = latch.await(4, TimeUnit.SECONDS); boolean did = latch.await(4, TimeUnit.SECONDS);
assertFalse(did); assertFalse(did);
assertEquals(latch.getCount(), 1); assertEquals(1, latch.getCount());
} }
@SuppressWarnings("StatementWithEmptyBody")
@Test @Test
public void testCapturePicture_size() throws Exception { public void testCapturePicture_size() throws Exception {
openSync(true); openSync(true);
// PictureSize can still be null after opened. // PictureSize can still be null after opened.
// TODO be more elegant
while (camera.getPictureSize() == null) {} while (camera.getPictureSize() == null) {}
Size size = camera.getPictureSize(); Size size = camera.getPictureSize();
camera.takePicture(); camera.takePicture();
PictureResult result = waitForPictureResult(true); PictureResult result = waitForPictureResult(true);
assertNotNull(result);
Bitmap bitmap = CameraUtils.decodeBitmap(result.getData(), Integer.MAX_VALUE, Integer.MAX_VALUE); Bitmap bitmap = CameraUtils.decodeBitmap(result.getData(), Integer.MAX_VALUE, Integer.MAX_VALUE);
assertNotNull(bitmap);
assertEquals(result.getSize(), size); assertEquals(result.getSize(), size);
assertEquals(bitmap.getWidth(), size.getWidth()); assertEquals(bitmap.getWidth(), size.getWidth());
assertEquals(bitmap.getHeight(), size.getHeight()); assertEquals(bitmap.getHeight(), size.getHeight());
@ -638,16 +680,20 @@ public abstract class CameraIntegrationTest extends BaseTest {
assertEquals(1, latch.getCount()); assertEquals(1, latch.getCount());
} }
@SuppressWarnings("StatementWithEmptyBody")
@Test @Test
public void testCaptureSnapshot_size() throws Exception { public void testCaptureSnapshot_size() throws Exception {
openSync(true); openSync(true);
// SnapshotSize can still be null after opened. // SnapshotSize can still be null after opened.
// TODO be more elegant
while (camera.getSnapshotSize() == null) {} while (camera.getSnapshotSize() == null) {}
Size size = camera.getSnapshotSize(); Size size = camera.getSnapshotSize();
camera.takePictureSnapshot(); camera.takePictureSnapshot();
PictureResult result = waitForPictureResult(true); PictureResult result = waitForPictureResult(true);
assertNotNull(result);
Bitmap bitmap = CameraUtils.decodeBitmap(result.getData(), Integer.MAX_VALUE, Integer.MAX_VALUE); Bitmap bitmap = CameraUtils.decodeBitmap(result.getData(), Integer.MAX_VALUE, Integer.MAX_VALUE);
assertNotNull(bitmap);
assertEquals(result.getSize(), size); assertEquals(result.getSize(), size);
assertEquals(bitmap.getWidth(), size.getWidth()); assertEquals(bitmap.getWidth(), size.getWidth());
assertEquals(bitmap.getHeight(), size.getHeight()); assertEquals(bitmap.getHeight(), size.getHeight());
@ -735,4 +781,32 @@ public abstract class CameraIntegrationTest extends BaseTest {
} }
//endregion //endregion
//region Overlays
@Test
public void testOverlay_forPictureSnapshot() {
Overlay overlay = mock(Overlay.class);
when(overlay.drawsOn(any(Overlay.Target.class))).thenReturn(true);
controller.setOverlay(overlay);
openSync(true);
camera.takePictureSnapshot();
waitForPictureResult(true);
verify(overlay, atLeastOnce()).drawsOn(Overlay.Target.PICTURE_SNAPSHOT);
verify(overlay, times(1)).drawOn(eq(Overlay.Target.PICTURE_SNAPSHOT), any(Canvas.class));
}
@Test
public void testOverlay_forVideoSnapshot() {
Overlay overlay = mock(Overlay.class);
when(overlay.drawsOn(any(Overlay.Target.class))).thenReturn(true);
controller.setOverlay(overlay);
openSync(true);
takeVideoSnapshotSync(true, 4000);
waitForVideoResult(true);
verify(overlay, atLeastOnce()).drawsOn(Overlay.Target.VIDEO_SNAPSHOT);
verify(overlay, atLeastOnce()).drawOn(eq(Overlay.Target.VIDEO_SNAPSHOT), any(Canvas.class));
}
//endregion
} }

@ -0,0 +1,197 @@
package com.otaliastudios.cameraview.overlay;
import android.content.res.Resources;
import android.content.res.TypedArray;
import android.content.res.XmlResourceParser;
import android.graphics.Canvas;
import android.util.AttributeSet;
import android.util.Xml;
import android.view.Gravity;
import android.view.LayoutInflater;
import android.view.View;
import android.view.ViewGroup;
import android.widget.FrameLayout;
import androidx.annotation.NonNull;
import androidx.test.annotation.UiThreadTest;
import androidx.test.ext.junit.runners.AndroidJUnit4;
import androidx.test.filters.SmallTest;
import com.otaliastudios.cameraview.BaseTest;
import com.otaliastudios.cameraview.R;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.ArgumentCaptor;
import org.mockito.ArgumentMatcher;
import org.mockito.invocation.InvocationOnMock;
import org.mockito.stubbing.Answer;
import org.w3c.dom.Attr;
import org.xmlpull.v1.XmlPullParser;
import org.xmlpull.v1.XmlPullParserException;
import java.io.IOException;
import static org.junit.Assert.assertEquals;
import static org.junit.Assert.assertFalse;
import static org.junit.Assert.assertNotEquals;
import static org.junit.Assert.assertNotNull;
import static org.junit.Assert.assertTrue;
import static org.mockito.ArgumentMatchers.any;
import static org.mockito.ArgumentMatchers.anyFloat;
import static org.mockito.ArgumentMatchers.anyLong;
import static org.mockito.ArgumentMatchers.argThat;
import static org.mockito.ArgumentMatchers.eq;
import static org.mockito.ArgumentMatchers.notNull;
import static org.mockito.Mockito.doNothing;
import static org.mockito.Mockito.mock;
import static org.mockito.Mockito.never;
import static org.mockito.Mockito.reset;
import static org.mockito.Mockito.spy;
import static org.mockito.Mockito.times;
import static org.mockito.Mockito.verify;
import static org.mockito.Mockito.when;
@RunWith(AndroidJUnit4.class)
@SmallTest
public class OverlayLayoutTest extends BaseTest {
private OverlayLayout overlayLayout;
@Before
public void setUp() {
overlayLayout = spy(new OverlayLayout(context()));
}
@After
public void tearDown() {
overlayLayout = null;
}
@Test
public void testIsOverlay_LayoutParams() {
ViewGroup.LayoutParams params;
params = new ViewGroup.LayoutParams(10, 10);
assertFalse(overlayLayout.isOverlay(params));
params = new OverlayLayout.LayoutParams(10, 10);
assertTrue(overlayLayout.isOverlay(params));
}
@Test
public void testIsOverlay_attributeSet() throws Exception {
int layout1 = com.otaliastudios.cameraview.test.R.layout.overlay;
int layout2 = com.otaliastudios.cameraview.test.R.layout.not_overlay;
AttributeSet set1 = getAttributeSet(layout1);
assertTrue(overlayLayout.isOverlay(set1));
AttributeSet set2 = getAttributeSet(layout2);
assertFalse(overlayLayout.isOverlay(set2));
}
@NonNull
private AttributeSet getAttributeSet(int layout) throws Exception {
// Get the attribute set in the correct state: use a parser and move to START_TAG
XmlResourceParser parser = context().getResources().getLayout(layout);
//noinspection StatementWithEmptyBody
while (parser.next() != XmlResourceParser.START_TAG) {}
return Xml.asAttributeSet(parser);
}
@Test
public void testLayoutParams_drawsOn() {
OverlayLayout.LayoutParams params = new OverlayLayout.LayoutParams(10, 10);
assertFalse(params.drawsOn(Overlay.Target.PREVIEW));
assertFalse(params.drawsOn(Overlay.Target.PICTURE_SNAPSHOT));
assertFalse(params.drawsOn(Overlay.Target.VIDEO_SNAPSHOT));
params.drawOnPreview = true;
assertTrue(params.drawsOn(Overlay.Target.PREVIEW));
params.drawOnPictureSnapshot = true;
assertTrue(params.drawsOn(Overlay.Target.PICTURE_SNAPSHOT));
params.drawOnVideoSnapshot = true;
assertTrue(params.drawsOn(Overlay.Target.VIDEO_SNAPSHOT));
}
@Test
public void testLayoutParams_toString() {
OverlayLayout.LayoutParams params = new OverlayLayout.LayoutParams(10, 10);
String string = params.toString();
assertTrue(string.contains("drawOnPreview"));
assertTrue(string.contains("drawOnPictureSnapshot"));
assertTrue(string.contains("drawOnVideoSnapshot"));
}
@Test
public void testDrawChild() {
Canvas canvas = new Canvas();
OverlayLayout.LayoutParams params = new OverlayLayout.LayoutParams(10, 10);
View child = new View(context());
child.setLayoutParams(params);
when(overlayLayout.doDrawChild(canvas, child, 0)).thenReturn(true);
overlayLayout.currentTarget = Overlay.Target.PREVIEW;
assertFalse(overlayLayout.drawChild(canvas, child, 0));
params.drawOnPreview = true;
assertTrue(overlayLayout.drawChild(canvas, child, 0));
overlayLayout.currentTarget = Overlay.Target.PICTURE_SNAPSHOT;
assertFalse(overlayLayout.drawChild(canvas, child, 0));
params.drawOnPictureSnapshot = true;
assertTrue(overlayLayout.drawChild(canvas, child, 0));
overlayLayout.currentTarget = Overlay.Target.VIDEO_SNAPSHOT;
assertFalse(overlayLayout.drawChild(canvas, child, 0));
params.drawOnVideoSnapshot = true;
assertTrue(overlayLayout.drawChild(canvas, child, 0));
}
@UiThreadTest
@Test
public void testDraw() {
Canvas canvas = new Canvas();
when(overlayLayout.drawsOn(Overlay.Target.PREVIEW)).thenReturn(false);
overlayLayout.draw(canvas);
verify(overlayLayout, never()).drawOn(Overlay.Target.PREVIEW, canvas);
when(overlayLayout.drawsOn(Overlay.Target.PREVIEW)).thenReturn(true);
overlayLayout.draw(canvas);
verify(overlayLayout, times(1)).drawOn(Overlay.Target.PREVIEW, canvas);
}
@UiThreadTest
@Test
public void testDrawOn() {
Canvas canvas = spy(new Canvas());
View child = new View(context());
OverlayLayout.LayoutParams params = new OverlayLayout.LayoutParams(10, 10);
params.drawOnPreview = true;
params.drawOnPictureSnapshot = true;
params.drawOnVideoSnapshot = true;
overlayLayout.addView(child, params);
overlayLayout.drawOn(Overlay.Target.PREVIEW, canvas);
verify(canvas, never()).scale(anyFloat(), anyFloat());
verify(overlayLayout, times(1)).doDrawChild(eq(canvas), eq(child), anyLong());
reset(canvas);
reset(overlayLayout);
overlayLayout.drawOn(Overlay.Target.PICTURE_SNAPSHOT, canvas);
verify(canvas, times(1)).scale(anyFloat(), anyFloat());
verify(overlayLayout, times(1)).doDrawChild(eq(canvas), eq(child), anyLong());
reset(canvas);
reset(overlayLayout);
overlayLayout.drawOn(Overlay.Target.VIDEO_SNAPSHOT, canvas);
verify(canvas, times(1)).scale(anyFloat(), anyFloat());
verify(overlayLayout, times(1)).doDrawChild(eq(canvas), eq(child), anyLong());
reset(canvas);
reset(overlayLayout);
}
}

@ -0,0 +1,4 @@
<?xml version="1.0" encoding="utf-8"?>
<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent"
android:layout_height="match_parent"/>

@ -0,0 +1,8 @@
<?xml version="1.0" encoding="utf-8"?>
<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
app:layout_drawOnPreview="true"
app:layout_drawOnPictureSnapshot="true"
app:layout_drawOnVideoSnapshot="true"
android:layout_width="match_parent"
android:layout_height="match_parent"/>

@ -28,6 +28,7 @@ import androidx.annotation.NonNull;
import androidx.annotation.Nullable; import androidx.annotation.Nullable;
import android.util.AttributeSet; import android.util.AttributeSet;
import android.view.MotionEvent; import android.view.MotionEvent;
import android.view.View;
import android.view.ViewGroup; import android.view.ViewGroup;
import android.widget.FrameLayout; import android.widget.FrameLayout;
@ -62,6 +63,7 @@ import com.otaliastudios.cameraview.internal.utils.CropHelper;
import com.otaliastudios.cameraview.internal.utils.OrientationHelper; import com.otaliastudios.cameraview.internal.utils.OrientationHelper;
import com.otaliastudios.cameraview.internal.utils.WorkerHandler; import com.otaliastudios.cameraview.internal.utils.WorkerHandler;
import com.otaliastudios.cameraview.markers.MarkerParser; import com.otaliastudios.cameraview.markers.MarkerParser;
import com.otaliastudios.cameraview.overlay.OverlayLayout;
import com.otaliastudios.cameraview.preview.CameraPreview; import com.otaliastudios.cameraview.preview.CameraPreview;
import com.otaliastudios.cameraview.preview.GlCameraPreview; import com.otaliastudios.cameraview.preview.GlCameraPreview;
import com.otaliastudios.cameraview.preview.SurfaceCameraPreview; import com.otaliastudios.cameraview.preview.SurfaceCameraPreview;
@ -125,12 +127,15 @@ public class CameraView extends FrameLayout implements LifecycleObserver {
@VisibleForTesting ScrollGestureFinder mScrollGestureFinder; @VisibleForTesting ScrollGestureFinder mScrollGestureFinder;
// Views // Views
GridLinesLayout mGridLinesLayout; @VisibleForTesting GridLinesLayout mGridLinesLayout;
MarkerLayout mMarkerLayout; @VisibleForTesting MarkerLayout mMarkerLayout;
private boolean mKeepScreenOn; private boolean mKeepScreenOn;
@SuppressWarnings({"FieldCanBeLocal", "unused"}) @SuppressWarnings({"FieldCanBeLocal", "unused"})
private boolean mExperimental; private boolean mExperimental;
// Overlays
@VisibleForTesting OverlayLayout mOverlayLayout;
// Threading // Threading
private Handler mUiHandler; private Handler mUiHandler;
private WorkerHandler mFrameProcessorsHandler; private WorkerHandler mFrameProcessorsHandler;
@ -187,9 +192,11 @@ public class CameraView extends FrameLayout implements LifecycleObserver {
// Views // Views
mGridLinesLayout = new GridLinesLayout(context); mGridLinesLayout = new GridLinesLayout(context);
mOverlayLayout = new OverlayLayout(context);
mMarkerLayout = new MarkerLayout(context); mMarkerLayout = new MarkerLayout(context);
addView(mGridLinesLayout); addView(mGridLinesLayout);
addView(mMarkerLayout); addView(mMarkerLayout);
addView(mOverlayLayout);
// Create the engine // Create the engine
doInstantiateEngine(); doInstantiateEngine();
@ -237,7 +244,10 @@ public class CameraView extends FrameLayout implements LifecycleObserver {
* {@link #setEngine(Engine)} is called. * {@link #setEngine(Engine)} is called.
*/ */
private void doInstantiateEngine() { private void doInstantiateEngine() {
LOG.w("doInstantiateEngine:", "instantiating. engine:", mEngine);
mCameraEngine = instantiateCameraEngine(mEngine, mCameraCallbacks); mCameraEngine = instantiateCameraEngine(mEngine, mCameraCallbacks);
LOG.w("doInstantiateEngine:", "instantiated. engine:", mCameraEngine.getClass().getSimpleName());
mCameraEngine.setOverlay(mOverlayLayout);
} }
/** /**
@ -247,7 +257,9 @@ public class CameraView extends FrameLayout implements LifecycleObserver {
*/ */
@VisibleForTesting @VisibleForTesting
void doInstantiatePreview() { void doInstantiatePreview() {
LOG.w("doInstantiateEngine:", "instantiating. preview:", mPreview);
mCameraPreview = instantiatePreview(mPreview, getContext(), this); mCameraPreview = instantiatePreview(mPreview, getContext(), this);
LOG.w("doInstantiateEngine:", "instantiated. preview:", mCameraPreview.getClass().getSimpleName());
mCameraEngine.setPreview(mCameraPreview); mCameraEngine.setPreview(mCameraPreview);
} }
@ -279,7 +291,6 @@ public class CameraView extends FrameLayout implements LifecycleObserver {
*/ */
@NonNull @NonNull
protected CameraPreview instantiatePreview(@NonNull Preview preview, @NonNull Context context, @NonNull ViewGroup container) { protected CameraPreview instantiatePreview(@NonNull Preview preview, @NonNull Context context, @NonNull ViewGroup container) {
LOG.w("preview:", "isHardwareAccelerated:", isHardwareAccelerated());
switch (preview) { switch (preview) {
case SURFACE: case SURFACE:
return new SurfaceCameraPreview(context, container); return new SurfaceCameraPreview(context, container);
@ -300,6 +311,7 @@ public class CameraView extends FrameLayout implements LifecycleObserver {
protected void onAttachedToWindow() { protected void onAttachedToWindow() {
super.onAttachedToWindow(); super.onAttachedToWindow();
if (mCameraPreview == null) { if (mCameraPreview == null) {
// isHardwareAccelerated will return the real value only after we are // isHardwareAccelerated will return the real value only after we are
// attached. That's why we instantiate the preview here. // attached. That's why we instantiate the preview here.
doInstantiatePreview(); doInstantiatePreview();
@ -384,7 +396,7 @@ public class CameraView extends FrameLayout implements LifecycleObserver {
// other than respect it. The preview will eventually be cropped at the sides (by PreviewImpl scaling) // other than respect it. The preview will eventually be cropped at the sides (by PreviewImpl scaling)
// except the case in which these fixed dimensions manage to fit exactly the preview aspect ratio. // except the case in which these fixed dimensions manage to fit exactly the preview aspect ratio.
if (widthMode == EXACTLY && heightMode == EXACTLY) { if (widthMode == EXACTLY && heightMode == EXACTLY) {
LOG.w("onMeasure:", "both are MATCH_PARENT or fixed value. We adapt.", LOG.i("onMeasure:", "both are MATCH_PARENT or fixed value. We adapt.",
"This means CROP_CENTER.", "(" + widthValue + "x" + heightValue + ")"); "This means CROP_CENTER.", "(" + widthValue + "x" + heightValue + ")");
super.onMeasure(widthMeasureSpec, heightMeasureSpec); super.onMeasure(widthMeasureSpec, heightMeasureSpec);
return; return;
@ -1218,6 +1230,7 @@ public class CameraView extends FrameLayout implements LifecycleObserver {
mCameraEngine.setAutoFocusResetDelay(delayMillis); mCameraEngine.setAutoFocusResetDelay(delayMillis);
} }
/** /**
* Returns the current delay in milliseconds to reset the focus after an autofocus process. * Returns the current delay in milliseconds to reset the focus after an autofocus process.
* @return the current autofocus reset delay in milliseconds. * @return the current autofocus reset delay in milliseconds.
@ -2078,4 +2091,26 @@ public class CameraView extends FrameLayout implements LifecycleObserver {
} }
//endregion //endregion
//region Overlays
@Override
public LayoutParams generateLayoutParams(AttributeSet attributeSet) {
if (mOverlayLayout.isOverlay(attributeSet)) {
return mOverlayLayout.generateLayoutParams(attributeSet);
}
return super.generateLayoutParams(attributeSet);
}
// We don't support removeView on overlays for now.
@Override
public void addView(View child, int index, ViewGroup.LayoutParams params) {
if (mOverlayLayout.isOverlay(params)) {
mOverlayLayout.addView(child, params);
} else {
super.addView(child, index, params);
}
}
//endregion
} }

@ -312,7 +312,7 @@ public class Camera1Engine extends CameraEngine implements
AspectRatio outputRatio = getAngles().flip(Reference.OUTPUT, Reference.VIEW) ? viewAspectRatio.flip() : viewAspectRatio; AspectRatio outputRatio = getAngles().flip(Reference.OUTPUT, Reference.VIEW) ? viewAspectRatio.flip() : viewAspectRatio;
if (mPreview instanceof GlCameraPreview && Build.VERSION.SDK_INT >= Build.VERSION_CODES.KITKAT) { if (mPreview instanceof GlCameraPreview && Build.VERSION.SDK_INT >= Build.VERSION_CODES.KITKAT) {
mPictureRecorder = new SnapshotGlPictureRecorder(stub, this, (GlCameraPreview) mPreview, outputRatio); mPictureRecorder = new SnapshotGlPictureRecorder(stub, this, (GlCameraPreview) mPreview, outputRatio, getOverlay());
} else { } else {
mPictureRecorder = new Snapshot1PictureRecorder(stub, this, mCamera, outputRatio); mPictureRecorder = new Snapshot1PictureRecorder(stub, this, mCamera, outputRatio);
} }
@ -359,10 +359,20 @@ public class Camera1Engine extends CameraEngine implements
Rect outputCrop = CropHelper.computeCrop(outputSize, outputRatio); Rect outputCrop = CropHelper.computeCrop(outputSize, outputRatio);
outputSize = new Size(outputCrop.width(), outputCrop.height()); outputSize = new Size(outputCrop.width(), outputCrop.height());
stub.size = outputSize; stub.size = outputSize;
// Vertical: 0 (270-0-0)
// Left (unlocked): 0 (270-90-270)
// Right (unlocked): 0 (270-270-90)
// Upside down (unlocked): 0 (270-180-180)
// Left (locked): 270 (270-0-270)
// Right (locked): 90 (270-0-90)
// Upside down (locked): 180 (270-0-180)
// The correct formula seems to be deviceOrientation+displayOffset,
// which means offset(Reference.VIEW, Reference.OUTPUT, Axis.ABSOLUTE).
stub.rotation = getAngles().offset(Reference.VIEW, Reference.OUTPUT, Axis.ABSOLUTE); stub.rotation = getAngles().offset(Reference.VIEW, Reference.OUTPUT, Axis.ABSOLUTE);
LOG.i("onTakeVideoSnapshot", "rotation:", stub.rotation, "size:", stub.size);
// Start. // Start.
mVideoRecorder = new SnapshotVideoRecorder(Camera1Engine.this, glPreview); mVideoRecorder = new SnapshotVideoRecorder(Camera1Engine.this, glPreview, getOverlay(), stub.rotation);
mVideoRecorder.start(stub); mVideoRecorder.start(stub);
} }

@ -615,7 +615,7 @@ public class Camera2Engine extends CameraEngine implements ImageReader.OnImageAv
stub.rotation = getAngles().offset(Reference.SENSOR, Reference.OUTPUT, Axis.RELATIVE_TO_SENSOR); // Actually it will be rotated and set to 0. stub.rotation = getAngles().offset(Reference.SENSOR, Reference.OUTPUT, Axis.RELATIVE_TO_SENSOR); // Actually it will be rotated and set to 0.
AspectRatio outputRatio = getAngles().flip(Reference.OUTPUT, Reference.VIEW) ? viewAspectRatio.flip() : viewAspectRatio; AspectRatio outputRatio = getAngles().flip(Reference.OUTPUT, Reference.VIEW) ? viewAspectRatio.flip() : viewAspectRatio;
if (mPreview instanceof GlCameraPreview) { if (mPreview instanceof GlCameraPreview) {
mPictureRecorder = new SnapshotGlPictureRecorder(stub, this, (GlCameraPreview) mPreview, outputRatio); mPictureRecorder = new SnapshotGlPictureRecorder(stub, this, (GlCameraPreview) mPreview, outputRatio, getOverlay());
} else { } else {
throw new RuntimeException("takePictureSnapshot with Camera2 is only supported with Preview.GL_SURFACE"); throw new RuntimeException("takePictureSnapshot with Camera2 is only supported with Preview.GL_SURFACE");
} }
@ -670,7 +670,7 @@ public class Camera2Engine extends CameraEngine implements ImageReader.OnImageAv
private void doTakeVideo(@NonNull final VideoResult.Stub stub) { private void doTakeVideo(@NonNull final VideoResult.Stub stub) {
if (!(mVideoRecorder instanceof Full2VideoRecorder)) { if (!(mVideoRecorder instanceof Full2VideoRecorder)) {
mVideoRecorder = new Full2VideoRecorder(this, mCameraId); throw new IllegalStateException("doTakeVideo called, but video recorder is not a Full2VideoRecorder! " + mVideoRecorder);
} }
Full2VideoRecorder recorder = (Full2VideoRecorder) mVideoRecorder; Full2VideoRecorder recorder = (Full2VideoRecorder) mVideoRecorder;
try { try {
@ -706,10 +706,22 @@ public class Camera2Engine extends CameraEngine implements ImageReader.OnImageAv
Rect outputCrop = CropHelper.computeCrop(outputSize, outputRatio); Rect outputCrop = CropHelper.computeCrop(outputSize, outputRatio);
outputSize = new Size(outputCrop.width(), outputCrop.height()); outputSize = new Size(outputCrop.width(), outputCrop.height());
stub.size = outputSize; stub.size = outputSize;
stub.rotation = getAngles().offset(Reference.VIEW, Reference.OUTPUT, Axis.ABSOLUTE); // Vertical: 0 (270-0-0)
// Left (unlocked): 270 (270-90-270)
// Right (unlocked): 90 (270-270-90)
// Upside down (unlocked): 180 (270-180-180)
// Left (locked): 270 (270-0-270)
// Right (locked): 90 (270-0-90)
// Upside down (locked): 180 (270-0-180)
// Unlike Camera1, the correct formula seems to be deviceOrientation,
// which means offset(Reference.BASE, Reference.OUTPUT, Axis.ABSOLUTE).
stub.rotation = getAngles().offset(Reference.BASE, Reference.OUTPUT, Axis.ABSOLUTE);
LOG.i("onTakeVideoSnapshot", "rotation:", stub.rotation, "size:", stub.size);
// Start. // Start.
mVideoRecorder = new SnapshotVideoRecorder(this, glPreview); // The overlay rotation should alway be VIEW-OUTPUT, just liek Camera1Engine.
int overlayRotation = getAngles().offset(Reference.VIEW, Reference.OUTPUT, Axis.ABSOLUTE);
mVideoRecorder = new SnapshotVideoRecorder(this, glPreview, getOverlay(), overlayRotation);
mVideoRecorder.start(stub); mVideoRecorder.start(stub);
} }

@ -18,6 +18,7 @@ import com.otaliastudios.cameraview.CameraException;
import com.otaliastudios.cameraview.CameraLogger; import com.otaliastudios.cameraview.CameraLogger;
import com.otaliastudios.cameraview.CameraOptions; import com.otaliastudios.cameraview.CameraOptions;
import com.otaliastudios.cameraview.PictureResult; import com.otaliastudios.cameraview.PictureResult;
import com.otaliastudios.cameraview.overlay.Overlay;
import com.otaliastudios.cameraview.VideoResult; import com.otaliastudios.cameraview.VideoResult;
import com.otaliastudios.cameraview.engine.offset.Angles; import com.otaliastudios.cameraview.engine.offset.Angles;
import com.otaliastudios.cameraview.engine.offset.Reference; import com.otaliastudios.cameraview.engine.offset.Reference;
@ -184,6 +185,7 @@ public abstract class CameraEngine implements
private long mAutoFocusResetDelayMillis; private long mAutoFocusResetDelayMillis;
private int mSnapshotMaxWidth = Integer.MAX_VALUE; // in REF_VIEW for consistency with SizeSelectors private int mSnapshotMaxWidth = Integer.MAX_VALUE; // in REF_VIEW for consistency with SizeSelectors
private int mSnapshotMaxHeight = Integer.MAX_VALUE; // in REF_VIEW for consistency with SizeSelectors private int mSnapshotMaxHeight = Integer.MAX_VALUE; // in REF_VIEW for consistency with SizeSelectors
private Overlay overlay;
// Steps // Steps
private final Step.Callback mStepCallback = new Step.Callback() { private final Step.Callback mStepCallback = new Step.Callback() {
@ -781,6 +783,15 @@ public abstract class CameraEngine implements
//region Final setters and getters //region Final setters and getters
public final void setOverlay(@Nullable Overlay overlay) {
this.overlay = overlay;
}
@Nullable
public final Overlay getOverlay() {
return overlay;
}
@SuppressWarnings("WeakerAccess") @SuppressWarnings("WeakerAccess")
public final Angles getAngles() { public final Angles getAngles() {
return mAngles; return mAngles;
@ -1218,27 +1229,31 @@ public abstract class CameraEngine implements
@Nullable @Nullable
public final Size getPictureSize(@SuppressWarnings("SameParameterValue") @NonNull Reference reference) { public final Size getPictureSize(@SuppressWarnings("SameParameterValue") @NonNull Reference reference) {
if (mCaptureSize == null || mMode == Mode.VIDEO) return null; Size size = mCaptureSize;
return getAngles().flip(Reference.SENSOR, reference) ? mCaptureSize.flip() : mCaptureSize; if (size == null || mMode == Mode.VIDEO) return null;
return getAngles().flip(Reference.SENSOR, reference) ? size.flip() : size;
} }
@Nullable @Nullable
public final Size getVideoSize(@SuppressWarnings("SameParameterValue") @NonNull Reference reference) { public final Size getVideoSize(@SuppressWarnings("SameParameterValue") @NonNull Reference reference) {
if (mCaptureSize == null || mMode == Mode.PICTURE) return null; Size size = mCaptureSize;
return getAngles().flip(Reference.SENSOR, reference) ? mCaptureSize.flip() : mCaptureSize; if (size == null || mMode == Mode.PICTURE) return null;
return getAngles().flip(Reference.SENSOR, reference) ? size.flip() : size;
} }
@Nullable @Nullable
public final Size getPreviewStreamSize(@NonNull Reference reference) { public final Size getPreviewStreamSize(@NonNull Reference reference) {
if (mPreviewStreamSize == null) return null; Size size = mPreviewStreamSize;
return getAngles().flip(Reference.SENSOR, reference) ? mPreviewStreamSize.flip() : mPreviewStreamSize; if (size == null) return null;
return getAngles().flip(Reference.SENSOR, reference) ? size.flip() : size;
} }
@SuppressWarnings("SameParameterValue") @SuppressWarnings("SameParameterValue")
@Nullable @Nullable
private Size getPreviewSurfaceSize(@NonNull Reference reference) { private Size getPreviewSurfaceSize(@NonNull Reference reference) {
if (mPreview == null) return null; CameraPreview preview = mPreview;
return getAngles().flip(Reference.VIEW, reference) ? mPreview.getSurfaceSize().flip() : mPreview.getSurfaceSize(); if (preview == null) return null;
return getAngles().flip(Reference.VIEW, reference) ? preview.getSurfaceSize().flip() : preview.getSurfaceSize();
} }
/** /**

@ -3,6 +3,7 @@ package com.otaliastudios.cameraview.engine.offset;
import androidx.annotation.NonNull; import androidx.annotation.NonNull;
import androidx.annotation.VisibleForTesting; import androidx.annotation.VisibleForTesting;
import com.otaliastudios.cameraview.CameraLogger;
import com.otaliastudios.cameraview.controls.Facing; import com.otaliastudios.cameraview.controls.Facing;
/** /**
@ -21,6 +22,9 @@ import com.otaliastudios.cameraview.controls.Facing;
*/ */
public class Angles { public class Angles {
private final static String TAG = Angles.class.getSimpleName();
private final static CameraLogger LOG = CameraLogger.create(TAG);
private Facing mSensorFacing; private Facing mSensorFacing;
@VisibleForTesting int mSensorOffset = 0; @VisibleForTesting int mSensorOffset = 0;
@VisibleForTesting int mDisplayOffset = 0; @VisibleForTesting int mDisplayOffset = 0;
@ -40,6 +44,7 @@ public class Angles {
if (mSensorFacing == Facing.FRONT) { if (mSensorFacing == Facing.FRONT) {
mSensorOffset = sanitizeOutput(360 - mSensorOffset); mSensorOffset = sanitizeOutput(360 - mSensorOffset);
} }
print();
} }
/** /**
@ -49,6 +54,7 @@ public class Angles {
public void setDisplayOffset(int displayOffset) { public void setDisplayOffset(int displayOffset) {
sanitizeInput(displayOffset); sanitizeInput(displayOffset);
mDisplayOffset = displayOffset; mDisplayOffset = displayOffset;
print();
} }
/** /**
@ -58,6 +64,14 @@ public class Angles {
public void setDeviceOrientation(int deviceOrientation) { public void setDeviceOrientation(int deviceOrientation) {
sanitizeInput(deviceOrientation); sanitizeInput(deviceOrientation);
mDeviceOrientation = deviceOrientation; mDeviceOrientation = deviceOrientation;
print();
}
private void print() {
LOG.i("Angles changed:",
"sensorOffset:", mSensorOffset,
"displayOffset:", mDisplayOffset,
"deviceOrientation:", mDeviceOrientation);
} }
/** /**

@ -145,6 +145,12 @@ public class EglViewport extends EglElement {
GLES20.glUseProgram(mProgramHandle); GLES20.glUseProgram(mProgramHandle);
check("glUseProgram"); check("glUseProgram");
// enable blending, from: http://www.learnopengles.com/android-lesson-five-an-introduction-to-blending/
GLES20.glDisable(GLES20.GL_CULL_FACE);
GLES20.glDisable(GLES20.GL_DEPTH_TEST);
GLES20.glEnable(GLES20.GL_BLEND);
GLES20.glBlendFunc(GLES20.GL_SRC_ALPHA, GLES20.GL_ONE_MINUS_SRC_ALPHA);
// Set the texture. // Set the texture.
GLES20.glActiveTexture(GLES20.GL_TEXTURE0); GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glBindTexture(mTextureTarget, textureId); GLES20.glBindTexture(mTextureTarget, textureId);

@ -63,17 +63,17 @@ public class Pool<T> {
T item = mQueue.poll(); T item = mQueue.poll();
if (item != null) { if (item != null) {
activeCount++; // poll decreases, this fixes activeCount++; // poll decreases, this fixes
LOG.v("GET: Reusing recycled item.", this); LOG.v("GET - Reusing recycled item.", this);
return item; return item;
} }
if (isEmpty()) { if (isEmpty()) {
LOG.v("GET: Returning null. Too much items requested.", this); LOG.v("GET - Returning null. Too much items requested.", this);
return null; return null;
} }
activeCount++; activeCount++;
LOG.v("GET: Creating a new item.", this); LOG.v("GET - Creating a new item.", this);
return factory.create(); return factory.create();
} }
@ -84,7 +84,7 @@ public class Pool<T> {
* @param item used item * @param item used item
*/ */
public void recycle(@NonNull T item) { public void recycle(@NonNull T item) {
LOG.v("RECYCLE: Recycling item.", this); LOG.v("RECYCLE - Recycling item.", this);
if (--activeCount < 0) { if (--activeCount < 0) {
throw new IllegalStateException("Trying to recycle an item which makes activeCount < 0." + throw new IllegalStateException("Trying to recycle an item which makes activeCount < 0." +
"This means that this or some previous items being recycled were not coming from " + "This means that this or some previous items being recycled were not coming from " +
@ -112,6 +112,7 @@ public class Pool<T> {
* *
* @return count * @return count
*/ */
@SuppressWarnings("WeakerAccess")
public final int count() { public final int count() {
return activeCount() + recycledCount(); return activeCount() + recycledCount();
} }
@ -122,6 +123,7 @@ public class Pool<T> {
* *
* @return active count * @return active count
*/ */
@SuppressWarnings("WeakerAccess")
public final int activeCount() { public final int activeCount() {
return activeCount; return activeCount;
} }
@ -133,6 +135,7 @@ public class Pool<T> {
* *
* @return recycled count * @return recycled count
*/ */
@SuppressWarnings("WeakerAccess")
public final int recycledCount() { public final int recycledCount() {
return mQueue.size(); return mQueue.size();
} }
@ -140,6 +143,6 @@ public class Pool<T> {
@NonNull @NonNull
@Override @Override
public String toString() { public String toString() {
return getClass().getSimpleName() + " -- count:" + count() + ", active:" + activeCount() + ", recycled:" + recycledCount(); return getClass().getSimpleName() + " - count:" + count() + ", active:" + activeCount() + ", recycled:" + recycledCount();
} }
} }

@ -0,0 +1,33 @@
package com.otaliastudios.cameraview.overlay;
import android.graphics.Canvas;
import androidx.annotation.NonNull;
/**
* Base interface for overlays.
*/
public interface Overlay {
enum Target {
PREVIEW, PICTURE_SNAPSHOT, VIDEO_SNAPSHOT
}
/**
* Called for this overlay to draw itself on the specified target and canvas.
*
* @param target target
* @param canvas target canvas
*/
void drawOn(@NonNull Target target, @NonNull Canvas canvas);
/**
* Called to understand if this overlay would like to draw onto the given
* target or not. If true is returned, {@link #drawOn(Target, Canvas)} can be
* called at a future time.
*
* @param target the target
* @return true to draw on it
*/
boolean drawsOn(@NonNull Target target);
}

@ -0,0 +1,211 @@
package com.otaliastudios.cameraview.overlay;
import android.annotation.SuppressLint;
import android.content.Context;
import android.content.res.TypedArray;
import android.graphics.Canvas;
import android.util.AttributeSet;
import android.view.View;
import android.view.ViewGroup;
import android.widget.FrameLayout;
import androidx.annotation.NonNull;
import androidx.annotation.Nullable;
import androidx.annotation.VisibleForTesting;
import com.otaliastudios.cameraview.CameraLogger;
import com.otaliastudios.cameraview.R;
@SuppressLint("CustomViewStyleable")
public class OverlayLayout extends FrameLayout implements Overlay {
private static final String TAG = OverlayLayout.class.getSimpleName();
private static final CameraLogger LOG = CameraLogger.create(TAG);
@VisibleForTesting Target currentTarget = Target.PREVIEW;
/**
* We set {@link #setWillNotDraw(boolean)} to false even if we don't draw anything.
* This ensures that the View system will call {@link #draw(Canvas)} on us instead
* of short-circuiting to {@link #dispatchDraw(Canvas)}.
*
* That would be a problem for us since we use {@link #draw(Canvas)} to understand if
* we are currently drawing on the preview or not.
*
* @param context a context
*/
public OverlayLayout(@NonNull Context context) {
super(context);
setWillNotDraw(false);
}
/**
* Returns true if this {@link AttributeSet} belongs to an overlay.
* @param set an attribute set
* @return true if overlay
*/
public boolean isOverlay(@Nullable AttributeSet set) {
if (set == null) return false;
TypedArray a = getContext().obtainStyledAttributes(set, R.styleable.CameraView_Layout);
boolean isOverlay =
a.hasValue(R.styleable.CameraView_Layout_layout_drawOnPreview)
|| a.hasValue(R.styleable.CameraView_Layout_layout_drawOnPictureSnapshot)
|| a.hasValue(R.styleable.CameraView_Layout_layout_drawOnVideoSnapshot);
a.recycle();
return isOverlay;
}
/**
* Returns true if this {@link ViewGroup.LayoutParams} belongs to an overlay.
* @param params a layout params
* @return true if overlay
*/
public boolean isOverlay(@NonNull ViewGroup.LayoutParams params) {
return params instanceof LayoutParams;
}
/**
* Generates our own overlay layout params.
* @param attrs input attrs
* @return our params
*/
@Override
public OverlayLayout.LayoutParams generateLayoutParams(AttributeSet attrs) {
return new LayoutParams(getContext(), attrs);
}
/**
* This is called by the View hierarchy, so at this point we are
* likely drawing on the preview.
* @param canvas View canvas
*/
@SuppressLint("MissingSuperCall")
@Override
public void draw(Canvas canvas) {
LOG.i("normal draw called.");
if (drawsOn(Target.PREVIEW)) {
drawOn(Target.PREVIEW, canvas);
}
}
@Override
public boolean drawsOn(@NonNull Target target) {
for (int i = 0; i < getChildCount(); i++) {
LayoutParams params = (LayoutParams) getChildAt(i).getLayoutParams();
if (params.drawsOn(target)) return true;
}
return false;
}
/**
* For {@link Target#PREVIEW}, this method is called by the View hierarchy. We will
* just forward the call to super.
*
* For {@link Target#PICTURE_SNAPSHOT} and {@link Target#VIDEO_SNAPSHOT},
* this method is called by the overlay drawer. We call {@link #dispatchDraw(Canvas)}
* to draw our children only.
*
* @param target the draw target
* @param canvas the canvas
*/
@Override
public void drawOn(@NonNull Target target, @NonNull Canvas canvas) {
synchronized (this) {
currentTarget = target;
switch (target) {
case PREVIEW:
super.draw(canvas);
break;
case VIDEO_SNAPSHOT:
case PICTURE_SNAPSHOT:
canvas.save();
// The input canvas size is that of the preview stream, cropped to match
// the view aspect ratio (this op is done by picture & video recorder).
// So the aspect ratio is guaranteed to be the same, but we might have
// to apply some scale (typically > 1).
float widthScale = canvas.getWidth() / (float) getWidth();
float heightScale = canvas.getHeight() / (float) getHeight();
LOG.i("draw",
"target:", target,
"canvas:", canvas.getWidth() + "x" + canvas.getHeight(),
"view:", getWidth() + "x" + getHeight(),
"widthScale:", widthScale,
"heightScale:", heightScale
);
canvas.scale(widthScale, heightScale);
dispatchDraw(canvas);
canvas.restore();
break;
}
}
}
/**
* We end up here in all three cases, and should filter out
* views that are not meant to be drawn on that specific surface.
*/
@Override
protected boolean drawChild(Canvas canvas, View child, long drawingTime) {
LayoutParams params = (LayoutParams) child.getLayoutParams();
if (params.drawsOn(currentTarget)) {
LOG.v("Performing drawing for view:", child.getClass().getSimpleName(),
"target:", currentTarget,
"params:", params);
return doDrawChild(canvas, child, drawingTime);
} else {
LOG.v("Skipping drawing for view:", child.getClass().getSimpleName(),
"target:", currentTarget,
"params:", params);
return false;
}
}
@VisibleForTesting
boolean doDrawChild(Canvas canvas, View child, long drawingTime) {
return super.drawChild(canvas, child, drawingTime);
}
@SuppressWarnings("WeakerAccess")
public static class LayoutParams extends FrameLayout.LayoutParams {
@SuppressWarnings("unused")
public boolean drawOnPreview = false;
public boolean drawOnPictureSnapshot = false;
public boolean drawOnVideoSnapshot = false;
public LayoutParams(int width, int height) {
super(width, height);
}
public LayoutParams(@NonNull Context context, @NonNull AttributeSet attrs) {
super(context, attrs);
TypedArray a = context.obtainStyledAttributes(attrs, R.styleable.CameraView_Layout);
try {
drawOnPreview = a.getBoolean(R.styleable.CameraView_Layout_layout_drawOnPreview, false);
drawOnPictureSnapshot = a.getBoolean(R.styleable.CameraView_Layout_layout_drawOnPictureSnapshot, false);
drawOnVideoSnapshot = a.getBoolean(R.styleable.CameraView_Layout_layout_drawOnVideoSnapshot, false);
} finally {
a.recycle();
}
}
@VisibleForTesting
boolean drawsOn(@NonNull Target target) {
return ((target == Target.PREVIEW && drawOnPreview)
|| (target == Target.VIDEO_SNAPSHOT && drawOnVideoSnapshot)
|| (target == Target.PICTURE_SNAPSHOT && drawOnPictureSnapshot));
}
@NonNull
@Override
public String toString() {
return getClass().getName() + "["
+ "drawOnPreview:" + drawOnPreview
+ ",drawOnPictureSnapshot:" + drawOnPictureSnapshot
+ ",drawOnVideoSnapshot:" + drawOnVideoSnapshot
+ "]";
}
}
}

@ -2,6 +2,9 @@ package com.otaliastudios.cameraview.picture;
import android.annotation.TargetApi; import android.annotation.TargetApi;
import android.graphics.Bitmap; import android.graphics.Bitmap;
import android.graphics.Canvas;
import android.graphics.Color;
import android.graphics.PorterDuff;
import android.graphics.Rect; import android.graphics.Rect;
import android.graphics.SurfaceTexture; import android.graphics.SurfaceTexture;
import android.opengl.EGL14; import android.opengl.EGL14;
@ -11,8 +14,10 @@ import android.os.Build;
import com.otaliastudios.cameraview.CameraLogger; import com.otaliastudios.cameraview.CameraLogger;
import com.otaliastudios.cameraview.PictureResult; import com.otaliastudios.cameraview.PictureResult;
import com.otaliastudios.cameraview.overlay.Overlay;
import com.otaliastudios.cameraview.controls.Facing; import com.otaliastudios.cameraview.controls.Facing;
import com.otaliastudios.cameraview.engine.CameraEngine; import com.otaliastudios.cameraview.engine.CameraEngine;
import com.otaliastudios.cameraview.engine.offset.Axis;
import com.otaliastudios.cameraview.engine.offset.Reference; import com.otaliastudios.cameraview.engine.offset.Reference;
import com.otaliastudios.cameraview.internal.egl.EglCore; import com.otaliastudios.cameraview.internal.egl.EglCore;
import com.otaliastudios.cameraview.internal.egl.EglViewport; import com.otaliastudios.cameraview.internal.egl.EglViewport;
@ -26,7 +31,9 @@ import com.otaliastudios.cameraview.size.AspectRatio;
import com.otaliastudios.cameraview.size.Size; import com.otaliastudios.cameraview.size.Size;
import androidx.annotation.NonNull; import androidx.annotation.NonNull;
import androidx.annotation.Nullable;
import android.view.Surface;
public class SnapshotGlPictureRecorder extends PictureRecorder { public class SnapshotGlPictureRecorder extends PictureRecorder {
@ -37,15 +44,21 @@ public class SnapshotGlPictureRecorder extends PictureRecorder {
private GlCameraPreview mPreview; private GlCameraPreview mPreview;
private AspectRatio mOutputRatio; private AspectRatio mOutputRatio;
private Overlay mOverlay;
private boolean mHasOverlay;
public SnapshotGlPictureRecorder( public SnapshotGlPictureRecorder(
@NonNull PictureResult.Stub stub, @NonNull PictureResult.Stub stub,
@NonNull CameraEngine engine, @NonNull CameraEngine engine,
@NonNull GlCameraPreview preview, @NonNull GlCameraPreview preview,
@NonNull AspectRatio outputRatio) { @NonNull AspectRatio outputRatio,
@Nullable Overlay overlay) {
super(stub, engine); super(stub, engine);
mEngine = engine; mEngine = engine;
mPreview = preview; mPreview = preview;
mOutputRatio = outputRatio; mOutputRatio = outputRatio;
mOverlay = overlay;
mHasOverlay = overlay != null && overlay.drawsOn(Overlay.Target.PICTURE_SNAPSHOT);
} }
@TargetApi(Build.VERSION_CODES.KITKAT) @TargetApi(Build.VERSION_CODES.KITKAT)
@ -57,15 +70,31 @@ public class SnapshotGlPictureRecorder extends PictureRecorder {
SurfaceTexture mSurfaceTexture; SurfaceTexture mSurfaceTexture;
float[] mTransform; float[] mTransform;
int mOverlayTextureId = 0;
SurfaceTexture mOverlaySurfaceTexture;
Surface mOverlaySurface;
float[] mOverlayTransform;
EglViewport mViewport;
@RendererThread @RendererThread
public void onRendererTextureCreated(int textureId) { public void onRendererTextureCreated(int textureId) {
mTextureId = textureId; mTextureId = textureId;
mViewport = new EglViewport();
mSurfaceTexture = new SurfaceTexture(mTextureId, true); mSurfaceTexture = new SurfaceTexture(mTextureId, true);
// Need to crop the size. // Need to crop the size.
Rect crop = CropHelper.computeCrop(mResult.size, mOutputRatio); Rect crop = CropHelper.computeCrop(mResult.size, mOutputRatio);
mResult.size = new Size(crop.width(), crop.height()); mResult.size = new Size(crop.width(), crop.height());
mSurfaceTexture.setDefaultBufferSize(mResult.size.getWidth(), mResult.size.getHeight()); mSurfaceTexture.setDefaultBufferSize(mResult.size.getWidth(), mResult.size.getHeight());
mTransform = new float[16]; mTransform = new float[16];
if (mHasOverlay) {
mOverlayTextureId = mViewport.createTexture();
mOverlaySurfaceTexture = new SurfaceTexture(mOverlayTextureId, true);
mOverlaySurfaceTexture.setDefaultBufferSize(mResult.size.getWidth(), mResult.size.getHeight());
mOverlaySurface = new Surface(mOverlaySurfaceTexture);
mOverlayTransform = new float[16];
}
} }
@RendererThread @RendererThread
@ -97,14 +126,14 @@ public class SnapshotGlPictureRecorder extends PictureRecorder {
WorkerHandler.execute(new Runnable() { WorkerHandler.execute(new Runnable() {
@Override @Override
public void run() { public void run() {
// 1. Get latest texture
EglWindowSurface surface = new EglWindowSurface(core, mSurfaceTexture); EglWindowSurface surface = new EglWindowSurface(core, mSurfaceTexture);
surface.makeCurrent(); surface.makeCurrent();
EglViewport viewport = new EglViewport();
mSurfaceTexture.updateTexImage(); mSurfaceTexture.updateTexImage();
mSurfaceTexture.getTransformMatrix(mTransform); mSurfaceTexture.getTransformMatrix(mTransform);
// Apply scale and crop: // 2. Apply scale and crop:
// NOTE: scaleX and scaleY are in REF_VIEW, while our input appears to be in REF_SENSOR. // scaleX and scaleY are in REF_VIEW, while our input appears to be in REF_SENSOR.
boolean flip = mEngine.getAngles().flip(Reference.VIEW, Reference.SENSOR); boolean flip = mEngine.getAngles().flip(Reference.VIEW, Reference.SENSOR);
float realScaleX = flip ? scaleY : scaleX; float realScaleX = flip ? scaleY : scaleX;
float realScaleY = flip ? scaleX : scaleY; float realScaleY = flip ? scaleX : scaleY;
@ -113,38 +142,62 @@ public class SnapshotGlPictureRecorder extends PictureRecorder {
Matrix.translateM(mTransform, 0, scaleTranslX, scaleTranslY, 0); Matrix.translateM(mTransform, 0, scaleTranslX, scaleTranslY, 0);
Matrix.scaleM(mTransform, 0, realScaleX, realScaleY, 1); Matrix.scaleM(mTransform, 0, realScaleX, realScaleY, 1);
// Fix rotation: // 3. Go back to 0,0 so that rotate and flip work well.
// Not sure why we need the minus here... It makes no sense to me.
LOG.w("Recording frame. Rotation:", mResult.rotation, "Actual:", -mResult.rotation);
int rotation = -mResult.rotation;
mResult.rotation = 0;
// Go back to 0,0 so that rotate and flip work well.
Matrix.translateM(mTransform, 0, 0.5F, 0.5F, 0); Matrix.translateM(mTransform, 0, 0.5F, 0.5F, 0);
// Apply rotation: // 4. Apply rotation:
Matrix.rotateM(mTransform, 0, rotation, 0, 0, 1); // Not sure why we need the minus here.
Matrix.rotateM(mTransform, 0, -mResult.rotation, 0, 0, 1);
mResult.rotation = 0;
// Flip horizontally for front camera: // 5. Flip horizontally for front camera:
if (mResult.facing == Facing.FRONT) { if (mResult.facing == Facing.FRONT) {
Matrix.scaleM(mTransform, 0, -1, 1, 1); Matrix.scaleM(mTransform, 0, -1, 1, 1);
} }
// Go back to old position. // 6. Go back to old position.
Matrix.translateM(mTransform, 0, -0.5F, -0.5F, 0); Matrix.translateM(mTransform, 0, -0.5F, -0.5F, 0);
// Future note: passing scale values to the viewport? // 7. Do pretty much the same for overlays, though with
// They are simply realScaleX and realScaleY. // some differences.
viewport.drawFrame(mTextureId, mTransform); if (mHasOverlay) {
// 1. First we must draw on the texture and get latest image.
try {
final Canvas surfaceCanvas = mOverlaySurface.lockCanvas(null);
surfaceCanvas.drawColor(Color.TRANSPARENT, PorterDuff.Mode.CLEAR);
mOverlay.drawOn(Overlay.Target.PICTURE_SNAPSHOT, surfaceCanvas);
mOverlaySurface.unlockCanvasAndPost(surfaceCanvas);
} catch (Surface.OutOfResourcesException e) {
LOG.w("Got Surface.OutOfResourcesException while drawing picture overlays", e);
}
mOverlaySurfaceTexture.updateTexImage();
mOverlaySurfaceTexture.getTransformMatrix(mOverlayTransform);
// 2. Then we can apply the transformations.
int rotation = mEngine.getAngles().offset(Reference.VIEW, Reference.OUTPUT, Axis.ABSOLUTE);
Matrix.translateM(mOverlayTransform, 0, 0.5F, 0.5F, 0);
Matrix.rotateM(mOverlayTransform, 0, rotation, 0, 0, 1);
// No need to flip the x axis for front camera, but need to flip the y axis always.
Matrix.scaleM(mOverlayTransform, 0, 1, -1, 1);
Matrix.translateM(mOverlayTransform, 0, -0.5F, -0.5F, 0);
}
// 8. Draw and save
mViewport.drawFrame(mTextureId, mTransform);
if (mHasOverlay) mViewport.drawFrame(mOverlayTextureId, mOverlayTransform);
// don't - surface.swapBuffers(); // don't - surface.swapBuffers();
mResult.data = surface.saveFrameTo(Bitmap.CompressFormat.JPEG); mResult.data = surface.saveFrameTo(Bitmap.CompressFormat.JPEG);
mResult.format = PictureResult.FORMAT_JPEG; mResult.format = PictureResult.FORMAT_JPEG;
mSurfaceTexture.releaseTexImage();
// EGL14.eglMakeCurrent(oldDisplay, oldSurface, oldSurface, eglContext); // 9. Cleanup
mSurfaceTexture.releaseTexImage();
surface.release(); surface.release();
viewport.release(); mViewport.release();
mSurfaceTexture.release(); mSurfaceTexture.release();
if (mHasOverlay) {
mOverlaySurface.release();
mOverlaySurfaceTexture.release();
}
core.release(); core.release();
dispatchResult(); dispatchResult();
} }

@ -1,14 +1,21 @@
package com.otaliastudios.cameraview.preview; package com.otaliastudios.cameraview.preview;
import android.content.Context; import android.content.Context;
import androidx.annotation.CallSuper;
import androidx.annotation.NonNull; import androidx.annotation.NonNull;
import androidx.annotation.Nullable; import androidx.annotation.Nullable;
import androidx.annotation.UiThread;
import androidx.annotation.VisibleForTesting; import androidx.annotation.VisibleForTesting;
import android.os.Handler;
import android.os.Looper;
import android.view.View; import android.view.View;
import android.view.ViewGroup; import android.view.ViewGroup;
import android.view.ViewParent; import android.view.ViewParent;
import com.google.android.gms.tasks.TaskCompletionSource;
import com.google.android.gms.tasks.Tasks;
import com.otaliastudios.cameraview.CameraLogger; import com.otaliastudios.cameraview.CameraLogger;
import com.otaliastudios.cameraview.engine.CameraEngine; import com.otaliastudios.cameraview.engine.CameraEngine;
import com.otaliastudios.cameraview.internal.utils.Op; import com.otaliastudios.cameraview.internal.utils.Op;
@ -240,7 +247,32 @@ public abstract class CameraPreview<T extends View, Output> {
* Called by the hosting {@link com.otaliastudios.cameraview.CameraView}, * Called by the hosting {@link com.otaliastudios.cameraview.CameraView},
* this is a lifecycle event. * this is a lifecycle event.
*/ */
@CallSuper
public void onDestroy() { public void onDestroy() {
if (Thread.currentThread() == Looper.getMainLooper().getThread()) {
onDestroyView();
} else {
// Do this on the UI thread and wait.
Handler ui = new Handler(Looper.getMainLooper());
final TaskCompletionSource<Void> task = new TaskCompletionSource<>();
ui.post(new Runnable() {
@Override
public void run() {
onDestroyView();
task.setResult(null);
}
});
try { Tasks.await(task.getTask()); } catch (Exception ignore) {}
}
}
/**
* At this point we undo the work that was done during {@link #onCreateView(Context, ViewGroup)},
* which basically means removing the root view from the hierarchy.
*/
@SuppressWarnings("WeakerAccess")
@UiThread
protected void onDestroyView() {
View root = getRootView(); View root = getRootView();
ViewParent parent = root.getParent(); ViewParent parent = root.getParent();
if (parent instanceof ViewGroup) { if (parent instanceof ViewGroup) {

@ -5,7 +5,6 @@ import android.graphics.SurfaceTexture;
import android.opengl.GLSurfaceView; import android.opengl.GLSurfaceView;
import android.opengl.Matrix; import android.opengl.Matrix;
import androidx.annotation.NonNull; import androidx.annotation.NonNull;
import androidx.annotation.Nullable;
import androidx.annotation.VisibleForTesting; import androidx.annotation.VisibleForTesting;
import android.view.LayoutInflater; import android.view.LayoutInflater;

@ -1,14 +1,19 @@
package com.otaliastudios.cameraview.video; package com.otaliastudios.cameraview.video;
import android.graphics.Canvas;
import android.graphics.Color;
import android.graphics.PorterDuff;
import android.graphics.SurfaceTexture; import android.graphics.SurfaceTexture;
import android.opengl.EGL14; import android.opengl.EGL14;
import android.os.Build; import android.os.Build;
import android.view.Surface;
import com.otaliastudios.cameraview.CameraLogger; import com.otaliastudios.cameraview.CameraLogger;
import com.otaliastudios.cameraview.overlay.Overlay;
import com.otaliastudios.cameraview.VideoResult; import com.otaliastudios.cameraview.VideoResult;
import com.otaliastudios.cameraview.controls.Audio; import com.otaliastudios.cameraview.controls.Audio;
import com.otaliastudios.cameraview.engine.CameraEngine; import com.otaliastudios.cameraview.engine.CameraEngine;
import com.otaliastudios.cameraview.engine.offset.Reference; import com.otaliastudios.cameraview.internal.egl.EglViewport;
import com.otaliastudios.cameraview.preview.GlCameraPreview; import com.otaliastudios.cameraview.preview.GlCameraPreview;
import com.otaliastudios.cameraview.preview.RendererFrameCallback; import com.otaliastudios.cameraview.preview.RendererFrameCallback;
import com.otaliastudios.cameraview.preview.RendererThread; import com.otaliastudios.cameraview.preview.RendererThread;
@ -40,25 +45,33 @@ public class SnapshotVideoRecorder extends VideoRecorder implements RendererFram
private static final int STATE_NOT_RECORDING = 1; private static final int STATE_NOT_RECORDING = 1;
private MediaEncoderEngine mEncoderEngine; private MediaEncoderEngine mEncoderEngine;
private CameraEngine mEngine;
private GlCameraPreview mPreview; private GlCameraPreview mPreview;
private boolean mFlipped;
private int mCurrentState = STATE_NOT_RECORDING; private int mCurrentState = STATE_NOT_RECORDING;
private int mDesiredState = STATE_NOT_RECORDING; private int mDesiredState = STATE_NOT_RECORDING;
private int mTextureId = 0; private int mTextureId = 0;
private int mOverlayTextureId = 0;
private SurfaceTexture mOverlaySurfaceTexture;
private Surface mOverlaySurface;
private Overlay mOverlay;
private boolean mHasOverlay;
private int mOverlayRotation;
public SnapshotVideoRecorder(@NonNull CameraEngine engine, public SnapshotVideoRecorder(@NonNull CameraEngine engine,
@NonNull GlCameraPreview preview) { @NonNull GlCameraPreview preview,
@Nullable Overlay overlay,
int overlayRotation) {
super(engine); super(engine);
mPreview = preview; mPreview = preview;
mEngine = engine; mOverlay = overlay;
mHasOverlay = overlay != null && overlay.drawsOn(Overlay.Target.VIDEO_SNAPSHOT);
mOverlayRotation = overlayRotation;
} }
@Override @Override
protected void onStart() { protected void onStart() {
mPreview.addRendererFrameCallback(this); mPreview.addRendererFrameCallback(this);
mFlipped = mEngine.getAngles().flip(Reference.SENSOR, Reference.VIEW);
mDesiredState = STATE_RECORDING; mDesiredState = STATE_RECORDING;
dispatchVideoRecordingStart(); dispatchVideoRecordingStart();
} }
@ -72,6 +85,13 @@ public class SnapshotVideoRecorder extends VideoRecorder implements RendererFram
@Override @Override
public void onRendererTextureCreated(int textureId) { public void onRendererTextureCreated(int textureId) {
mTextureId = textureId; mTextureId = textureId;
if (mHasOverlay) {
EglViewport temp = new EglViewport();
mOverlayTextureId = temp.createTexture();
mOverlaySurfaceTexture = new SurfaceTexture(mOverlayTextureId);
mOverlaySurfaceTexture.setDefaultBufferSize(mResult.size.getWidth(), mResult.size.getHeight());
mOverlaySurface = new Surface(mOverlaySurfaceTexture);
}
} }
@RendererThread @RendererThread
@ -104,8 +124,9 @@ public class SnapshotVideoRecorder extends VideoRecorder implements RendererFram
mResult.rotation, mResult.rotation,
type, mTextureId, type, mTextureId,
scaleX, scaleY, scaleX, scaleY,
mFlipped, EGL14.eglGetCurrentContext(),
EGL14.eglGetCurrentContext() mHasOverlay ? mOverlayTextureId : TextureMediaEncoder.NO_TEXTURE,
mOverlayRotation
); );
TextureMediaEncoder videoEncoder = new TextureMediaEncoder(config); TextureMediaEncoder videoEncoder = new TextureMediaEncoder(config);
@ -129,6 +150,21 @@ public class SnapshotVideoRecorder extends VideoRecorder implements RendererFram
TextureMediaEncoder.TextureFrame textureFrame = textureEncoder.acquireFrame(); TextureMediaEncoder.TextureFrame textureFrame = textureEncoder.acquireFrame();
textureFrame.timestamp = surfaceTexture.getTimestamp(); textureFrame.timestamp = surfaceTexture.getTimestamp();
surfaceTexture.getTransformMatrix(textureFrame.transform); surfaceTexture.getTransformMatrix(textureFrame.transform);
// get overlay
if (mHasOverlay) {
try {
final Canvas surfaceCanvas = mOverlaySurface.lockCanvas(null);
surfaceCanvas.drawColor(Color.TRANSPARENT, PorterDuff.Mode.CLEAR);
mOverlay.drawOn(Overlay.Target.VIDEO_SNAPSHOT, surfaceCanvas);
mOverlaySurface.unlockCanvasAndPost(surfaceCanvas);
} catch (Surface.OutOfResourcesException e) {
LOG.w("Got Surface.OutOfResourcesException while drawing video overlays", e);
}
mOverlaySurfaceTexture.updateTexImage();
mOverlaySurfaceTexture.getTransformMatrix(textureFrame.overlayTransform);
}
if (mEncoderEngine != null) { if (mEncoderEngine != null) {
// can happen on teardown // can happen on teardown
mEncoderEngine.notify(TextureMediaEncoder.FRAME_EVENT, textureFrame); mEncoderEngine.notify(TextureMediaEncoder.FRAME_EVENT, textureFrame);
@ -142,28 +178,41 @@ public class SnapshotVideoRecorder extends VideoRecorder implements RendererFram
mEncoderEngine = null; mEncoderEngine = null;
mPreview.removeRendererFrameCallback(SnapshotVideoRecorder.this); mPreview.removeRendererFrameCallback(SnapshotVideoRecorder.this);
mPreview = null; mPreview = null;
if (mOverlaySurfaceTexture != null) {
mOverlaySurfaceTexture.release();
mOverlaySurfaceTexture = null;
}
if (mOverlaySurface != null) {
mOverlaySurface.release();
mOverlaySurface = null;
}
} }
} }
@Override
public void onEncodingStart() {
// Do nothing.
}
@EncoderThread @EncoderThread
@Override @Override
public void onEncoderStop(int stopReason, @Nullable Exception e) { public void onEncodingEnd(int stopReason, @Nullable Exception e) {
// If something failed, undo the result, since this is the mechanism // If something failed, undo the result, since this is the mechanism
// to notify Camera1Engine about this. // to notify Camera1Engine about this.
if (e != null) { if (e != null) {
LOG.e("Error onEncoderStop", e); LOG.e("Error onEncodingEnd", e);
mResult = null; mResult = null;
mError = e; mError = e;
} else { } else {
if (stopReason == MediaEncoderEngine.STOP_BY_MAX_DURATION) { if (stopReason == MediaEncoderEngine.END_BY_MAX_DURATION) {
LOG.i("onEncoderStop because of max duration."); LOG.i("onEncodingEnd because of max duration.");
mResult.endReason = VideoResult.REASON_MAX_DURATION_REACHED; mResult.endReason = VideoResult.REASON_MAX_DURATION_REACHED;
} else if (stopReason == MediaEncoderEngine.STOP_BY_MAX_SIZE) { } else if (stopReason == MediaEncoderEngine.END_BY_MAX_SIZE) {
LOG.i("onEncoderStop because of max size."); LOG.i("onEncodingEnd because of max size.");
mResult.endReason = VideoResult.REASON_MAX_SIZE_REACHED; mResult.endReason = VideoResult.REASON_MAX_SIZE_REACHED;
} else { } else {
LOG.i("onEncoderStop because of user."); LOG.i("onEncodingEnd because of user.");
} }
} }
// Cleanup // Cleanup

@ -44,17 +44,27 @@ public class AudioMediaEncoder extends MediaEncoder {
private static final int SAMPLE_SIZE = 2; // byte/sample/channel private static final int SAMPLE_SIZE = 2; // byte/sample/channel
private static final int BYTE_RATE_PER_CHANNEL = SAMPLING_FREQUENCY * SAMPLE_SIZE; // byte/sec/channel private static final int BYTE_RATE_PER_CHANNEL = SAMPLING_FREQUENCY * SAMPLE_SIZE; // byte/sec/channel
private static final int BYTE_RATE = BYTE_RATE_PER_CHANNEL * CHANNELS_COUNT; // byte/sec private static final int BYTE_RATE = BYTE_RATE_PER_CHANNEL * CHANNELS_COUNT; // byte/sec
@SuppressWarnings("unused")
static final int BIT_RATE = BYTE_RATE * 8; // bit/sec private static final int BIT_RATE = BYTE_RATE * 8; // bit/sec
// We call FRAME here the chunk of data that we want to read at each loop cycle // We call FRAME here the chunk of data that we want to read at each loop cycle
private static final int FRAME_SIZE_PER_CHANNEL = 1024; // bytes/frame/channel [AAC constant] private static final int FRAME_SIZE_PER_CHANNEL = 1024; // bytes/frame/channel [AAC constant]
private static final int FRAME_SIZE = FRAME_SIZE_PER_CHANNEL * CHANNELS_COUNT; // bytes/frame private static final int FRAME_SIZE = FRAME_SIZE_PER_CHANNEL * CHANNELS_COUNT; // bytes/frame
// We allocate buffers of 1KB each, which is not so much. I would say that allocating // We allocate buffers of 1KB each, which is not so much. This value indicates the maximum
// at most 200 of them is a reasonable value. With the current setup, in device tests, // number of these buffers that we can allocate at a given instant.
// we manage to use 50 at most. // This value is the number of runnables that the encoder thread is allowed to be 'behind'
private static final int BUFFER_POOL_MAX_SIZE = 200; // the recorder thread. It's not safe to have it very large or we can end encoding A LOT AFTER
// the actual recording. It's better to reduce this and skip recording at all.
private static final int BUFFER_POOL_MAX_SIZE = 60;
private static long bytesToUs(int bytes) {
return (1000000L * bytes) / BYTE_RATE;
}
private static long bytesToUs(long bytes) {
return (1000000L * bytes) / BYTE_RATE;
}
private boolean mRequestStop = false; private boolean mRequestStop = false;
private AudioEncodingHandler mEncoder; private AudioEncodingHandler mEncoder;
@ -157,7 +167,7 @@ public class AudioMediaEncoder extends MediaEncoder {
while (!mRequestStop) { while (!mRequestStop) {
read(false); read(false);
} }
LOG.w("RECORDER: Stop was requested. We're out of the loop. Will post an endOfStream."); LOG.w("Stop was requested. We're out of the loop. Will post an endOfStream.");
// Last input with 0 length. This will signal the endOfStream. // Last input with 0 length. This will signal the endOfStream.
// Can't use drain(true); it is only available when writing to the codec InputSurface. // Can't use drain(true); it is only available when writing to the codec InputSurface.
read(true); read(true);
@ -169,20 +179,21 @@ public class AudioMediaEncoder extends MediaEncoder {
private void read(boolean endOfStream) { private void read(boolean endOfStream) {
mCurrentBuffer = mByteBufferPool.get(); mCurrentBuffer = mByteBufferPool.get();
if (mCurrentBuffer == null) { if (mCurrentBuffer == null) {
LOG.e("Skipping audio frame, encoding is too slow."); LOG.e("read thread - eos:", endOfStream, "- Skipping audio frame, encoding is too slow.");
// TODO should fix the next presentation time here. However this is // Should fix the next presentation time here, but
// extremely unlikely based on my tests. The mByteBufferPool should be big enough.
} else { } else {
mCurrentBuffer.clear(); mCurrentBuffer.clear();
mReadBytes = mAudioRecord.read(mCurrentBuffer, FRAME_SIZE); mReadBytes = mAudioRecord.read(mCurrentBuffer, FRAME_SIZE);
LOG.i("read thread - eos:", endOfStream, "- Read new audio frame. Bytes:", mReadBytes);
if (mReadBytes > 0) { // Good read: increase PTS. if (mReadBytes > 0) { // Good read: increase PTS.
increaseTime(mReadBytes); mLastTimeUs = increaseTime(mReadBytes);
LOG.i("read thread - eos:", endOfStream, "- Frame PTS:", mLastTimeUs);
mCurrentBuffer.limit(mReadBytes); mCurrentBuffer.limit(mReadBytes);
onBuffer(endOfStream); onBuffer(endOfStream);
} else if (mReadBytes == AudioRecord.ERROR_INVALID_OPERATION) { } else if (mReadBytes == AudioRecord.ERROR_INVALID_OPERATION) {
LOG.e("Got AudioRecord.ERROR_INVALID_OPERATION"); LOG.e("read thread - eos:", endOfStream, "- Got AudioRecord.ERROR_INVALID_OPERATION");
} else if (mReadBytes == AudioRecord.ERROR_BAD_VALUE) { } else if (mReadBytes == AudioRecord.ERROR_BAD_VALUE) {
LOG.e("Got AudioRecord.ERROR_BAD_VALUE"); LOG.e("read thread - eos:", endOfStream, "- Got AudioRecord.ERROR_BAD_VALUE");
} }
} }
} }
@ -193,12 +204,12 @@ public class AudioMediaEncoder extends MediaEncoder {
* to the consumer. * to the consumer.
*/ */
private void onBuffer(boolean endOfStream) { private void onBuffer(boolean endOfStream) {
LOG.v("read thread - Sending buffer to encoder thread.");
mEncoder.sendInputBuffer(mCurrentBuffer, mLastTimeUs, endOfStream); mEncoder.sendInputBuffer(mCurrentBuffer, mLastTimeUs, endOfStream);
} }
private void increaseTime(int readBytes) { private long increaseTime(int readBytes) {
increaseTime3(readBytes); return increaseTime3(readBytes);
LOG.v("Read", readBytes, "bytes, increasing PTS to", mLastTimeUs);
} }
/** /**
@ -206,49 +217,61 @@ public class AudioMediaEncoder extends MediaEncoder {
* It will use System.nanoTime() just once, as the starting point. * It will use System.nanoTime() just once, as the starting point.
* Of course we don't as there are things going on in this thread. * Of course we don't as there are things going on in this thread.
*/ */
private void increaseTime1(int readBytes) { @SuppressWarnings("unused")
mLastTimeUs += (1000000L * readBytes) / BYTE_RATE; private long increaseTime1(int readBytes) {
return mLastTimeUs + bytesToUs(readBytes);
} }
/** /**
* Just for testing, this method will use Api 24 method to retrieve the timestamp. * Just for testing, this method will use Api 24 method to retrieve the timestamp.
* This way we let the platform choose instead of making assumptions. * This way we let the platform choose instead of making assumptions.
*/ */
@SuppressWarnings("unused")
@RequiresApi(24) @RequiresApi(24)
private void increaseTime2(int readBytes) { private long increaseTime2(int readBytes) {
if (mApi24Timestamp == null) { if (mApi24Timestamp == null) {
mApi24Timestamp = new AudioTimestamp(); mApi24Timestamp = new AudioTimestamp();
} }
mAudioRecord.getTimestamp(mApi24Timestamp, AudioTimestamp.TIMEBASE_MONOTONIC); mAudioRecord.getTimestamp(mApi24Timestamp, AudioTimestamp.TIMEBASE_MONOTONIC);
mLastTimeUs = mApi24Timestamp.nanoTime / 1000; return mApi24Timestamp.nanoTime / 1000;
} }
private AudioTimestamp mApi24Timestamp; private AudioTimestamp mApi24Timestamp;
/** /**
* This method looks like an improvement over {@link #increaseTime1(int)} as it * This method looks like an improvement over {@link #increaseTime1(int)} as it
* accounts for the current time as well. Adapted & improved. from Kickflip. * accounts for the current time as well. Adapted & improved. from Kickflip.
*
* This creates regular timestamps unless we accumulate a lot of delay (greater than
* twice the buffer duration), in which case it creates a gap and starts again trying
* to be regular from the new point.
*/ */
private void increaseTime3(int readBytes) { private long increaseTime3(int readBytes) {
long currentTime = System.nanoTime() / 1000; long bufferDurationUs = bytesToUs(readBytes);
long correctedTime; long bufferEndTimeUs = System.nanoTime() / 1000; // now
long bufferDuration = (1000000 * readBytes) / BYTE_RATE; long bufferStartTimeUs = bufferEndTimeUs - bufferDurationUs;
long bufferTime = currentTime - bufferDuration; // delay of acquiring the audio buffer
if (mTotalReadBytes == 0) { // If this is the first time, the base time is the buffer start time.
mStartTimeUs = bufferTime; if (mBytesSinceBaseTime == 0) mBaseTimeUs = bufferStartTimeUs;
}
// Recompute time assuming that we are respecting the sampling frequency. // Recompute time assuming that we are respecting the sampling frequency.
// However, if the correction is too big (> 2*bufferDuration), reset to this point. // This puts the time at the end of last read buffer, which means, where we
correctedTime = mStartTimeUs + (1000000 * mTotalReadBytes) / BYTE_RATE; // should be if we had no delay / missed buffers.
if(bufferTime - correctedTime >= 2 * bufferDuration) { long correctedTimeUs = mBaseTimeUs + bytesToUs(mBytesSinceBaseTime);
mStartTimeUs = bufferTime; long correctionUs = bufferStartTimeUs - correctedTimeUs;
mTotalReadBytes = 0;
correctedTime = mStartTimeUs; // However, if the correction is too big (> 2*bufferDurationUs), reset to this point.
// This is triggered if we lose buffers and are recording/encoding at a slower rate.
if (correctionUs >= 2L * bufferDurationUs) {
mBaseTimeUs = bufferStartTimeUs;
mBytesSinceBaseTime = readBytes;
return mBaseTimeUs;
} else {
mBytesSinceBaseTime += readBytes;
return correctedTimeUs;
} }
mTotalReadBytes += readBytes;
mLastTimeUs = correctedTime;
} }
private long mStartTimeUs; private long mBaseTimeUs;
private long mTotalReadBytes; private long mBytesSinceBaseTime;
} }
/** /**
@ -278,9 +301,11 @@ public class AudioMediaEncoder extends MediaEncoder {
super.handleMessage(msg); super.handleMessage(msg);
boolean endOfStream = msg.what == 1; boolean endOfStream = msg.what == 1;
long timestamp = (((long) msg.arg1) << 32) | (((long) msg.arg2) & 0xffffffffL); long timestamp = (((long) msg.arg1) << 32) | (((long) msg.arg2) & 0xffffffffL);
LOG.i("encoding thread - got buffer. timestamp:", timestamp, "eos:", endOfStream);
ByteBuffer buffer = (ByteBuffer) msg.obj; ByteBuffer buffer = (ByteBuffer) msg.obj;
int readBytes = buffer.remaining(); int readBytes = buffer.remaining();
InputBuffer inputBuffer = mInputBufferPool.get(); InputBuffer inputBuffer = mInputBufferPool.get();
//noinspection ConstantConditions
inputBuffer.source = buffer; inputBuffer.source = buffer;
inputBuffer.timestamp = timestamp; inputBuffer.timestamp = timestamp;
inputBuffer.length = readBytes; inputBuffer.length = readBytes;
@ -290,7 +315,7 @@ public class AudioMediaEncoder extends MediaEncoder {
} }
private void performPendingOps(boolean force) { private void performPendingOps(boolean force) {
LOG.v("Performing", mPendingOps.size(), "Pending operations."); LOG.i("encoding thread - performing", mPendingOps.size(), "pending operations. force:", force);
InputBuffer buffer; InputBuffer buffer;
while ((buffer = mPendingOps.peek()) != null) { while ((buffer = mPendingOps.peek()) != null) {
if (force) { if (force) {
@ -305,17 +330,43 @@ public class AudioMediaEncoder extends MediaEncoder {
} }
private void performPendingOp(InputBuffer buffer) { private void performPendingOp(InputBuffer buffer) {
buffer.data.put(buffer.source); LOG.i("encoding thread - performing pending operation for timestamp:", buffer.timestamp, "- encoding.");
buffer.data.put(buffer.source); // TODO this copy is prob. the worst part here for performance
mByteBufferPool.recycle(buffer.source); mByteBufferPool.recycle(buffer.source);
mPendingOps.remove(buffer); mPendingOps.remove(buffer);
encodeInputBuffer(buffer); encodeInputBuffer(buffer);
boolean eos = buffer.isEndOfStream; boolean eos = buffer.isEndOfStream;
mInputBufferPool.recycle(buffer); mInputBufferPool.recycle(buffer);
drainOutput(eos); if (eos) mInputBufferPool.clear();
if (eos) { LOG.i("encoding thread - performing pending operation for timestamp:", buffer.timestamp, "- draining.");
mInputBufferPool.clear(); // NOTE: can consider calling this drainOutput on yet another thread, which would let us
WorkerHandler.get("AudioEncodingHandler").getThread().interrupt(); // use an even smaller BUFFER_POOL_MAX_SIZE without losing audio frames. But this way
// we can accumulate delay on this new thread without noticing (no pool getting empty).
if (true) {
drainOutput(eos);
if (eos) WorkerHandler.get("AudioEncodingHandler").getThread().interrupt();
} else {
// Testing the option above.
WorkerHandler.get("AudioEncodingDrainer").remove(drainRunnable);
WorkerHandler.get("AudioEncodingDrainer").remove(drainRunnableEos);
WorkerHandler.get("AudioEncodingDrainer").post(eos ? drainRunnableEos : drainRunnable);
} }
} }
private final Runnable drainRunnable = new Runnable() {
@Override
public void run() {
drainOutput(false);
}
};
private final Runnable drainRunnableEos = new Runnable() {
@Override
public void run() {
drainOutput(true);
WorkerHandler.get("AudioEncodingHandler").getThread().interrupt();
WorkerHandler.get("AudioEncodingDrainer").getThread().interrupt();
}
};
} }
} }

@ -88,11 +88,11 @@ abstract class MediaEncoder {
* NOTE: it's important to call {@link WorkerHandler#post(Runnable)} instead of run()! * NOTE: it's important to call {@link WorkerHandler#post(Runnable)} instead of run()!
*/ */
final void start() { final void start() {
LOG.i(getName(), "Start was called. Posting."); LOG.w(getName(), "Start was called. Posting.");
mWorker.post(new Runnable() { mWorker.post(new Runnable() {
@Override @Override
public void run() { public void run() {
LOG.i(getName(), "Start was called. Executing."); LOG.w(getName(), "Start was called. Executing.");
onStart(); onStart();
} }
}); });
@ -108,11 +108,11 @@ abstract class MediaEncoder {
* @param data object * @param data object
*/ */
final void notify(final @NonNull String event, final @Nullable Object data) { final void notify(final @NonNull String event, final @Nullable Object data) {
LOG.i(getName(), "Notify was called. Posting."); LOG.v(getName(), "Notify was called. Posting.");
mWorker.post(new Runnable() { mWorker.post(new Runnable() {
@Override @Override
public void run() { public void run() {
LOG.i(getName(), "Notify was called. Executing."); LOG.v(getName(), "Notify was called. Executing.");
onEvent(event, data); onEvent(event, data);
} }
}); });
@ -124,11 +124,11 @@ abstract class MediaEncoder {
* NOTE: it's important to call {@link WorkerHandler#post(Runnable)} instead of run()! * NOTE: it's important to call {@link WorkerHandler#post(Runnable)} instead of run()!
*/ */
final void stop() { final void stop() {
LOG.i(getName(), "Stop was called. Posting."); LOG.w(getName(), "Stop was called. Posting.");
mWorker.post(new Runnable() { mWorker.post(new Runnable() {
@Override @Override
public void run() { public void run() {
LOG.i(getName(), "Stop was called. Executing."); LOG.w(getName(), "Stop was called. Executing.");
onStop(); onStop();
} }
}); });
@ -175,8 +175,9 @@ abstract class MediaEncoder {
* parameters, might also be through an input buffer flag). * parameters, might also be through an input buffer flag).
*/ */
private void release() { private void release() {
LOG.w("Subclass", getName(), "Notified that it is released."); LOG.w(getName(), "is being released. Notifying controller and releasing codecs.");
mController.requestRelease(mTrackIndex); // TODO should we notify after this method?
mController.notifyReleased(mTrackIndex);
mMediaCodec.stop(); mMediaCodec.stop();
mMediaCodec.release(); mMediaCodec.release();
mMediaCodec = null; mMediaCodec = null;
@ -217,7 +218,7 @@ abstract class MediaEncoder {
/** /**
* Returns a new input buffer and index, waiting indefinitely if none is available. * Returns a new input buffer and index, waiting indefinitely if none is available.
* The buffer should be written into, then the index should be passed to {@link #encodeInputBuffer(InputBuffer)}. * The buffer should be written into, then be passed to {@link #encodeInputBuffer(InputBuffer)}.
* *
* @param holder the input buffer holder * @param holder the input buffer holder
*/ */
@ -233,7 +234,7 @@ abstract class MediaEncoder {
*/ */
@SuppressWarnings("WeakerAccess") @SuppressWarnings("WeakerAccess")
protected void encodeInputBuffer(InputBuffer buffer) { protected void encodeInputBuffer(InputBuffer buffer) {
LOG.w("ENCODING:", getName(), "Buffer:", buffer.index, "Bytes:", buffer.length, "Presentation:", buffer.timestamp); LOG.v(getName(), "ENCODING - Buffer:", buffer.index, "Bytes:", buffer.length, "Presentation:", buffer.timestamp);
if (buffer.isEndOfStream) { // send EOS if (buffer.isEndOfStream) { // send EOS
mMediaCodec.queueInputBuffer(buffer.index, 0, 0, mMediaCodec.queueInputBuffer(buffer.index, 0, 0,
buffer.timestamp, MediaCodec.BUFFER_FLAG_END_OF_STREAM); buffer.timestamp, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
@ -266,7 +267,7 @@ abstract class MediaEncoder {
@SuppressLint("LogNotTimber") @SuppressLint("LogNotTimber")
@SuppressWarnings("WeakerAccess") @SuppressWarnings("WeakerAccess")
protected void drainOutput(boolean drainAll) { protected void drainOutput(boolean drainAll) {
LOG.w("DRAINING:", getName(), "EOS:", drainAll); LOG.v(getName(), "DRAINING - EOS:", drainAll);
if (mMediaCodec == null) { if (mMediaCodec == null) {
LOG.e("drain() was called before prepare() or after releasing."); LOG.e("drain() was called before prepare() or after releasing.");
return; return;
@ -315,14 +316,12 @@ abstract class MediaEncoder {
// and should be used for offsets only. // and should be used for offsets only.
// TODO find a better way, this causes sync issues. (+ note: this sends pts=0 at first) // TODO find a better way, this causes sync issues. (+ note: this sends pts=0 at first)
// mBufferInfo.presentationTimeUs = mLastPresentationTimeUs - mStartPresentationTimeUs; // mBufferInfo.presentationTimeUs = mLastPresentationTimeUs - mStartPresentationTimeUs;
LOG.i("DRAINING:", getName(), "Dispatching write(). Presentation:", mBufferInfo.presentationTimeUs); LOG.v(getName(), "DRAINING - About to write(). Presentation:", mBufferInfo.presentationTimeUs);
// TODO fix the mBufferInfo being the same, then implement delayed writing in Controller // TODO fix the mBufferInfo being the same, then implement delayed writing in Controller
// and remove the isStarted() check here. // and remove the isStarted() check here.
OutputBuffer buffer = mOutputBufferPool.get(); OutputBuffer buffer = mOutputBufferPool.get();
if (buffer == null) { //noinspection ConstantConditions
throw new IllegalStateException("buffer is null!");
}
buffer.info = mBufferInfo; buffer.info = mBufferInfo;
buffer.trackIndex = mTrackIndex; buffer.trackIndex = mTrackIndex;
buffer.data = encodedData; buffer.data = encodedData;
@ -336,17 +335,18 @@ abstract class MediaEncoder {
&& !mMaxLengthReached && !mMaxLengthReached
&& mStartPresentationTimeUs != Long.MIN_VALUE && mStartPresentationTimeUs != Long.MIN_VALUE
&& mLastPresentationTimeUs - mStartPresentationTimeUs > mMaxLengthMillis * 1000) { && mLastPresentationTimeUs - mStartPresentationTimeUs > mMaxLengthMillis * 1000) {
LOG.w("DRAINING: Reached maxLength! mLastPresentationTimeUs:", mLastPresentationTimeUs, LOG.w(getName(), "DRAINING - Reached maxLength! mLastPresentationTimeUs:", mLastPresentationTimeUs,
"mStartPresentationTimeUs:", mStartPresentationTimeUs, "mStartPresentationTimeUs:", mStartPresentationTimeUs,
"mMaxLengthUs:", mMaxLengthMillis * 1000); "mMaxLengthUs:", mMaxLengthMillis * 1000);
mMaxLengthReached = true; mMaxLengthReached = true;
LOG.w(getName(), "DRAINING - Requesting a stop.");
mController.requestStop(mTrackIndex); mController.requestStop(mTrackIndex);
break; break;
} }
// Check for the EOS flag so we can release the encoder. // Check for the EOS flag so we can release the encoder.
if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) { if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
LOG.w("DRAINING:", getName(), "Dispatching release()."); LOG.w(getName(), "DRAINING - Got EOS. Releasing the codec.");
release(); release();
break; break;
} }

@ -26,34 +26,40 @@ public class MediaEncoderEngine {
*/ */
public interface Listener { public interface Listener {
/**
* Called when encoding started.
*/
@EncoderThread
void onEncodingStart();
/** /**
* Called when encoding stopped for some reason. * Called when encoding stopped for some reason.
* If there's an exception, it failed. * If there's an exception, it failed.
* @param stopReason the reason * @param reason the reason
* @param e the error, if present * @param e the error, if present
*/ */
@EncoderThread @EncoderThread
void onEncoderStop(int stopReason, @Nullable Exception e); void onEncodingEnd(int reason, @Nullable Exception e);
} }
private final static String TAG = MediaEncoderEngine.class.getSimpleName(); private final static String TAG = MediaEncoderEngine.class.getSimpleName();
private final static CameraLogger LOG = CameraLogger.create(TAG); private final static CameraLogger LOG = CameraLogger.create(TAG);
@SuppressWarnings("WeakerAccess") @SuppressWarnings("WeakerAccess")
public final static int STOP_BY_USER = 0; public final static int END_BY_USER = 0;
public final static int STOP_BY_MAX_DURATION = 1; public final static int END_BY_MAX_DURATION = 1;
public final static int STOP_BY_MAX_SIZE = 2; public final static int END_BY_MAX_SIZE = 2;
private ArrayList<MediaEncoder> mEncoders; private ArrayList<MediaEncoder> mEncoders;
private MediaMuxer mMediaMuxer; private MediaMuxer mMediaMuxer;
private int mStartedEncodersCount; private int mStartedEncodersCount;
private int mStoppedEncodersCount; private int mReleasedEncodersCount;
private boolean mMediaMuxerStarted; private boolean mMediaMuxerStarted;
@SuppressWarnings("FieldCanBeLocal") @SuppressWarnings("FieldCanBeLocal")
private Controller mController; private Controller mController;
private Listener mListener; private Listener mListener;
private int mStopReason = STOP_BY_USER; private int mEndReason = END_BY_USER;
private int mPossibleStopReason; private int mPossibleEndReason;
private final Object mControllerLock = new Object(); private final Object mControllerLock = new Object();
/** /**
@ -87,7 +93,7 @@ public class MediaEncoderEngine {
} }
mStartedEncodersCount = 0; mStartedEncodersCount = 0;
mMediaMuxerStarted = false; mMediaMuxerStarted = false;
mStoppedEncodersCount = 0; mReleasedEncodersCount = 0;
// Trying to convert the size constraints to duration constraints, // Trying to convert the size constraints to duration constraints,
// because they are super easy to check. // because they are super easy to check.
@ -101,13 +107,13 @@ public class MediaEncoderEngine {
long finalMaxDuration = Long.MAX_VALUE; long finalMaxDuration = Long.MAX_VALUE;
if (maxSize > 0 && maxDuration > 0) { if (maxSize > 0 && maxDuration > 0) {
mPossibleStopReason = sizeMaxDuration < maxDuration ? STOP_BY_MAX_SIZE : STOP_BY_MAX_DURATION; mPossibleEndReason = sizeMaxDuration < maxDuration ? END_BY_MAX_SIZE : END_BY_MAX_DURATION;
finalMaxDuration = Math.min(sizeMaxDuration, maxDuration); finalMaxDuration = Math.min(sizeMaxDuration, maxDuration);
} else if (maxSize > 0) { } else if (maxSize > 0) {
mPossibleStopReason = STOP_BY_MAX_SIZE; mPossibleEndReason = END_BY_MAX_SIZE;
finalMaxDuration = sizeMaxDuration; finalMaxDuration = sizeMaxDuration;
} else if (maxDuration > 0) { } else if (maxDuration > 0) {
mPossibleStopReason = STOP_BY_MAX_DURATION; mPossibleEndReason = END_BY_MAX_DURATION;
finalMaxDuration = maxDuration; finalMaxDuration = maxDuration;
} }
LOG.w("Computed a max duration of", (finalMaxDuration / 1000F)); LOG.w("Computed a max duration of", (finalMaxDuration / 1000F));
@ -120,6 +126,7 @@ public class MediaEncoderEngine {
* Asks encoders to start (each one on its own track). * Asks encoders to start (each one on its own track).
*/ */
public final void start() { public final void start() {
LOG.i("Passing event to encoders:", "START");
for (MediaEncoder encoder : mEncoders) { for (MediaEncoder encoder : mEncoders) {
encoder.start(); encoder.start();
} }
@ -133,6 +140,7 @@ public class MediaEncoderEngine {
*/ */
@SuppressWarnings("SameParameterValue") @SuppressWarnings("SameParameterValue")
public final void notify(final String event, final Object data) { public final void notify(final String event, final Object data) {
LOG.i("Passing event to encoders:", event);
for (MediaEncoder encoder : mEncoders) { for (MediaEncoder encoder : mEncoders) {
encoder.notify(event, data); encoder.notify(event, data);
} }
@ -140,21 +148,23 @@ public class MediaEncoderEngine {
/** /**
* Asks encoders to stop. This is not sync, of course we will ask for encoders * Asks encoders to stop. This is not sync, of course we will ask for encoders
* to call {@link Controller#requestRelease(int)} before actually stop the muxer. * to call {@link Controller#notifyReleased(int)} before actually stop the muxer.
* When all encoders request a release, {@link #release()} is called to do cleanup * When all encoders request a release, {@link #end()} is called to do cleanup
* and notify the listener. * and notify the listener.
*/ */
public final void stop() { public final void stop() {
LOG.i("Passing event to encoders:", "STOP");
for (MediaEncoder encoder : mEncoders) { for (MediaEncoder encoder : mEncoders) {
encoder.stop(); encoder.stop();
} }
} }
/** /**
* Called after all encoders have requested a release using {@link Controller#requestRelease(int)}. * Called after all encoders have requested a release using {@link Controller#notifyReleased(int)}.
* At this point we will do cleanup and notify the listener. * At this point we will do cleanup and notify the listener.
*/ */
private void release() { private void end() {
LOG.i("end:", "Releasing muxer after all encoders have been released.");
Exception error = null; Exception error = null;
if (mMediaMuxer != null) { if (mMediaMuxer != null) {
// stop() throws an exception if you haven't fed it any data. // stop() throws an exception if you haven't fed it any data.
@ -168,14 +178,16 @@ public class MediaEncoderEngine {
} }
mMediaMuxer = null; mMediaMuxer = null;
} }
LOG.w("end:", "Dispatching end to listener - reason:", mEndReason, "error:", error);
if (mListener != null) { if (mListener != null) {
mListener.onEncoderStop(mStopReason, error); mListener.onEncodingEnd(mEndReason, error);
mListener = null; mListener = null;
} }
mStopReason = STOP_BY_USER; mEndReason = END_BY_USER;
mStartedEncodersCount = 0; mStartedEncodersCount = 0;
mStoppedEncodersCount = 0; mReleasedEncodersCount = 0;
mMediaMuxerStarted = false; mMediaMuxerStarted = false;
LOG.i("end:", "Completed.");
} }
/** /**
@ -219,10 +231,14 @@ public class MediaEncoderEngine {
throw new IllegalStateException("Trying to start but muxer started already"); throw new IllegalStateException("Trying to start but muxer started already");
} }
int track = mMediaMuxer.addTrack(format); int track = mMediaMuxer.addTrack(format);
LOG.w("Controller:", "Assigned track", track, "to format", format.getString(MediaFormat.KEY_MIME)); LOG.w("requestStart:", "Assigned track", track, "to format", format.getString(MediaFormat.KEY_MIME));
if (++mStartedEncodersCount == mEncoders.size()) { if (++mStartedEncodersCount == mEncoders.size()) {
LOG.w("requestStart:", "All encoders have started. Starting muxer and dispatching onEncodingStart().");
mMediaMuxer.start(); mMediaMuxer.start();
mMediaMuxerStarted = true; mMediaMuxerStarted = true;
if (mListener != null) {
mListener.onEncodingStart();
}
} }
return track; return track;
} }
@ -251,7 +267,7 @@ public class MediaEncoderEngine {
// This is a bad idea and causes crashes. // This is a bad idea and causes crashes.
// if (info.presentationTimeUs < mLastTimestampUs) info.presentationTimeUs = mLastTimestampUs; // if (info.presentationTimeUs < mLastTimestampUs) info.presentationTimeUs = mLastTimestampUs;
// mLastTimestampUs = info.presentationTimeUs; // mLastTimestampUs = info.presentationTimeUs;
LOG.v("Writing for track", buffer.trackIndex, ". Presentation:", buffer.info.presentationTimeUs); LOG.v("write:", "Writing OutputBuffer - track:", buffer.trackIndex, "presentation:", buffer.info.presentationTimeUs);
mMediaMuxer.writeSampleData(buffer.trackIndex, buffer.data, buffer.info); mMediaMuxer.writeSampleData(buffer.trackIndex, buffer.data, buffer.info);
pool.recycle(buffer); pool.recycle(buffer);
} }
@ -264,10 +280,11 @@ public class MediaEncoderEngine {
* When this succeeds, {@link MediaEncoder#stop()} is called. * When this succeeds, {@link MediaEncoder#stop()} is called.
*/ */
void requestStop(int track) { void requestStop(int track) {
LOG.i("RequestStop was called for track", track);
synchronized (mControllerLock) { synchronized (mControllerLock) {
LOG.w("requestStop:", "Called for track", track);
if (--mStartedEncodersCount == 0) { if (--mStartedEncodersCount == 0) {
mStopReason = mPossibleStopReason; LOG.w("requestStop:", "All encoders have requested a stop. Stopping them.");
mEndReason = mPossibleEndReason;
stop(); stop();
} }
} }
@ -277,11 +294,12 @@ public class MediaEncoderEngine {
* Notifies that the encoder was stopped. After this is called by all encoders, * Notifies that the encoder was stopped. After this is called by all encoders,
* we will actually stop the muxer. * we will actually stop the muxer.
*/ */
void requestRelease(int track) { void notifyReleased(int track) {
LOG.i("requestRelease was called for track", track);
synchronized (mControllerLock) { synchronized (mControllerLock) {
if (++mStoppedEncodersCount == mEncoders.size()) { LOG.w("notifyReleased:", "Called for track", track);
release(); if (++mReleasedEncodersCount == mEncoders.size()) {
LOG.w("requestStop:", "All encoders have been released. Stopping the muxer.");
end();
} }
} }
} }

@ -24,17 +24,24 @@ public class TextureMediaEncoder extends VideoMediaEncoder<TextureMediaEncoder.C
private static final CameraLogger LOG = CameraLogger.create(TAG); private static final CameraLogger LOG = CameraLogger.create(TAG);
public final static String FRAME_EVENT = "frame"; public final static String FRAME_EVENT = "frame";
public final static int NO_TEXTURE = Integer.MIN_VALUE;
public static class Config extends VideoMediaEncoder.Config { public static class Config extends VideoMediaEncoder.Config {
int textureId; int textureId;
int overlayTextureId;
float scaleX; float scaleX;
float scaleY; float scaleY;
boolean scaleFlipped;
EGLContext eglContext; EGLContext eglContext;
int transformRotation; int transformRotation;
int overlayTransformRotation;
public Config(int width, int height, int bitRate, int frameRate, int rotation, String mimeType,
int textureId, float scaleX, float scaleY, boolean scaleFlipped, EGLContext eglContext) { public Config(int width, int height,
int bitRate, int frameRate,
int rotation, @NonNull String mimeType,
int textureId,
float scaleX, float scaleY,
@NonNull EGLContext eglContext,
int overlayTextureId, int overlayRotation) {
// We rotate the texture using transformRotation. Pass rotation=0 to super so that // We rotate the texture using transformRotation. Pass rotation=0 to super so that
// no rotation metadata is written into the output file. // no rotation metadata is written into the output file.
super(width, height, bitRate, frameRate, 0, mimeType); super(width, height, bitRate, frameRate, 0, mimeType);
@ -42,8 +49,9 @@ public class TextureMediaEncoder extends VideoMediaEncoder<TextureMediaEncoder.C
this.textureId = textureId; this.textureId = textureId;
this.scaleX = scaleX; this.scaleX = scaleX;
this.scaleY = scaleY; this.scaleY = scaleY;
this.scaleFlipped = scaleFlipped;
this.eglContext = eglContext; this.eglContext = eglContext;
this.overlayTextureId = overlayTextureId;
this.overlayTransformRotation = overlayRotation;
} }
} }
@ -67,6 +75,7 @@ public class TextureMediaEncoder extends VideoMediaEncoder<TextureMediaEncoder.C
// Typically coming from SurfaceTexture.getTimestamp(). // Typically coming from SurfaceTexture.getTimestamp().
public long timestamp; public long timestamp;
public float[] transform = new float[16]; public float[] transform = new float[16];
public float[] overlayTransform = new float[16];
} }
@NonNull @NonNull
@ -117,6 +126,7 @@ public class TextureMediaEncoder extends VideoMediaEncoder<TextureMediaEncoder.C
// We must scale this matrix like GlCameraPreview does, because it might have some cropping. // We must scale this matrix like GlCameraPreview does, because it might have some cropping.
// Scaling takes place with respect to the (0, 0, 0) point, so we must apply a Translation to compensate. // Scaling takes place with respect to the (0, 0, 0) point, so we must apply a Translation to compensate.
float[] transform = frame.transform; float[] transform = frame.transform;
float[] overlayTransform = frame.overlayTransform;
float scaleX = mConfig.scaleX; float scaleX = mConfig.scaleX;
float scaleY = mConfig.scaleY; float scaleY = mConfig.scaleY;
float scaleTranslX = (1F - scaleX) / 2F; float scaleTranslX = (1F - scaleX) / 2F;
@ -128,17 +138,25 @@ public class TextureMediaEncoder extends VideoMediaEncoder<TextureMediaEncoder.C
// stream, but the output video, must be correctly rotated based on the device rotation at the moment. // stream, but the output video, must be correctly rotated based on the device rotation at the moment.
// Rotation also takes place with respect to the origin (the Z axis), so we must // Rotation also takes place with respect to the origin (the Z axis), so we must
// translate to origin, rotate, then back to where we were. // translate to origin, rotate, then back to where we were.
Matrix.translateM(transform, 0, 0.5F, 0.5F, 0); Matrix.translateM(transform, 0, 0.5F, 0.5F, 0);
Matrix.rotateM(transform, 0, mConfig.transformRotation, 0, 0, 1); Matrix.rotateM(transform, 0, mConfig.transformRotation, 0, 0, 1);
Matrix.translateM(transform, 0, -0.5F, -0.5F, 0); Matrix.translateM(transform, 0, -0.5F, -0.5F, 0);
boolean hasOverlay = mConfig.overlayTextureId != NO_TEXTURE;
if (hasOverlay) {
Matrix.translateM(overlayTransform, 0, 0.5F, 0.5F, 0);
Matrix.rotateM(overlayTransform, 0, mConfig.overlayTransformRotation, 0, 0, 1);
Matrix.translateM(overlayTransform, 0, -0.5F, -0.5F, 0);
}
LOG.v("onEvent", "frameNum:", thisFrameNum, "realFrameNum:", mFrameNum, "calling drainOutput."); LOG.v("onEvent", "frameNum:", thisFrameNum, "realFrameNum:", mFrameNum, "calling drainOutput.");
drainOutput(false); drainOutput(false);
// Future note: passing scale values to the viewport? They are scaleX and scaleY,
// but flipped based on the mConfig.scaleFlipped boolean.
LOG.v("onEvent", "frameNum:", thisFrameNum, "realFrameNum:", mFrameNum, "calling drawFrame."); LOG.v("onEvent", "frameNum:", thisFrameNum, "realFrameNum:", mFrameNum, "calling drawFrame.");
mViewport.drawFrame(mConfig.textureId, transform); mViewport.drawFrame(mConfig.textureId, transform);
if (hasOverlay) {
mViewport.drawFrame(mConfig.overlayTextureId, overlayTransform);
}
mWindow.setPresentationTime(frame.timestamp); mWindow.setPresentationTime(frame.timestamp);
mWindow.swapBuffers(); mWindow.swapBuffers();
mFramePool.recycle(frame); mFramePool.recycle(frame);

@ -42,7 +42,7 @@ abstract class VideoMediaEncoder<C extends VideoMediaEncoder.Config> extends Med
int rotation; int rotation;
String mimeType; String mimeType;
Config(int width, int height, int bitRate, int frameRate, int rotation, String mimeType) { Config(int width, int height, int bitRate, int frameRate, int rotation, @NonNull String mimeType) {
this.width = width; this.width = width;
this.height = height; this.height = height;
this.bitRate = bitRate; this.bitRate = bitRate;

@ -22,12 +22,12 @@
<attr name="cameraVideoSizeBiggest" format="boolean"/> <attr name="cameraVideoSizeBiggest" format="boolean"/>
<attr name="cameraVideoSizeAspectRatio" format="string|reference"/> <attr name="cameraVideoSizeAspectRatio" format="string|reference"/>
<attr name="cameraVideoBitRate" format="integer|reference" />
<attr name="cameraAudioBitRate" format="integer|reference" />
<attr name="cameraSnapshotMaxWidth" format="integer|reference" /> <attr name="cameraSnapshotMaxWidth" format="integer|reference" />
<attr name="cameraSnapshotMaxHeight" format="integer|reference" /> <attr name="cameraSnapshotMaxHeight" format="integer|reference" />
<attr name="cameraVideoBitRate" format="integer|reference" />
<attr name="cameraAudioBitRate" format="integer|reference" />
<attr name="cameraGestureTap" format="enum"> <attr name="cameraGestureTap" format="enum">
<enum name="none" value="0" /> <enum name="none" value="0" />
<enum name="autoFocus" value="1" /> <enum name="autoFocus" value="1" />
@ -58,6 +58,17 @@
<enum name="exposureCorrection" value="4" /> <enum name="exposureCorrection" value="4" />
</attr> </attr>
<attr name="cameraEngine" format="enum">
<enum name="camera1" value="0" />
<enum name="camera2" value="1" />
</attr>
<attr name="cameraPreview" format="enum">
<enum name="surface" value="0" />
<enum name="texture" value="1" />
<enum name="glSurface" value="2" />
</attr>
<attr name="cameraFacing" format="enum"> <attr name="cameraFacing" format="enum">
<enum name="back" value="0" /> <enum name="back" value="0" />
<enum name="front" value="1" /> <enum name="front" value="1" />
@ -83,15 +94,6 @@
<enum name="cloudy" value="4" /> <enum name="cloudy" value="4" />
</attr> </attr>
<attr name="cameraGrid" format="enum">
<enum name="off" value="0" />
<enum name="draw3x3" value="1" />
<enum name="draw4x4" value="2" />
<enum name="drawPhi" value="3" />
</attr>
<attr name="cameraGridColor" format="color|reference"/>
<attr name="cameraMode" format="enum"> <attr name="cameraMode" format="enum">
<enum name="picture" value="0" /> <enum name="picture" value="0" />
<enum name="video" value="1" /> <enum name="video" value="1" />
@ -102,16 +104,14 @@
<enum name="on" value="1" /> <enum name="on" value="1" />
</attr> </attr>
<attr name="cameraPreview" format="enum"> <attr name="cameraGrid" format="enum">
<enum name="surface" value="0" /> <enum name="off" value="0" />
<enum name="texture" value="1" /> <enum name="draw3x3" value="1" />
<enum name="glSurface" value="2" /> <enum name="draw4x4" value="2" />
<enum name="drawPhi" value="3" />
</attr> </attr>
<attr name="cameraEngine" format="enum"> <attr name="cameraGridColor" format="color|reference"/>
<enum name="camera1" value="0" />
<enum name="camera2" value="1" />
</attr>
<attr name="cameraPlaySounds" format="boolean" /> <attr name="cameraPlaySounds" format="boolean" />
@ -125,13 +125,20 @@
<enum name="h264" value="2" /> <enum name="h264" value="2" />
</attr> </attr>
<attr name="cameraExperimental" format="boolean" />
<attr name="cameraAutoFocusResetDelay" format="integer|reference"/> <attr name="cameraAutoFocusResetDelay" format="integer|reference"/>
<attr name="cameraAutoFocusMarker" format="string|reference"/> <attr name="cameraAutoFocusMarker" format="string|reference"/>
<attr name="cameraUseDeviceOrientation" format="boolean"/> <attr name="cameraUseDeviceOrientation" format="boolean"/>
<attr name="cameraExperimental" format="boolean" />
</declare-styleable>
<declare-styleable name="CameraView_Layout">
<attr name="layout_drawOnPreview" format="boolean"/>
<attr name="layout_drawOnPictureSnapshot" format="boolean"/>
<attr name="layout_drawOnVideoSnapshot" format="boolean"/>
</declare-styleable> </declare-styleable>
</resources> </resources>

@ -1,16 +1,21 @@
package com.otaliastudios.cameraview.demo; package com.otaliastudios.cameraview.demo;
import android.animation.Animator;
import android.animation.ValueAnimator;
import android.annotation.SuppressLint;
import android.content.Intent; import android.content.Intent;
import android.content.pm.PackageManager; import android.content.pm.PackageManager;
import android.os.Bundle; import android.os.Bundle;
import androidx.annotation.NonNull; import androidx.annotation.NonNull;
import com.google.android.material.bottomsheet.BottomSheetBehavior; import com.google.android.material.bottomsheet.BottomSheetBehavior;
import androidx.appcompat.app.AppCompatActivity; import androidx.appcompat.app.AppCompatActivity;
import androidx.interpolator.view.animation.FastOutSlowInInterpolator;
import android.util.Log; import android.util.Log;
import android.view.View; import android.view.View;
import android.view.ViewGroup; import android.view.ViewGroup;
import android.view.ViewTreeObserver; import android.view.ViewTreeObserver;
import android.widget.TextView;
import android.widget.Toast; import android.widget.Toast;
import com.otaliastudios.cameraview.CameraException; import com.otaliastudios.cameraview.CameraException;
@ -24,6 +29,7 @@ import com.otaliastudios.cameraview.VideoResult;
import com.otaliastudios.cameraview.controls.Preview; import com.otaliastudios.cameraview.controls.Preview;
import java.io.File; import java.io.File;
import java.util.Arrays;
import java.util.List; import java.util.List;
@ -31,8 +37,6 @@ public class CameraActivity extends AppCompatActivity implements View.OnClickLis
private CameraView camera; private CameraView camera;
private ViewGroup controlPanel; private ViewGroup controlPanel;
// To show stuff in the callback
private long mCaptureTime; private long mCaptureTime;
@Override @Override
@ -65,9 +69,41 @@ public class CameraActivity extends AppCompatActivity implements View.OnClickLis
controlPanel = findViewById(R.id.controls); controlPanel = findViewById(R.id.controls);
ViewGroup group = (ViewGroup) controlPanel.getChildAt(0); ViewGroup group = (ViewGroup) controlPanel.getChildAt(0);
List<Option<?>> options = Option.getAll(); final View watermark = findViewById(R.id.watermark);
for (Option option : options) {
OptionView view = new OptionView(this, option, this); List<Option<?>> options = Arrays.asList(
// Layout
new Option.Width(), new Option.Height(),
// Engine and preview
new Option.Mode(), new Option.Engine(), new Option.Preview(),
// Some controls
new Option.Flash(), new Option.WhiteBalance(), new Option.Hdr(),
// Video recording
new Option.VideoCodec(), new Option.Audio(),
// Gestures
new Option.Pinch(), new Option.HorizontalScroll(), new Option.VerticalScroll(),
new Option.Tap(), new Option.LongTap(),
// Watermarks
new Option.OverlayInPreview(watermark),
new Option.OverlayInPictureSnapshot(watermark),
new Option.OverlayInVideoSnapshot(watermark),
// Other
new Option.Grid(), new Option.GridColor(), new Option.UseDeviceOrientation()
);
List<Boolean> dividers = Arrays.asList(
false, true,
false, false, true,
false, false, true,
false, true,
false, false, false, false, true,
false, false, true,
false, false, true
);
for (int i = 0; i < options.size(); i++) {
OptionView view = new OptionView(this);
//noinspection unchecked
view.setOption(options.get(i), this);
view.setHasDivider(dividers.get(i));
group.addView(view, group.addView(view,
ViewGroup.LayoutParams.MATCH_PARENT, ViewGroup.LayoutParams.MATCH_PARENT,
ViewGroup.LayoutParams.WRAP_CONTENT); ViewGroup.LayoutParams.WRAP_CONTENT);
@ -80,6 +116,22 @@ public class CameraActivity extends AppCompatActivity implements View.OnClickLis
b.setState(BottomSheetBehavior.STATE_HIDDEN); b.setState(BottomSheetBehavior.STATE_HIDDEN);
} }
}); });
// Animate the watermark just to show we record the animation in video snapshots
ValueAnimator animator = ValueAnimator.ofFloat(1F, 0.8F);
animator.setDuration(300);
animator.setRepeatCount(ValueAnimator.INFINITE);
animator.setRepeatMode(ValueAnimator.REVERSE);
animator.addUpdateListener(new ValueAnimator.AnimatorUpdateListener() {
@Override
public void onAnimationUpdate(ValueAnimator animation) {
float scale = (float) animation.getAnimatedValue();
watermark.setScaleX(scale);
watermark.setScaleY(scale);
watermark.setRotation(watermark.getRotation() + 2);
}
});
animator.start();
} }
private void message(String content, boolean important) { private void message(String content, boolean important) {

@ -12,6 +12,8 @@ import com.otaliastudios.cameraview.CameraOptions;
import com.otaliastudios.cameraview.CameraView; import com.otaliastudios.cameraview.CameraView;
import com.otaliastudios.cameraview.gesture.Gesture; import com.otaliastudios.cameraview.gesture.Gesture;
import com.otaliastudios.cameraview.gesture.GestureAction; import com.otaliastudios.cameraview.gesture.GestureAction;
import com.otaliastudios.cameraview.overlay.Overlay;
import com.otaliastudios.cameraview.overlay.OverlayLayout;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.Arrays; import java.util.Arrays;
@ -21,51 +23,13 @@ import java.util.List;
/** /**
* Controls that we want to display in a ControlView. * Controls that we want to display in a ControlView.
*/ */
@SuppressWarnings("WeakerAccess")
public abstract class Option<T> { public abstract class Option<T> {
public static List<Option<?>> getAll() {
return Arrays.asList(
// Layout
new Width(false),
new Height(true),
// Engine and preview
new Mode(false),
new Engine(false),
new Preview(true),
// Some controls
new Flash(false),
new WhiteBalance(false),
new Hdr(true),
// Video recording
new VideoCodec(false),
new Audio(true),
// TODO audio bitRate
// TODO video bitRate
// They are a bit annoying because it's not clear what the default should be.
// Gestures
new Pinch(false),
new HorizontalScroll(false),
new VerticalScroll(false),
new Tap(false),
new LongTap(true),
// Other
new Grid(false),
new GridColor(false),
new UseDeviceOrientation(true)
);
}
private String name; private String name;
private boolean hasDividerBelow;
private Option(@NonNull String name, boolean hasDividerBelow) { private Option(@NonNull String name) {
this.name = name; this.name = name;
this.hasDividerBelow = hasDividerBelow;
} }
@SuppressWarnings("WeakerAccess") @SuppressWarnings("WeakerAccess")
@ -74,11 +38,6 @@ public abstract class Option<T> {
return name; return name;
} }
@SuppressWarnings("WeakerAccess")
public final boolean hasDividerBelow() {
return hasDividerBelow;
}
@NonNull @NonNull
public abstract T get(@NonNull CameraView view); public abstract T get(@NonNull CameraView view);
@ -93,8 +52,8 @@ public abstract class Option<T> {
} }
public static class Width extends Option<Integer> { public static class Width extends Option<Integer> {
Width(boolean hasDividerBelow) { public Width() {
super("Width", hasDividerBelow); super("Width");
} }
@NonNull @NonNull
@ -135,8 +94,8 @@ public abstract class Option<T> {
} }
public static class Height extends Option<Integer> { public static class Height extends Option<Integer> {
Height(boolean hasDividerBelow) { public Height() {
super("Height", hasDividerBelow); super("Height");
} }
@NonNull @NonNull
@ -179,8 +138,8 @@ public abstract class Option<T> {
private static abstract class ControlOption<T extends com.otaliastudios.cameraview.controls.Control> extends Option<T> { private static abstract class ControlOption<T extends com.otaliastudios.cameraview.controls.Control> extends Option<T> {
private final Class<T> controlClass; private final Class<T> controlClass;
ControlOption(@NonNull Class<T> controlClass, String name, boolean hasDividerBelow) { ControlOption(@NonNull Class<T> controlClass, String name) {
super(name, hasDividerBelow); super(name);
this.controlClass = controlClass; this.controlClass = controlClass;
} }
@ -203,14 +162,14 @@ public abstract class Option<T> {
} }
public static class Mode extends ControlOption<com.otaliastudios.cameraview.controls.Mode> { public static class Mode extends ControlOption<com.otaliastudios.cameraview.controls.Mode> {
Mode(boolean hasDividerBelow) { public Mode() {
super(com.otaliastudios.cameraview.controls.Mode.class, "Mode", hasDividerBelow); super(com.otaliastudios.cameraview.controls.Mode.class, "Mode");
} }
} }
public static class Engine extends ControlOption<com.otaliastudios.cameraview.controls.Engine> { public static class Engine extends ControlOption<com.otaliastudios.cameraview.controls.Engine> {
Engine(boolean hasDividerBelow) { public Engine() {
super(com.otaliastudios.cameraview.controls.Engine.class, "Engine", hasDividerBelow); super(com.otaliastudios.cameraview.controls.Engine.class, "Engine");
} }
@Override @Override
@ -234,8 +193,8 @@ public abstract class Option<T> {
} }
public static class Preview extends ControlOption<com.otaliastudios.cameraview.controls.Preview> { public static class Preview extends ControlOption<com.otaliastudios.cameraview.controls.Preview> {
Preview(boolean hasDividerBelow) { public Preview() {
super(com.otaliastudios.cameraview.controls.Preview.class, "Preview Surface", hasDividerBelow); super(com.otaliastudios.cameraview.controls.Preview.class, "Preview Surface");
} }
@Override @Override
@ -276,32 +235,32 @@ public abstract class Option<T> {
} }
public static class Flash extends ControlOption<com.otaliastudios.cameraview.controls.Flash> { public static class Flash extends ControlOption<com.otaliastudios.cameraview.controls.Flash> {
Flash(boolean hasDividerBelow) { public Flash() {
super(com.otaliastudios.cameraview.controls.Flash.class, "Flash", hasDividerBelow); super(com.otaliastudios.cameraview.controls.Flash.class, "Flash");
} }
} }
public static class WhiteBalance extends ControlOption<com.otaliastudios.cameraview.controls.WhiteBalance> { public static class WhiteBalance extends ControlOption<com.otaliastudios.cameraview.controls.WhiteBalance> {
WhiteBalance(boolean hasDividerBelow) { public WhiteBalance() {
super(com.otaliastudios.cameraview.controls.WhiteBalance.class, "White Balance", hasDividerBelow); super(com.otaliastudios.cameraview.controls.WhiteBalance.class, "White Balance");
} }
} }
public static class Hdr extends ControlOption<com.otaliastudios.cameraview.controls.Hdr> { public static class Hdr extends ControlOption<com.otaliastudios.cameraview.controls.Hdr> {
Hdr(boolean hasDividerBelow) { public Hdr() {
super(com.otaliastudios.cameraview.controls.Hdr.class, "HDR", hasDividerBelow); super(com.otaliastudios.cameraview.controls.Hdr.class, "HDR");
} }
} }
public static class VideoCodec extends ControlOption<com.otaliastudios.cameraview.controls.VideoCodec> { public static class VideoCodec extends ControlOption<com.otaliastudios.cameraview.controls.VideoCodec> {
VideoCodec(boolean hasDividerBelow) { public VideoCodec() {
super(com.otaliastudios.cameraview.controls.VideoCodec.class, "Video Codec", hasDividerBelow); super(com.otaliastudios.cameraview.controls.VideoCodec.class, "Video Codec");
} }
} }
public static class Audio extends ControlOption<com.otaliastudios.cameraview.controls.Audio> { public static class Audio extends ControlOption<com.otaliastudios.cameraview.controls.Audio> {
Audio(boolean hasDividerBelow) { public Audio() {
super(com.otaliastudios.cameraview.controls.Audio.class, "Audio", hasDividerBelow); super(com.otaliastudios.cameraview.controls.Audio.class, "Audio");
} }
} }
@ -309,8 +268,8 @@ public abstract class Option<T> {
private final Gesture gesture; private final Gesture gesture;
private final GestureAction[] allActions = GestureAction.values(); private final GestureAction[] allActions = GestureAction.values();
GestureOption(@NonNull Gesture gesture, String name, boolean hasDividerBelow) { GestureOption(@NonNull Gesture gesture, String name) {
super(name, hasDividerBelow); super(name);
this.gesture = gesture; this.gesture = gesture;
} }
@ -339,45 +298,103 @@ public abstract class Option<T> {
} }
public static class Pinch extends GestureOption { public static class Pinch extends GestureOption {
Pinch(boolean hasDividerBelow) { public Pinch() {
super(Gesture.PINCH, "Pinch", hasDividerBelow); super(Gesture.PINCH, "Pinch");
} }
} }
public static class HorizontalScroll extends GestureOption { public static class HorizontalScroll extends GestureOption {
HorizontalScroll(boolean hasDividerBelow) { public HorizontalScroll() {
super(Gesture.SCROLL_HORIZONTAL, "Horizontal Scroll", hasDividerBelow); super(Gesture.SCROLL_HORIZONTAL, "Horizontal Scroll");
} }
} }
public static class VerticalScroll extends GestureOption { public static class VerticalScroll extends GestureOption {
VerticalScroll(boolean hasDividerBelow) { public VerticalScroll() {
super(Gesture.SCROLL_VERTICAL, "Vertical Scroll", hasDividerBelow); super(Gesture.SCROLL_VERTICAL, "Vertical Scroll");
} }
} }
public static class Tap extends GestureOption { public static class Tap extends GestureOption {
Tap(boolean hasDividerBelow) { public Tap() {
super(Gesture.TAP, "Tap", hasDividerBelow); super(Gesture.TAP, "Tap");
} }
} }
public static class LongTap extends GestureOption { public static class LongTap extends GestureOption {
LongTap(boolean hasDividerBelow) { public LongTap() {
super(Gesture.LONG_TAP, "Long Tap", hasDividerBelow); super(Gesture.LONG_TAP, "Long Tap");
}
}
private static abstract class OverlayOption extends Option<Boolean> {
private View overlay;
private Overlay.Target target;
OverlayOption(@NonNull Overlay.Target target, @NonNull String name, @NonNull View overlay) {
super(name);
this.overlay = overlay;
this.target = target;
}
@NonNull
@Override
public Collection<Boolean> getAll(@NonNull CameraView view, @NonNull CameraOptions options) {
return Arrays.asList(true, false);
}
@NonNull
@Override
public Boolean get(@NonNull CameraView view) {
OverlayLayout.LayoutParams params = (OverlayLayout.LayoutParams) overlay.getLayoutParams();
switch (target) {
case PREVIEW: return params.drawOnPreview;
case PICTURE_SNAPSHOT: return params.drawOnPictureSnapshot;
case VIDEO_SNAPSHOT: return params.drawOnVideoSnapshot;
}
return false;
}
@Override
public void set(@NonNull CameraView view, @NonNull Boolean value) {
OverlayLayout.LayoutParams params = (OverlayLayout.LayoutParams) overlay.getLayoutParams();
switch (target) {
case PREVIEW: params.drawOnPreview = value; break;
case PICTURE_SNAPSHOT: params.drawOnPictureSnapshot = value; break;
case VIDEO_SNAPSHOT: params.drawOnVideoSnapshot = value; break;
}
overlay.setLayoutParams(params);
}
}
public static class OverlayInPreview extends OverlayOption {
public OverlayInPreview(@NonNull View overlay) {
super(Overlay.Target.PREVIEW, "Overlay in Preview", overlay);
}
}
public static class OverlayInPictureSnapshot extends OverlayOption {
public OverlayInPictureSnapshot(@NonNull View overlay) {
super(Overlay.Target.PICTURE_SNAPSHOT, "Overlay in Picture Snapshot", overlay);
}
}
public static class OverlayInVideoSnapshot extends OverlayOption {
public OverlayInVideoSnapshot(@NonNull View overlay) {
super(Overlay.Target.VIDEO_SNAPSHOT, "Overlay in Video Snapshot", overlay);
} }
} }
public static class Grid extends ControlOption<com.otaliastudios.cameraview.controls.Grid> { public static class Grid extends ControlOption<com.otaliastudios.cameraview.controls.Grid> {
Grid(boolean hasDividerBelow) { public Grid() {
super(com.otaliastudios.cameraview.controls.Grid.class, "Grid Lines", hasDividerBelow); super(com.otaliastudios.cameraview.controls.Grid.class, "Grid Lines");
} }
} }
public static class GridColor extends Option<Pair<Integer, String>> { public static class GridColor extends Option<Pair<Integer, String>> {
GridColor(boolean hasDividerBelow) { public GridColor() {
super("Grid Color", hasDividerBelow); super("Grid Color");
} }
private static final List<Pair<Integer, String>> ALL = Arrays.asList( private static final List<Pair<Integer, String>> ALL = Arrays.asList(
@ -420,8 +437,8 @@ public abstract class Option<T> {
} }
public static class UseDeviceOrientation extends Option<Boolean> { public static class UseDeviceOrientation extends Option<Boolean> {
UseDeviceOrientation(boolean hasDividerBelow) { public UseDeviceOrientation() {
super("Use Device Orientation", hasDividerBelow); super("Use Device Orientation");
} }
@NonNull @NonNull

@ -33,23 +33,27 @@ public class OptionView<Value> extends LinearLayout implements Spinner.OnItemSel
private Callback callback; private Callback callback;
private Spinner spinner; private Spinner spinner;
public OptionView(Context context, Option option, Callback callback) { public OptionView(@NonNull Context context) {
super(context); super(context);
this.option = option;
this.callback = callback;
setOrientation(VERTICAL); setOrientation(VERTICAL);
inflate(context, R.layout.option_view, this); inflate(context, R.layout.option_view, this);
TextView title = findViewById(R.id.title);
title.setText(option.getName());
View divider = findViewById(R.id.divider);
divider.setVisibility(option.hasDividerBelow() ? View.VISIBLE : View.GONE);
ViewGroup content = findViewById(R.id.content); ViewGroup content = findViewById(R.id.content);
spinner = new Spinner(context, Spinner.MODE_DROPDOWN); spinner = new Spinner(context, Spinner.MODE_DROPDOWN);
content.addView(spinner); content.addView(spinner);
} }
public void setHasDivider(boolean hasDivider) {
View divider = findViewById(R.id.divider);
divider.setVisibility(hasDivider ? View.VISIBLE : View.GONE);
}
public void setOption(@NonNull Option<Value> option, @NonNull Callback callback) {
this.option = option;
this.callback = callback;
TextView title = findViewById(R.id.title);
title.setText(option.getName());
}
@SuppressWarnings("all") @SuppressWarnings("all")
public void onCameraOpened(CameraView view, CameraOptions options) { public void onCameraOpened(CameraView view, CameraOptions options) {
values = new ArrayList(option.getAll(view, options)); values = new ArrayList(option.getAll(view, options));

@ -17,7 +17,7 @@
android:layout_marginBottom="88dp" android:layout_marginBottom="88dp"
android:keepScreenOn="true" android:keepScreenOn="true"
app:cameraExperimental="true" app:cameraExperimental="true"
app:cameraEngine="camera2" app:cameraEngine="camera1"
app:cameraPreview="glSurface" app:cameraPreview="glSurface"
app:cameraPlaySounds="true" app:cameraPlaySounds="true"
app:cameraGrid="off" app:cameraGrid="off"
@ -29,8 +29,21 @@
app:cameraGestureScrollHorizontal="exposureCorrection" app:cameraGestureScrollHorizontal="exposureCorrection"
app:cameraGestureScrollVertical="none" app:cameraGestureScrollVertical="none"
app:cameraMode="picture" app:cameraMode="picture"
app:cameraAutoFocusMarker="@string/cameraview_default_autofocus_marker"/> app:cameraAutoFocusMarker="@string/cameraview_default_autofocus_marker">
<!-- Watermark -->
<ImageView
android:id="@+id/watermark"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_gravity="bottom|end"
app:layout_drawOnPreview="true"
app:layout_drawOnVideoSnapshot="true"
app:layout_drawOnPictureSnapshot="true"
android:src="@mipmap/cameraview"
android:padding="8dp"/>
</com.otaliastudios.cameraview.CameraView>
<ImageButton <ImageButton
android:id="@+id/toggleCamera" android:id="@+id/toggleCamera"

@ -10,6 +10,7 @@ New versions are released through GitHub, so the reference page is the [GitHub R
### v2.0.0 (to be released) ### v2.0.0 (to be released)
- New: support for watermarks and animated overlays ([docs](../docs/watermarks-and-overlays.html)), thanks to [@RAN3000][RAN3000] ([#502][502], [#421][421])
- New: added `onVideoRecordingStart()` to be notified when video recording starts, thanks to [@agrawalsuneet][agrawalsuneet] ([#498][498]) - New: added `onVideoRecordingStart()` to be notified when video recording starts, thanks to [@agrawalsuneet][agrawalsuneet] ([#498][498])
- New: added `cameraUseDeviceOrientation` to choose whether picture and video outputs should consider the device orientation or not ([#497][497]) - New: added `cameraUseDeviceOrientation` to choose whether picture and video outputs should consider the device orientation or not ([#497][497])
- Improvement: improved Camera2 stability and various bugs fixed (e.g. [#501][501]) - Improvement: improved Camera2 stability and various bugs fixed (e.g. [#501][501])
@ -58,6 +59,7 @@ This is the first beta release. For changes with respect to v1, please take a lo
[cneuwirt]: https://github.com/cneuwirt [cneuwirt]: https://github.com/cneuwirt
[agrawalsuneet]: https://github.com/agrawalsuneet [agrawalsuneet]: https://github.com/agrawalsuneet
[RAN3000]: https://github.com/RAN3000
[356]: https://github.com/natario1/CameraView/pull/356 [356]: https://github.com/natario1/CameraView/pull/356
[360]: https://github.com/natario1/CameraView/pull/360 [360]: https://github.com/natario1/CameraView/pull/360
@ -67,6 +69,7 @@ This is the first beta release. For changes with respect to v1, please take a lo
[471]: https://github.com/natario1/CameraView/pull/471 [471]: https://github.com/natario1/CameraView/pull/471
[431]: https://github.com/natario1/CameraView/pull/431 [431]: https://github.com/natario1/CameraView/pull/431
[403]: https://github.com/natario1/CameraView/pull/403 [403]: https://github.com/natario1/CameraView/pull/403
[421]: https://github.com/natario1/CameraView/pull/421
[435]: https://github.com/natario1/CameraView/pull/435 [435]: https://github.com/natario1/CameraView/pull/435
[477]: https://github.com/natario1/CameraView/pull/477 [477]: https://github.com/natario1/CameraView/pull/477
[482]: https://github.com/natario1/CameraView/pull/482 [482]: https://github.com/natario1/CameraView/pull/482
@ -75,3 +78,4 @@ This is the first beta release. For changes with respect to v1, please take a lo
[497]: https://github.com/natario1/CameraView/pull/497 [497]: https://github.com/natario1/CameraView/pull/497
[498]: https://github.com/natario1/CameraView/pull/498 [498]: https://github.com/natario1/CameraView/pull/498
[501]: https://github.com/natario1/CameraView/pull/501 [501]: https://github.com/natario1/CameraView/pull/501
[502]: https://github.com/natario1/CameraView/pull/502

@ -2,7 +2,7 @@
layout: page layout: page
title: "Debugging" title: "Debugging"
category: docs category: docs
order: 12 order: 13
date: 2018-12-20 20:02:38 date: 2018-12-20 20:02:38
disqus: 1 disqus: 1
--- ---

@ -2,7 +2,7 @@
layout: page layout: page
title: "Error Handling" title: "Error Handling"
category: docs category: docs
order: 11 order: 12
date: 2018-12-20 20:02:31 date: 2018-12-20 20:02:31
disqus: 1 disqus: 1
--- ---

@ -4,7 +4,7 @@ title: "More features"
subtitle: "Undocumented features & more" subtitle: "Undocumented features & more"
description: "Undocumented features & more" description: "Undocumented features & more"
category: docs category: docs
order: 13 order: 14
date: 2018-12-20 20:41:20 date: 2018-12-20 20:41:20
disqus: 1 disqus: 1
--- ---

@ -1,6 +1,6 @@
--- ---
layout: page layout: page
title: "Engine and previews" title: "Engine and Previews"
subtitle: "Camera engine and preview implementations" subtitle: "Camera engine and preview implementations"
description: "Camera engine and preview implementations" description: "Camera engine and preview implementations"
category: docs category: docs

@ -4,7 +4,7 @@ title: "Runtime Permissions"
subtitle: "Permissions and Manifest setup" subtitle: "Permissions and Manifest setup"
description: "Permissions and Manifest setup" description: "Permissions and Manifest setup"
category: docs category: docs
order: 10 order: 11
date: 2018-12-20 20:03:03 date: 2018-12-20 20:03:03
disqus: 1 disqus: 1
--- ---

@ -0,0 +1,73 @@
---
layout: page
title: "Watermarks and Overlays"
subtitle: "Static and animated overlays"
description: "Static and animated overlays"
category: docs
order: 10
date: 2019-07-14 20:14:31
disqus: 1
---
CameraView offers a simple yet powerful framework for watermarks and overlays of any kind.
These overlays can be shown on the live camera preview, plus they appear on the media results
taken with `takePictureSnapshot()` or `takeVideoSnapshot()`.
### Simple Usage
```xml
<com.otaliastudios.cameraview.CameraView
android:layout_width="wrap_content"
android:layout_height="wrap_content">
<!-- Watermark in bottom/end corner -->
<ImageView
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_gravity="bottom|end"
android:src="@drawable/watermark"
app:layout_drawOnPreview="true|false"
app:layout_drawOnPictureSnapshot="true|false"
app:layout_drawOnVideoSnapshot="true|false"/>
<!-- More overlays here... -->
</com.otaliastudios.cameraview.CameraView>
```
As you can see, the overlay system is View-based - each overlay is just a real `View` attached
into the hierarchy. This is a powerful and creative tool. You can, for instance, retrieve the
overlay with `findViewById` and:
- Animate it!
- Change its visibility
- Change its position or appearance
- Do so while video is being recorded
Any changes in the overlay appearance will be recorded in real-time in the picture snapshot
or video snapshot that you are capturing.
As you can see in the example, you can also selectively choose, for each overlay, whether it
will draw on the preview (`layout_drawOnPreview`), on picture snapshots (`layout_drawOnPreview`),
on video snapshots (`layout_drawOnPreview`).
### Advanced Usage
If you need to change these flags at runtime, you should cast the overlay `LayoutParams` as follows:
```java
// Cast to OverlayLayout.LayoutParams
View overlay = findViewById(R.id.watermark);
OverlayLayout.LayoutParams params = (OverlayLayout.LayoutParams) overlay.getLayoutParams();
// Perform changes
params.drawOnPreview = true; // draw on preview
params.drawOnPreview = false; // do not draw on preview
params.drawOnPictureSnapshot = true; // draw on picture snapshots
params.drawOnPictureSnapshot = false; // do not draw on picture snapshots
params.drawOnVideoSnapshot = true; // draw on video snapshots
params.drawOnVideoSnapshot = false; // do not draw on video snapshots
// When done, apply
overlay.setLayoutParams(params);
```

@ -9,14 +9,15 @@ CameraView is a well documented, high-level library that makes capturing picture
addressing most of the common issues and needs, and still leaving you with flexibility where needed. addressing most of the common issues and needs, and still leaving you with flexibility where needed.
- Fast & reliable - Fast & reliable
- Gestures support - Gestures support [[docs]](docs/gestures.html)
- Camera1 or Camera2 powered engine - Camera1 or Camera2 powered engine [[docs]](docs/previews.html)
- Frame processing support - Frame processing support [[docs]](docs/frame-processing.html)
- OpenGL powered preview - Watermarks & animated overlays [[docs]](docs/watermarks-and-overlays.html)
- Take high-quality content with `takePicture` and `takeVideo` - OpenGL powered preview [[docs]](docs/previews.html)
- Take super-fast snapshots with `takePictureSnapshot` and `takeVideoSnapshot` - Take high-quality content with `takePicture` and `takeVideo` [[docs]](docs/capturing-media.html)
- Smart sizing: create a `CameraView` of any size - Take super-fast snapshots with `takePictureSnapshot` and `takeVideoSnapshot` [[docs]](docs/capturing-media.html)
- Control HDR, flash, zoom, white balance, exposure, location, grid drawing & more - Smart sizing: create a `CameraView` of any size [[docs]](docs/preview-size.html)
- Control HDR, flash, zoom, white balance, exposure, location, grid drawing & more [[docs]](docs/controls.html)
- Lightweight - Lightweight
- Works down to API level 15 - Works down to API level 15
- Well tested - Well tested
@ -28,7 +29,7 @@ addressing most of the common issues and needs, and still leaving you with flexi
### Get started ### Get started
Get started with [install info](about/install.html), [quick setup](about/getting-started.html), or Get started with [install info](about/install.html), [quick setup](about/getting-started.html), or
read the in-depth [documentation](docs/camera-events.html). start reading the in-depth [documentation](docs/camera-events.html).
### Older versions ### Older versions

Loading…
Cancel
Save