Improve video encoding (#506)

* Reorder code and add long comments

* Simplify encoders Config

* Fix Audio recording bugs

* Anticipate max length detection

* Anticipate even more

* Estimate video bit rate instead of ugly default

* Fix bugs, better logs and comments

* Fix long standing sync bug

* Make inner classes public

* Remove performance logging code

* Add Audio.MONO and Audio.STEREO

* Add mono and stereo in attrs

* Write zeros when we have gaps

* Improve comments

* Add performance flags

* Move configs to separate classes

* Fix stereo bug

* Add onVideoRecordingEnd

* Add changelog notes

* Address some TODOs

* Refactor tests, add PoolTest
pull/513/head
Mattia Iavarone 5 years ago committed by GitHub
parent ea952d1497
commit 6962744d4f
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
  1. 2
      README.md
  2. 79
      cameraview/src/androidTest/java/com/otaliastudios/cameraview/BaseTest.java
  3. 16
      cameraview/src/androidTest/java/com/otaliastudios/cameraview/CameraLoggerTest.java
  4. 4
      cameraview/src/androidTest/java/com/otaliastudios/cameraview/CameraUtilsTest.java
  5. 107
      cameraview/src/androidTest/java/com/otaliastudios/cameraview/CameraViewCallbacksTest.java
  6. 42
      cameraview/src/androidTest/java/com/otaliastudios/cameraview/CameraViewTest.java
  7. 65
      cameraview/src/androidTest/java/com/otaliastudios/cameraview/engine/CameraIntegrationTest.java
  8. 9
      cameraview/src/androidTest/java/com/otaliastudios/cameraview/engine/MockCameraEngine.java
  9. 2
      cameraview/src/androidTest/java/com/otaliastudios/cameraview/gesture/GestureFinderTest.java
  10. 4
      cameraview/src/androidTest/java/com/otaliastudios/cameraview/gesture/PinchGestureFinderTest.java
  11. 7
      cameraview/src/androidTest/java/com/otaliastudios/cameraview/gesture/ScrollGestureFinderTest.java
  12. 4
      cameraview/src/androidTest/java/com/otaliastudios/cameraview/gesture/TapGestureFinderTest.java
  13. 2
      cameraview/src/androidTest/java/com/otaliastudios/cameraview/internal/GridLinesLayoutTest.java
  14. 4
      cameraview/src/androidTest/java/com/otaliastudios/cameraview/internal/utils/CropHelperTest.java
  15. 11
      cameraview/src/androidTest/java/com/otaliastudios/cameraview/internal/utils/OrientationHelperTest.java
  16. 3
      cameraview/src/androidTest/java/com/otaliastudios/cameraview/markers/MarkerLayoutTest.java
  17. 31
      cameraview/src/androidTest/java/com/otaliastudios/cameraview/overlay/OverlayLayoutTest.java
  18. 1
      cameraview/src/androidTest/java/com/otaliastudios/cameraview/picture/PictureRecorderTest.java
  19. 5
      cameraview/src/androidTest/java/com/otaliastudios/cameraview/preview/CameraPreviewTest.java
  20. 4
      cameraview/src/androidTest/java/com/otaliastudios/cameraview/preview/SurfaceCameraPreviewTest.java
  21. 4
      cameraview/src/androidTest/java/com/otaliastudios/cameraview/preview/TextureCameraPreviewTest.java
  22. 10
      cameraview/src/androidTest/java/com/otaliastudios/cameraview/video/VideoRecorderTest.java
  23. 21
      cameraview/src/main/java/com/otaliastudios/cameraview/CameraListener.java
  24. 27
      cameraview/src/main/java/com/otaliastudios/cameraview/CameraLogger.java
  25. 15
      cameraview/src/main/java/com/otaliastudios/cameraview/CameraOptions.java
  26. 21
      cameraview/src/main/java/com/otaliastudios/cameraview/CameraView.java
  27. 18
      cameraview/src/main/java/com/otaliastudios/cameraview/controls/Audio.java
  28. 5
      cameraview/src/main/java/com/otaliastudios/cameraview/engine/CameraEngine.java
  29. 4
      cameraview/src/main/java/com/otaliastudios/cameraview/gesture/GestureFinder.java
  30. 79
      cameraview/src/main/java/com/otaliastudios/cameraview/internal/utils/Pool.java
  31. 19
      cameraview/src/main/java/com/otaliastudios/cameraview/preview/GlCameraPreview.java
  32. 17
      cameraview/src/main/java/com/otaliastudios/cameraview/video/FullVideoRecorder.java
  33. 66
      cameraview/src/main/java/com/otaliastudios/cameraview/video/SnapshotVideoRecorder.java
  34. 23
      cameraview/src/main/java/com/otaliastudios/cameraview/video/VideoRecorder.java
  35. 96
      cameraview/src/main/java/com/otaliastudios/cameraview/video/encoding/AudioConfig.java
  36. 448
      cameraview/src/main/java/com/otaliastudios/cameraview/video/encoding/AudioMediaEncoder.java
  37. 105
      cameraview/src/main/java/com/otaliastudios/cameraview/video/encoding/AudioTimestamp.java
  38. 15
      cameraview/src/main/java/com/otaliastudios/cameraview/video/encoding/InputBuffer.java
  39. 305
      cameraview/src/main/java/com/otaliastudios/cameraview/video/encoding/MediaEncoder.java
  40. 109
      cameraview/src/main/java/com/otaliastudios/cameraview/video/encoding/MediaEncoderEngine.java
  41. 9
      cameraview/src/main/java/com/otaliastudios/cameraview/video/encoding/OutputBuffer.java
  42. 38
      cameraview/src/main/java/com/otaliastudios/cameraview/video/encoding/TextureConfig.java
  43. 146
      cameraview/src/main/java/com/otaliastudios/cameraview/video/encoding/TextureMediaEncoder.java
  44. 25
      cameraview/src/main/java/com/otaliastudios/cameraview/video/encoding/VideoConfig.java
  45. 61
      cameraview/src/main/java/com/otaliastudios/cameraview/video/encoding/VideoMediaEncoder.java
  46. 2
      cameraview/src/main/res/values/attrs.xml
  47. 168
      cameraview/src/test/java/com/otaliastudios/cameraview/internal/utils/PoolTest.java
  48. 21
      demo/src/main/java/com/otaliastudios/cameraview/demo/CameraActivity.java
  49. 2
      demo/src/main/res/layout/activity_camera.xml
  50. 2
      docs/_posts/2018-12-20-camera-events.md
  51. 31
      docs/_posts/2018-12-20-capturing-media.md
  52. 6
      docs/_posts/2018-12-20-changelog.md
  53. 46
      docs/_posts/2018-12-20-controls.md
  54. 32
      docs/_posts/2018-12-20-more-features.md

@ -95,7 +95,7 @@ motivation boost to push the library forward.
app:cameraFlash="on|auto|torch|off" app:cameraFlash="on|auto|torch|off"
app:cameraWhiteBalance="auto|cloudy|daylight|fluorescent|incandescent" app:cameraWhiteBalance="auto|cloudy|daylight|fluorescent|incandescent"
app:cameraMode="picture|video" app:cameraMode="picture|video"
app:cameraAudio="on|off" app:cameraAudio="on|off|mono|stereo"
app:cameraGrid="draw3x3|draw4x4|drawPhi|off" app:cameraGrid="draw3x3|draw4x4|drawPhi|off"
app:cameraGridColor="@color/grid_color" app:cameraGridColor="@color/grid_color"
app:cameraPlaySounds="true|false" app:cameraPlaySounds="true|false"

@ -8,11 +8,9 @@ import android.os.Handler;
import android.os.Looper; import android.os.Looper;
import android.os.PowerManager; import android.os.PowerManager;
import androidx.annotation.NonNull;
import androidx.test.platform.app.InstrumentationRegistry; import androidx.test.platform.app.InstrumentationRegistry;
import android.util.Log;
import android.view.View;
import com.otaliastudios.cameraview.internal.utils.Op; import com.otaliastudios.cameraview.internal.utils.Op;
import org.junit.After; import org.junit.After;
@ -27,9 +25,7 @@ import java.util.concurrent.CountDownLatch;
import static android.content.Context.KEYGUARD_SERVICE; import static android.content.Context.KEYGUARD_SERVICE;
import static android.content.Context.POWER_SERVICE; import static android.content.Context.POWER_SERVICE;
import static org.mockito.Matchers.any;
import static org.mockito.Mockito.doAnswer; import static org.mockito.Mockito.doAnswer;
import static org.mockito.Mockito.mock;
public class BaseTest { public class BaseTest {
@ -38,17 +34,16 @@ public class BaseTest {
// https://github.com/linkedin/test-butler/blob/bc2bb4df13d0a554d2e2b0ea710795017717e710/test-butler-app/src/main/java/com/linkedin/android/testbutler/ButlerService.java#L121 // https://github.com/linkedin/test-butler/blob/bc2bb4df13d0a554d2e2b0ea710795017717e710/test-butler-app/src/main/java/com/linkedin/android/testbutler/ButlerService.java#L121
@BeforeClass @BeforeClass
@SuppressWarnings("MissingPermission") public static void beforeClass_wakeUp() {
public static void wakeUp() {
CameraLogger.setLogLevel(CameraLogger.LEVEL_VERBOSE); CameraLogger.setLogLevel(CameraLogger.LEVEL_VERBOSE);
// Acquire a keyguard lock to prevent the lock screen from randomly appearing and breaking tests // Acquire a keyguard lock to prevent the lock screen from randomly appearing and breaking tests
KeyguardManager keyguardManager = (KeyguardManager) context().getSystemService(KEYGUARD_SERVICE); KeyguardManager keyguardManager = (KeyguardManager) getContext().getSystemService(KEYGUARD_SERVICE);
keyguardLock = keyguardManager.newKeyguardLock("CameraViewLock"); keyguardLock = keyguardManager.newKeyguardLock("CameraViewLock");
keyguardLock.disableKeyguard(); keyguardLock.disableKeyguard();
// Acquire a wake lock to prevent the cpu from going to sleep and breaking tests // Acquire a wake lock to prevent the cpu from going to sleep and breaking tests
PowerManager powerManager = (PowerManager) context().getSystemService(POWER_SERVICE); PowerManager powerManager = (PowerManager) getContext().getSystemService(POWER_SERVICE);
wakeLock = powerManager.newWakeLock(PowerManager.FULL_WAKE_LOCK wakeLock = powerManager.newWakeLock(PowerManager.FULL_WAKE_LOCK
| PowerManager.ACQUIRE_CAUSES_WAKEUP | PowerManager.ACQUIRE_CAUSES_WAKEUP
| PowerManager.ON_AFTER_RELEASE, "CameraViewLock"); | PowerManager.ON_AFTER_RELEASE, "CameraViewLock");
@ -56,8 +51,9 @@ public class BaseTest {
} }
@AfterClass @AfterClass
@SuppressWarnings("MissingPermission") public static void afterClass_releaseWakeUp() {
public static void releaseWakeUp() { CameraLogger.setLogLevel(CameraLogger.LEVEL_ERROR);
wakeLock.release(); wakeLock.release();
keyguardLock.reenableKeyguard(); keyguardLock.reenableKeyguard();
} }
@ -66,61 +62,48 @@ public class BaseTest {
* This will make mockito report the error when it should. * This will make mockito report the error when it should.
* Mockito reports failure on the next mockito invocation, which is terrible * Mockito reports failure on the next mockito invocation, which is terrible
* since it might be on the next test or even never happen. * since it might be on the next test or even never happen.
*
* Calling this
*/ */
@After @After
public void syncMockito() { public void after_checkMockito() {
Object object = Mockito.mock(Object.class); Object object = Mockito.mock(Object.class);
//noinspection ResultOfMethodCallIgnored
object.toString(); object.toString();
} }
public static void ui(Runnable runnable) { @NonNull
InstrumentationRegistry.getInstrumentation().runOnMainSync(runnable); protected static Context getContext() {
}
public static void uiAsync(Runnable runnable) {
new Handler(Looper.getMainLooper()).post(runnable);
}
public static Context context() {
return InstrumentationRegistry.getInstrumentation().getContext(); return InstrumentationRegistry.getInstrumentation().getContext();
} }
public static void uiRequestLayout(final View view) { protected static void uiSync(Runnable runnable) {
ui(new Runnable() { InstrumentationRegistry.getInstrumentation().runOnMainSync(runnable);
@Override
public void run() {
view.requestLayout();
}
});
} }
public static void idle() { @SuppressWarnings("unused")
InstrumentationRegistry.getInstrumentation().waitForIdleSync(); protected static void uiAsync(Runnable runnable) {
new Handler(Looper.getMainLooper()).post(runnable);
} }
public static void sleep(long time) { @SuppressWarnings("unused")
try { protected static void waitUiIdle() {
Thread.sleep(time); InstrumentationRegistry.getInstrumentation().waitForIdleSync();
} catch (InterruptedException e) {
e.printStackTrace();
}
} }
public static void grantPermissions() { protected static void grantAllPermissions() {
grantPermission("android.permission.CAMERA"); grantPermission("android.permission.CAMERA");
grantPermission("android.permission.RECORD_AUDIO"); grantPermission("android.permission.RECORD_AUDIO");
grantPermission("android.permission.WRITE_EXTERNAL_STORAGE"); grantPermission("android.permission.WRITE_EXTERNAL_STORAGE");
} }
public static void grantPermission(String permission) { @SuppressWarnings("WeakerAccess")
protected static void grantPermission(@NonNull String permission) {
if (Build.VERSION.SDK_INT < Build.VERSION_CODES.M) return; if (Build.VERSION.SDK_INT < Build.VERSION_CODES.M) return;
String command = "pm grant " + context().getPackageName() + " " + permission; String command = "pm grant " + getContext().getPackageName() + " " + permission;
InstrumentationRegistry.getInstrumentation().getUiAutomation().executeShellCommand(command); InstrumentationRegistry.getInstrumentation().getUiAutomation().executeShellCommand(command);
} }
public static Stubber doCountDown(final CountDownLatch latch) { @NonNull
protected static Stubber doCountDown(final CountDownLatch latch) {
return doAnswer(new Answer<Object>() { return doAnswer(new Answer<Object>() {
@Override @Override
public Object answer(InvocationOnMock invocation) { public Object answer(InvocationOnMock invocation) {
@ -130,22 +113,24 @@ public class BaseTest {
}); });
} }
public static <T> Stubber doEndTask(final Op<T> op, final T response) { @NonNull
protected static <T> Stubber doEndOp(final Op<T> op, final T response) {
return doAnswer(new Answer<Object>() { return doAnswer(new Answer<Object>() {
@Override @Override
public Object answer(InvocationOnMock invocation) throws Throwable { public Object answer(InvocationOnMock invocation) {
op.end(response); op.end(response);
return null; return null;
} }
}); });
} }
public static Stubber doEndTask(final Op op, final int withReturnArgument) { @NonNull
protected static <T> Stubber doEndOp(final Op<T> op, final int withReturnArgument) {
return doAnswer(new Answer<Object>() { return doAnswer(new Answer<Object>() {
@Override @Override
public Object answer(InvocationOnMock invocation) throws Throwable { public Object answer(InvocationOnMock invocation) {
Object o = invocation.getArguments()[withReturnArgument];
//noinspection unchecked //noinspection unchecked
T o = (T) invocation.getArguments()[withReturnArgument];
op.end(o); op.end(o);
return null; return null;
} }

@ -10,8 +10,6 @@ import org.junit.After;
import org.junit.Before; import org.junit.Before;
import org.junit.Test; import org.junit.Test;
import org.junit.runner.RunWith; import org.junit.runner.RunWith;
import org.mockito.invocation.InvocationOnMock;
import org.mockito.stubbing.Answer;
import static org.junit.Assert.*; import static org.junit.Assert.*;
import static org.mockito.Mockito.*; import static org.mockito.Mockito.*;
@ -26,11 +24,13 @@ public class CameraLoggerTest extends BaseTest {
@Before @Before
public void setUp() { public void setUp() {
CameraLogger.setLogLevel(CameraLogger.LEVEL_VERBOSE); CameraLogger.setLogLevel(CameraLogger.LEVEL_VERBOSE);
CameraLogger.unregisterLogger(CameraLogger.sAndroidLogger); // Avoid writing into Logs during these tests
logger = CameraLogger.create(loggerTag); logger = CameraLogger.create(loggerTag);
} }
@After @After
public void tearDown() { public void tearDown() {
CameraLogger.registerLogger(CameraLogger.sAndroidLogger);
logger = null; logger = null;
} }
@ -110,15 +110,9 @@ public class CameraLoggerTest extends BaseTest {
CameraLogger.registerLogger(mock); CameraLogger.registerLogger(mock);
final Op<Throwable> op = new Op<>(); final Op<Throwable> op = new Op<>();
doAnswer(new Answer() { doEndOp(op, 3)
@Override .when(mock)
public Object answer(InvocationOnMock invocation) throws Throwable { .log(anyInt(), anyString(), anyString(), any(Throwable.class));
Object[] args = invocation.getArguments();
Throwable throwable = (Throwable) args[3];
op.end(throwable);
return null;
}
}).when(mock).log(anyInt(), anyString(), anyString(), any(Throwable.class));
op.listen(); op.listen();
logger.e("Got no error."); logger.e("Got no error.");

@ -56,7 +56,7 @@ public class CameraUtilsTest extends BaseTest {
}; };
// Run on ui because it involves handlers. // Run on ui because it involves handlers.
ui(new Runnable() { uiSync(new Runnable() {
@Override @Override
public void run() { public void run() {
if (maxWidth > 0 && maxHeight > 0) { if (maxWidth > 0 && maxHeight > 0) {
@ -84,8 +84,6 @@ public class CameraUtilsTest extends BaseTest {
assertEquals(0, other.getPixel(0, h-1)); assertEquals(0, other.getPixel(0, h-1));
assertEquals(0, other.getPixel(w-1, 0)); assertEquals(0, other.getPixel(w-1, 0));
assertEquals(0, other.getPixel(w-1, h-1)); assertEquals(0, other.getPixel(w-1, h-1));
// TODO: improve when we add EXIF writing to byte arrays
} }

@ -29,19 +29,14 @@ import org.junit.After;
import org.junit.Before; import org.junit.Before;
import org.junit.Test; import org.junit.Test;
import org.junit.runner.RunWith; import org.junit.runner.RunWith;
import org.mockito.invocation.InvocationOnMock;
import org.mockito.stubbing.Answer;
import org.mockito.stubbing.Stubber;
import static junit.framework.Assert.assertNotNull; import static junit.framework.Assert.assertNotNull;
import static org.junit.Assert.assertEquals; import static org.junit.Assert.assertEquals;
import static org.junit.Assert.assertNull; import static org.junit.Assert.assertNull;
import static org.mockito.ArgumentMatchers.nullable; import static org.mockito.ArgumentMatchers.nullable;
import static org.mockito.Matchers.any; import static org.mockito.Matchers.any;
import static org.mockito.Matchers.anyFloat;
import static org.mockito.Matchers.anyInt; import static org.mockito.Matchers.anyInt;
import static org.mockito.Matchers.eq; import static org.mockito.Matchers.eq;
import static org.mockito.Mockito.doAnswer;
import static org.mockito.Mockito.mock; import static org.mockito.Mockito.mock;
import static org.mockito.Mockito.never; import static org.mockito.Mockito.never;
import static org.mockito.Mockito.times; import static org.mockito.Mockito.times;
@ -54,6 +49,8 @@ import static org.mockito.Mockito.verify;
@MediumTest @MediumTest
public class CameraViewCallbacksTest extends BaseTest { public class CameraViewCallbacksTest extends BaseTest {
private final static long DELAY = 500;
private CameraView camera; private CameraView camera;
private CameraListener listener; private CameraListener listener;
private FrameProcessor processor; private FrameProcessor processor;
@ -63,10 +60,10 @@ public class CameraViewCallbacksTest extends BaseTest {
@Before @Before
public void setUp() { public void setUp() {
ui(new Runnable() { uiSync(new Runnable() {
@Override @Override
public void run() { public void run() {
Context context = context(); Context context = getContext();
listener = mock(CameraListener.class); listener = mock(CameraListener.class);
processor = mock(FrameProcessor.class); processor = mock(FrameProcessor.class);
camera = new CameraView(context) { camera = new CameraView(context) {
@ -106,99 +103,101 @@ public class CameraViewCallbacksTest extends BaseTest {
listener = null; listener = null;
} }
// Completes our op.
private Stubber completeTask() {
return doAnswer(new Answer() {
@Override
public Object answer(InvocationOnMock invocation) throws Throwable {
op.end(true);
return null;
}
});
}
@Test @Test
public void testDontDispatchIfRemoved() { public void testDontDispatchIfRemoved() {
camera.removeCameraListener(listener); camera.removeCameraListener(listener);
completeTask().when(listener).onCameraOpened(null); CameraOptions options = mock(CameraOptions.class);
camera.mCameraCallbacks.dispatchOnCameraOpened(null); doEndOp(op, true).when(listener).onCameraOpened(options);
camera.mCameraCallbacks.dispatchOnCameraOpened(options);
assertNull(op.await(500)); assertNull(op.await(DELAY));
verify(listener, never()).onCameraOpened(null); verify(listener, never()).onCameraOpened(options);
} }
@Test @Test
public void testDontDispatchIfCleared() { public void testDontDispatchIfCleared() {
camera.clearCameraListeners(); camera.clearCameraListeners();
completeTask().when(listener).onCameraOpened(null); CameraOptions options = mock(CameraOptions.class);
camera.mCameraCallbacks.dispatchOnCameraOpened(null); doEndOp(op, true).when(listener).onCameraOpened(options);
camera.mCameraCallbacks.dispatchOnCameraOpened(options);
assertNull(op.await(500)); assertNull(op.await(DELAY));
verify(listener, never()).onCameraOpened(null); verify(listener, never()).onCameraOpened(options);
} }
@Test @Test
public void testDispatchOnCameraOpened() { public void testDispatchOnCameraOpened() {
completeTask().when(listener).onCameraOpened(null); CameraOptions options = mock(CameraOptions.class);
camera.mCameraCallbacks.dispatchOnCameraOpened(null); doEndOp(op, true).when(listener).onCameraOpened(options);
camera.mCameraCallbacks.dispatchOnCameraOpened(options);
assertNotNull(op.await(500)); assertNotNull(op.await(DELAY));
verify(listener, times(1)).onCameraOpened(null); verify(listener, times(1)).onCameraOpened(options);
} }
@Test @Test
public void testDispatchOnCameraClosed() { public void testDispatchOnCameraClosed() {
completeTask().when(listener).onCameraClosed(); doEndOp(op, true).when(listener).onCameraClosed();
camera.mCameraCallbacks.dispatchOnCameraClosed(); camera.mCameraCallbacks.dispatchOnCameraClosed();
assertNotNull(op.await(500)); assertNotNull(op.await(DELAY));
verify(listener, times(1)).onCameraClosed(); verify(listener, times(1)).onCameraClosed();
} }
@Test @Test
public void testDispatchOnVideoRecordingStart() { public void testDispatchOnVideoRecordingStart() {
completeTask().when(listener).onVideoRecordingStart(); doEndOp(op, true).when(listener).onVideoRecordingStart();
camera.mCameraCallbacks.dispatchOnVideoRecordingStart(); camera.mCameraCallbacks.dispatchOnVideoRecordingStart();
assertNotNull(op.await(500)); assertNotNull(op.await(DELAY));
verify(listener, times(1)).onVideoRecordingStart(); verify(listener, times(1)).onVideoRecordingStart();
} }
@Test
public void testDispatchOnVideoRecordingEnd() {
doEndOp(op, true).when(listener).onVideoRecordingEnd();
camera.mCameraCallbacks.dispatchOnVideoRecordingEnd();
assertNotNull(op.await(DELAY));
verify(listener, times(1)).onVideoRecordingEnd();
}
@Test @Test
public void testDispatchOnVideoTaken() { public void testDispatchOnVideoTaken() {
VideoResult.Stub stub = new VideoResult.Stub(); VideoResult.Stub stub = new VideoResult.Stub();
completeTask().when(listener).onVideoTaken(any(VideoResult.class)); doEndOp(op, true).when(listener).onVideoTaken(any(VideoResult.class));
camera.mCameraCallbacks.dispatchOnVideoTaken(stub); camera.mCameraCallbacks.dispatchOnVideoTaken(stub);
assertNotNull(op.await(500)); assertNotNull(op.await(DELAY));
verify(listener, times(1)).onVideoTaken(any(VideoResult.class)); verify(listener, times(1)).onVideoTaken(any(VideoResult.class));
} }
@Test @Test
public void testDispatchOnPictureTaken() { public void testDispatchOnPictureTaken() {
PictureResult.Stub stub = new PictureResult.Stub(); PictureResult.Stub stub = new PictureResult.Stub();
completeTask().when(listener).onPictureTaken(any(PictureResult.class)); doEndOp(op, true).when(listener).onPictureTaken(any(PictureResult.class));
camera.mCameraCallbacks.dispatchOnPictureTaken(stub); camera.mCameraCallbacks.dispatchOnPictureTaken(stub);
assertNotNull(op.await(500)); assertNotNull(op.await(DELAY));
verify(listener, times(1)).onPictureTaken(any(PictureResult.class)); verify(listener, times(1)).onPictureTaken(any(PictureResult.class));
} }
@Test @Test
public void testDispatchOnZoomChanged() { public void testDispatchOnZoomChanged() {
completeTask().when(listener).onZoomChanged(eq(0f), eq(new float[]{0, 1}), nullable(PointF[].class)); doEndOp(op, true).when(listener).onZoomChanged(eq(0f), eq(new float[]{0, 1}), nullable(PointF[].class));
camera.mCameraCallbacks.dispatchOnZoomChanged(0f, null); camera.mCameraCallbacks.dispatchOnZoomChanged(0f, null);
assertNotNull(op.await(500)); assertNotNull(op.await(DELAY));
verify(listener, times(1)).onZoomChanged(eq(0f), eq(new float[]{0, 1}), nullable(PointF[].class)); verify(listener, times(1)).onZoomChanged(eq(0f), eq(new float[]{0, 1}), nullable(PointF[].class));
} }
@Test @Test
public void testDispatchOnExposureCorrectionChanged() { public void testDispatchOnExposureCorrectionChanged() {
completeTask().when(listener).onExposureCorrectionChanged(0f, null, null); float[] bounds = new float[]{};
camera.mCameraCallbacks.dispatchOnExposureCorrectionChanged(0f, null, null); doEndOp(op, true).when(listener).onExposureCorrectionChanged(0f, bounds, null);
camera.mCameraCallbacks.dispatchOnExposureCorrectionChanged(0f, bounds, null);
assertNotNull(op.await(500)); assertNotNull(op.await(DELAY));
verify(listener, times(1)).onExposureCorrectionChanged(0f, null, null); verify(listener, times(1)).onExposureCorrectionChanged(0f, bounds, null);
} }
@Test @Test
@ -212,10 +211,10 @@ public class CameraViewCallbacksTest extends BaseTest {
camera.mMarkerLayout = markerLayout; camera.mMarkerLayout = markerLayout;
PointF point = new PointF(); PointF point = new PointF();
completeTask().when(listener).onAutoFocusStart(point); doEndOp(op, true).when(listener).onAutoFocusStart(point);
camera.mCameraCallbacks.dispatchOnFocusStart(Gesture.TAP, point); camera.mCameraCallbacks.dispatchOnFocusStart(Gesture.TAP, point);
assertNotNull(op.await(500)); assertNotNull(op.await(DELAY));
verify(listener, times(1)).onAutoFocusStart(point); verify(listener, times(1)).onAutoFocusStart(point);
verify(marker, times(1)).onAutoFocusStart(AutoFocusTrigger.GESTURE, point); verify(marker, times(1)).onAutoFocusStart(AutoFocusTrigger.GESTURE, point);
verify(markerLayout, times(1)).onEvent(eq(MarkerLayout.TYPE_AUTOFOCUS), any(PointF[].class)); verify(markerLayout, times(1)).onEvent(eq(MarkerLayout.TYPE_AUTOFOCUS), any(PointF[].class));
@ -231,10 +230,10 @@ public class CameraViewCallbacksTest extends BaseTest {
PointF point = new PointF(); PointF point = new PointF();
boolean success = true; boolean success = true;
completeTask().when(listener).onAutoFocusEnd(success, point); doEndOp(op, true).when(listener).onAutoFocusEnd(success, point);
camera.mCameraCallbacks.dispatchOnFocusEnd(Gesture.TAP, success, point); camera.mCameraCallbacks.dispatchOnFocusEnd(Gesture.TAP, success, point);
assertNotNull(op.await(500)); assertNotNull(op.await(DELAY));
verify(listener, times(1)).onAutoFocusEnd(success, point); verify(listener, times(1)).onAutoFocusEnd(success, point);
verify(marker, times(1)).onAutoFocusEnd(AutoFocusTrigger.GESTURE, success, point); verify(marker, times(1)).onAutoFocusEnd(AutoFocusTrigger.GESTURE, success, point);
@ -243,9 +242,9 @@ public class CameraViewCallbacksTest extends BaseTest {
@Test @Test
public void testOrientationCallbacks() { public void testOrientationCallbacks() {
completeTask().when(listener).onOrientationChanged(anyInt()); doEndOp(op, true).when(listener).onOrientationChanged(anyInt());
camera.mCameraCallbacks.onDeviceOrientationChanged(90); camera.mCameraCallbacks.onDeviceOrientationChanged(90);
assertNotNull(op.await(500)); assertNotNull(op.await(DELAY));
verify(listener, times(1)).onOrientationChanged(anyInt()); verify(listener, times(1)).onOrientationChanged(anyInt());
} }
@ -254,20 +253,20 @@ public class CameraViewCallbacksTest extends BaseTest {
@Test @Test
public void testCameraError() { public void testCameraError() {
CameraException error = new CameraException(new RuntimeException("Error")); CameraException error = new CameraException(new RuntimeException("Error"));
completeTask().when(listener).onCameraError(error); doEndOp(op, true).when(listener).onCameraError(error);
camera.mCameraCallbacks.dispatchError(error); camera.mCameraCallbacks.dispatchError(error);
assertNotNull(op.await(500)); assertNotNull(op.await(DELAY));
verify(listener, times(1)).onCameraError(error); verify(listener, times(1)).onCameraError(error);
} }
@Test @Test
public void testProcessFrame() { public void testProcessFrame() {
Frame mock = mock(Frame.class); Frame mock = mock(Frame.class);
completeTask().when(processor).process(mock); doEndOp(op, true).when(processor).process(mock);
camera.mCameraCallbacks.dispatchFrame(mock); camera.mCameraCallbacks.dispatchFrame(mock);
assertNotNull(op.await(500)); assertNotNull(op.await(DELAY));
verify(processor, times(1)).process(mock); verify(processor, times(1)).process(mock);
} }
} }

@ -10,7 +10,6 @@ import androidx.test.ext.junit.runners.AndroidJUnit4;
import androidx.test.filters.MediumTest; import androidx.test.filters.MediumTest;
import android.util.AttributeSet; import android.util.AttributeSet;
import android.view.Gravity;
import android.view.LayoutInflater; import android.view.LayoutInflater;
import android.view.MotionEvent; import android.view.MotionEvent;
import android.view.View; import android.view.View;
@ -52,9 +51,6 @@ import org.junit.After;
import org.junit.Before; import org.junit.Before;
import org.junit.Test; import org.junit.Test;
import org.junit.runner.RunWith; import org.junit.runner.RunWith;
import org.mockito.invocation.InvocationOnMock;
import org.mockito.stubbing.Answer;
import org.w3c.dom.Attr;
import static org.junit.Assert.*; import static org.junit.Assert.*;
@ -74,10 +70,10 @@ public class CameraViewTest extends BaseTest {
@Before @Before
public void setUp() { public void setUp() {
ui(new Runnable() { uiSync(new Runnable() {
@Override @Override
public void run() { public void run() {
Context context = context(); Context context = getContext();
cameraView = new CameraView(context) { cameraView = new CameraView(context) {
@NonNull @NonNull
@ -151,8 +147,8 @@ public class CameraViewTest extends BaseTest {
@Test @Test
public void testDefaults() { public void testDefaults() {
// CameraEngine // CameraEngine
TypedArray empty = context().obtainStyledAttributes(new int[]{}); TypedArray empty = getContext().obtainStyledAttributes(new int[]{});
ControlParser controls = new ControlParser(context(), empty); ControlParser controls = new ControlParser(getContext(), empty);
assertEquals(cameraView.getFlash(), controls.getFlash()); assertEquals(cameraView.getFlash(), controls.getFlash());
assertEquals(cameraView.getFacing(), controls.getFacing()); assertEquals(cameraView.getFacing(), controls.getFacing());
assertEquals(cameraView.getGrid(), controls.getGrid()); assertEquals(cameraView.getGrid(), controls.getGrid());
@ -236,7 +232,7 @@ public class CameraViewTest extends BaseTest {
mockController.setMockCameraOptions(o); mockController.setMockCameraOptions(o);
mockController.setMockEngineState(true); mockController.setMockEngineState(true);
MotionEvent event = MotionEvent.obtain(0L, 0L, 0, 0f, 0f, 0); MotionEvent event = MotionEvent.obtain(0L, 0L, 0, 0f, 0f, 0);
ui(new Runnable() { uiSync(new Runnable() {
@Override @Override
public void run() { public void run() {
cameraView.mTapGestureFinder = new TapGestureFinder(cameraView.mCameraCallbacks) { cameraView.mTapGestureFinder = new TapGestureFinder(cameraView.mCameraCallbacks) {
@ -258,7 +254,7 @@ public class CameraViewTest extends BaseTest {
mockController.setMockCameraOptions(o); mockController.setMockCameraOptions(o);
mockController.setMockEngineState(true); mockController.setMockEngineState(true);
MotionEvent event = MotionEvent.obtain(0L, 0L, 0, 0f, 0f, 0); MotionEvent event = MotionEvent.obtain(0L, 0L, 0, 0f, 0f, 0);
ui(new Runnable() { uiSync(new Runnable() {
@Override @Override
public void run() { public void run() {
cameraView.mTapGestureFinder = new TapGestureFinder(cameraView.mCameraCallbacks) { cameraView.mTapGestureFinder = new TapGestureFinder(cameraView.mCameraCallbacks) {
@ -285,7 +281,7 @@ public class CameraViewTest extends BaseTest {
mockController.mZoomChanged = false; mockController.mZoomChanged = false;
MotionEvent event = MotionEvent.obtain(0L, 0L, 0, 0f, 0f, 0); MotionEvent event = MotionEvent.obtain(0L, 0L, 0, 0f, 0f, 0);
final FactorHolder factor = new FactorHolder(); final FactorHolder factor = new FactorHolder();
ui(new Runnable() { uiSync(new Runnable() {
@Override @Override
public void run() { public void run() {
cameraView.mPinchGestureFinder = new PinchGestureFinder(cameraView.mCameraCallbacks) { cameraView.mPinchGestureFinder = new PinchGestureFinder(cameraView.mCameraCallbacks) {
@ -326,7 +322,7 @@ public class CameraViewTest extends BaseTest {
mockController.mExposureCorrectionChanged = false; mockController.mExposureCorrectionChanged = false;
MotionEvent event = MotionEvent.obtain(0L, 0L, 0, 0f, 0f, 0); MotionEvent event = MotionEvent.obtain(0L, 0L, 0, 0f, 0f, 0);
final FactorHolder factor = new FactorHolder(); final FactorHolder factor = new FactorHolder();
ui(new Runnable() { uiSync(new Runnable() {
@Override @Override
public void run() { public void run() {
cameraView.mScrollGestureFinder = new ScrollGestureFinder(cameraView.mCameraCallbacks) { cameraView.mScrollGestureFinder = new ScrollGestureFinder(cameraView.mCameraCallbacks) {
@ -628,6 +624,10 @@ public class CameraViewTest extends BaseTest {
assertEquals(cameraView.get(Audio.class), Audio.ON); assertEquals(cameraView.get(Audio.class), Audio.ON);
cameraView.set(Audio.OFF); cameraView.set(Audio.OFF);
assertEquals(cameraView.get(Audio.class), Audio.OFF); assertEquals(cameraView.get(Audio.class), Audio.OFF);
cameraView.set(Audio.MONO);
assertEquals(cameraView.get(Audio.class), Audio.MONO);
cameraView.set(Audio.STEREO);
assertEquals(cameraView.get(Audio.class), Audio.STEREO);
} }
@Test @Test
@ -681,7 +681,6 @@ public class CameraViewTest extends BaseTest {
//region Lists of listeners and processors //region Lists of listeners and processors
@SuppressWarnings("UseBulkOperation")
@Test @Test
public void testCameraListenerList() { public void testCameraListenerList() {
assertTrue(cameraView.mListeners.isEmpty()); assertTrue(cameraView.mListeners.isEmpty());
@ -709,7 +708,6 @@ public class CameraViewTest extends BaseTest {
} }
} }
@SuppressWarnings({"NullableProblems", "UseBulkOperation"})
@Test @Test
public void testFrameProcessorsList() { public void testFrameProcessorsList() {
assertTrue(cameraView.mFrameProcessors.isEmpty()); assertTrue(cameraView.mFrameProcessors.isEmpty());
@ -771,13 +769,7 @@ public class CameraViewTest extends BaseTest {
final PointF point = new PointF(0, 0); final PointF point = new PointF(0, 0);
final PointF[] points = new PointF[]{ point }; final PointF[] points = new PointF[]{ point };
final Op<Boolean> op = new Op<>(true); final Op<Boolean> op = new Op<>(true);
doAnswer(new Answer() { doEndOp(op, true).when(markerLayout).onEvent(MarkerLayout.TYPE_AUTOFOCUS, points);
@Override
public Object answer(InvocationOnMock invocation) throws Throwable {
op.end(true);
return null;
}
}).when(markerLayout).onEvent(MarkerLayout.TYPE_AUTOFOCUS, points);
cameraView.mCameraCallbacks.dispatchOnFocusStart(Gesture.TAP, point); cameraView.mCameraCallbacks.dispatchOnFocusStart(Gesture.TAP, point);
assertNotNull(op.await(100)); assertNotNull(op.await(100));
} }
@ -789,7 +781,7 @@ public class CameraViewTest extends BaseTest {
@Test @Test
public void testOverlays_generateLayoutParams() { public void testOverlays_generateLayoutParams() {
cameraView.mOverlayLayout = spy(cameraView.mOverlayLayout); cameraView.mOverlayLayout = spy(cameraView.mOverlayLayout);
LayoutInflater inflater = LayoutInflater.from(context()); LayoutInflater inflater = LayoutInflater.from(getContext());
View overlay = inflater.inflate(com.otaliastudios.cameraview.test.R.layout.overlay, cameraView, false); View overlay = inflater.inflate(com.otaliastudios.cameraview.test.R.layout.overlay, cameraView, false);
assertTrue(overlay.getLayoutParams() instanceof OverlayLayout.LayoutParams); assertTrue(overlay.getLayoutParams() instanceof OverlayLayout.LayoutParams);
verify(cameraView.mOverlayLayout, times(1)).isOverlay(any(AttributeSet.class)); verify(cameraView.mOverlayLayout, times(1)).isOverlay(any(AttributeSet.class));
@ -800,7 +792,7 @@ public class CameraViewTest extends BaseTest {
@Test @Test
public void testOverlays_dontGenerateLayoutParams() { public void testOverlays_dontGenerateLayoutParams() {
cameraView.mOverlayLayout = spy(cameraView.mOverlayLayout); cameraView.mOverlayLayout = spy(cameraView.mOverlayLayout);
LayoutInflater inflater = LayoutInflater.from(context()); LayoutInflater inflater = LayoutInflater.from(getContext());
View overlay = inflater.inflate(com.otaliastudios.cameraview.test.R.layout.not_overlay, cameraView, false); View overlay = inflater.inflate(com.otaliastudios.cameraview.test.R.layout.not_overlay, cameraView, false);
assertFalse(overlay.getLayoutParams() instanceof OverlayLayout.LayoutParams); assertFalse(overlay.getLayoutParams() instanceof OverlayLayout.LayoutParams);
verify(cameraView.mOverlayLayout, times(1)).isOverlay(any(AttributeSet.class)); verify(cameraView.mOverlayLayout, times(1)).isOverlay(any(AttributeSet.class));
@ -810,7 +802,7 @@ public class CameraViewTest extends BaseTest {
@Test @Test
public void testOverlays_addOverlayView() { public void testOverlays_addOverlayView() {
cameraView.mOverlayLayout = spy(cameraView.mOverlayLayout); cameraView.mOverlayLayout = spy(cameraView.mOverlayLayout);
View overlay = new View(context()); View overlay = new View(getContext());
OverlayLayout.LayoutParams params = new OverlayLayout.LayoutParams(10, 10); OverlayLayout.LayoutParams params = new OverlayLayout.LayoutParams(10, 10);
int count = cameraView.getChildCount(); int count = cameraView.getChildCount();
cameraView.addView(overlay, 0, params); cameraView.addView(overlay, 0, params);
@ -822,7 +814,7 @@ public class CameraViewTest extends BaseTest {
@Test @Test
public void testOverlays_dontAddOverlayView() { public void testOverlays_dontAddOverlayView() {
cameraView.mOverlayLayout = spy(cameraView.mOverlayLayout); cameraView.mOverlayLayout = spy(cameraView.mOverlayLayout);
View overlay = new View(context()); View overlay = new View(getContext());
ViewGroup.LayoutParams params = new ViewGroup.LayoutParams(10, 10); ViewGroup.LayoutParams params = new ViewGroup.LayoutParams(10, 10);
int count = cameraView.getChildCount(); int count = cameraView.getChildCount();
cameraView.addView(overlay, 0, params); cameraView.addView(overlay, 0, params);

@ -77,7 +77,7 @@ public abstract class CameraIntegrationTest extends BaseTest {
@BeforeClass @BeforeClass
public static void grant() { public static void grant() {
grantPermissions(); grantAllPermissions();
} }
@NonNull @NonNull
@ -88,7 +88,7 @@ public abstract class CameraIntegrationTest extends BaseTest {
LOG.e("Test started. Setting up camera."); LOG.e("Test started. Setting up camera.");
WorkerHandler.destroy(); WorkerHandler.destroy();
ui(new Runnable() { uiSync(new Runnable() {
@Override @Override
public void run() { public void run() {
camera = new CameraView(rule.getActivity()) { camera = new CameraView(rule.getActivity()) {
@ -139,12 +139,13 @@ public abstract class CameraIntegrationTest extends BaseTest {
private CameraOptions openSync(boolean expectSuccess) { private CameraOptions openSync(boolean expectSuccess) {
camera.open(); camera.open();
final Op<CameraOptions> open = new Op<>(true); final Op<CameraOptions> open = new Op<>(true);
doEndTask(open, 0).when(listener).onCameraOpened(any(CameraOptions.class)); doEndOp(open, 0).when(listener).onCameraOpened(any(CameraOptions.class));
CameraOptions result = open.await(DELAY); CameraOptions result = open.await(DELAY);
if (expectSuccess) { if (expectSuccess) {
assertNotNull("Can open", result); assertNotNull("Can open", result);
// Extra wait for the bind state. // Extra wait for the bind state.
// TODO fix this and other while {} in this class in a more elegant way. // TODO fix this and other while {} in this class in a more elegant way.
//noinspection StatementWithEmptyBody
while (controller.getBindState() != CameraEngine.STATE_STARTED) {} while (controller.getBindState() != CameraEngine.STATE_STARTED) {}
} else { } else {
assertNull("Should not open", result); assertNull("Should not open", result);
@ -155,7 +156,7 @@ public abstract class CameraIntegrationTest extends BaseTest {
private void closeSync(boolean expectSuccess) { private void closeSync(boolean expectSuccess) {
camera.close(); camera.close();
final Op<Boolean> close = new Op<>(true); final Op<Boolean> close = new Op<>(true);
doEndTask(close, true).when(listener).onCameraClosed(); doEndOp(close, true).when(listener).onCameraClosed();
Boolean result = close.await(DELAY); Boolean result = close.await(DELAY);
if (expectSuccess) { if (expectSuccess) {
assertNotNull("Can close", result); assertNotNull("Can close", result);
@ -167,16 +168,24 @@ public abstract class CameraIntegrationTest extends BaseTest {
@SuppressWarnings("UnusedReturnValue") @SuppressWarnings("UnusedReturnValue")
@Nullable @Nullable
private VideoResult waitForVideoResult(boolean expectSuccess) { private VideoResult waitForVideoResult(boolean expectSuccess) {
// CountDownLatch for onVideoRecordingEnd.
CountDownLatch onVideoRecordingEnd = new CountDownLatch(1);
doCountDown(onVideoRecordingEnd).when(listener).onVideoRecordingEnd();
// Op for onVideoTaken.
final Op<VideoResult> video = new Op<>(true); final Op<VideoResult> video = new Op<>(true);
doEndTask(video, 0).when(listener).onVideoTaken(any(VideoResult.class)); doEndOp(video, 0).when(listener).onVideoTaken(any(VideoResult.class));
doEndTask(video, null).when(listener).onCameraError(argThat(new ArgumentMatcher<CameraException>() { doEndOp(video, null).when(listener).onCameraError(argThat(new ArgumentMatcher<CameraException>() {
@Override @Override
public boolean matches(CameraException argument) { public boolean matches(CameraException argument) {
return argument.getReason() == CameraException.REASON_VIDEO_FAILED; return argument.getReason() == CameraException.REASON_VIDEO_FAILED;
} }
})); }));
// Wait for onVideoTaken and check.
VideoResult result = video.await(VIDEO_DELAY); VideoResult result = video.await(VIDEO_DELAY);
if (expectSuccess) { if (expectSuccess) {
assertEquals("Should call onVideoRecordingEnd", 0, onVideoRecordingEnd.getCount());
assertNotNull("Should end video", result); assertNotNull("Should end video", result);
} else { } else {
assertNull("Should not end video", result); assertNull("Should not end video", result);
@ -187,8 +196,8 @@ public abstract class CameraIntegrationTest extends BaseTest {
@Nullable @Nullable
private PictureResult waitForPictureResult(boolean expectSuccess) { private PictureResult waitForPictureResult(boolean expectSuccess) {
final Op<PictureResult> pic = new Op<>(true); final Op<PictureResult> pic = new Op<>(true);
doEndTask(pic, 0).when(listener).onPictureTaken(any(PictureResult.class)); doEndOp(pic, 0).when(listener).onPictureTaken(any(PictureResult.class));
doEndTask(pic, null).when(listener).onCameraError(argThat(new ArgumentMatcher<CameraException>() { doEndOp(pic, null).when(listener).onCameraError(argThat(new ArgumentMatcher<CameraException>() {
@Override @Override
public boolean matches(CameraException argument) { public boolean matches(CameraException argument) {
return argument.getReason() == CameraException.REASON_PICTURE_FAILED; return argument.getReason() == CameraException.REASON_PICTURE_FAILED;
@ -209,14 +218,14 @@ public abstract class CameraIntegrationTest extends BaseTest {
private void takeVideoSync(boolean expectSuccess, int duration) { private void takeVideoSync(boolean expectSuccess, int duration) {
final Op<Boolean> op = new Op<>(true); final Op<Boolean> op = new Op<>(true);
doEndTask(op, true).when(listener).onVideoRecordingStart(); doEndOp(op, true).when(listener).onVideoRecordingStart();
doEndTask(op, false).when(listener).onCameraError(argThat(new ArgumentMatcher<CameraException>() { doEndOp(op, false).when(listener).onCameraError(argThat(new ArgumentMatcher<CameraException>() {
@Override @Override
public boolean matches(CameraException argument) { public boolean matches(CameraException argument) {
return argument.getReason() == CameraException.REASON_VIDEO_FAILED; return argument.getReason() == CameraException.REASON_VIDEO_FAILED;
} }
})); }));
File file = new File(context().getFilesDir(), "video.mp4"); File file = new File(getContext().getFilesDir(), "video.mp4");
if (duration > 0) { if (duration > 0) {
camera.takeVideo(file, duration); camera.takeVideo(file, duration);
} else { } else {
@ -231,21 +240,21 @@ public abstract class CameraIntegrationTest extends BaseTest {
} }
} }
@SuppressWarnings("unused") @SuppressWarnings({"unused", "SameParameterValue"})
private void takeVideoSnapshotSync(boolean expectSuccess) { private void takeVideoSnapshotSync(boolean expectSuccess) {
takeVideoSnapshotSync(expectSuccess,0); takeVideoSnapshotSync(expectSuccess,0);
} }
private void takeVideoSnapshotSync(boolean expectSuccess, int duration) { private void takeVideoSnapshotSync(boolean expectSuccess, int duration) {
final Op<Boolean> op = new Op<>(true); final Op<Boolean> op = new Op<>(true);
doEndTask(op, true).when(listener).onVideoRecordingStart(); doEndOp(op, true).when(listener).onVideoRecordingStart();
doEndTask(op, false).when(listener).onCameraError(argThat(new ArgumentMatcher<CameraException>() { doEndOp(op, false).when(listener).onCameraError(argThat(new ArgumentMatcher<CameraException>() {
@Override @Override
public boolean matches(CameraException argument) { public boolean matches(CameraException argument) {
return argument.getReason() == CameraException.REASON_VIDEO_FAILED; return argument.getReason() == CameraException.REASON_VIDEO_FAILED;
} }
})); }));
File file = new File(context().getFilesDir(), "video.mp4"); File file = new File(getContext().getFilesDir(), "video.mp4");
if (duration > 0) { if (duration > 0) {
camera.takeVideoSnapshot(file, duration); camera.takeVideoSnapshot(file, duration);
} else { } else {
@ -541,11 +550,10 @@ public abstract class CameraIntegrationTest extends BaseTest {
@Test @Test
public void testEndVideoSnapshot_withMaxSize() { public void testEndVideoSnapshot_withMaxSize() {
// TODO camera.setVideoMaxSize(3000*1000);
// camera.setVideoMaxSize(3000*1000); openSync(true);
// waitForOpen(true); takeVideoSnapshotSync(true);
// waitForVideoStart(); waitForVideoResult(true);
// waitForVideoEnd(true);
} }
@Test @Test
@ -559,11 +567,10 @@ public abstract class CameraIntegrationTest extends BaseTest {
@Test @Test
public void testEndVideoSnapshot_withMaxDuration() { public void testEndVideoSnapshot_withMaxDuration() {
// TODO camera.setVideoMaxDuration(4000);
// camera.setVideoMaxDuration(4000); openSync(true);
// waitForOpen(true); takeVideoSnapshotSync(true);
// waitForVideoStart(); waitForVideoResult(true);
// waitForVideoEnd(true);
} }
//endregion //endregion
@ -575,7 +582,7 @@ public abstract class CameraIntegrationTest extends BaseTest {
CameraOptions o = openSync(true); CameraOptions o = openSync(true);
final Op<PointF> focus = new Op<>(true); final Op<PointF> focus = new Op<>(true);
doEndTask(focus, 0).when(listener).onAutoFocusStart(any(PointF.class)); doEndOp(focus, 0).when(listener).onAutoFocusStart(any(PointF.class));
camera.startAutoFocus(1, 1); camera.startAutoFocus(1, 1);
PointF point = focus.await(300); PointF point = focus.await(300);
@ -592,7 +599,7 @@ public abstract class CameraIntegrationTest extends BaseTest {
CameraOptions o = openSync(true); CameraOptions o = openSync(true);
final Op<PointF> focus = new Op<>(true); final Op<PointF> focus = new Op<>(true);
doEndTask(focus, 1).when(listener).onAutoFocusEnd(anyBoolean(), any(PointF.class)); doEndOp(focus, 1).when(listener).onAutoFocusEnd(anyBoolean(), any(PointF.class));
camera.startAutoFocus(1, 1); camera.startAutoFocus(1, 1);
// Stop is not guaranteed to be called, we use a delay. So wait at least the delay time. // Stop is not guaranteed to be called, we use a delay. So wait at least the delay time.
@ -632,7 +639,7 @@ public abstract class CameraIntegrationTest extends BaseTest {
@SuppressWarnings("StatementWithEmptyBody") @SuppressWarnings("StatementWithEmptyBody")
@Test @Test
public void testCapturePicture_size() throws Exception { public void testCapturePicture_size() {
openSync(true); openSync(true);
// PictureSize can still be null after opened. // PictureSize can still be null after opened.
// TODO be more elegant // TODO be more elegant
@ -682,7 +689,7 @@ public abstract class CameraIntegrationTest extends BaseTest {
@SuppressWarnings("StatementWithEmptyBody") @SuppressWarnings("StatementWithEmptyBody")
@Test @Test
public void testCaptureSnapshot_size() throws Exception { public void testCaptureSnapshot_size() {
openSync(true); openSync(true);
// SnapshotSize can still be null after opened. // SnapshotSize can still be null after opened.
// TODO be more elegant // TODO be more elegant

@ -9,22 +9,18 @@ import com.google.android.gms.tasks.Tasks;
import com.otaliastudios.cameraview.CameraOptions; import com.otaliastudios.cameraview.CameraOptions;
import com.otaliastudios.cameraview.PictureResult; import com.otaliastudios.cameraview.PictureResult;
import com.otaliastudios.cameraview.VideoResult; import com.otaliastudios.cameraview.VideoResult;
import com.otaliastudios.cameraview.controls.Audio;
import com.otaliastudios.cameraview.controls.Facing; import com.otaliastudios.cameraview.controls.Facing;
import com.otaliastudios.cameraview.controls.Flash; import com.otaliastudios.cameraview.controls.Flash;
import com.otaliastudios.cameraview.frame.FrameManager; import com.otaliastudios.cameraview.frame.FrameManager;
import com.otaliastudios.cameraview.gesture.Gesture; import com.otaliastudios.cameraview.gesture.Gesture;
import com.otaliastudios.cameraview.controls.Hdr; import com.otaliastudios.cameraview.controls.Hdr;
import com.otaliastudios.cameraview.controls.Mode;
import com.otaliastudios.cameraview.controls.WhiteBalance; import com.otaliastudios.cameraview.controls.WhiteBalance;
import com.otaliastudios.cameraview.size.AspectRatio; import com.otaliastudios.cameraview.size.AspectRatio;
import com.otaliastudios.cameraview.size.Size; import com.otaliastudios.cameraview.size.Size;
import com.otaliastudios.cameraview.size.SizeSelector;
import androidx.annotation.NonNull; import androidx.annotation.NonNull;
import androidx.annotation.Nullable; import androidx.annotation.Nullable;
import java.io.File;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.List; import java.util.List;
@ -177,9 +173,4 @@ public class MockCameraEngine extends CameraEngine {
protected boolean collectCameraInfo(@NonNull Facing facing) { protected boolean collectCameraInfo(@NonNull Facing facing) {
return true; return true;
} }
@Override
public void onVideoRecordingStart() {
}
} }

@ -40,7 +40,7 @@ public abstract class GestureFinderTest<T extends GestureFinder> extends BaseTes
@Before @Before
public void setUp() { public void setUp() {
ui(new Runnable() { uiSync(new Runnable() {
@Override @Override
public void run() { public void run() {
TestActivity a = rule.getActivity(); TestActivity a = rule.getActivity();

@ -1,8 +1,6 @@
package com.otaliastudios.cameraview.gesture; package com.otaliastudios.cameraview.gesture;
import android.content.Context;
import androidx.annotation.NonNull; import androidx.annotation.NonNull;
import androidx.test.espresso.ViewAction; import androidx.test.espresso.ViewAction;
import androidx.test.ext.junit.runners.AndroidJUnit4; import androidx.test.ext.junit.runners.AndroidJUnit4;
@ -11,10 +9,8 @@ import androidx.test.filters.SmallTest;
import org.junit.Test; import org.junit.Test;
import org.junit.runner.RunWith; import org.junit.runner.RunWith;
import static androidx.test.espresso.matcher.ViewMatchers.withId;
import static org.junit.Assert.assertEquals; import static org.junit.Assert.assertEquals;
import static org.junit.Assert.assertNotNull; import static org.junit.Assert.assertNotNull;
import static org.junit.Assert.assertNull;
import static org.junit.Assert.assertTrue; import static org.junit.Assert.assertTrue;
@RunWith(AndroidJUnit4.class) @RunWith(AndroidJUnit4.class)

@ -1,8 +1,6 @@
package com.otaliastudios.cameraview.gesture; package com.otaliastudios.cameraview.gesture;
import android.content.Context;
import androidx.annotation.NonNull; import androidx.annotation.NonNull;
import androidx.test.espresso.ViewAction; import androidx.test.espresso.ViewAction;
import androidx.test.ext.junit.runners.AndroidJUnit4; import androidx.test.ext.junit.runners.AndroidJUnit4;
@ -11,13 +9,10 @@ import androidx.test.filters.SmallTest;
import org.junit.Test; import org.junit.Test;
import org.junit.runner.RunWith; import org.junit.runner.RunWith;
import static androidx.test.espresso.action.ViewActions.click;
import static androidx.test.espresso.action.ViewActions.swipeDown; import static androidx.test.espresso.action.ViewActions.swipeDown;
import static androidx.test.espresso.action.ViewActions.swipeLeft; import static androidx.test.espresso.action.ViewActions.swipeLeft;
import static androidx.test.espresso.action.ViewActions.swipeRight; import static androidx.test.espresso.action.ViewActions.swipeRight;
import static androidx.test.espresso.action.ViewActions.swipeUp; import static androidx.test.espresso.action.ViewActions.swipeUp;
import static androidx.test.espresso.matcher.ViewMatchers.withId;
import static junit.framework.Assert.assertNotNull;
import static org.junit.Assert.assertEquals; import static org.junit.Assert.assertEquals;
import static org.junit.Assert.assertNull; import static org.junit.Assert.assertNull;
import static org.junit.Assert.assertTrue; import static org.junit.Assert.assertTrue;
@ -33,7 +28,7 @@ public class ScrollGestureFinderTest extends GestureFinderTest<ScrollGestureFind
@Test @Test
public void testDefaults() { public void testDefaults() {
assertNull(finder.getGesture()); assertNull(finder.mType);
assertEquals(finder.getPoints().length, 2); assertEquals(finder.getPoints().length, 2);
assertEquals(finder.getPoints()[0].x, 0, 0); assertEquals(finder.getPoints()[0].x, 0, 0);
assertEquals(finder.getPoints()[0].y, 0, 0); assertEquals(finder.getPoints()[0].y, 0, 0);

@ -1,8 +1,6 @@
package com.otaliastudios.cameraview.gesture; package com.otaliastudios.cameraview.gesture;
import android.content.Context;
import androidx.annotation.NonNull; import androidx.annotation.NonNull;
import androidx.test.espresso.action.GeneralClickAction; import androidx.test.espresso.action.GeneralClickAction;
import androidx.test.espresso.action.GeneralLocation; import androidx.test.espresso.action.GeneralLocation;
@ -32,7 +30,7 @@ public class TapGestureFinderTest extends GestureFinderTest<TapGestureFinder> {
@Test @Test
public void testDefaults() { public void testDefaults() {
assertNull(finder.getGesture()); assertNull(finder.mType);
assertEquals(finder.getPoints().length, 1); assertEquals(finder.getPoints().length, 1);
assertEquals(finder.getPoints()[0].x, 0, 0); assertEquals(finder.getPoints()[0].x, 0, 0);
assertEquals(finder.getPoints()[0].y, 0, 0); assertEquals(finder.getPoints()[0].y, 0, 0);

@ -27,7 +27,7 @@ public class GridLinesLayoutTest extends BaseTest {
@Before @Before
public void setUp() { public void setUp() {
ui(new Runnable() { uiSync(new Runnable() {
@Override @Override
public void run() { public void run() {
TestActivity a = rule.getActivity(); TestActivity a = rule.getActivity();

@ -4,7 +4,6 @@ package com.otaliastudios.cameraview.internal.utils;
import android.graphics.Rect; import android.graphics.Rect;
import com.otaliastudios.cameraview.BaseTest; import com.otaliastudios.cameraview.BaseTest;
import com.otaliastudios.cameraview.internal.utils.CropHelper;
import com.otaliastudios.cameraview.size.AspectRatio; import com.otaliastudios.cameraview.size.AspectRatio;
import com.otaliastudios.cameraview.size.Size; import com.otaliastudios.cameraview.size.Size;
@ -15,11 +14,8 @@ import org.junit.Test;
import org.junit.runner.RunWith; import org.junit.runner.RunWith;
import static org.junit.Assert.assertEquals; import static org.junit.Assert.assertEquals;
import static org.junit.Assert.assertFalse;
import static org.junit.Assert.assertNotEquals; import static org.junit.Assert.assertNotEquals;
import static org.junit.Assert.assertTrue; import static org.junit.Assert.assertTrue;
import static org.mockito.Matchers.any;
import static org.mockito.Mockito.mock;
@RunWith(AndroidJUnit4.class) @RunWith(AndroidJUnit4.class)
@SmallTest @SmallTest

@ -6,7 +6,6 @@ import androidx.test.filters.SmallTest;
import android.view.OrientationEventListener; import android.view.OrientationEventListener;
import com.otaliastudios.cameraview.BaseTest; import com.otaliastudios.cameraview.BaseTest;
import com.otaliastudios.cameraview.internal.utils.OrientationHelper;
import org.junit.After; import org.junit.After;
import org.junit.Before; import org.junit.Before;
@ -25,11 +24,11 @@ public class OrientationHelperTest extends BaseTest {
@Before @Before
public void setUp() { public void setUp() {
ui(new Runnable() { uiSync(new Runnable() {
@Override @Override
public void run() { public void run() {
callback = mock(OrientationHelper.Callback.class); callback = mock(OrientationHelper.Callback.class);
helper = new OrientationHelper(context(), callback); helper = new OrientationHelper(getContext(), callback);
} }
}); });
} }
@ -46,12 +45,12 @@ public class OrientationHelperTest extends BaseTest {
assertEquals(helper.getDisplayOffset(), -1); assertEquals(helper.getDisplayOffset(), -1);
assertEquals(helper.getDeviceOrientation(), -1); assertEquals(helper.getDeviceOrientation(), -1);
helper.enable(context()); helper.enable(getContext());
assertNotNull(helper.mListener); assertNotNull(helper.mListener);
assertNotEquals(helper.getDisplayOffset(), -1); // Don't know about device orientation. assertNotEquals(helper.getDisplayOffset(), -1); // Don't know about device orientation.
// Ensure nothing bad if called twice. // Ensure nothing bad if called twice.
helper.enable(context()); helper.enable(getContext());
assertNotNull(helper.mListener); assertNotNull(helper.mListener);
assertNotEquals(helper.getDisplayOffset(), -1); assertNotEquals(helper.getDisplayOffset(), -1);
@ -66,7 +65,7 @@ public class OrientationHelperTest extends BaseTest {
// Sometimes (on some APIs) the helper will trigger an update to 0 // Sometimes (on some APIs) the helper will trigger an update to 0
// right after enabling. But that's fine for us, times(1) will be OK either way. // right after enabling. But that's fine for us, times(1) will be OK either way.
helper.enable(context()); helper.enable(getContext());
helper.mListener.onOrientationChanged(OrientationEventListener.ORIENTATION_UNKNOWN); helper.mListener.onOrientationChanged(OrientationEventListener.ORIENTATION_UNKNOWN);
assertEquals(helper.getDeviceOrientation(), 0); assertEquals(helper.getDeviceOrientation(), 0);
helper.mListener.onOrientationChanged(10); helper.mListener.onOrientationChanged(10);

@ -11,7 +11,6 @@ import org.junit.Assert;
import org.junit.Before; import org.junit.Before;
import org.junit.Rule; import org.junit.Rule;
import org.junit.Test; import org.junit.Test;
import org.junit.runner.manipulation.Filter;
import org.mockito.Mockito; import org.mockito.Mockito;
import androidx.test.annotation.UiThreadTest; import androidx.test.annotation.UiThreadTest;
@ -29,7 +28,7 @@ public class MarkerLayoutTest extends BaseTest {
@Before @Before
public void setUp() { public void setUp() {
ui(new Runnable() { uiSync(new Runnable() {
@Override @Override
public void run() { public void run() {
TestActivity a = rule.getActivity(); TestActivity a = rule.getActivity();

@ -1,17 +1,12 @@
package com.otaliastudios.cameraview.overlay; package com.otaliastudios.cameraview.overlay;
import android.content.res.Resources;
import android.content.res.TypedArray;
import android.content.res.XmlResourceParser; import android.content.res.XmlResourceParser;
import android.graphics.Canvas; import android.graphics.Canvas;
import android.util.AttributeSet; import android.util.AttributeSet;
import android.util.Xml; import android.util.Xml;
import android.view.Gravity;
import android.view.LayoutInflater;
import android.view.View; import android.view.View;
import android.view.ViewGroup; import android.view.ViewGroup;
import android.widget.FrameLayout;
import androidx.annotation.NonNull; import androidx.annotation.NonNull;
import androidx.test.annotation.UiThreadTest; import androidx.test.annotation.UiThreadTest;
@ -19,35 +14,17 @@ import androidx.test.ext.junit.runners.AndroidJUnit4;
import androidx.test.filters.SmallTest; import androidx.test.filters.SmallTest;
import com.otaliastudios.cameraview.BaseTest; import com.otaliastudios.cameraview.BaseTest;
import com.otaliastudios.cameraview.R;
import org.junit.After; import org.junit.After;
import org.junit.Before; import org.junit.Before;
import org.junit.Test; import org.junit.Test;
import org.junit.runner.RunWith; import org.junit.runner.RunWith;
import org.mockito.ArgumentCaptor;
import org.mockito.ArgumentMatcher;
import org.mockito.invocation.InvocationOnMock;
import org.mockito.stubbing.Answer;
import org.w3c.dom.Attr;
import org.xmlpull.v1.XmlPullParser;
import org.xmlpull.v1.XmlPullParserException;
import java.io.IOException;
import static org.junit.Assert.assertEquals;
import static org.junit.Assert.assertFalse; import static org.junit.Assert.assertFalse;
import static org.junit.Assert.assertNotEquals;
import static org.junit.Assert.assertNotNull;
import static org.junit.Assert.assertTrue; import static org.junit.Assert.assertTrue;
import static org.mockito.ArgumentMatchers.any;
import static org.mockito.ArgumentMatchers.anyFloat; import static org.mockito.ArgumentMatchers.anyFloat;
import static org.mockito.ArgumentMatchers.anyLong; import static org.mockito.ArgumentMatchers.anyLong;
import static org.mockito.ArgumentMatchers.argThat;
import static org.mockito.ArgumentMatchers.eq; import static org.mockito.ArgumentMatchers.eq;
import static org.mockito.ArgumentMatchers.notNull;
import static org.mockito.Mockito.doNothing;
import static org.mockito.Mockito.mock;
import static org.mockito.Mockito.never; import static org.mockito.Mockito.never;
import static org.mockito.Mockito.reset; import static org.mockito.Mockito.reset;
import static org.mockito.Mockito.spy; import static org.mockito.Mockito.spy;
@ -63,7 +40,7 @@ public class OverlayLayoutTest extends BaseTest {
@Before @Before
public void setUp() { public void setUp() {
overlayLayout = spy(new OverlayLayout(context())); overlayLayout = spy(new OverlayLayout(getContext()));
} }
@After @After
@ -97,7 +74,7 @@ public class OverlayLayoutTest extends BaseTest {
@NonNull @NonNull
private AttributeSet getAttributeSet(int layout) throws Exception { private AttributeSet getAttributeSet(int layout) throws Exception {
// Get the attribute set in the correct state: use a parser and move to START_TAG // Get the attribute set in the correct state: use a parser and move to START_TAG
XmlResourceParser parser = context().getResources().getLayout(layout); XmlResourceParser parser = getContext().getResources().getLayout(layout);
//noinspection StatementWithEmptyBody //noinspection StatementWithEmptyBody
while (parser.next() != XmlResourceParser.START_TAG) {} while (parser.next() != XmlResourceParser.START_TAG) {}
return Xml.asAttributeSet(parser); return Xml.asAttributeSet(parser);
@ -132,7 +109,7 @@ public class OverlayLayoutTest extends BaseTest {
public void testDrawChild() { public void testDrawChild() {
Canvas canvas = new Canvas(); Canvas canvas = new Canvas();
OverlayLayout.LayoutParams params = new OverlayLayout.LayoutParams(10, 10); OverlayLayout.LayoutParams params = new OverlayLayout.LayoutParams(10, 10);
View child = new View(context()); View child = new View(getContext());
child.setLayoutParams(params); child.setLayoutParams(params);
when(overlayLayout.doDrawChild(canvas, child, 0)).thenReturn(true); when(overlayLayout.doDrawChild(canvas, child, 0)).thenReturn(true);
@ -169,7 +146,7 @@ public class OverlayLayoutTest extends BaseTest {
@Test @Test
public void testDrawOn() { public void testDrawOn() {
Canvas canvas = spy(new Canvas()); Canvas canvas = spy(new Canvas());
View child = new View(context()); View child = new View(getContext());
OverlayLayout.LayoutParams params = new OverlayLayout.LayoutParams(10, 10); OverlayLayout.LayoutParams params = new OverlayLayout.LayoutParams(10, 10);
params.drawOnPreview = true; params.drawOnPreview = true;
params.drawOnPictureSnapshot = true; params.drawOnPictureSnapshot = true;

@ -3,7 +3,6 @@ package com.otaliastudios.cameraview.picture;
import com.otaliastudios.cameraview.BaseTest; import com.otaliastudios.cameraview.BaseTest;
import com.otaliastudios.cameraview.PictureResult; import com.otaliastudios.cameraview.PictureResult;
import com.otaliastudios.cameraview.picture.PictureRecorder;
import androidx.test.ext.junit.runners.AndroidJUnit4; import androidx.test.ext.junit.runners.AndroidJUnit4;
import androidx.test.filters.SmallTest; import androidx.test.filters.SmallTest;

@ -32,7 +32,6 @@ public abstract class CameraPreviewTest extends BaseTest {
@Rule @Rule
public ActivityTestRule<TestActivity> rule = new ActivityTestRule<>(TestActivity.class); public ActivityTestRule<TestActivity> rule = new ActivityTestRule<>(TestActivity.class);
@SuppressWarnings("WeakerAccess")
protected CameraPreview preview; protected CameraPreview preview;
@SuppressWarnings("WeakerAccess") @SuppressWarnings("WeakerAccess")
protected Size surfaceSize; protected Size surfaceSize;
@ -46,7 +45,7 @@ public abstract class CameraPreviewTest extends BaseTest {
available = new Op<>(true); available = new Op<>(true);
destroyed = new Op<>(true); destroyed = new Op<>(true);
ui(new Runnable() { uiSync(new Runnable() {
@Override @Override
public void run() { public void run() {
TestActivity a = rule.getActivity(); TestActivity a = rule.getActivity();
@ -82,7 +81,7 @@ public abstract class CameraPreviewTest extends BaseTest {
// Trigger a destroy. // Trigger a destroy.
protected void ensureDestroyed() { protected void ensureDestroyed() {
ui(new Runnable() { uiSync(new Runnable() {
@Override @Override
public void run() { public void run() {
rule.getActivity().getContentView().removeView(preview.getRootView()); rule.getActivity().getContentView().removeView(preview.getRootView());

@ -7,10 +7,6 @@ import androidx.test.ext.junit.runners.AndroidJUnit4;
import androidx.test.filters.SmallTest; import androidx.test.filters.SmallTest;
import android.view.ViewGroup; import android.view.ViewGroup;
import com.otaliastudios.cameraview.preview.CameraPreview;
import com.otaliastudios.cameraview.preview.CameraPreviewTest;
import com.otaliastudios.cameraview.preview.SurfaceCameraPreview;
import org.junit.runner.RunWith; import org.junit.runner.RunWith;
@RunWith(AndroidJUnit4.class) @RunWith(AndroidJUnit4.class)

@ -7,10 +7,6 @@ import androidx.test.ext.junit.runners.AndroidJUnit4;
import androidx.test.filters.SmallTest; import androidx.test.filters.SmallTest;
import android.view.ViewGroup; import android.view.ViewGroup;
import com.otaliastudios.cameraview.preview.CameraPreview;
import com.otaliastudios.cameraview.preview.CameraPreviewTest;
import com.otaliastudios.cameraview.preview.TextureCameraPreview;
import org.junit.runner.RunWith; import org.junit.runner.RunWith;
@RunWith(AndroidJUnit4.class) @RunWith(AndroidJUnit4.class)

@ -13,9 +13,6 @@ import org.mockito.Mockito;
import java.lang.reflect.Constructor; import java.lang.reflect.Constructor;
import static org.junit.Assert.assertEquals;
import static org.junit.Assert.assertNull;
@RunWith(AndroidJUnit4.class) @RunWith(AndroidJUnit4.class)
@SmallTest @SmallTest
@ -27,10 +24,13 @@ public class VideoRecorderTest extends BaseTest {
VideoRecorder.VideoResultListener listener = Mockito.mock(VideoRecorder.VideoResultListener.class); VideoRecorder.VideoResultListener listener = Mockito.mock(VideoRecorder.VideoResultListener.class);
VideoRecorder recorder = new VideoRecorder(listener) { VideoRecorder recorder = new VideoRecorder(listener) {
@Override @Override
protected void onStart() { dispatchVideoRecordingStart(); } protected void onStart() {
dispatchVideoRecordingStart();
}
@Override @Override
protected void onStop() { protected void onStop() {
dispatchVideoRecordingEnd();
dispatchResult(); dispatchResult();
} }
}; };
@ -38,6 +38,8 @@ public class VideoRecorderTest extends BaseTest {
Mockito.verify(listener,Mockito.times(1) ) Mockito.verify(listener,Mockito.times(1) )
.onVideoRecordingStart(); .onVideoRecordingStart();
recorder.stop(); recorder.stop();
Mockito.verify(listener, Mockito.times(1))
.onVideoRecordingEnd();
Mockito.verify(listener, Mockito.times(1)) Mockito.verify(listener, Mockito.times(1))
.onVideoResult(result, null); .onVideoResult(result, null);
} }

@ -129,13 +129,30 @@ public abstract class CameraListener {
/** /**
* Notifies that the actual video recording has started * Notifies that the actual video recording has started.
* This is the time when actual frames recording starts. * This is the time when actual frames recording starts.
* This can be used to show some indicator while the actual video recording. *
* This can be used to show some UI indicator for video recording or counting time.
*
* @see #onVideoRecordingEnd()
*/ */
@UiThread @UiThread
public void onVideoRecordingStart() { public void onVideoRecordingStart() {
} }
/**
* Notifies that the actual video recording has ended.
* At this point recording has ended, though the file might still be processed.
* The {@link #onVideoTaken(VideoResult)} callback will be called soon.
*
* This can be used to remove UI indicators for video recording.
*
* @see #onVideoRecordingStart()
*/
@UiThread
public void onVideoRecordingEnd() {
}
} }

@ -56,22 +56,23 @@ public final class CameraLogger {
@VisibleForTesting static String lastTag; @VisibleForTesting static String lastTag;
private static int sLevel; private static int sLevel;
private static List<Logger> sLoggers; private static List<Logger> sLoggers = new ArrayList<>();
@VisibleForTesting static Logger sAndroidLogger = new Logger() {
@Override
public void log(int level, @NonNull String tag, @NonNull String message, @Nullable Throwable throwable) {
switch (level) {
case LEVEL_VERBOSE: Log.v(tag, message, throwable); break;
case LEVEL_INFO: Log.i(tag, message, throwable); break;
case LEVEL_WARNING: Log.w(tag, message, throwable); break;
case LEVEL_ERROR: Log.e(tag, message, throwable); break;
}
}
};
static { static {
setLogLevel(LEVEL_ERROR); setLogLevel(LEVEL_ERROR);
sLoggers = new ArrayList<>(); sLoggers.add(sAndroidLogger);
sLoggers.add(new Logger() {
@Override
public void log(int level, @NonNull String tag, @NonNull String message, @Nullable Throwable throwable) {
switch (level) {
case LEVEL_VERBOSE: Log.v(tag, message, throwable); break;
case LEVEL_INFO: Log.i(tag, message, throwable); break;
case LEVEL_WARNING: Log.w(tag, message, throwable); break;
case LEVEL_ERROR: Log.e(tag, message, throwable); break;
}
}
});
} }
/** /**

@ -6,10 +6,8 @@ import android.hardware.Camera;
import android.hardware.camera2.CameraAccessException; import android.hardware.camera2.CameraAccessException;
import android.hardware.camera2.CameraCharacteristics; import android.hardware.camera2.CameraCharacteristics;
import android.hardware.camera2.CameraManager; import android.hardware.camera2.CameraManager;
import android.hardware.camera2.CaptureRequest;
import android.hardware.camera2.params.StreamConfigurationMap; import android.hardware.camera2.params.StreamConfigurationMap;
import android.media.CamcorderProfile; import android.media.CamcorderProfile;
import android.media.ImageReader;
import android.media.MediaRecorder; import android.media.MediaRecorder;
import android.os.Build; import android.os.Build;
import android.util.Range; import android.util.Range;
@ -63,8 +61,6 @@ public class CameraOptions {
private boolean autoFocusSupported; private boolean autoFocusSupported;
// Camera1Engine constructor.
@SuppressWarnings("deprecation")
public CameraOptions(@NonNull Camera.Parameters params, boolean flipSizes) { public CameraOptions(@NonNull Camera.Parameters params, boolean flipSizes) {
List<String> strings; List<String> strings;
Mapper mapper = Mapper.get(Engine.CAMERA1); Mapper mapper = Mapper.get(Engine.CAMERA1);
@ -151,7 +147,6 @@ public class CameraOptions {
// Camera2Engine constructor. // Camera2Engine constructor.
@RequiresApi(Build.VERSION_CODES.LOLLIPOP) @RequiresApi(Build.VERSION_CODES.LOLLIPOP)
@SuppressWarnings("deprecation")
public CameraOptions(@NonNull CameraManager manager, @NonNull String cameraId, boolean flipSizes) throws CameraAccessException { public CameraOptions(@NonNull CameraManager manager, @NonNull String cameraId, boolean flipSizes) throws CameraAccessException {
Mapper mapper = Mapper.get(Engine.CAMERA2); Mapper mapper = Mapper.get(Engine.CAMERA2);
CameraCharacteristics cameraCharacteristics = manager.getCameraCharacteristics(cameraId); CameraCharacteristics cameraCharacteristics = manager.getCameraCharacteristics(cameraId);
@ -323,7 +318,6 @@ public class CameraOptions {
* *
* @return a collection of supported values. * @return a collection of supported values.
*/ */
@SuppressWarnings("WeakerAccess")
@NonNull @NonNull
public Collection<Size> getSupportedPictureSizes() { public Collection<Size> getSupportedPictureSizes() {
return Collections.unmodifiableSet(supportedPictureSizes); return Collections.unmodifiableSet(supportedPictureSizes);
@ -347,7 +341,6 @@ public class CameraOptions {
* *
* @return a collection of supported values. * @return a collection of supported values.
*/ */
@SuppressWarnings("WeakerAccess")
@NonNull @NonNull
public Collection<Size> getSupportedVideoSizes() { public Collection<Size> getSupportedVideoSizes() {
return Collections.unmodifiableSet(supportedVideoSizes); return Collections.unmodifiableSet(supportedVideoSizes);
@ -373,7 +366,6 @@ public class CameraOptions {
* @see Facing#FRONT * @see Facing#FRONT
* @return a collection of supported values. * @return a collection of supported values.
*/ */
@SuppressWarnings("WeakerAccess")
@NonNull @NonNull
public Collection<Facing> getSupportedFacing() { public Collection<Facing> getSupportedFacing() {
return Collections.unmodifiableSet(supportedFacing); return Collections.unmodifiableSet(supportedFacing);
@ -389,7 +381,6 @@ public class CameraOptions {
* @see Flash#TORCH * @see Flash#TORCH
* @return a collection of supported values. * @return a collection of supported values.
*/ */
@SuppressWarnings("WeakerAccess")
@NonNull @NonNull
public Collection<Flash> getSupportedFlash() { public Collection<Flash> getSupportedFlash() {
return Collections.unmodifiableSet(supportedFlash); return Collections.unmodifiableSet(supportedFlash);
@ -406,7 +397,6 @@ public class CameraOptions {
* @see WhiteBalance#CLOUDY * @see WhiteBalance#CLOUDY
* @return a collection of supported values. * @return a collection of supported values.
*/ */
@SuppressWarnings("WeakerAccess")
@NonNull @NonNull
public Collection<WhiteBalance> getSupportedWhiteBalance() { public Collection<WhiteBalance> getSupportedWhiteBalance() {
return Collections.unmodifiableSet(supportedWhiteBalance); return Collections.unmodifiableSet(supportedWhiteBalance);
@ -432,7 +422,6 @@ public class CameraOptions {
* *
* @return whether zoom is supported. * @return whether zoom is supported.
*/ */
@SuppressWarnings("WeakerAccess")
public boolean isZoomSupported() { public boolean isZoomSupported() {
return zoomSupported; return zoomSupported;
} }
@ -444,7 +433,6 @@ public class CameraOptions {
* *
* @return whether auto focus is supported. * @return whether auto focus is supported.
*/ */
@SuppressWarnings("WeakerAccess")
public boolean isAutoFocusSupported() { public boolean isAutoFocusSupported() {
return autoFocusSupported; return autoFocusSupported;
} }
@ -458,7 +446,6 @@ public class CameraOptions {
* @see #getExposureCorrectionMaxValue() * @see #getExposureCorrectionMaxValue()
* @return whether exposure correction is supported. * @return whether exposure correction is supported.
*/ */
@SuppressWarnings("WeakerAccess")
public boolean isExposureCorrectionSupported() { public boolean isExposureCorrectionSupported() {
return exposureCorrectionSupported; return exposureCorrectionSupported;
} }
@ -470,7 +457,6 @@ public class CameraOptions {
* *
* @return min EV value * @return min EV value
*/ */
@SuppressWarnings("WeakerAccess")
public float getExposureCorrectionMinValue() { public float getExposureCorrectionMinValue() {
return exposureCorrectionMinValue; return exposureCorrectionMinValue;
} }
@ -482,7 +468,6 @@ public class CameraOptions {
* *
* @return max EV value * @return max EV value
*/ */
@SuppressWarnings("WeakerAccess")
public float getExposureCorrectionMaxValue() { public float getExposureCorrectionMaxValue() {
return exposureCorrectionMaxValue; return exposureCorrectionMaxValue;
} }

@ -678,7 +678,7 @@ public class CameraView extends FrameLayout implements LifecycleObserver {
Context c = getContext(); Context c = getContext();
boolean needsCamera = true; boolean needsCamera = true;
boolean needsAudio = audio == Audio.ON; boolean needsAudio = audio == Audio.ON || audio == Audio.MONO || audio == Audio.STEREO;
needsCamera = needsCamera && c.checkSelfPermission(Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED; needsCamera = needsCamera && c.checkSelfPermission(Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED;
needsAudio = needsAudio && c.checkSelfPermission(Manifest.permission.RECORD_AUDIO) != PackageManager.PERMISSION_GRANTED; needsAudio = needsAudio && c.checkSelfPermission(Manifest.permission.RECORD_AUDIO) != PackageManager.PERMISSION_GRANTED;
@ -696,7 +696,7 @@ public class CameraView extends FrameLayout implements LifecycleObserver {
* If the developer did not add this to its manifest, throw and fire warnings. * If the developer did not add this to its manifest, throw and fire warnings.
*/ */
private void checkPermissionsManifestOrThrow(@NonNull Audio audio) { private void checkPermissionsManifestOrThrow(@NonNull Audio audio) {
if (audio == Audio.ON) { if (audio == Audio.ON || audio == Audio.MONO || audio == Audio.STEREO) {
try { try {
PackageManager manager = getContext().getPackageManager(); PackageManager manager = getContext().getPackageManager();
PackageInfo info = manager.getPackageInfo(getContext().getPackageName(), PackageManager.GET_PERMISSIONS); PackageInfo info = manager.getPackageInfo(getContext().getPackageName(), PackageManager.GET_PERMISSIONS);
@ -1174,6 +1174,8 @@ public class CameraView extends FrameLayout implements LifecycleObserver {
* *
* @see Audio#OFF * @see Audio#OFF
* @see Audio#ON * @see Audio#ON
* @see Audio#MONO
* @see Audio#STEREO
* *
* @param audio desired audio value * @param audio desired audio value
*/ */
@ -2078,7 +2080,7 @@ public class CameraView extends FrameLayout implements LifecycleObserver {
@Override @Override
public void dispatchOnVideoRecordingStart() { public void dispatchOnVideoRecordingStart() {
mLogger.i("dispatchOnVideoRecordingStart", "dispatchOnVideoRecordingStart"); mLogger.i("dispatchOnVideoRecordingStart");
mUiHandler.post(new Runnable() { mUiHandler.post(new Runnable() {
@Override @Override
public void run() { public void run() {
@ -2088,6 +2090,19 @@ public class CameraView extends FrameLayout implements LifecycleObserver {
} }
}); });
} }
@Override
public void dispatchOnVideoRecordingEnd() {
mLogger.i("dispatchOnVideoRecordingEnd");
mUiHandler.post(new Runnable() {
@Override
public void run() {
for (CameraListener listener : mListeners) {
listener.onVideoRecordingEnd();
}
}
});
}
} }
//endregion //endregion

@ -14,14 +14,26 @@ import androidx.annotation.Nullable;
public enum Audio implements Control { public enum Audio implements Control {
/** /**
* No Audio. * No audio.
*/ */
OFF(0), OFF(0),
/** /**
* With Audio. * Audio on. The number of channels depends on the video configuration,
* on the device capabilities and on the video type (e.g. we default to
* mono for snapshots).
*/ */
ON(1); ON(1),
/**
* Force mono channel audio.
*/
MONO(2),
/**
* Force stereo audio.
*/
STEREO(3);
final static Audio DEFAULT = ON; final static Audio DEFAULT = ON;

@ -136,6 +136,7 @@ public abstract class CameraEngine implements
void dispatchFrame(Frame frame); void dispatchFrame(Frame frame);
void dispatchError(CameraException exception); void dispatchError(CameraException exception);
void dispatchOnVideoRecordingStart(); void dispatchOnVideoRecordingStart();
void dispatchOnVideoRecordingEnd();
} }
private static final String TAG = CameraEngine.class.getSimpleName(); private static final String TAG = CameraEngine.class.getSimpleName();
@ -1210,6 +1211,10 @@ public abstract class CameraEngine implements
mCallback.dispatchOnVideoRecordingStart(); mCallback.dispatchOnVideoRecordingStart();
} }
@Override
public void onVideoRecordingEnd() {
mCallback.dispatchOnVideoRecordingEnd();
}
@WorkerThread @WorkerThread
protected abstract void onTakePicture(@NonNull PictureResult.Stub stub); protected abstract void onTakePicture(@NonNull PictureResult.Stub stub);

@ -3,6 +3,8 @@ package com.otaliastudios.cameraview.gesture;
import android.content.Context; import android.content.Context;
import android.graphics.PointF; import android.graphics.PointF;
import androidx.annotation.NonNull; import androidx.annotation.NonNull;
import androidx.annotation.VisibleForTesting;
import android.view.MotionEvent; import android.view.MotionEvent;
/** /**
@ -22,7 +24,7 @@ public abstract class GestureFinder {
private final static int GRANULARITY = 50; private final static int GRANULARITY = 50;
private boolean mActive; private boolean mActive;
private Gesture mType; @VisibleForTesting Gesture mType;
private PointF[] mPoints; private PointF[] mPoints;
private Controller mController; private Controller mController;

@ -9,7 +9,7 @@ import androidx.annotation.NonNull;
import androidx.annotation.Nullable; import androidx.annotation.Nullable;
/** /**
* Base class for pools of recycleable objects. * Base class for thread-safe pools of recycleable objects.
* @param <T> the object type * @param <T> the object type
*/ */
public class Pool<T> { public class Pool<T> {
@ -19,8 +19,9 @@ public class Pool<T> {
private int maxPoolSize; private int maxPoolSize;
private int activeCount; private int activeCount;
private LinkedBlockingQueue<T> mQueue; private LinkedBlockingQueue<T> queue;
private Factory<T> factory; private Factory<T> factory;
private final Object lock = new Object();
/** /**
* Used to create new instances of objects when needed. * Used to create new instances of objects when needed.
@ -37,7 +38,7 @@ public class Pool<T> {
*/ */
public Pool(int maxPoolSize, @NonNull Factory<T> factory) { public Pool(int maxPoolSize, @NonNull Factory<T> factory) {
this.maxPoolSize = maxPoolSize; this.maxPoolSize = maxPoolSize;
this.mQueue = new LinkedBlockingQueue<>(maxPoolSize); this.queue = new LinkedBlockingQueue<>(maxPoolSize);
this.factory = factory; this.factory = factory;
} }
@ -48,7 +49,9 @@ public class Pool<T> {
* @return whether the pool is empty * @return whether the pool is empty
*/ */
public boolean isEmpty() { public boolean isEmpty() {
return count() >= maxPoolSize; synchronized (lock) {
return count() >= maxPoolSize;
}
} }
/** /**
@ -60,21 +63,23 @@ public class Pool<T> {
*/ */
@Nullable @Nullable
public T get() { public T get() {
T item = mQueue.poll(); synchronized (lock) {
if (item != null) { T item = queue.poll();
activeCount++; // poll decreases, this fixes if (item != null) {
LOG.v("GET - Reusing recycled item.", this); activeCount++; // poll decreases, this fixes
return item; LOG.v("GET - Reusing recycled item.", this);
} return item;
}
if (isEmpty()) {
LOG.v("GET - Returning null. Too much items requested.", this); if (isEmpty()) {
return null; LOG.v("GET - Returning null. Too much items requested.", this);
return null;
}
activeCount++;
LOG.v("GET - Creating a new item.", this);
return factory.create();
} }
activeCount++;
LOG.v("GET - Creating a new item.", this);
return factory.create();
} }
/** /**
@ -84,16 +89,18 @@ public class Pool<T> {
* @param item used item * @param item used item
*/ */
public void recycle(@NonNull T item) { public void recycle(@NonNull T item) {
LOG.v("RECYCLE - Recycling item.", this); synchronized (lock) {
if (--activeCount < 0) { LOG.v("RECYCLE - Recycling item.", this);
throw new IllegalStateException("Trying to recycle an item which makes activeCount < 0." + if (--activeCount < 0) {
"This means that this or some previous items being recycled were not coming from " + throw new IllegalStateException("Trying to recycle an item which makes activeCount < 0." +
"this pool, or some item was recycled more than once. " + this); "This means that this or some previous items being recycled were not coming from " +
} "this pool, or some item was recycled more than once. " + this);
if (!mQueue.offer(item)) { }
throw new IllegalStateException("Trying to recycle an item while the queue is full. " + if (!queue.offer(item)) {
"This means that this or some previous items being recycled were not coming from " + throw new IllegalStateException("Trying to recycle an item while the queue is full. " +
"this pool, or some item was recycled more than once. " + this); "This means that this or some previous items being recycled were not coming from " +
"this pool, or some item was recycled more than once. " + this);
}
} }
} }
@ -102,7 +109,9 @@ public class Pool<T> {
*/ */
@CallSuper @CallSuper
public void clear() { public void clear() {
mQueue.clear(); synchronized (lock) {
queue.clear();
}
} }
/** /**
@ -114,7 +123,9 @@ public class Pool<T> {
*/ */
@SuppressWarnings("WeakerAccess") @SuppressWarnings("WeakerAccess")
public final int count() { public final int count() {
return activeCount() + recycledCount(); synchronized (lock) {
return activeCount() + recycledCount();
}
} }
/** /**
@ -125,7 +136,9 @@ public class Pool<T> {
*/ */
@SuppressWarnings("WeakerAccess") @SuppressWarnings("WeakerAccess")
public final int activeCount() { public final int activeCount() {
return activeCount; synchronized (lock) {
return activeCount;
}
} }
/** /**
@ -137,7 +150,9 @@ public class Pool<T> {
*/ */
@SuppressWarnings("WeakerAccess") @SuppressWarnings("WeakerAccess")
public final int recycledCount() { public final int recycledCount() {
return mQueue.size(); synchronized (lock) {
return queue.size();
}
} }
@NonNull @NonNull

@ -63,7 +63,7 @@ public class GlCameraPreview extends CameraPreview<GLSurfaceView, SurfaceTexture
private int mOutputTextureId = 0; private int mOutputTextureId = 0;
private SurfaceTexture mInputSurfaceTexture; private SurfaceTexture mInputSurfaceTexture;
private EglViewport mOutputViewport; private EglViewport mOutputViewport;
private Set<RendererFrameCallback> mRendererFrameCallbacks = Collections.synchronizedSet(new HashSet<RendererFrameCallback>()); private final Set<RendererFrameCallback> mRendererFrameCallbacks = Collections.synchronizedSet(new HashSet<RendererFrameCallback>());
@VisibleForTesting float mCropScaleX = 1F; @VisibleForTesting float mCropScaleX = 1F;
@VisibleForTesting float mCropScaleY = 1F; @VisibleForTesting float mCropScaleY = 1F;
private View mRootView; private View mRootView;
@ -144,8 +144,11 @@ public class GlCameraPreview extends CameraPreview<GLSurfaceView, SurfaceTexture
getView().queueEvent(new Runnable() { getView().queueEvent(new Runnable() {
@Override @Override
public void run() { public void run() {
for (RendererFrameCallback callback : mRendererFrameCallbacks) { // Need to synchronize when iterating the Collections.synchronizedSet
callback.onRendererTextureCreated(mOutputTextureId); synchronized (mRendererFrameCallbacks) {
for (RendererFrameCallback callback : mRendererFrameCallbacks) {
callback.onRendererTextureCreated(mOutputTextureId);
}
} }
} }
}); });
@ -202,11 +205,12 @@ public class GlCameraPreview extends CameraPreview<GLSurfaceView, SurfaceTexture
Matrix.translateM(mTransformMatrix, 0, translX, translY, 0); Matrix.translateM(mTransformMatrix, 0, translX, translY, 0);
Matrix.scaleM(mTransformMatrix, 0, mCropScaleX, mCropScaleY, 1); Matrix.scaleM(mTransformMatrix, 0, mCropScaleX, mCropScaleY, 1);
} }
// Future note: passing scale to the viewport?
// They are scaleX an scaleY, but flipped based on mInputFlipped.
mOutputViewport.drawFrame(mOutputTextureId, mTransformMatrix); mOutputViewport.drawFrame(mOutputTextureId, mTransformMatrix);
for (RendererFrameCallback callback : mRendererFrameCallbacks) { synchronized (mRendererFrameCallbacks) {
callback.onRendererFrame(mInputSurfaceTexture, mCropScaleX, mCropScaleY); // Need to synchronize when iterating the Collections.synchronizedSet
for (RendererFrameCallback callback : mRendererFrameCallbacks) {
callback.onRendererFrame(mInputSurfaceTexture, mCropScaleX, mCropScaleY);
}
} }
} }
} }
@ -299,6 +303,7 @@ public class GlCameraPreview extends CameraPreview<GLSurfaceView, SurfaceTexture
* Creates the renderer for this GL surface. * Creates the renderer for this GL surface.
* @return the renderer for this GL surface * @return the renderer for this GL surface
*/ */
@SuppressWarnings("WeakerAccess")
@NonNull @NonNull
protected Renderer instantiateRenderer() { protected Renderer instantiateRenderer() {
return new Renderer(); return new Renderer();

@ -37,17 +37,16 @@ public abstract class FullVideoRecorder extends VideoRecorder {
super(listener); super(listener);
} }
@SuppressWarnings({"WeakerAccess", "UnusedReturnValue", "BooleanMethodIsAlwaysInverted"}) @SuppressWarnings({"WeakerAccess", "UnusedReturnValue"})
protected boolean prepareMediaRecorder(@NonNull VideoResult.Stub stub) { protected boolean prepareMediaRecorder(@NonNull VideoResult.Stub stub) {
if (mMediaRecorderPrepared) return true; if (mMediaRecorderPrepared) return true;
return onPrepareMediaRecorder(stub, new MediaRecorder()); return onPrepareMediaRecorder(stub, new MediaRecorder());
} }
@SuppressWarnings("WeakerAccess")
protected boolean onPrepareMediaRecorder(@NonNull VideoResult.Stub stub, @NonNull MediaRecorder mediaRecorder) { protected boolean onPrepareMediaRecorder(@NonNull VideoResult.Stub stub, @NonNull MediaRecorder mediaRecorder) {
mMediaRecorder = mediaRecorder; mMediaRecorder = mediaRecorder;
Size size = stub.rotation % 180 != 0 ? stub.size.flip() : stub.size; Size size = stub.rotation % 180 != 0 ? stub.size.flip() : stub.size;
if (stub.audio == Audio.ON) { if (stub.audio == Audio.ON || stub.audio == Audio.MONO || stub.audio == Audio.STEREO) {
// Must be called before setOutputFormat. // Must be called before setOutputFormat.
mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.DEFAULT); mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.DEFAULT);
} }
@ -71,8 +70,15 @@ public abstract class FullVideoRecorder extends VideoRecorder {
} else { } else {
mMediaRecorder.setVideoEncodingBitRate(stub.videoBitRate); mMediaRecorder.setVideoEncodingBitRate(stub.videoBitRate);
} }
if (stub.audio == Audio.ON) { if (stub.audio == Audio.ON || stub.audio == Audio.MONO || stub.audio == Audio.STEREO) {
mMediaRecorder.setAudioChannels(mProfile.audioChannels); if (stub.audio == Audio.ON) {
mMediaRecorder.setAudioChannels(mProfile.audioChannels);
} else if (stub.audio == Audio.MONO) {
mMediaRecorder.setAudioChannels(1);
} else //noinspection ConstantConditions
if (stub.audio == Audio.STEREO) {
mMediaRecorder.setAudioChannels(2);
}
mMediaRecorder.setAudioSamplingRate(mProfile.audioSampleRate); mMediaRecorder.setAudioSamplingRate(mProfile.audioSampleRate);
mMediaRecorder.setAudioEncoder(mProfile.audioCodec); mMediaRecorder.setAudioEncoder(mProfile.audioCodec);
if (stub.audioBitRate <= 0) { if (stub.audioBitRate <= 0) {
@ -142,6 +148,7 @@ public abstract class FullVideoRecorder extends VideoRecorder {
@Override @Override
protected void onStop() { protected void onStop() {
if (mMediaRecorder != null) { if (mMediaRecorder != null) {
dispatchVideoRecordingEnd();
try { try {
mMediaRecorder.stop(); mMediaRecorder.stop();
} catch (Exception e) { } catch (Exception e) {

@ -18,9 +18,11 @@ import com.otaliastudios.cameraview.preview.GlCameraPreview;
import com.otaliastudios.cameraview.preview.RendererFrameCallback; import com.otaliastudios.cameraview.preview.RendererFrameCallback;
import com.otaliastudios.cameraview.preview.RendererThread; import com.otaliastudios.cameraview.preview.RendererThread;
import com.otaliastudios.cameraview.size.Size; import com.otaliastudios.cameraview.size.Size;
import com.otaliastudios.cameraview.video.encoding.AudioConfig;
import com.otaliastudios.cameraview.video.encoding.AudioMediaEncoder; import com.otaliastudios.cameraview.video.encoding.AudioMediaEncoder;
import com.otaliastudios.cameraview.video.encoding.EncoderThread; import com.otaliastudios.cameraview.video.encoding.EncoderThread;
import com.otaliastudios.cameraview.video.encoding.MediaEncoderEngine; import com.otaliastudios.cameraview.video.encoding.MediaEncoderEngine;
import com.otaliastudios.cameraview.video.encoding.TextureConfig;
import com.otaliastudios.cameraview.video.encoding.TextureMediaEncoder; import com.otaliastudios.cameraview.video.encoding.TextureMediaEncoder;
import androidx.annotation.NonNull; import androidx.annotation.NonNull;
@ -38,9 +40,15 @@ public class SnapshotVideoRecorder extends VideoRecorder implements RendererFram
private static final CameraLogger LOG = CameraLogger.create(TAG); private static final CameraLogger LOG = CameraLogger.create(TAG);
private static final int DEFAULT_VIDEO_FRAMERATE = 30; private static final int DEFAULT_VIDEO_FRAMERATE = 30;
private static final int DEFAULT_VIDEO_BITRATE = 1000000;
private static final int DEFAULT_AUDIO_BITRATE = 64000; private static final int DEFAULT_AUDIO_BITRATE = 64000;
// https://stackoverflow.com/a/5220554/4288782
// Assuming low motion, we don't want to put this too high for default usage,
// advanced users are still free to change this for each video.
private static int estimateVideoBitRate(@NonNull Size size, int frameRate) {
return (int) (0.07F * 1F * size.getWidth() * size.getHeight() * frameRate);
}
private static final int STATE_RECORDING = 0; private static final int STATE_RECORDING = 0;
private static final int STATE_NOT_RECORDING = 1; private static final int STATE_NOT_RECORDING = 1;
@ -73,7 +81,6 @@ public class SnapshotVideoRecorder extends VideoRecorder implements RendererFram
protected void onStart() { protected void onStart() {
mPreview.addRendererFrameCallback(this); mPreview.addRendererFrameCallback(this);
mDesiredState = STATE_RECORDING; mDesiredState = STATE_RECORDING;
dispatchVideoRecordingStart();
} }
@Override @Override
@ -101,8 +108,8 @@ public class SnapshotVideoRecorder extends VideoRecorder implements RendererFram
LOG.i("Starting the encoder engine."); LOG.i("Starting the encoder engine.");
// Set default options // Set default options
if (mResult.videoBitRate <= 0) mResult.videoBitRate = DEFAULT_VIDEO_BITRATE;
if (mResult.videoFrameRate <= 0) mResult.videoFrameRate = DEFAULT_VIDEO_FRAMERATE; if (mResult.videoFrameRate <= 0) mResult.videoFrameRate = DEFAULT_VIDEO_FRAMERATE;
if (mResult.videoBitRate <= 0) mResult.videoBitRate = estimateVideoBitRate(mResult.size, mResult.videoFrameRate);
if (mResult.audioBitRate <= 0) mResult.audioBitRate = DEFAULT_AUDIO_BITRATE; if (mResult.audioBitRate <= 0) mResult.audioBitRate = DEFAULT_AUDIO_BITRATE;
// Video. Ensure width and height are divisible by 2, as I have read somewhere. // Video. Ensure width and height are divisible by 2, as I have read somewhere.
@ -118,22 +125,31 @@ public class SnapshotVideoRecorder extends VideoRecorder implements RendererFram
case DEVICE_DEFAULT: type = "video/avc"; break; case DEVICE_DEFAULT: type = "video/avc"; break;
} }
LOG.w("Creating frame encoder. Rotation:", mResult.rotation); LOG.w("Creating frame encoder. Rotation:", mResult.rotation);
TextureMediaEncoder.Config config = new TextureMediaEncoder.Config(width, height, TextureConfig videoConfig = new TextureConfig();
mResult.videoBitRate, videoConfig.width = width;
mResult.videoFrameRate, videoConfig.height = height;
mResult.rotation, videoConfig.bitRate = mResult.videoBitRate;
type, mTextureId, videoConfig.frameRate = mResult.videoFrameRate;
scaleX, scaleY, videoConfig.rotation = mResult.rotation;
EGL14.eglGetCurrentContext(), videoConfig.mimeType = type;
mHasOverlay ? mOverlayTextureId : TextureMediaEncoder.NO_TEXTURE, videoConfig.textureId = mTextureId;
mOverlayRotation videoConfig.scaleX = scaleX;
); videoConfig.scaleY = scaleY;
TextureMediaEncoder videoEncoder = new TextureMediaEncoder(config); videoConfig.eglContext = EGL14.eglGetCurrentContext();
if (mHasOverlay) {
videoConfig.overlayTextureId = mOverlayTextureId;
videoConfig.overlayRotation = mOverlayRotation;
}
TextureMediaEncoder videoEncoder = new TextureMediaEncoder(videoConfig);
// Audio // Audio
AudioMediaEncoder audioEncoder = null; AudioMediaEncoder audioEncoder = null;
if (mResult.audio == Audio.ON) { if (mResult.audio == Audio.ON || mResult.audio == Audio.MONO || mResult.audio == Audio.STEREO) {
audioEncoder = new AudioMediaEncoder(new AudioMediaEncoder.Config(mResult.audioBitRate)); AudioConfig audioConfig = new AudioConfig();
audioConfig.bitRate = mResult.audioBitRate;
if (mResult.audio == Audio.MONO) audioConfig.channels = 1;
if (mResult.audio == Audio.STEREO) audioConfig.channels = 2;
audioEncoder = new AudioMediaEncoder(audioConfig);
} }
// Engine // Engine
@ -147,9 +163,10 @@ public class SnapshotVideoRecorder extends VideoRecorder implements RendererFram
if (mCurrentState == STATE_RECORDING) { if (mCurrentState == STATE_RECORDING) {
LOG.v("dispatching frame."); LOG.v("dispatching frame.");
TextureMediaEncoder textureEncoder = (TextureMediaEncoder) mEncoderEngine.getVideoEncoder(); TextureMediaEncoder textureEncoder = (TextureMediaEncoder) mEncoderEngine.getVideoEncoder();
TextureMediaEncoder.TextureFrame textureFrame = textureEncoder.acquireFrame(); TextureMediaEncoder.Frame frame = textureEncoder.acquireFrame();
textureFrame.timestamp = surfaceTexture.getTimestamp(); frame.timestamp = surfaceTexture.getTimestamp();
surfaceTexture.getTransformMatrix(textureFrame.transform); frame.timestampMillis = System.currentTimeMillis(); // NOTE: this is an approximation but it seems to work.
surfaceTexture.getTransformMatrix(frame.transform);
// get overlay // get overlay
if (mHasOverlay) { if (mHasOverlay) {
@ -162,12 +179,12 @@ public class SnapshotVideoRecorder extends VideoRecorder implements RendererFram
LOG.w("Got Surface.OutOfResourcesException while drawing video overlays", e); LOG.w("Got Surface.OutOfResourcesException while drawing video overlays", e);
} }
mOverlaySurfaceTexture.updateTexImage(); mOverlaySurfaceTexture.updateTexImage();
mOverlaySurfaceTexture.getTransformMatrix(textureFrame.overlayTransform); mOverlaySurfaceTexture.getTransformMatrix(frame.overlayTransform);
} }
if (mEncoderEngine != null) { if (mEncoderEngine != null) {
// can happen on teardown // can happen on teardown
mEncoderEngine.notify(TextureMediaEncoder.FRAME_EVENT, textureFrame); mEncoderEngine.notify(TextureMediaEncoder.FRAME_EVENT, frame);
} }
} }
@ -192,7 +209,12 @@ public class SnapshotVideoRecorder extends VideoRecorder implements RendererFram
@Override @Override
public void onEncodingStart() { public void onEncodingStart() {
// Do nothing. dispatchVideoRecordingStart();
}
@Override
public void onEncodingStop() {
dispatchVideoRecordingEnd();
} }
@EncoderThread @EncoderThread

@ -29,10 +29,16 @@ public abstract class VideoRecorder {
* The callback for the actual video recording starting. * The callback for the actual video recording starting.
*/ */
void onVideoRecordingStart(); void onVideoRecordingStart();
/**
* Video recording has ended. We will finish processing the file
* and soon {@link #onVideoResult(VideoResult.Stub, Exception)} will be called.
*/
void onVideoRecordingEnd();
} }
@VisibleForTesting(otherwise = VisibleForTesting.PROTECTED) VideoResult.Stub mResult; @VisibleForTesting(otherwise = VisibleForTesting.PROTECTED) VideoResult.Stub mResult;
@VisibleForTesting final VideoResultListener mListener; private final VideoResultListener mListener;
@SuppressWarnings("WeakerAccess") @SuppressWarnings("WeakerAccess")
protected Exception mError; protected Exception mError;
private boolean mIsRecording; private boolean mIsRecording;
@ -96,9 +102,20 @@ public abstract class VideoRecorder {
*/ */
@SuppressWarnings("WeakerAccess") @SuppressWarnings("WeakerAccess")
@CallSuper @CallSuper
protected void dispatchVideoRecordingStart(){ protected void dispatchVideoRecordingStart() {
if(mListener != null){ if (mListener != null) {
mListener.onVideoRecordingStart(); mListener.onVideoRecordingStart();
} }
} }
/**
* Subclasses can call this to notify that the video recording has ended,
* although the video result might still be processed.
*/
@CallSuper
protected void dispatchVideoRecordingEnd() {
if (mListener != null) {
mListener.onVideoRecordingEnd();
}
}
} }

@ -0,0 +1,96 @@
package com.otaliastudios.cameraview.video.encoding;
import android.media.AudioFormat;
import androidx.annotation.NonNull;
/**
* Audio configuration to be passed as input to the constructor
* of an {@link AudioMediaEncoder}.
*/
@SuppressWarnings("WeakerAccess")
public class AudioConfig {
// Configurable options
public int bitRate; // ENCODED bit rate
public int channels = 1;
// Not configurable options (for now)
final String mimeType = "audio/mp4a-latm";
final int encoding = AudioFormat.ENCODING_PCM_16BIT; // Determines the sampleSizePerChannel
// The 44.1KHz frequency is the only setting guaranteed to be available on all devices.
final int samplingFrequency = 44100; // samples/sec
final int sampleSizePerChannel = 2; // byte/sample/channel [16bit]
final int byteRatePerChannel = samplingFrequency * sampleSizePerChannel; // byte/sec/channel
@NonNull
AudioConfig copy() {
AudioConfig config = new AudioConfig();
config.bitRate = this.bitRate;
config.channels = this.channels;
return config;
}
int byteRate() { // RAW byte rate
return byteRatePerChannel * channels; // byte/sec
}
@SuppressWarnings("unused")
int bitRate() { // RAW bit rate
return byteRate() * 8; // bit/sec
}
int audioFormatChannels() {
if (channels == 1) {
return AudioFormat.CHANNEL_IN_MONO;
} else if (channels == 2) {
return AudioFormat.CHANNEL_IN_STEREO;
}
throw new RuntimeException("Invalid number of channels: " + channels);
}
/**
* We call FRAME here the chunk of data that we want to read at each loop cycle.
*
* When this number is HIGH, the AudioRecord might be unable to keep a good pace and
* we might end up skip some frames.
*
* When this number is LOW, we pull a bigger number of frames and this might end up
* delaying our recorder/encoder balance (more frames means more encoding operations).
* In the end, this means that the recorder will skip some frames to restore the balance.
*
* @return the frame size
*/
int frameSize() {
return 1024 * channels;
}
/**
* Number of frames contained in the {@link android.media.AudioRecord} buffer.
* In theory, the higher this value is, the safer it is to delay reading as the
* audioRecord will hold the recorded samples anyway and return to us next time we read.
*
* Should be coordinated with {@link #frameSize()}.
*
* @return the number of frames
*/
int audioRecordBufferFrames() {
return 25;
}
/**
* We allocate buffers of {@link #frameSize()} each, which is not much.
*
* This value indicates the maximum number of these buffers that we can allocate at a given instant.
* This value is the number of runnables that the encoder thread is allowed to be 'behind'
* the recorder thread. It's not safe to have it very large or we can end encoding A LOT AFTER
* the actual recording. It's better to reduce this and skip recording at all.
*
* Should be coordinated with {@link #frameSize()}.
*
* @return the buffer pool max size
*/
int bufferPoolMaxSize() {
return 80;
}
}

@ -1,133 +1,101 @@
package com.otaliastudios.cameraview.video.encoding; package com.otaliastudios.cameraview.video.encoding;
import android.annotation.SuppressLint;
import android.media.AudioFormat; import android.media.AudioFormat;
import android.media.AudioRecord; import android.media.AudioRecord;
import android.media.AudioTimestamp;
import android.media.MediaCodec; import android.media.MediaCodec;
import android.media.MediaCodecInfo; import android.media.MediaCodecInfo;
import android.media.MediaFormat; import android.media.MediaFormat;
import android.media.MediaRecorder; import android.media.MediaRecorder;
import android.os.Build; import android.os.Build;
import android.os.Handler;
import android.os.Message;
import com.otaliastudios.cameraview.CameraLogger; import com.otaliastudios.cameraview.CameraLogger;
import com.otaliastudios.cameraview.internal.utils.WorkerHandler;
import androidx.annotation.NonNull; import androidx.annotation.NonNull;
import androidx.annotation.Nullable;
import androidx.annotation.RequiresApi; import androidx.annotation.RequiresApi;
import java.io.IOException; import java.io.IOException;
import java.nio.ByteBuffer; import java.nio.ByteBuffer;
import java.util.HashMap;
import java.util.Map;
import java.util.concurrent.LinkedBlockingQueue; import java.util.concurrent.LinkedBlockingQueue;
/** /**
* Default implementation for audio encoding. * Default implementation for audio encoding.
*/ */
// TODO create onVideoRecordingStart/onVideoRecordingEnd callbacks
@RequiresApi(api = Build.VERSION_CODES.JELLY_BEAN_MR2) @RequiresApi(api = Build.VERSION_CODES.JELLY_BEAN_MR2)
public class AudioMediaEncoder extends MediaEncoder { public class AudioMediaEncoder extends MediaEncoder {
private static final String TAG = AudioMediaEncoder.class.getSimpleName(); private static final String TAG = AudioMediaEncoder.class.getSimpleName();
private static final CameraLogger LOG = CameraLogger.create(TAG); private static final CameraLogger LOG = CameraLogger.create(TAG);
private static final String MIME_TYPE = "audio/mp4a-latm"; private static final boolean PERFORMANCE_DEBUG = false;
private static final int ENCODING = AudioFormat.ENCODING_PCM_16BIT; // Determines the SAMPLE_SIZE private static final boolean PERFORMANCE_FILL_GAPS = true;
private static final int CHANNELS = AudioFormat.CHANNEL_IN_MONO; // AudioFormat.CHANNEL_IN_STEREO;
// The 44.1KHz frequency is the only setting guaranteed to be available on all devices.
private static final int SAMPLING_FREQUENCY = 44100; // samples/sec
private static final int CHANNELS_COUNT = 1; // 2;
private static final int SAMPLE_SIZE = 2; // byte/sample/channel
private static final int BYTE_RATE_PER_CHANNEL = SAMPLING_FREQUENCY * SAMPLE_SIZE; // byte/sec/channel
private static final int BYTE_RATE = BYTE_RATE_PER_CHANNEL * CHANNELS_COUNT; // byte/sec
@SuppressWarnings("unused")
private static final int BIT_RATE = BYTE_RATE * 8; // bit/sec
// We call FRAME here the chunk of data that we want to read at each loop cycle
private static final int FRAME_SIZE_PER_CHANNEL = 1024; // bytes/frame/channel [AAC constant]
private static final int FRAME_SIZE = FRAME_SIZE_PER_CHANNEL * CHANNELS_COUNT; // bytes/frame
// We allocate buffers of 1KB each, which is not so much. This value indicates the maximum
// number of these buffers that we can allocate at a given instant.
// This value is the number of runnables that the encoder thread is allowed to be 'behind'
// the recorder thread. It's not safe to have it very large or we can end encoding A LOT AFTER
// the actual recording. It's better to reduce this and skip recording at all.
private static final int BUFFER_POOL_MAX_SIZE = 60;
private static long bytesToUs(int bytes) {
return (1000000L * bytes) / BYTE_RATE;
}
private static long bytesToUs(long bytes) {
return (1000000L * bytes) / BYTE_RATE;
}
private boolean mRequestStop = false; private boolean mRequestStop = false;
private AudioEncodingHandler mEncoder; private AudioEncodingThread mEncoder;
private AudioRecordingThread mRecorder; private AudioRecordingThread mRecorder;
private ByteBufferPool mByteBufferPool; private ByteBufferPool mByteBufferPool;
private Config mConfig; private ByteBuffer mZeroBuffer;
private final AudioTimestamp mTimestamp;
public static class Config { private AudioConfig mConfig;
int bitRate; private InputBufferPool mInputBufferPool = new InputBufferPool();
public Config(int bitRate) { private final LinkedBlockingQueue<InputBuffer> mInputBufferQueue = new LinkedBlockingQueue<>();
this.bitRate = bitRate;
} // Just to debug performance.
} private int mSendCount = 0;
private int mExecuteCount = 0;
public AudioMediaEncoder(@NonNull Config config) { private long mAvgSendDelay = 0;
mConfig = config; private long mAvgExecuteDelay = 0;
} private Map<Long, Long> mSendStartMap = new HashMap<>();
@NonNull public AudioMediaEncoder(@NonNull AudioConfig config) {
@Override super("AudioEncoder");
String getName() { mConfig = config.copy();
return "AudioEncoder"; mTimestamp = new AudioTimestamp(mConfig.byteRate());
// These two were in onPrepare() but it's better to do warm-up here
// since thread and looper creation is expensive.
mEncoder = new AudioEncodingThread();
mRecorder = new AudioRecordingThread();
} }
@EncoderThread @EncoderThread
@Override @Override
void onPrepare(@NonNull MediaEncoderEngine.Controller controller, long maxLengthMillis) { protected void onPrepare(@NonNull MediaEncoderEngine.Controller controller, long maxLengthMillis) {
final MediaFormat audioFormat = MediaFormat.createAudioFormat(MIME_TYPE, SAMPLING_FREQUENCY, CHANNELS_COUNT); final MediaFormat audioFormat = MediaFormat.createAudioFormat(
mConfig.mimeType,
mConfig.samplingFrequency,
mConfig.channels);
audioFormat.setInteger(MediaFormat.KEY_AAC_PROFILE, MediaCodecInfo.CodecProfileLevel.AACObjectLC); audioFormat.setInteger(MediaFormat.KEY_AAC_PROFILE, MediaCodecInfo.CodecProfileLevel.AACObjectLC);
audioFormat.setInteger(MediaFormat.KEY_CHANNEL_MASK, CHANNELS); audioFormat.setInteger(MediaFormat.KEY_CHANNEL_MASK, mConfig.audioFormatChannels());
audioFormat.setInteger(MediaFormat.KEY_BIT_RATE, mConfig.bitRate); audioFormat.setInteger(MediaFormat.KEY_BIT_RATE, mConfig.bitRate); // TODO multiply by channels?
audioFormat.setInteger(MediaFormat.KEY_CHANNEL_COUNT, CHANNELS_COUNT);
try { try {
mMediaCodec = MediaCodec.createEncoderByType(MIME_TYPE); mMediaCodec = MediaCodec.createEncoderByType(mConfig.mimeType);
} catch (IOException e) { } catch (IOException e) {
throw new RuntimeException(e); throw new RuntimeException(e);
} }
mMediaCodec.configure(audioFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE); mMediaCodec.configure(audioFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
mMediaCodec.start(); mMediaCodec.start();
mByteBufferPool = new ByteBufferPool(FRAME_SIZE, BUFFER_POOL_MAX_SIZE); mByteBufferPool = new ByteBufferPool(mConfig.frameSize(), mConfig.bufferPoolMaxSize());
mEncoder = new AudioEncodingHandler(); mZeroBuffer = ByteBuffer.allocateDirect(mConfig.frameSize());
mRecorder = new AudioRecordingThread();
} }
@EncoderThread @EncoderThread
@Override @Override
void onStart() { protected void onStart() {
mRequestStop = false; mRequestStop = false;
mRecorder.start(); mRecorder.start();
mEncoder.start();
} }
@EncoderThread @EncoderThread
@Override @Override
void onEvent(@NonNull String event, @Nullable Object data) { } protected void onStop() {
@EncoderThread
@Override
void onStop() {
mRequestStop = true; mRequestStop = true;
} }
@Override @Override
void onRelease() { protected void onStopped() {
super.onStopped();
mRequestStop = false; mRequestStop = false;
mEncoder = null; mEncoder = null;
mRecorder = null; mRecorder = null;
@ -138,25 +106,52 @@ public class AudioMediaEncoder extends MediaEncoder {
} }
@Override @Override
int getEncodedBitRate() { protected int getEncodedBitRate() {
return mConfig.bitRate; return mConfig.bitRate;
} }
class AudioRecordingThread extends Thread { /**
* Sleeps for some frames duration, to skip them. This can be used to slow down
* the recording operation to balance it with encoding.
*/
private void skipFrames(int frames) {
try {
Thread.sleep(AudioTimestamp.bytesToMillis(
mConfig.frameSize() * frames,
mConfig.byteRate()));
} catch (InterruptedException ignore) {}
}
/**
* A thread recording from microphone using {@link AudioRecord} class.
* Communicates with {@link AudioEncodingThread} using {@link #mInputBufferQueue}.
*/
private class AudioRecordingThread extends Thread {
private AudioRecord mAudioRecord; private AudioRecord mAudioRecord;
private ByteBuffer mCurrentBuffer; private ByteBuffer mCurrentBuffer;
private int mReadBytes; private int mReadBytes;
private long mLastTimeUs; private long mLastTimeUs;
private long mFirstTimeUs = Long.MIN_VALUE;
AudioRecordingThread() {
final int minBufferSize = AudioRecord.getMinBufferSize(SAMPLING_FREQUENCY, CHANNELS, ENCODING); private AudioRecordingThread() {
int bufferSize = FRAME_SIZE * 25; // Make this bigger so we don't skip frames. final int minBufferSize = AudioRecord.getMinBufferSize(
mConfig.samplingFrequency,
mConfig.audioFormatChannels(),
mConfig.encoding);
// Make this bigger so we don't skip frames. 25: Stereo: 51200. Mono: 25600
// 25 is quite big already. Tried to make it bigger to solve the read() delay
// but it just makes things worse (ruins MONO as well).
// Tried to make it smaller and things change as well.
int bufferSize = mConfig.frameSize() * mConfig.audioRecordBufferFrames();
while (bufferSize < minBufferSize) { while (bufferSize < minBufferSize) {
bufferSize += FRAME_SIZE; // Unlikely I think. bufferSize += mConfig.frameSize(); // Unlikely.
} }
mAudioRecord = new AudioRecord(MediaRecorder.AudioSource.CAMCORDER, mAudioRecord = new AudioRecord(MediaRecorder.AudioSource.CAMCORDER,
SAMPLING_FREQUENCY, CHANNELS, ENCODING, bufferSize); mConfig.samplingFrequency,
mConfig.audioFormatChannels(),
mConfig.encoding,
bufferSize);
setPriority(Thread.MAX_PRIORITY); setPriority(Thread.MAX_PRIORITY);
} }
@ -179,17 +174,41 @@ public class AudioMediaEncoder extends MediaEncoder {
private void read(boolean endOfStream) { private void read(boolean endOfStream) {
mCurrentBuffer = mByteBufferPool.get(); mCurrentBuffer = mByteBufferPool.get();
if (mCurrentBuffer == null) { if (mCurrentBuffer == null) {
LOG.e("read thread - eos:", endOfStream, "- Skipping audio frame, encoding is too slow."); // This can happen and it means that encoding is slow with respect to recording.
// Should fix the next presentation time here, but // One might be tempted to fix precisely the next frame presentation time when this happens,
// but this is not needed because the current increaseTime() algorithm will consider delays
// when they get large.
// Sleeping before returning is a good way of balancing the two operations.
// However, if endOfStream, we CAN'T lose this frame!
if (endOfStream) {
LOG.v("read thread - eos: true - No buffer, retrying.");
read(true); // try again
} else {
LOG.w("read thread - eos: false - Skipping audio frame, encoding is too slow.");
skipFrames(6); // sleep a bit
}
} else { } else {
mCurrentBuffer.clear(); mCurrentBuffer.clear();
mReadBytes = mAudioRecord.read(mCurrentBuffer, FRAME_SIZE); // When stereo, we read twice the data here and AudioRecord will fill the buffer
// with left and right bytes. https://stackoverflow.com/q/20594750/4288782
if (PERFORMANCE_DEBUG) {
long before = System.nanoTime();
mReadBytes = mAudioRecord.read(mCurrentBuffer, mConfig.frameSize());
long after = System.nanoTime();
float delayMillis = (after - before) / 1000000F;
float durationMillis = AudioTimestamp.bytesToMillis(mReadBytes, mConfig.byteRate());
LOG.v("read thread - reading took:", delayMillis,
"should be:", durationMillis,
"delay:", delayMillis - durationMillis);
} else {
mReadBytes = mAudioRecord.read(mCurrentBuffer, mConfig.frameSize());
}
LOG.i("read thread - eos:", endOfStream, "- Read new audio frame. Bytes:", mReadBytes); LOG.i("read thread - eos:", endOfStream, "- Read new audio frame. Bytes:", mReadBytes);
if (mReadBytes > 0) { // Good read: increase PTS. if (mReadBytes > 0) { // Good read: increase PTS.
mLastTimeUs = increaseTime(mReadBytes); increaseTime(mReadBytes, endOfStream);
LOG.i("read thread - eos:", endOfStream, "- Frame PTS:", mLastTimeUs); LOG.i("read thread - eos:", endOfStream, "- mLastTimeUs:", mLastTimeUs);
mCurrentBuffer.limit(mReadBytes); mCurrentBuffer.limit(mReadBytes);
onBuffer(endOfStream); enqueue(mCurrentBuffer, mLastTimeUs, endOfStream);
} else if (mReadBytes == AudioRecord.ERROR_INVALID_OPERATION) { } else if (mReadBytes == AudioRecord.ERROR_INVALID_OPERATION) {
LOG.e("read thread - eos:", endOfStream, "- Got AudioRecord.ERROR_INVALID_OPERATION"); LOG.e("read thread - eos:", endOfStream, "- Got AudioRecord.ERROR_INVALID_OPERATION");
} else if (mReadBytes == AudioRecord.ERROR_BAD_VALUE) { } else if (mReadBytes == AudioRecord.ERROR_BAD_VALUE) {
@ -199,174 +218,153 @@ public class AudioMediaEncoder extends MediaEncoder {
} }
/** /**
* New data at position buffer.position() of size buffer.remaining() * Increases presentation time and checks for max length constraint. This is much faster
* has been written into this buffer. This method should pass the data * then waiting for the encoder to check it during {@link #drainOutput(boolean)}. We
* to the consumer. * want to catch this as soon as possible so we stop recording useless frames and bother
* all the threads involved.
* @param readBytes bytes read in last reading
* @param endOfStream end of stream?
*/ */
private void onBuffer(boolean endOfStream) { private void increaseTime(int readBytes, boolean endOfStream) {
LOG.v("read thread - Sending buffer to encoder thread."); // Get the latest frame timestamp.
mEncoder.sendInputBuffer(mCurrentBuffer, mLastTimeUs, endOfStream); mLastTimeUs = mTimestamp.increaseUs(readBytes);
} if (mFirstTimeUs == Long.MIN_VALUE) {
mFirstTimeUs = mLastTimeUs;
private long increaseTime(int readBytes) { // Compute the first frame milliseconds as well.
return increaseTime3(readBytes); notifyFirstFrameMillis(System.currentTimeMillis()
} - AudioTimestamp.bytesToMillis(readBytes, mConfig.byteRate()));
}
/** // See if we reached the max length value.
* This method simply assumes that we read everything without losing a single US. boolean didReachMaxLength = (mLastTimeUs - mFirstTimeUs) > getMaxLengthMillis() * 1000L;
* It will use System.nanoTime() just once, as the starting point. if (didReachMaxLength && !endOfStream) {
* Of course we don't as there are things going on in this thread. LOG.w("read thread - this frame reached the maxLength! deltaUs:", mLastTimeUs - mFirstTimeUs);
*/ notifyMaxLengthReached();
@SuppressWarnings("unused") }
private long increaseTime1(int readBytes) {
return mLastTimeUs + bytesToUs(readBytes);
}
/** // Add zeroes if we have huge gaps. Even if timestamps are correct, if we have gaps between
* Just for testing, this method will use Api 24 method to retrieve the timestamp. // them, the encoder might shrink all timestamps to have a continuous audio. This results
* This way we let the platform choose instead of making assumptions. // in a video that is fast-forwarded.
*/ // Adding zeroes does not solve the gaps issue - audio will still be distorted. But at
@SuppressWarnings("unused") // least we get a video that has the correct playback speed.
@RequiresApi(24) if (PERFORMANCE_FILL_GAPS) {
private long increaseTime2(int readBytes) { int gaps = mTimestamp.getGapCount(mConfig.frameSize());
if (mApi24Timestamp == null) { if (gaps > 0) {
mApi24Timestamp = new AudioTimestamp(); long gapStart = mTimestamp.getGapStartUs(mLastTimeUs);
long frameUs = AudioTimestamp.bytesToUs(mConfig.frameSize(), mConfig.byteRate());
LOG.w("read thread - GAPS: trying to add", gaps, "zeroed buffers");
for (int i = 0; i < gaps; i++) {
ByteBuffer zeroBuffer = mByteBufferPool.get();
if (zeroBuffer == null) {
LOG.e("read thread - GAPS: aborting because we have no free buffer.");
break;
}
;
zeroBuffer.position(0);
zeroBuffer.put(mZeroBuffer);
zeroBuffer.clear();
enqueue(zeroBuffer, gapStart, false);
gapStart += frameUs;
}
}
} }
mAudioRecord.getTimestamp(mApi24Timestamp, AudioTimestamp.TIMEBASE_MONOTONIC);
return mApi24Timestamp.nanoTime / 1000;
} }
private AudioTimestamp mApi24Timestamp;
/** private void enqueue(@NonNull ByteBuffer byteBuffer, long timestamp, boolean isEndOfStream) {
* This method looks like an improvement over {@link #increaseTime1(int)} as it if (PERFORMANCE_DEBUG) {
* accounts for the current time as well. Adapted & improved. from Kickflip. mSendStartMap.put(timestamp, System.nanoTime() / 1000000);
*
* This creates regular timestamps unless we accumulate a lot of delay (greater than
* twice the buffer duration), in which case it creates a gap and starts again trying
* to be regular from the new point.
*/
private long increaseTime3(int readBytes) {
long bufferDurationUs = bytesToUs(readBytes);
long bufferEndTimeUs = System.nanoTime() / 1000; // now
long bufferStartTimeUs = bufferEndTimeUs - bufferDurationUs;
// If this is the first time, the base time is the buffer start time.
if (mBytesSinceBaseTime == 0) mBaseTimeUs = bufferStartTimeUs;
// Recompute time assuming that we are respecting the sampling frequency.
// This puts the time at the end of last read buffer, which means, where we
// should be if we had no delay / missed buffers.
long correctedTimeUs = mBaseTimeUs + bytesToUs(mBytesSinceBaseTime);
long correctionUs = bufferStartTimeUs - correctedTimeUs;
// However, if the correction is too big (> 2*bufferDurationUs), reset to this point.
// This is triggered if we lose buffers and are recording/encoding at a slower rate.
if (correctionUs >= 2L * bufferDurationUs) {
mBaseTimeUs = bufferStartTimeUs;
mBytesSinceBaseTime = readBytes;
return mBaseTimeUs;
} else {
mBytesSinceBaseTime += readBytes;
return correctedTimeUs;
} }
int readBytes = byteBuffer.remaining();
InputBuffer inputBuffer = mInputBufferPool.get();
//noinspection ConstantConditions
inputBuffer.source = byteBuffer;
inputBuffer.timestamp = timestamp;
inputBuffer.length = readBytes;
inputBuffer.isEndOfStream = isEndOfStream;
mInputBufferQueue.add(inputBuffer);
} }
private long mBaseTimeUs;
private long mBytesSinceBaseTime;
} }
/** /**
* This will be a super busy thread. It's important for it to be: * A thread encoding the microphone data using the media encoder APIs.
* - different than the recording thread: or we would miss a lot of audio * Communicates with {@link AudioRecordingThread} using {@link #mInputBufferQueue}.
* - different than the 'encoder' thread: we want that to be reactive. *
* For example, a stop() must become onStop() soon, can't wait for all this draining. * We want to do this operation on a different thread than the recording one (to avoid
* losing frames while we're working here), and different than the {@link MediaEncoder}
* own thread (we want that to be reactive - stop() must become onStop() soon).
*/ */
@SuppressLint("HandlerLeak") private class AudioEncodingThread extends Thread {
class AudioEncodingHandler extends Handler { private AudioEncodingThread() {
setPriority(Thread.MAX_PRIORITY);
InputBufferPool mInputBufferPool = new InputBufferPool();
LinkedBlockingQueue<InputBuffer> mPendingOps = new LinkedBlockingQueue<>();
AudioEncodingHandler() {
super(WorkerHandler.get("AudioEncodingHandler").getLooper());
}
void sendInputBuffer(ByteBuffer buffer, long presentationTimeUs, boolean endOfStream) {
int presentation1 = (int) (presentationTimeUs >> 32);
int presentation2 = (int) (presentationTimeUs);
sendMessage(obtainMessage(endOfStream ? 1 : 0, presentation1, presentation2, buffer));
} }
@Override @Override
public void handleMessage(Message msg) { public void run() {
super.handleMessage(msg); encoding: while (true) {
boolean endOfStream = msg.what == 1; if (mInputBufferQueue.isEmpty()) {
long timestamp = (((long) msg.arg1) << 32) | (((long) msg.arg2) & 0xffffffffL); skipFrames(2);
LOG.i("encoding thread - got buffer. timestamp:", timestamp, "eos:", endOfStream);
ByteBuffer buffer = (ByteBuffer) msg.obj;
int readBytes = buffer.remaining();
InputBuffer inputBuffer = mInputBufferPool.get();
//noinspection ConstantConditions
inputBuffer.source = buffer;
inputBuffer.timestamp = timestamp;
inputBuffer.length = readBytes;
inputBuffer.isEndOfStream = endOfStream;
mPendingOps.add(inputBuffer);
performPendingOps(endOfStream);
}
private void performPendingOps(boolean force) {
LOG.i("encoding thread - performing", mPendingOps.size(), "pending operations. force:", force);
InputBuffer buffer;
while ((buffer = mPendingOps.peek()) != null) {
if (force) {
acquireInputBuffer(buffer);
performPendingOp(buffer);
} else if (tryAcquireInputBuffer(buffer)) {
performPendingOp(buffer);
} else { } else {
break; // Will try later. LOG.i("encoding thread - performing", mInputBufferQueue.size(), "pending operations.");
InputBuffer inputBuffer;
while ((inputBuffer = mInputBufferQueue.peek()) != null) {
// Performance logging
if (PERFORMANCE_DEBUG) {
long sendEnd = System.nanoTime() / 1000000;
Long sendStart = mSendStartMap.remove(inputBuffer.timestamp);
if (sendStart != null) {
mAvgSendDelay = ((mAvgSendDelay * mSendCount) + (sendEnd - sendStart)) / (++mSendCount);
LOG.v("send delay millis:", sendEnd - sendStart, "average:", mAvgSendDelay);
} else {
// This input buffer was already processed (but tryAcquire failed for now).
}
}
// Actual work
if (inputBuffer.isEndOfStream) {
acquireInputBuffer(inputBuffer);
encode(inputBuffer);
break encoding;
} else if (tryAcquireInputBuffer(inputBuffer)) {
encode(inputBuffer);
} else {
skipFrames(1);
}
}
} }
} }
// We got an end of stream.
mInputBufferPool.clear();
if (PERFORMANCE_DEBUG) {
// After latest changes, the count here is not so different between MONO and STEREO.
// We get about 400 frames in both cases (430 for MONO, but doesn't seem like a big issue).
LOG.e("EXECUTE DELAY MILLIS:", mAvgExecuteDelay, "COUNT:", mExecuteCount);
LOG.e("SEND DELAY MILLIS:", mAvgSendDelay, "COUNT:", mSendCount);
}
} }
private void performPendingOp(InputBuffer buffer) { private void encode(@NonNull InputBuffer buffer) {
long executeStart = System.nanoTime() / 1000000;
LOG.i("encoding thread - performing pending operation for timestamp:", buffer.timestamp, "- encoding."); LOG.i("encoding thread - performing pending operation for timestamp:", buffer.timestamp, "- encoding.");
buffer.data.put(buffer.source); // TODO this copy is prob. the worst part here for performance buffer.data.put(buffer.source); // NOTE: this copy is prob. the worst part here for performance
mByteBufferPool.recycle(buffer.source); mByteBufferPool.recycle(buffer.source);
mPendingOps.remove(buffer); mInputBufferQueue.remove(buffer);
encodeInputBuffer(buffer); encodeInputBuffer(buffer);
boolean eos = buffer.isEndOfStream; boolean eos = buffer.isEndOfStream;
mInputBufferPool.recycle(buffer); mInputBufferPool.recycle(buffer);
if (eos) mInputBufferPool.clear();
LOG.i("encoding thread - performing pending operation for timestamp:", buffer.timestamp, "- draining."); LOG.i("encoding thread - performing pending operation for timestamp:", buffer.timestamp, "- draining.");
// NOTE: can consider calling this drainOutput on yet another thread, which would let us // NOTE: can consider calling this drainOutput on yet another thread, which would let us
// use an even smaller BUFFER_POOL_MAX_SIZE without losing audio frames. But this way // use an even smaller BUFFER_POOL_MAX_SIZE without losing audio frames. But this way
// we can accumulate delay on this new thread without noticing (no pool getting empty). // we can accumulate delay on this new thread without noticing (no pool getting empty).
if (true) { drainOutput(buffer.isEndOfStream);
drainOutput(eos);
if (eos) WorkerHandler.get("AudioEncodingHandler").getThread().interrupt();
} else {
// Testing the option above.
WorkerHandler.get("AudioEncodingDrainer").remove(drainRunnable);
WorkerHandler.get("AudioEncodingDrainer").remove(drainRunnableEos);
WorkerHandler.get("AudioEncodingDrainer").post(eos ? drainRunnableEos : drainRunnable);
}
}
private final Runnable drainRunnable = new Runnable() { if (PERFORMANCE_DEBUG) {
@Override long executeEnd = System.nanoTime() / 1000000;
public void run() { mAvgExecuteDelay = ((mAvgExecuteDelay * mExecuteCount) + (executeEnd - executeStart)) / (++mExecuteCount);
drainOutput(false); LOG.v("execute delay millis:", executeEnd - executeStart, "average:", mAvgExecuteDelay);
} }
}; }
private final Runnable drainRunnableEos = new Runnable() {
@Override
public void run() {
drainOutput(true);
WorkerHandler.get("AudioEncodingHandler").getThread().interrupt();
WorkerHandler.get("AudioEncodingDrainer").getThread().interrupt();
}
};
} }
} }

@ -0,0 +1,105 @@
package com.otaliastudios.cameraview.video.encoding;
import android.util.Log;
/**
* Computes timestamps for audio frames.
* Video frames do not need this since the timestamp comes from
* the surface texture.
*
* This is independent from the channels count, as long as the read bytes include
* all channels and the byte rate accounts for this as well.
* If channels is 2, both values will be doubled and we behave the same.
*
* This class keeps track of gaps between frames.
* This can be used, for example, to write zeros instead of nothing.
*/
class AudioTimestamp {
static long bytesToUs(long bytes, int byteRate) {
return (1000000L * bytes) / byteRate;
}
static long bytesToMillis(long bytes, int byteRate) {
return (1000L * bytes) / byteRate;
}
private int mByteRate;
private long mBaseTimeUs;
private long mBytesSinceBaseTime;
private long mGapUs;
AudioTimestamp(int byteRate) {
mByteRate = byteRate;
}
/**
* This method accounts for the current time and proved to be the most reliable among
* the ones tested.
*
* This creates regular timestamps unless we accumulate a lot of delay (greater than
* twice the buffer duration), in which case it creates a gap and starts again trying
* to be regular from the new point.
*
* Returns timestamps in the {@link System#nanoTime()} reference.
*/
@SuppressWarnings("SameParameterValue")
long increaseUs(int readBytes) {
long bufferDurationUs = bytesToUs((long) readBytes, mByteRate);
long bufferEndTimeUs = System.nanoTime() / 1000; // now
long bufferStartTimeUs = bufferEndTimeUs - bufferDurationUs;
// If this is the first time, the base time is the buffer start time.
if (mBytesSinceBaseTime == 0) mBaseTimeUs = bufferStartTimeUs;
// Recompute time assuming that we are respecting the sampling frequency.
// This puts the time at the end of last read buffer, which means, where we
// should be if we had no delay / missed buffers.
long correctedTimeUs = mBaseTimeUs + bytesToUs(mBytesSinceBaseTime, mByteRate);
long correctionUs = bufferStartTimeUs - correctedTimeUs;
if (correctionUs >= 2L * bufferDurationUs) {
// However, if the correction is too big (> 2*bufferDurationUs), reset to this point.
// This is triggered if we lose buffers and are recording/encoding at a slower rate.
mBaseTimeUs = bufferStartTimeUs;
mBytesSinceBaseTime = readBytes;
mGapUs = correctionUs;
return mBaseTimeUs;
} else {
//noinspection StatementWithEmptyBody
if (correctionUs < 0) {
// This means that this method is being called too often, so that the expected start
// time for this buffer is BEFORE the last buffer end. So, respect the last buffer end
// instead.
}
mGapUs = 0;
mBytesSinceBaseTime += readBytes;
return correctedTimeUs;
}
}
/**
* Returns the number of gaps (meaning, missing frames) assuming that each
* frame has frameBytes size. Possibly 0.
*
* @param frameBytes size of standard frame
* @return number of gaps
*/
int getGapCount(int frameBytes) {
if (mGapUs == 0) return 0;
long durationUs = bytesToUs((long) frameBytes, mByteRate);
return (int) (mGapUs / durationUs);
}
/**
* Returns the timestamp of the first missing frame.
* Should be called only after {@link #getGapCount(int)} returns something
* greater than zero.
*
* @param lastTimeUs the last real frame timestamp
* @return the first missing frame timestamp
*/
long getGapStartUs(long lastTimeUs) {
return lastTimeUs - mGapUs;
}
}

@ -6,11 +6,12 @@ import java.nio.ByteBuffer;
* Represents an input buffer, which means, * Represents an input buffer, which means,
* raw data that should be encoded by MediaCodec. * raw data that should be encoded by MediaCodec.
*/ */
class InputBuffer { @SuppressWarnings("WeakerAccess")
ByteBuffer data; public class InputBuffer {
ByteBuffer source; public ByteBuffer data;
int index; public ByteBuffer source;
int length; public int index;
long timestamp; public int length;
boolean isEndOfStream; public long timestamp;
public boolean isEndOfStream;
} }

@ -5,6 +5,7 @@ import android.media.MediaCodec;
import android.media.MediaFormat; import android.media.MediaFormat;
import android.os.Build; import android.os.Build;
import androidx.annotation.CallSuper;
import androidx.annotation.NonNull; import androidx.annotation.NonNull;
import androidx.annotation.Nullable; import androidx.annotation.Nullable;
import androidx.annotation.RequiresApi; import androidx.annotation.RequiresApi;
@ -16,10 +17,65 @@ import java.nio.ByteBuffer;
/** /**
* Base class for single-track encoders, coordinated by a {@link MediaEncoderEngine}. * Base class for single-track encoders, coordinated by a {@link MediaEncoderEngine}.
* For the lifecycle of this class, read comments in the engine class.
*
* This class manages a background thread and streamlines events on this thread
* which we call the {@link EncoderThread}:
*
* 1. When {@link #prepare(MediaEncoderEngine.Controller, long)} is called, we call
* {@link #onPrepare(MediaEncoderEngine.Controller, long)} on the encoder thread.
*
* 2. When {@link #start()} is called, we call {@link #onStart()} on the encoder thread.
*
* 3. When {@link #notify(String, Object)} is called, we call {@link #onEvent(String, Object)}
* on the encoder thread.
*
* 4. After starting, encoders are free to acquire an input buffer with
* {@link #tryAcquireInputBuffer(InputBuffer)} or {@link #acquireInputBuffer(InputBuffer)}.
*
* 5. After getting the input buffer, they are free to fill it with data.
*
* 6. After filling it with data, they are required to call {@link #encodeInputBuffer(InputBuffer)}
* for encoding to take place.
*
* 7. After this happens, or at regular intervals, or whenever they want, encoders can then
* call {@link #drainOutput(boolean)} with a false parameter to fetch the encoded data
* and pass it to the engine (so it can be written to the muxer).
*
* 8. When {@link #stop()} is called - either by the engine user, or as a consequence of having
* called {@link MediaEncoderEngine.Controller#requestStop(int)} - we call
* {@link #onStop()} on the encoder thread.
*
* 9. The {@link #onStop()} implementation should, as fast as possible, stop reading, signal the
* end of input stream (there are two ways to do so), and finally call
* {@link #drainOutput(boolean)} for the last time, with a true parameter.
*
* 10. Once everything is drained, we will call {@link #onStopped()}, on a unspecified thread.
* There, subclasses can perform extra cleanup of their own resources.
*
* For VIDEO encoders, things are much easier because we skip the whole input part.
* See description in {@link VideoMediaEncoder}.
*
* MAX LENGTH CONSTRAINT
*
* For max length constraint, it will be checked automatically during {@link #drainOutput(boolean)},
* OR subclasses can provide an hint to this encoder using {@link #notifyMaxLengthReached()}.
* In this second case, we can request a stop at reading time, so we avoid useless readings
* in certain setups (where drain is called a lot after reading).
*
* TIMING
*
* Subclasses can use timestamps (in microseconds) in any reference system they prefer. For
* instance, it might be the {@link System#nanoTime()} reference, or some reference provided
* by SurfaceTextures.
*
* However, they are required to call {@link #notifyFirstFrameMillis(long)} and pass the
* milliseconds of the first frame in the {@link System#currentTimeMillis()} reference, so
* something that we can coordinate on.
*/ */
// https://github.com/saki4510t/AudioVideoRecordingSample/blob/master/app/src/main/java/com/serenegiant/encoder/MediaEncoder.java // https://github.com/saki4510t/AudioVideoRecordingSample/blob/master/app/src/main/java/com/serenegiant/encoder/MediaEncoder.java
@RequiresApi(api = Build.VERSION_CODES.JELLY_BEAN_MR2) @RequiresApi(api = Build.VERSION_CODES.JELLY_BEAN_MR2)
abstract class MediaEncoder { public abstract class MediaEncoder {
private final static String TAG = MediaEncoder.class.getSimpleName(); private final static String TAG = MediaEncoder.class.getSimpleName();
private final static CameraLogger LOG = CameraLogger.create(TAG); private final static CameraLogger LOG = CameraLogger.create(TAG);
@ -36,6 +92,20 @@ abstract class MediaEncoder {
// Can't go too high or this is a bottleneck for the audio encoder. // Can't go too high or this is a bottleneck for the audio encoder.
private final static int OUTPUT_TIMEOUT_US = 0; private final static int OUTPUT_TIMEOUT_US = 0;
private final static int STATE_NONE = 0;
private final static int STATE_PREPARING = 1;
private final static int STATE_PREPARED = 2;
private final static int STATE_STARTING = 3;
private final static int STATE_STARTED = 4;
// max timestamp was reached. we will keep draining but have asked the engine to stop us.
// this step can be skipped in case stop() is called from outside before a limit is reached.
private final static int STATE_LIMIT_REACHED = 5;
private final static int STATE_STOPPING = 6;
private final static int STATE_STOPPED = 7;
private int mState = STATE_NONE;
private final String mName;
@SuppressWarnings("WeakerAccess") @SuppressWarnings("WeakerAccess")
protected MediaCodec mMediaCodec; protected MediaCodec mMediaCodec;
@ -47,35 +117,65 @@ abstract class MediaEncoder {
private OutputBufferPool mOutputBufferPool; private OutputBufferPool mOutputBufferPool;
private MediaCodec.BufferInfo mBufferInfo; private MediaCodec.BufferInfo mBufferInfo;
private MediaCodecBuffers mBuffers; private MediaCodecBuffers mBuffers;
private long mMaxLengthMillis; private long mMaxLengthMillis;
private boolean mMaxLengthReached; private boolean mMaxLengthReached;
private long mStartTimeMillis = 0; // In System.currentTimeMillis()
private long mStartTimeUs = Long.MIN_VALUE; // In unknown reference
private long mLastTimeUs = 0;
/** /**
* A readable name for the thread. * Needs a readable name for the thread and for logging.
* @param name a name
*/ */
@NonNull @SuppressWarnings("WeakerAccess")
abstract String getName(); protected MediaEncoder(@NonNull String name) {
mName = name;
}
private void setState(int newState) {
String newStateName = null;
switch (newState) {
case STATE_NONE: newStateName = "NONE"; break;
case STATE_PREPARING: newStateName = "PREPARING"; break;
case STATE_PREPARED: newStateName = "PREPARED"; break;
case STATE_STARTING: newStateName = "STARTING"; break;
case STATE_STARTED: newStateName = "STARTED"; break;
case STATE_LIMIT_REACHED: newStateName = "LIMIT_REACHED"; break;
case STATE_STOPPING: newStateName = "STOPPING"; break;
case STATE_STOPPED: newStateName = "STOPPED"; break;
}
LOG.w(mName, "setState:", newStateName);
mState = newState;
}
/** /**
* This encoder was attached to the engine. Keep the controller * This encoder was attached to the engine. Keep the controller
* and run the internal thread. * and run the internal thread.
* *
* NOTE: it's important to call {@link WorkerHandler#post(Runnable)} instead of run()! * NOTE: it's important to call {@link WorkerHandler#post(Runnable)} instead of run()!
* The internal actions can cause a stop/release, and due to how {@link WorkerHandler#run(Runnable)} * The internal actions can cause a stop, and due to how {@link WorkerHandler#run(Runnable)}
* works, we might have {@link #onStop()} or {@link #onRelease()} to be executed before * works, we might have {@link #onStop()} or {@link #onStopped()} to be executed before
* the previous step has completed. * the previous step has completed.
*/ */
final void prepare(@NonNull final MediaEncoderEngine.Controller controller, final long maxLengthMillis) { final void prepare(@NonNull final MediaEncoderEngine.Controller controller, final long maxLengthMillis) {
if (mState >= STATE_PREPARING) {
LOG.e(mName, "Wrong state while preparing. Aborting.", mState);
return;
}
mController = controller; mController = controller;
mBufferInfo = new MediaCodec.BufferInfo(); mBufferInfo = new MediaCodec.BufferInfo();
mMaxLengthMillis = maxLengthMillis; mMaxLengthMillis = maxLengthMillis;
mWorker = WorkerHandler.get(getName()); mWorker = WorkerHandler.get(mName);
LOG.i(getName(), "Prepare was called. Posting."); LOG.i(mName, "Prepare was called. Posting.");
mWorker.post(new Runnable() { mWorker.post(new Runnable() {
@Override @Override
public void run() { public void run() {
LOG.i(getName(), "Prepare was called. Executing."); LOG.i(mName, "Prepare was called. Executing.");
setState(STATE_PREPARING);
onPrepare(controller, maxLengthMillis); onPrepare(controller, maxLengthMillis);
setState(STATE_PREPARED);
} }
}); });
} }
@ -85,14 +185,22 @@ abstract class MediaEncoder {
* in case the encoder needs to wait for a certain event * in case the encoder needs to wait for a certain event
* like a "frame available". * like a "frame available".
* *
* The {@link #STATE_STARTED} state will be set when draining for the
* first time (not when onStart ends).
*
* NOTE: it's important to call {@link WorkerHandler#post(Runnable)} instead of run()! * NOTE: it's important to call {@link WorkerHandler#post(Runnable)} instead of run()!
*/ */
final void start() { final void start() {
LOG.w(getName(), "Start was called. Posting."); LOG.w(mName, "Start was called. Posting.");
mWorker.post(new Runnable() { mWorker.post(new Runnable() {
@Override @Override
public void run() { public void run() {
LOG.w(getName(), "Start was called. Executing."); if (mState < STATE_PREPARED || mState >= STATE_STARTING) {
LOG.e(mName, "Wrong state while starting. Aborting.", mState);
return;
}
setState(STATE_STARTING);
LOG.w(mName, "Start was called. Executing.");
onStart(); onStart();
} }
}); });
@ -108,27 +216,36 @@ abstract class MediaEncoder {
* @param data object * @param data object
*/ */
final void notify(final @NonNull String event, final @Nullable Object data) { final void notify(final @NonNull String event, final @Nullable Object data) {
LOG.v(getName(), "Notify was called. Posting."); LOG.v(mName, "Notify was called. Posting.");
mWorker.post(new Runnable() { mWorker.post(new Runnable() {
@Override @Override
public void run() { public void run() {
LOG.v(getName(), "Notify was called. Executing."); LOG.v(mName, "Notify was called. Executing.");
onEvent(event, data); onEvent(event, data);
} }
}); });
} }
/** /**
* Stop recording. * Stop recording. This involves signaling the end of stream and draining
* all output left.
*
* The {@link #STATE_STOPPED} state will be set when draining for the
* last time (not when onStart ends).
* *
* NOTE: it's important to call {@link WorkerHandler#post(Runnable)} instead of run()! * NOTE: it's important to call {@link WorkerHandler#post(Runnable)} instead of run()!
*/ */
final void stop() { final void stop() {
LOG.w(getName(), "Stop was called. Posting."); if (mState >= STATE_STOPPING) {
LOG.e(mName, "Wrong state while stopping. Aborting.", mState);
return;
}
setState(STATE_STOPPING);
LOG.w(mName, "Stop was called. Posting.");
mWorker.post(new Runnable() { mWorker.post(new Runnable() {
@Override @Override
public void run() { public void run() {
LOG.w(getName(), "Stop was called. Executing."); LOG.w(mName, "Stop was called. Executing.");
onStop(); onStop();
} }
}); });
@ -145,7 +262,7 @@ abstract class MediaEncoder {
* @param maxLengthMillis the maxLength in millis * @param maxLengthMillis the maxLength in millis
*/ */
@EncoderThread @EncoderThread
abstract void onPrepare(@NonNull final MediaEncoderEngine.Controller controller, final long maxLengthMillis); protected abstract void onPrepare(@NonNull final MediaEncoderEngine.Controller controller, final long maxLengthMillis);
/** /**
* Start recording. This might be a lightweight operation * Start recording. This might be a lightweight operation
@ -153,7 +270,7 @@ abstract class MediaEncoder {
* like a "frame available". * like a "frame available".
*/ */
@EncoderThread @EncoderThread
abstract void onStart(); protected abstract void onStart();
/** /**
* The caller notifying of a certain event occurring. * The caller notifying of a certain event occurring.
@ -162,38 +279,36 @@ abstract class MediaEncoder {
* @param data object * @param data object
*/ */
@EncoderThread @EncoderThread
abstract void onEvent(@NonNull String event, @Nullable Object data); protected void onEvent(@NonNull String event, @Nullable Object data) {}
/** /**
* Stop recording. * Stop recording. This involves signaling the end of stream and draining
* all output left.
*/ */
@EncoderThread @EncoderThread
abstract void onStop(); protected abstract void onStop();
/** /**
* Called by {@link #drainOutput(boolean)} when we get an EOS signal (not necessarily in the * Called by {@link #drainOutput(boolean)} when we get an EOS signal (not necessarily in the
* parameters, might also be through an input buffer flag). * parameters, might also be through an input buffer flag).
*
* This is a good moment to release all resources, although the muxer might still
* be alive (we wait for the other Encoder, see MediaEncoderEngine.Controller).
*/ */
private void release() { @CallSuper
LOG.w(getName(), "is being released. Notifying controller and releasing codecs."); protected void onStopped() {
// TODO should we notify after this method? LOG.w(mName, "is being released. Notifying controller and releasing codecs.");
mController.notifyReleased(mTrackIndex); // TODO should we call notifyStopped after this method ends?
mController.notifyStopped(mTrackIndex);
mMediaCodec.stop(); mMediaCodec.stop();
mMediaCodec.release(); mMediaCodec.release();
mMediaCodec = null; mMediaCodec = null;
mOutputBufferPool.clear(); mOutputBufferPool.clear();
mOutputBufferPool = null; mOutputBufferPool = null;
mBuffers = null; mBuffers = null;
onRelease(); setState(STATE_STOPPED);
} }
/**
* This is called when we are stopped.
* It is a good moment to release all resources, although the muxer
* might still be alive (we wait for the other Encoder, see Controller).
*/
abstract void onRelease();
/** /**
* Returns a new input buffer and index, waiting at most {@link #INPUT_TIMEOUT_US} if none is available. * Returns a new input buffer and index, waiting at most {@link #INPUT_TIMEOUT_US} if none is available.
* Callers should check the boolean result - true if the buffer was filled. * Callers should check the boolean result - true if the buffer was filled.
@ -234,7 +349,7 @@ abstract class MediaEncoder {
*/ */
@SuppressWarnings("WeakerAccess") @SuppressWarnings("WeakerAccess")
protected void encodeInputBuffer(InputBuffer buffer) { protected void encodeInputBuffer(InputBuffer buffer) {
LOG.v(getName(), "ENCODING - Buffer:", buffer.index, "Bytes:", buffer.length, "Presentation:", buffer.timestamp); LOG.v(mName, "ENCODING - Buffer:", buffer.index, "Bytes:", buffer.length, "Presentation:", buffer.timestamp);
if (buffer.isEndOfStream) { // send EOS if (buffer.isEndOfStream) { // send EOS
mMediaCodec.queueInputBuffer(buffer.index, 0, 0, mMediaCodec.queueInputBuffer(buffer.index, 0, 0,
buffer.timestamp, MediaCodec.BUFFER_FLAG_END_OF_STREAM); buffer.timestamp, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
@ -244,16 +359,6 @@ abstract class MediaEncoder {
} }
} }
/**
* Signals the end of input stream. This is a Video only API, as in the normal case,
* we use input buffers to signal the end. In the video case, we don't have input buffers
* because we use an input surface instead.
*/
@SuppressWarnings("WeakerAccess")
protected void signalEndOfInputStream() {
mMediaCodec.signalEndOfInputStream();
}
/** /**
* Extracts all pending data that was written and encoded into {@link #mMediaCodec}, * Extracts all pending data that was written and encoded into {@link #mMediaCodec},
* and forwards it to the muxer. * and forwards it to the muxer.
@ -267,7 +372,7 @@ abstract class MediaEncoder {
@SuppressLint("LogNotTimber") @SuppressLint("LogNotTimber")
@SuppressWarnings("WeakerAccess") @SuppressWarnings("WeakerAccess")
protected void drainOutput(boolean drainAll) { protected void drainOutput(boolean drainAll) {
LOG.v(getName(), "DRAINING - EOS:", drainAll); LOG.v(mName, "DRAINING - EOS:", drainAll);
if (mMediaCodec == null) { if (mMediaCodec == null) {
LOG.e("drain() was called before prepare() or after releasing."); LOG.e("drain() was called before prepare() or after releasing.");
return; return;
@ -289,7 +394,8 @@ abstract class MediaEncoder {
// should happen before receiving buffers, and should only happen once // should happen before receiving buffers, and should only happen once
if (mController.isStarted()) throw new RuntimeException("MediaFormat changed twice."); if (mController.isStarted()) throw new RuntimeException("MediaFormat changed twice.");
MediaFormat newFormat = mMediaCodec.getOutputFormat(); MediaFormat newFormat = mMediaCodec.getOutputFormat();
mTrackIndex = mController.requestStart(newFormat); mTrackIndex = mController.notifyStarted(newFormat);
setState(STATE_STARTED);
mOutputBufferPool = new OutputBufferPool(mTrackIndex); mOutputBufferPool = new OutputBufferPool(mTrackIndex);
} else if (encoderStatus < 0) { } else if (encoderStatus < 0) {
LOG.e("Unexpected result from dequeueOutputBuffer: " + encoderStatus); LOG.e("Unexpected result from dequeueOutputBuffer: " + encoderStatus);
@ -301,25 +407,29 @@ abstract class MediaEncoder {
// the INFO_OUTPUT_FORMAT_CHANGED status. Ignore it. // the INFO_OUTPUT_FORMAT_CHANGED status. Ignore it.
boolean isCodecConfig = (mBufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0; boolean isCodecConfig = (mBufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0;
if (!isCodecConfig && mController.isStarted() && mBufferInfo.size != 0) { if (!isCodecConfig && mController.isStarted() && mBufferInfo.size != 0) {
// adjust the ByteBuffer values to match BufferInfo (not needed?) // adjust the ByteBuffer values to match BufferInfo (not needed?)
encodedData.position(mBufferInfo.offset); encodedData.position(mBufferInfo.offset);
encodedData.limit(mBufferInfo.offset + mBufferInfo.size); encodedData.limit(mBufferInfo.offset + mBufferInfo.size);
// Store startPresentationTime and lastPresentationTime, useful for example to
// detect the mMaxLengthReached and stop recording. // Store mStartTimeUs and mLastTimeUs, useful to detect the max length
if (mStartPresentationTimeUs == Long.MIN_VALUE) { // reached and stop recording when needed.
mStartPresentationTimeUs = mBufferInfo.presentationTimeUs; if (mStartTimeUs == Long.MIN_VALUE) {
mStartTimeUs = mBufferInfo.presentationTimeUs;
LOG.w(mName, "DRAINING - Got the first presentation time:", mStartTimeUs);
} }
mLastPresentationTimeUs = mBufferInfo.presentationTimeUs; mLastTimeUs = mBufferInfo.presentationTimeUs;
// Pass presentation times as offets with respect to the mStartPresentationTimeUs.
// This ensures consistency between audio pts (coming from System.nanoTime()) and // Adjust the presentation times. Subclasses can pass a presentation time in any
// video pts (coming from SurfaceTexture) both of which have no meaningful time-base // reference system - possibly some that has no real meaning, and frequently,
// and should be used for offsets only. // presentation times from different encoders have a different time-base.
// TODO find a better way, this causes sync issues. (+ note: this sends pts=0 at first) // To address this, encoders are required to call notifyFirstFrameMillis
// mBufferInfo.presentationTimeUs = mLastPresentationTimeUs - mStartPresentationTimeUs; // so we can adjust here - moving to 1970 reference.
LOG.v(getName(), "DRAINING - About to write(). Presentation:", mBufferInfo.presentationTimeUs); // Extra benefit: we never pass a pts equal to 0, which some encoders refuse.
mBufferInfo.presentationTimeUs = (mStartTimeMillis * 1000) + mLastTimeUs - mStartTimeUs;
// TODO fix the mBufferInfo being the same, then implement delayed writing in Controller
// and remove the isStarted() check here. // Write.
LOG.v(mName, "DRAINING - About to write(). Adjusted presentation:", mBufferInfo.presentationTimeUs);
OutputBuffer buffer = mOutputBufferPool.get(); OutputBuffer buffer = mOutputBufferPool.get();
//noinspection ConstantConditions //noinspection ConstantConditions
buffer.info = mBufferInfo; buffer.info = mBufferInfo;
@ -333,29 +443,76 @@ abstract class MediaEncoder {
// Not needed if drainAll because we already were asked to stop // Not needed if drainAll because we already were asked to stop
if (!drainAll if (!drainAll
&& !mMaxLengthReached && !mMaxLengthReached
&& mStartPresentationTimeUs != Long.MIN_VALUE && mStartTimeUs != Long.MIN_VALUE
&& mLastPresentationTimeUs - mStartPresentationTimeUs > mMaxLengthMillis * 1000) { && mLastTimeUs - mStartTimeUs > mMaxLengthMillis * 1000) {
LOG.w(getName(), "DRAINING - Reached maxLength! mLastPresentationTimeUs:", mLastPresentationTimeUs, LOG.w(mName, "DRAINING - Reached maxLength! mLastTimeUs:", mLastTimeUs,
"mStartPresentationTimeUs:", mStartPresentationTimeUs, "mStartTimeUs:", mStartTimeUs,
"mMaxLengthUs:", mMaxLengthMillis * 1000); "mMaxLengthUs:", mMaxLengthMillis * 1000);
mMaxLengthReached = true; onMaxLengthReached();
LOG.w(getName(), "DRAINING - Requesting a stop.");
mController.requestStop(mTrackIndex);
break; break;
} }
// Check for the EOS flag so we can release the encoder. // Check for the EOS flag so we can call onStopped.
if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) { if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
LOG.w(getName(), "DRAINING - Got EOS. Releasing the codec."); LOG.w(mName, "DRAINING - Got EOS. Releasing the codec.");
release(); onStopped();
break; break;
} }
} }
} }
} }
private long mStartPresentationTimeUs = Long.MIN_VALUE; protected abstract int getEncodedBitRate();
private long mLastPresentationTimeUs = 0;
/**
* Returns the max length setting, in milliseconds, which can be used
* to compute the current state and eventually call {@link #notifyMaxLengthReached()}.
* This is not a requirement for subclasses - we do this check anyway when draining,
* but doing so might be better.
*
* @return the max length setting
*/
@SuppressWarnings("WeakerAccess")
protected long getMaxLengthMillis() {
return mMaxLengthMillis;
}
/**
* Called by subclasses to notify that the max length was reached.
* We will move to {@link #STATE_LIMIT_REACHED} and request a stop.
*/
@SuppressWarnings("WeakerAccess")
protected void notifyMaxLengthReached() {
onMaxLengthReached();
}
/**
* Called by us (during {@link #drainOutput(boolean)}) or by subclasses
* (through {@link #notifyMaxLengthReached()}) to notify that we reached the
* max length allowed. We will move to {@link #STATE_LIMIT_REACHED} and request a stop.
*/
private void onMaxLengthReached() {
if (mMaxLengthReached) return;
mMaxLengthReached = true;
if (mState >= STATE_LIMIT_REACHED) {
LOG.w(mName, "onMaxLengthReached: Reached in wrong state. Aborting.", mState);
} else {
LOG.w(mName, "onMaxLengthReached: Requesting a stop.");
setState(STATE_LIMIT_REACHED);
mController.requestStop(mTrackIndex);
}
}
abstract int getEncodedBitRate(); /**
* Should be called by subclasses to pass the milliseconds of the first frame - as soon
* as this information is available. The milliseconds should be in the
* {@link System#currentTimeMillis()} reference system, so we can coordinate between different
* encoders.
*
* @param firstFrameMillis the milliseconds of the first frame presentation
*/
@SuppressWarnings("WeakerAccess")
protected void notifyFirstFrameMillis(long firstFrameMillis) {
mStartTimeMillis = firstFrameMillis;
}
} }

@ -1,8 +1,10 @@
package com.otaliastudios.cameraview.video.encoding; package com.otaliastudios.cameraview.video.encoding;
import android.annotation.SuppressLint;
import android.media.MediaFormat; import android.media.MediaFormat;
import android.media.MediaMuxer; import android.media.MediaMuxer;
import android.os.Build; import android.os.Build;
import android.text.format.DateFormat;
import com.otaliastudios.cameraview.CameraLogger; import com.otaliastudios.cameraview.CameraLogger;
@ -13,9 +15,42 @@ import androidx.annotation.RequiresApi;
import java.io.File; import java.io.File;
import java.io.IOException; import java.io.IOException;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.Calendar;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
/** /**
* The entry point for encoding video files. * The entry point for encoding video files.
*
* The external API is simple but the internal mechanism is not easy. Basically the engine
* controls a {@link MediaEncoder} instance for each track (e.g. one for video, one for audio).
*
* 1. We prepare the MediaEncoders: {@link MediaEncoder#prepare(Controller, long)}
* MediaEncoders can be prepared synchronously or not.
*
* 2. Someone calls {@link #start()} from any thread.
* As a consequence, we start the MediaEncoders: {@link MediaEncoder#start()}.
*
* 3. MediaEncoders do not start synchronously. Instead, they call
* {@link Controller#notifyStarted(MediaFormat)} when they have a legit format,
* and we keep track of who has started.
*
* 4. When all MediaEncoders have started, we actually start the muxer.
*
* 5. Someone calls {@link #stop()} from any thread.
* As a consequence, we stop the MediaEncoders: {@link MediaEncoder#stop()}.
*
* 6. MediaEncoders do not stop synchronously. Instead, they will stop reading but
* keep draining the codec until there's no data left. At that point, they can
* call {@link Controller#notifyStopped(int)}.
*
* 7. When all MediaEncoders have been released, we actually stop the muxer and notify.
*
* There is another possibility where MediaEncoders themselves want to stop, for example
* because they reach some limit or constraint (e.g. max duration). For this, they should
* call {@link Controller#requestStop(int)}. Once all MediaEncoders have stopped, we will
* actually call {@link #stop()} on ourselves.
*/ */
@RequiresApi(api = Build.VERSION_CODES.JELLY_BEAN_MR2) @RequiresApi(api = Build.VERSION_CODES.JELLY_BEAN_MR2)
public class MediaEncoderEngine { public class MediaEncoderEngine {
@ -33,7 +68,17 @@ public class MediaEncoderEngine {
void onEncodingStart(); void onEncodingStart();
/** /**
* Called when encoding stopped for some reason. * Called when encoding stopped. At this point the mxuer might still be processing,
* but we have stopped receiving input (recording video and audio frames).
*
* The {@link #onEncodingEnd(int, Exception)} callback will soon be called
* with the results.
*/
@EncoderThread
void onEncodingStop();
/**
* Called when encoding ended for some reason.
* If there's an exception, it failed. * If there's an exception, it failed.
* @param reason the reason * @param reason the reason
* @param e the error, if present * @param e the error, if present
@ -44,13 +89,14 @@ public class MediaEncoderEngine {
private final static String TAG = MediaEncoderEngine.class.getSimpleName(); private final static String TAG = MediaEncoderEngine.class.getSimpleName();
private final static CameraLogger LOG = CameraLogger.create(TAG); private final static CameraLogger LOG = CameraLogger.create(TAG);
private static final boolean DEBUG_PERFORMANCE = true;
@SuppressWarnings("WeakerAccess") @SuppressWarnings("WeakerAccess")
public final static int END_BY_USER = 0; public final static int END_BY_USER = 0;
public final static int END_BY_MAX_DURATION = 1; public final static int END_BY_MAX_DURATION = 1;
public final static int END_BY_MAX_SIZE = 2; public final static int END_BY_MAX_SIZE = 2;
private ArrayList<MediaEncoder> mEncoders; private List<MediaEncoder> mEncoders;
private MediaMuxer mMediaMuxer; private MediaMuxer mMediaMuxer;
private int mStartedEncodersCount; private int mStartedEncodersCount;
private int mReleasedEncodersCount; private int mReleasedEncodersCount;
@ -148,7 +194,7 @@ public class MediaEncoderEngine {
/** /**
* Asks encoders to stop. This is not sync, of course we will ask for encoders * Asks encoders to stop. This is not sync, of course we will ask for encoders
* to call {@link Controller#notifyReleased(int)} before actually stop the muxer. * to call {@link Controller#notifyStopped(int)} before actually stop the muxer.
* When all encoders request a release, {@link #end()} is called to do cleanup * When all encoders request a release, {@link #end()} is called to do cleanup
* and notify the listener. * and notify the listener.
*/ */
@ -160,7 +206,7 @@ public class MediaEncoderEngine {
} }
/** /**
* Called after all encoders have requested a release using {@link Controller#notifyReleased(int)}. * Called after all encoders have requested a release using {@link Controller#notifyStopped(int)}.
* At this point we will do cleanup and notify the listener. * At this point we will do cleanup and notify the listener.
*/ */
private void end() { private void end() {
@ -217,7 +263,8 @@ public class MediaEncoderEngine {
* A handle for {@link MediaEncoder}s to pass information to this engine. * A handle for {@link MediaEncoder}s to pass information to this engine.
* All methods here can be called for multiple threads. * All methods here can be called for multiple threads.
*/ */
class Controller { @SuppressWarnings("WeakerAccess")
public class Controller {
/** /**
* Request that the muxer should start. This is not guaranteed to be executed: * Request that the muxer should start. This is not guaranteed to be executed:
@ -225,15 +272,15 @@ public class MediaEncoderEngine {
* @param format the media format * @param format the media format
* @return the encoder track index * @return the encoder track index
*/ */
int requestStart(@NonNull MediaFormat format) { public int notifyStarted(@NonNull MediaFormat format) {
synchronized (mControllerLock) { synchronized (mControllerLock) {
if (mMediaMuxerStarted) { if (mMediaMuxerStarted) {
throw new IllegalStateException("Trying to start but muxer started already"); throw new IllegalStateException("Trying to start but muxer started already");
} }
int track = mMediaMuxer.addTrack(format); int track = mMediaMuxer.addTrack(format);
LOG.w("requestStart:", "Assigned track", track, "to format", format.getString(MediaFormat.KEY_MIME)); LOG.w("notifyStarted:", "Assigned track", track, "to format", format.getString(MediaFormat.KEY_MIME));
if (++mStartedEncodersCount == mEncoders.size()) { if (++mStartedEncodersCount == mEncoders.size()) {
LOG.w("requestStart:", "All encoders have started. Starting muxer and dispatching onEncodingStart()."); LOG.w("notifyStarted:", "All encoders have started. Starting muxer and dispatching onEncodingStart().");
mMediaMuxer.start(); mMediaMuxer.start();
mMediaMuxerStarted = true; mMediaMuxerStarted = true;
if (mListener != null) { if (mListener != null) {
@ -245,41 +292,58 @@ public class MediaEncoderEngine {
} }
/** /**
* Whether the muxer is started. * Whether the muxer is started. MediaEncoders are required to avoid
* calling {@link #write(OutputBufferPool, OutputBuffer)} until this method returns true.
*
* @return true if muxer was started * @return true if muxer was started
*/ */
boolean isStarted() { public boolean isStarted() {
synchronized (mControllerLock) { synchronized (mControllerLock) {
return mMediaMuxerStarted; return mMediaMuxerStarted;
} }
} }
@SuppressLint("UseSparseArrays")
private Map<Integer, Integer> mDebugCount = new HashMap<>();
/** /**
* Writes the given data to the muxer. Should be called after {@link #isStarted()} * Writes the given data to the muxer. Should be called after {@link #isStarted()}
* returns true. Note: this seems to be thread safe, no lock. * returns true. Note: this seems to be thread safe, no lock.
* TODO cache values if not started yet, then apply later. Read comments in drain().
* Currently they are recycled instantly.
*/ */
void write(@NonNull OutputBufferPool pool, @NonNull OutputBuffer buffer) { public void write(@NonNull OutputBufferPool pool, @NonNull OutputBuffer buffer) {
if (!mMediaMuxerStarted) { if (!mMediaMuxerStarted) {
throw new IllegalStateException("Trying to write before muxer started"); throw new IllegalStateException("Trying to write before muxer started");
} }
// This is a bad idea and causes crashes.
// if (info.presentationTimeUs < mLastTimestampUs) info.presentationTimeUs = mLastTimestampUs; if (DEBUG_PERFORMANCE) {
// mLastTimestampUs = info.presentationTimeUs; // When AUDIO = mono, this is called about twice the time. (200 vs 100 for 5 sec).
LOG.v("write:", "Writing OutputBuffer - track:", buffer.trackIndex, "presentation:", buffer.info.presentationTimeUs); Integer count = mDebugCount.get(buffer.trackIndex);
mDebugCount.put(buffer.trackIndex, count == null ? 1 : ++count);
Calendar calendar = Calendar.getInstance();
calendar.setTimeInMillis(buffer.info.presentationTimeUs / 1000);
LOG.v("write:", "Writing into muxer -",
"track:", buffer.trackIndex,
"presentation:", buffer.info.presentationTimeUs,
"readable:", calendar.get(Calendar.SECOND) + ":" + calendar.get(Calendar.MILLISECOND),
"count:", count);
} else {
LOG.v("write:", "Writing into muxer -",
"track:", buffer.trackIndex,
"presentation:", buffer.info.presentationTimeUs);
}
mMediaMuxer.writeSampleData(buffer.trackIndex, buffer.data, buffer.info); mMediaMuxer.writeSampleData(buffer.trackIndex, buffer.data, buffer.info);
pool.recycle(buffer); pool.recycle(buffer);
} }
/** /**
* Requests that the engine stops. This is not executed until all encoders call * Requests that the engine stops. This is not executed until all encoders call
* this method, so it is a kind of soft request, just like {@link #requestStart(MediaFormat)}. * this method, so it is a kind of soft request, just like {@link #notifyStarted(MediaFormat)}.
* To be used when maxLength / maxSize constraints are reached, for example. * To be used when maxLength / maxSize constraints are reached, for example.
* *
* When this succeeds, {@link MediaEncoder#stop()} is called. * When this succeeds, {@link MediaEncoder#stop()} is called.
*/ */
void requestStop(int track) { public void requestStop(int track) {
synchronized (mControllerLock) { synchronized (mControllerLock) {
LOG.w("requestStop:", "Called for track", track); LOG.w("requestStop:", "Called for track", track);
if (--mStartedEncodersCount == 0) { if (--mStartedEncodersCount == 0) {
@ -294,11 +358,14 @@ public class MediaEncoderEngine {
* Notifies that the encoder was stopped. After this is called by all encoders, * Notifies that the encoder was stopped. After this is called by all encoders,
* we will actually stop the muxer. * we will actually stop the muxer.
*/ */
void notifyReleased(int track) { public void notifyStopped(int track) {
synchronized (mControllerLock) { synchronized (mControllerLock) {
LOG.w("notifyReleased:", "Called for track", track); LOG.w("notifyStopped:", "Called for track", track);
if (++mReleasedEncodersCount == mEncoders.size()) { if (++mReleasedEncodersCount == mEncoders.size()) {
LOG.w("requestStop:", "All encoders have been released. Stopping the muxer."); LOG.w("requestStop:", "All encoders have been released. Stopping the muxer.");
if (mListener != null) {
mListener.onEncodingStop();
}
end(); end();
} }
} }

@ -9,8 +9,9 @@ import java.nio.ByteBuffer;
* an encoded buffer of data that should be passed * an encoded buffer of data that should be passed
* to the muxer. * to the muxer.
*/ */
class OutputBuffer { @SuppressWarnings("WeakerAccess")
MediaCodec.BufferInfo info; public class OutputBuffer {
int trackIndex; public MediaCodec.BufferInfo info;
ByteBuffer data; public int trackIndex;
public ByteBuffer data;
} }

@ -0,0 +1,38 @@
package com.otaliastudios.cameraview.video.encoding;
import android.opengl.EGLContext;
import androidx.annotation.NonNull;
/**
* Video configuration to be passed as input to the constructor
* of a {@link TextureMediaEncoder}.
*/
public class TextureConfig extends VideoConfig {
private final static int NO_TEXTURE = Integer.MIN_VALUE;
public int textureId = NO_TEXTURE;
public int overlayTextureId = NO_TEXTURE;
public int overlayRotation;
public float scaleX;
public float scaleY;
public EGLContext eglContext;
@NonNull
TextureConfig copy() {
TextureConfig copy = new TextureConfig();
copy(copy);
copy.textureId = this.textureId;
copy.overlayTextureId = this.overlayTextureId;
copy.overlayRotation = this.overlayRotation;
copy.scaleX = this.scaleX;
copy.scaleY = this.scaleY;
copy.eglContext = this.eglContext;
return copy;
}
boolean hasOverlay() {
return overlayTextureId != NO_TEXTURE;
}
}

@ -1,6 +1,6 @@
package com.otaliastudios.cameraview.video.encoding; package com.otaliastudios.cameraview.video.encoding;
import android.opengl.EGLContext; import android.graphics.SurfaceTexture;
import android.opengl.Matrix; import android.opengl.Matrix;
import android.os.Build; import android.os.Build;
@ -18,68 +18,64 @@ import androidx.annotation.RequiresApi;
* Default implementation for video encoding. * Default implementation for video encoding.
*/ */
@RequiresApi(api = Build.VERSION_CODES.JELLY_BEAN_MR2) @RequiresApi(api = Build.VERSION_CODES.JELLY_BEAN_MR2)
public class TextureMediaEncoder extends VideoMediaEncoder<TextureMediaEncoder.Config> { public class TextureMediaEncoder extends VideoMediaEncoder<TextureConfig> {
private static final String TAG = TextureMediaEncoder.class.getSimpleName(); private static final String TAG = TextureMediaEncoder.class.getSimpleName();
private static final CameraLogger LOG = CameraLogger.create(TAG); private static final CameraLogger LOG = CameraLogger.create(TAG);
public final static String FRAME_EVENT = "frame"; public final static String FRAME_EVENT = "frame";
public final static int NO_TEXTURE = Integer.MIN_VALUE;
public static class Config extends VideoMediaEncoder.Config {
int textureId;
int overlayTextureId;
float scaleX;
float scaleY;
EGLContext eglContext;
int transformRotation;
int overlayTransformRotation;
public Config(int width, int height,
int bitRate, int frameRate,
int rotation, @NonNull String mimeType,
int textureId,
float scaleX, float scaleY,
@NonNull EGLContext eglContext,
int overlayTextureId, int overlayRotation) {
// We rotate the texture using transformRotation. Pass rotation=0 to super so that
// no rotation metadata is written into the output file.
super(width, height, bitRate, frameRate, 0, mimeType);
this.transformRotation = rotation;
this.textureId = textureId;
this.scaleX = scaleX;
this.scaleY = scaleY;
this.eglContext = eglContext;
this.overlayTextureId = overlayTextureId;
this.overlayTransformRotation = overlayRotation;
}
}
private int mTransformRotation;
private EglCore mEglCore; private EglCore mEglCore;
private EglWindowSurface mWindow; private EglWindowSurface mWindow;
private EglViewport mViewport; private EglViewport mViewport;
private Pool<TextureFrame> mFramePool = new Pool<>(100, new Pool.Factory<TextureFrame>() { private Pool<Frame> mFramePool = new Pool<>(100, new Pool.Factory<Frame>() {
@Override @Override
public TextureFrame create() { public Frame create() {
return new TextureFrame(); return new Frame();
} }
}); });
public TextureMediaEncoder(@NonNull Config config) { public TextureMediaEncoder(@NonNull TextureConfig config) {
super(config); super(config.copy());
} }
public static class TextureFrame { /**
private TextureFrame() {} * Should be acquired with {@link #acquireFrame()}, filled and then passed
// Nanoseconds, in no meaningful time-base. Should be for offsets only. * to {@link MediaEncoderEngine#notify(String, Object)} with {@link #FRAME_EVENT}.
// Typically coming from SurfaceTexture.getTimestamp(). */
public static class Frame {
private Frame() {}
/**
* Nanoseconds, in no meaningful time-base. Will be used for offsets only.
* Typically this comes from {@link SurfaceTexture#getTimestamp()}.
*/
public long timestamp; public long timestamp;
/**
* Milliseconds in the {@link System#currentTimeMillis()} reference.
* This is actually needed/read only for the first frame.
*/
public long timestampMillis;
/**
* The transformation matrix for the base texture.
*/
public float[] transform = new float[16]; public float[] transform = new float[16];
/**
* The transformation matrix for the overlay texture, if any.
*/
public float[] overlayTransform = new float[16]; public float[] overlayTransform = new float[16];
} }
/**
* Returns a new frame to be filled. See {@link Frame} for details.
* @return a new frame
*/
@NonNull @NonNull
public TextureFrame acquireFrame() { public Frame acquireFrame() {
if (mFramePool.isEmpty()) { if (mFramePool.isEmpty()) {
throw new RuntimeException("Need more frames than this! Please increase the pool size."); throw new RuntimeException("Need more frames than this! Please increase the pool size.");
} else { } else {
@ -88,10 +84,13 @@ public class TextureMediaEncoder extends VideoMediaEncoder<TextureMediaEncoder.C
} }
} }
@EncoderThread @EncoderThread
@Override @Override
void onPrepare(@NonNull MediaEncoderEngine.Controller controller, long maxLengthMillis) { protected void onPrepare(@NonNull MediaEncoderEngine.Controller controller, long maxLengthMillis) {
// We rotate the texture using transformRotation. Pass rotation=0 to super so that
// no rotation metadata is written into the output file.
mTransformRotation = mConfig.rotation;
mConfig.rotation = 0;
super.onPrepare(controller, maxLengthMillis); super.onPrepare(controller, maxLengthMillis);
mEglCore = new EglCore(mConfig.eglContext, EglCore.FLAG_RECORDABLE); mEglCore = new EglCore(mConfig.eglContext, EglCore.FLAG_RECORDABLE);
mWindow = new EglWindowSurface(mEglCore, mSurface, true); mWindow = new EglWindowSurface(mEglCore, mSurface, true);
@ -102,28 +101,33 @@ public class TextureMediaEncoder extends VideoMediaEncoder<TextureMediaEncoder.C
@EncoderThread @EncoderThread
@Override @Override
void onStart() { protected void onEvent(@NonNull String event, @Nullable Object data) {
super.onStart();
// Nothing to do here. Waiting for the first frame.
}
@EncoderThread
@Override
void onEvent(@NonNull String event, @Nullable Object data) {
if (!event.equals(FRAME_EVENT)) return; if (!event.equals(FRAME_EVENT)) return;
TextureFrame frame = (TextureFrame) data; Frame frame = (Frame) data;
if (frame == null) return; // Should not happen if (frame == null) {
if (frame.timestamp == 0 || mFrameNum < 0) { throw new IllegalArgumentException("Got null frame for FRAME_EVENT.");
// The first condition comes from grafika. }
// The second condition means we were asked to stop. if (frame.timestamp == 0) { // grafika
mFramePool.recycle(frame); mFramePool.recycle(frame);
return; return;
} }
if (mFrameNumber < 0) { // We were asked to stop.
mFramePool.recycle(frame);
return;
}
mFrameNumber++;
if (mFrameNumber == 1) {
notifyFirstFrameMillis(frame.timestampMillis);
}
// First, drain any previous data.
LOG.i("onEvent", "frameNumber:", mFrameNumber, "timestamp:", frame.timestamp, "- draining.");
drainOutput(false);
// Then draw on the surface.
LOG.i("onEvent", "frameNumber:", mFrameNumber, "timestamp:", frame.timestamp, "- drawing.");
mFrameNum++; // 1. We must scale this matrix like GlCameraPreview does, because it might have some cropping.
int thisFrameNum = mFrameNum;
LOG.v("onEvent", "frameNum:", thisFrameNum, "realFrameNum:", mFrameNum, "timestamp:", frame.timestamp);
// We must scale this matrix like GlCameraPreview does, because it might have some cropping.
// Scaling takes place with respect to the (0, 0, 0) point, so we must apply a Translation to compensate. // Scaling takes place with respect to the (0, 0, 0) point, so we must apply a Translation to compensate.
float[] transform = frame.transform; float[] transform = frame.transform;
float[] overlayTransform = frame.overlayTransform; float[] overlayTransform = frame.overlayTransform;
@ -134,36 +138,32 @@ public class TextureMediaEncoder extends VideoMediaEncoder<TextureMediaEncoder.C
Matrix.translateM(transform, 0, scaleTranslX, scaleTranslY, 0); Matrix.translateM(transform, 0, scaleTranslX, scaleTranslY, 0);
Matrix.scaleM(transform, 0, scaleX, scaleY, 1); Matrix.scaleM(transform, 0, scaleX, scaleY, 1);
// We also must rotate this matrix. In GlCameraPreview it is not needed because it is a live // 2. We also must rotate this matrix. In GlCameraPreview it is not needed because it is a live
// stream, but the output video, must be correctly rotated based on the device rotation at the moment. // stream, but the output video, must be correctly rotated based on the device rotation at the moment.
// Rotation also takes place with respect to the origin (the Z axis), so we must // Rotation also takes place with respect to the origin (the Z axis), so we must
// translate to origin, rotate, then back to where we were. // translate to origin, rotate, then back to where we were.
Matrix.translateM(transform, 0, 0.5F, 0.5F, 0); Matrix.translateM(transform, 0, 0.5F, 0.5F, 0);
Matrix.rotateM(transform, 0, mConfig.transformRotation, 0, 0, 1); Matrix.rotateM(transform, 0, mTransformRotation, 0, 0, 1);
Matrix.translateM(transform, 0, -0.5F, -0.5F, 0); Matrix.translateM(transform, 0, -0.5F, -0.5F, 0);
boolean hasOverlay = mConfig.overlayTextureId != NO_TEXTURE; // 3. Do the same for overlays with their own rotation.
if (hasOverlay) { if (mConfig.hasOverlay()) {
Matrix.translateM(overlayTransform, 0, 0.5F, 0.5F, 0); Matrix.translateM(overlayTransform, 0, 0.5F, 0.5F, 0);
Matrix.rotateM(overlayTransform, 0, mConfig.overlayTransformRotation, 0, 0, 1); Matrix.rotateM(overlayTransform, 0, mConfig.overlayRotation, 0, 0, 1);
Matrix.translateM(overlayTransform, 0, -0.5F, -0.5F, 0); Matrix.translateM(overlayTransform, 0, -0.5F, -0.5F, 0);
} }
LOG.v("onEvent", "frameNum:", thisFrameNum, "realFrameNum:", mFrameNum, "calling drainOutput.");
drainOutput(false);
LOG.v("onEvent", "frameNum:", thisFrameNum, "realFrameNum:", mFrameNum, "calling drawFrame.");
mViewport.drawFrame(mConfig.textureId, transform); mViewport.drawFrame(mConfig.textureId, transform);
if (hasOverlay) { if (mConfig.hasOverlay()) {
mViewport.drawFrame(mConfig.overlayTextureId, overlayTransform); mViewport.drawFrame(mConfig.overlayTextureId, overlayTransform);
} }
mWindow.setPresentationTime(frame.timestamp); mWindow.setPresentationTime(frame.timestamp);
mWindow.swapBuffers(); mWindow.swapBuffers();
mFramePool.recycle(frame); mFramePool.recycle(frame);
} }
@Override @Override
void onRelease() { protected void onStopped() {
super.onStopped();
mFramePool.clear(); mFramePool.clear();
if (mWindow != null) { if (mWindow != null) {
mWindow.release(); mWindow.release();

@ -0,0 +1,25 @@
package com.otaliastudios.cameraview.video.encoding;
import androidx.annotation.NonNull;
/**
* Base video configuration to be passed as input to the constructor
* of a {@link VideoMediaEncoder}.
*/
public class VideoConfig {
public int width;
public int height;
public int bitRate;
public int frameRate;
public int rotation;
public String mimeType;
protected <C extends VideoConfig> void copy(@NonNull C output) {
output.width = this.width;
output.height = this.height;
output.bitRate = this.bitRate;
output.frameRate = this.frameRate;
output.rotation = this.rotation;
output.mimeType = this.mimeType;
}
}

@ -14,13 +14,22 @@ import com.otaliastudios.cameraview.CameraLogger;
import java.io.IOException; import java.io.IOException;
/** /**
* This alone does nothing. * Base class for video encoding.
* Subclasses must make sure they write each frame onto the given Surface {@link #mSurface}. *
* This uses {@link MediaCodec#createInputSurface()} to create an input {@link Surface}
* into which we can write and that MediaCodec itself can read.
*
* This makes everything easier with respect to the process explained in {@link MediaEncoder}
* docs. We can skip the whole input part of acquiring an InputBuffer, filling it with data
* and returning it to the encoder with {@link #encodeInputBuffer(InputBuffer)}.
*
* All of this is automatically done by MediaCodec as long as we keep writing data into the
* given {@link Surface}. This class alone does not do this - subclasses are required to do so.
* *
* @param <C> the config object. * @param <C> the config object.
*/ */
@RequiresApi(api = Build.VERSION_CODES.JELLY_BEAN_MR2) @RequiresApi(api = Build.VERSION_CODES.JELLY_BEAN_MR2)
abstract class VideoMediaEncoder<C extends VideoMediaEncoder.Config> extends MediaEncoder { abstract class VideoMediaEncoder<C extends VideoConfig> extends MediaEncoder {
private static final String TAG = VideoMediaEncoder.class.getSimpleName(); private static final String TAG = VideoMediaEncoder.class.getSimpleName();
private static final CameraLogger LOG = CameraLogger.create(TAG); private static final CameraLogger LOG = CameraLogger.create(TAG);
@ -32,39 +41,16 @@ abstract class VideoMediaEncoder<C extends VideoMediaEncoder.Config> extends Med
protected Surface mSurface; protected Surface mSurface;
@SuppressWarnings("WeakerAccess") @SuppressWarnings("WeakerAccess")
protected int mFrameNum = -1; protected int mFrameNumber = -1;
static class Config {
int width;
int height;
int bitRate;
int frameRate;
int rotation;
String mimeType;
Config(int width, int height, int bitRate, int frameRate, int rotation, @NonNull String mimeType) {
this.width = width;
this.height = height;
this.bitRate = bitRate;
this.frameRate = frameRate;
this.rotation = rotation;
this.mimeType = mimeType;
}
}
VideoMediaEncoder(@NonNull C config) { VideoMediaEncoder(@NonNull C config) {
super("VideoEncoder");
mConfig = config; mConfig = config;
} }
@NonNull
@Override
String getName() {
return "VideoEncoder";
}
@EncoderThread @EncoderThread
@Override @Override
void onPrepare(@NonNull MediaEncoderEngine.Controller controller, long maxLengthMillis) { protected void onPrepare(@NonNull MediaEncoderEngine.Controller controller, long maxLengthMillis) {
MediaFormat format = MediaFormat.createVideoFormat(mConfig.mimeType, mConfig.width, mConfig.height); MediaFormat format = MediaFormat.createVideoFormat(mConfig.mimeType, mConfig.width, mConfig.height);
// Set some properties. Failing to specify some of these can cause the MediaCodec // Set some properties. Failing to specify some of these can cause the MediaCodec
@ -89,22 +75,25 @@ abstract class VideoMediaEncoder<C extends VideoMediaEncoder.Config> extends Med
@EncoderThread @EncoderThread
@Override @Override
void onStart() { protected void onStart() {
// Nothing to do here. Waiting for the first frame. // Nothing to do here. Waiting for the first frame.
mFrameNum = 0; mFrameNumber = 0;
} }
@EncoderThread @EncoderThread
@Override @Override
void onStop() { protected void onStop() {
LOG.i("onStop", "setting mFrameNum to 1 and signaling the end of input stream."); LOG.i("onStop", "setting mFrameNumber to 1 and signaling the end of input stream.");
mFrameNum = -1; mFrameNumber = -1;
signalEndOfInputStream(); // Signals the end of input stream. This is a Video only API, as in the normal case,
// we use input buffers to signal the end. In the video case, we don't have input buffers
// because we use an input surface instead.
mMediaCodec.signalEndOfInputStream();
drainOutput(true); drainOutput(true);
} }
@Override @Override
int getEncodedBitRate() { protected int getEncodedBitRate() {
return mConfig.bitRate; return mConfig.bitRate;
} }
} }

@ -102,6 +102,8 @@
<attr name="cameraAudio" format="enum"> <attr name="cameraAudio" format="enum">
<enum name="off" value="0" /> <enum name="off" value="0" />
<enum name="on" value="1" /> <enum name="on" value="1" />
<enum name="mono" value="2" />
<enum name="stereo" value="3" />
</attr> </attr>
<attr name="cameraGrid" format="enum"> <attr name="cameraGrid" format="enum">

@ -0,0 +1,168 @@
package com.otaliastudios.cameraview.internal.utils;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
import java.util.ArrayList;
import java.util.List;
import static junit.framework.Assert.assertNotNull;
import static org.junit.Assert.assertEquals;
import static org.junit.Assert.assertFalse;
import static org.junit.Assert.assertTrue;
import static org.junit.Assert.assertNull;
public class PoolTest {
private final static int MAX_SIZE = 20;
private class Item {}
private Pool<Item> pool;
private int instances = 0;
@Before
public void setUp() {
pool = new Pool<>(MAX_SIZE, new Pool.Factory<Item>() {
@Override
public Item create() {
instances++;
return new Item();
}
});
}
@After
public void tearDown() {
instances = 0;
pool = null;
}
@Test
public void testInstances() {
for (int i = 0; i < MAX_SIZE; i++) {
assertEquals(instances, i);
pool.get();
}
}
@Test
public void testIsEmtpy() {
assertFalse(pool.isEmpty());
// Get all items without recycling.
Item item = null;
for (int i = 0; i < MAX_SIZE; i++) {
item = pool.get();
}
assertTrue(pool.isEmpty());
}
@Test
public void testClear() {
// Take one and recycle it
Item item = pool.get();
assertNotNull(item);
pool.recycle(item);
// Ensure it is recycled.
assertEquals(pool.recycledCount(), 1);
assertEquals(pool.activeCount(), 0);
assertEquals(pool.count(), 1);
// Now clear and ensure pool is empty.
pool.clear();
assertEquals(pool.recycledCount(), 0);
assertEquals(pool.activeCount(), 0);
assertEquals(pool.count(), 0);
}
@Test
public void testCounts() {
assertEquals(pool.recycledCount(), 0);
assertEquals(pool.activeCount(), 0);
assertEquals(pool.count(), 0);
// Take all
List<Item> items = new ArrayList<>();
for (int i = 0; i < MAX_SIZE; i++) {
items.add(pool.get());
assertEquals(pool.recycledCount(), 0);
assertEquals(pool.activeCount(), items.size());
assertEquals(pool.count(), items.size());
}
// Recycle all
int recycled = 0;
for (Item item : items) {
pool.recycle(item);
recycled++;
assertEquals(pool.recycledCount(), recycled);
assertEquals(pool.activeCount(), MAX_SIZE - recycled);
assertEquals(pool.count(), MAX_SIZE);
}
}
@Test
public void testToString() {
String string = pool.toString();
assertTrue(string.contains("count"));
assertTrue(string.contains("active"));
assertTrue(string.contains("recycled"));
assertTrue(string.contains(Pool.class.getSimpleName()));
}
@Test(expected = IllegalStateException.class)
public void testRecycle_notActive() {
Item item = new Item();
pool.recycle(item);
}
@Test(expected = IllegalStateException.class)
public void testRecycle_twice() {
Item item = pool.get();
assertNotNull(item);
pool.recycle(item);
pool.recycle(item);
}
@Test(expected = IllegalStateException.class)
public void testRecycle_whileFull() {
// Take all and recycle all
List<Item> items = new ArrayList<>();
for (int i = 0; i < MAX_SIZE; i++) {
items.add(pool.get());
}
for (Item item : items) {
pool.recycle(item);
}
// Take one and recycle again
pool.recycle(items.get(0));
}
@Test
public void testGet_fromFactory() {
pool.get();
assertEquals(1, instances);
}
@Test
public void testGet_whenFull() {
for (int i = 0; i < MAX_SIZE; i++) {
pool.get();
}
assertNull(pool.get());
}
@Test
public void testGet_recycled() {
Item item = pool.get();
assertNotNull(item);
pool.recycle(item);
Item newItem = pool.get();
assertEquals(item, newItem);
assertEquals(1, instances);
}
}

@ -35,6 +35,8 @@ import java.util.List;
public class CameraActivity extends AppCompatActivity implements View.OnClickListener, OptionView.Callback { public class CameraActivity extends AppCompatActivity implements View.OnClickListener, OptionView.Callback {
private final static CameraLogger LOG = CameraLogger.create("DemoApp");
private CameraView camera; private CameraView camera;
private ViewGroup controlPanel; private ViewGroup controlPanel;
private long mCaptureTime; private long mCaptureTime;
@ -134,9 +136,14 @@ public class CameraActivity extends AppCompatActivity implements View.OnClickLis
animator.start(); animator.start();
} }
private void message(String content, boolean important) { private void message(@NonNull String content, boolean important) {
int length = important ? Toast.LENGTH_LONG : Toast.LENGTH_SHORT; if (important) {
Toast.makeText(this, content, length).show(); LOG.w(content);
Toast.makeText(this, content, Toast.LENGTH_LONG).show();
} else {
LOG.i(content);
Toast.makeText(this, content, Toast.LENGTH_SHORT).show();
}
} }
private class Listener extends CameraListener { private class Listener extends CameraListener {
@ -170,7 +177,7 @@ public class CameraActivity extends AppCompatActivity implements View.OnClickLis
PicturePreviewActivity.setPictureResult(result); PicturePreviewActivity.setPictureResult(result);
Intent intent = new Intent(CameraActivity.this, PicturePreviewActivity.class); Intent intent = new Intent(CameraActivity.this, PicturePreviewActivity.class);
intent.putExtra("delay", callbackTime - mCaptureTime); intent.putExtra("delay", callbackTime - mCaptureTime);
Log.e("CameraActivity", "Picture delay: " + (callbackTime - mCaptureTime)); LOG.w("Picture delay:", callbackTime - mCaptureTime);
startActivity(intent); startActivity(intent);
mCaptureTime = 0; mCaptureTime = 0;
} }
@ -182,6 +189,12 @@ public class CameraActivity extends AppCompatActivity implements View.OnClickLis
Intent intent = new Intent(CameraActivity.this, VideoPreviewActivity.class); Intent intent = new Intent(CameraActivity.this, VideoPreviewActivity.class);
startActivity(intent); startActivity(intent);
} }
@Override
public void onVideoRecordingStart() {
super.onVideoRecordingStart();
LOG.w("onVideoRecordingStart!");
}
} }
@Override @Override

@ -17,7 +17,7 @@
android:layout_marginBottom="88dp" android:layout_marginBottom="88dp"
android:keepScreenOn="true" android:keepScreenOn="true"
app:cameraExperimental="true" app:cameraExperimental="true"
app:cameraEngine="camera1" app:cameraEngine="camera2"
app:cameraPreview="glSurface" app:cameraPreview="glSurface"
app:cameraPlaySounds="true" app:cameraPlaySounds="true"
app:cameraGrid="off" app:cameraGrid="off"

@ -39,6 +39,8 @@ camera.addCameraListener(new CameraListener() {
public void onExposureCorrectionChanged(float newValue, float[] bounds, PointF[] fingers) {} public void onExposureCorrectionChanged(float newValue, float[] bounds, PointF[] fingers) {}
public void onVideoRecordingStart() {} public void onVideoRecordingStart() {}
public void onVideoRecordingEnd() {}
}); });
``` ```

@ -59,7 +59,7 @@ Please note that the video snaphot features requires:
This is allowed at the following conditions: This is allowed at the following conditions:
- `takePictureSnapshot()` is used (no HQ pictures) - `takePictureSnapshot()` is used (no HQ pictures)
- the OpenGL preview is used (see [previews](previews.html)) - the `GL_SURFACE` preview is used (see [previews](previews.html))
### Related XML attributes ### Related XML attributes
@ -68,6 +68,35 @@ This is allowed at the following conditions:
app:cameraMode="picture|video"/> app:cameraMode="picture|video"/>
``` ```
### Related callbacks
```java
camera.addCameraListener(new CameraListener() {
@Override
public void onPictureTaken(@NonNull PictureResult result) {
// A Picture was taken!
}
@Override
public void onVideoTaken(@NonNull VideoResult result) {
// A Video was taken!
}
@Override
public void onVideoRecordingStart() {
// Notifies that the actual video recording has started.
// Can be used to show some UI indicator for video recording or counting time.
}
@Override
public void onVideoRecordingEnd() {
// Notifies that the actual video recording has ended.
// Can be used to remove UI indicators added in onVideoRecordingStart.
}
})
```
### Related APIs ### Related APIs
|Method|Description| |Method|Description|

@ -12,8 +12,11 @@ New versions are released through GitHub, so the reference page is the [GitHub R
- New: support for watermarks and animated overlays ([docs](../docs/watermarks-and-overlays.html)), thanks to [@RAN3000][RAN3000] ([#502][502], [#421][421]) - New: support for watermarks and animated overlays ([docs](../docs/watermarks-and-overlays.html)), thanks to [@RAN3000][RAN3000] ([#502][502], [#421][421])
- New: added `onVideoRecordingStart()` to be notified when video recording starts, thanks to [@agrawalsuneet][agrawalsuneet] ([#498][498]) - New: added `onVideoRecordingStart()` to be notified when video recording starts, thanks to [@agrawalsuneet][agrawalsuneet] ([#498][498])
- New: added `onVideoRecordingEnd()` to be notified when video recording ends ([#506][506])
- New: added `Audio.MONO` and `Audio.STEREO` to control the channel count for videos and video snapshots ([#506][506])
- New: added `cameraUseDeviceOrientation` to choose whether picture and video outputs should consider the device orientation or not ([#497][497]) - New: added `cameraUseDeviceOrientation` to choose whether picture and video outputs should consider the device orientation or not ([#497][497])
- Improvement: improved Camera2 stability and various bugs fixed (e.g. [#501][501]) - Improvement: improved Camera2 stability and various bugs fixed (e.g. [#501][501])
- Improvement: improved video snapshots speed, quality and stability ([#506][506])
### v2.0.0-beta06 ### v2.0.0-beta06
@ -27,7 +30,7 @@ New versions are released through GitHub, so the reference page is the [GitHub R
If you were using `focus`, just switch to `autoFocus`. If you were using `focus`, just switch to `autoFocus`.
If you were using `focusWithMarker`, you can [add back the old marker](../docs/more-features.html#cameraautofocusmarker). If you were using `focusWithMarker`, you can [add back the old marker](../docs/controls.html#cameraautofocusmarker).
### v2.0.0-beta05 ### v2.0.0-beta05
@ -79,3 +82,4 @@ This is the first beta release. For changes with respect to v1, please take a lo
[498]: https://github.com/natario1/CameraView/pull/498 [498]: https://github.com/natario1/CameraView/pull/498
[501]: https://github.com/natario1/CameraView/pull/501 [501]: https://github.com/natario1/CameraView/pull/501
[502]: https://github.com/natario1/CameraView/pull/502 [502]: https://github.com/natario1/CameraView/pull/502
[506]: https://github.com/natario1/CameraView/pull/506

@ -25,7 +25,7 @@ or `CameraOptions.supports(Control)` to see if it is supported.
app:cameraFlash="off|on|auto|torch" app:cameraFlash="off|on|auto|torch"
app:cameraWhiteBalance="auto|incandescent|fluorescent|daylight|cloudy" app:cameraWhiteBalance="auto|incandescent|fluorescent|daylight|cloudy"
app:cameraHdr="off|on" app:cameraHdr="off|on"
app:cameraAudio="on|off" app:cameraAudio="on|off|mono|stereo"
app:cameraAudioBitRate="0" app:cameraAudioBitRate="0"
app:cameraVideoCodec="deviceDefault|h263|h264" app:cameraVideoCodec="deviceDefault|h263|h264"
app:cameraVideoMaxSize="0" app:cameraVideoMaxSize="0"
@ -96,7 +96,9 @@ Defaults to `ON`.
```java ```java
cameraView.setAudio(Audio.OFF); cameraView.setAudio(Audio.OFF);
cameraView.setAudio(Audio.ON); cameraView.setAudio(Audio.ON); // on but depends on video config
cameraView.setAudio(Audio.MONO); // force mono
cameraView.setAudio(Audio.STEREO); // force stereo
``` ```
##### cameraAudioBitRate ##### cameraAudioBitRate
@ -141,7 +143,7 @@ cameraView.setVideoBitRate(0);
cameraView.setVideoBitRate(4000000); cameraView.setVideoBitRate(4000000);
``` ```
### Manual Focus ### Auto Focus
There are many ways to focus a CameraView engine: There are many ways to focus a CameraView engine:
@ -173,6 +175,44 @@ cameraView.addCameraListener(new CameraListener() {
Auto focus is not guaranteed to be supported: check the `CameraOptions` to be sure. Auto focus is not guaranteed to be supported: check the `CameraOptions` to be sure.
```xml
<com.otaliastudios.cameraview.CameraView
app:cameraAutoFocusMarker="@string/cameraview_default_autofocus_marker"
app:cameraAutoFocusResetDelay="3000"/>
```
##### cameraAutoFocusMarker
Lets you set a marker for drawing on screen in response to auto focus events.
In XML, you should pass the qualified class name of your marker.
```java
cameraView.setAutoFocusMarker(null);
cameraView.setAutoFocusMarker(marker);
```
We offer a default marker (similar to the old `focusWithMarker` attribute in v1),
which you can set in XML using the `@string/cameraview_default_autofocus_marker` resource,
or programmatically:
```java
cameraView.setAutoFocusMarker(new DefaultAutoFocusMarker());
```
##### cameraAutoFocusResetDelay
Lets you control how an auto-focus operation is reset after completed.
Setting a value <= 0 or == Long.MAX_VALUE will not reset the auto-focus.
This is useful for low end devices that have slow auto-focus capabilities.
Defaults to 3 seconds.
```java
cameraView.setCameraAutoFocusResetDelay(1000); // 1 second
cameraView.setCameraAutoFocusResetDelay(0); // NO reset
cameraView.setCameraAutoFocusResetDelay(-1); // NO reset
cameraView.setCameraAutoFocusResetDelay(Long.MAX_VALUE); // NO reset
```
### Zoom ### Zoom
There are two ways to control the zoom value: There are two ways to control the zoom value:

@ -58,38 +58,6 @@ cameraView.setGridColor(Color.WHITE);
cameraView.setGridColor(Color.BLACK); cameraView.setGridColor(Color.BLACK);
``` ```
##### cameraAutoFocusMarker
Lets you set a marker for drawing on screen in response to auto focus events.
In XML, you should pass the qualified class name of your marker.
```java
cameraView.setAutoFocusMarker(null);
cameraView.setAutoFocusMarker(marker);
```
We offer a default marker (similar to the old `focusWithMarker` attribute in v1),
which you can set in XML using the `@string/cameraview_default_autofocus_marker` resource,
or programmatically:
```java
cameraView.setAutoFocusMarker(new DefaultAutoFocusMarker());
```
##### cameraAutoFocusResetDelay
Lets you control how an auto-focus operation is reset after completed.
Setting a value <= 0 or == Long.MAX_VALUE will not reset the auto-focus.
This is useful for low end devices that have slow auto-focus capabilities.
Defaults to 3 seconds.
```java
cameraView.setCameraAutoFocusResetDelay(1000); // 1 second
cameraView.setCameraAutoFocusResetDelay(0); // NO reset
cameraView.setCameraAutoFocusResetDelay(-1); // NO reset
cameraView.setCameraAutoFocusResetDelay(Long.MAX_VALUE); // NO reset
```
##### cameraUseDeviceOrientation ##### cameraUseDeviceOrientation
Controls whether we should consider the device orientation for picture and video outputs. Controls whether we should consider the device orientation for picture and video outputs.

Loading…
Cancel
Save