Frame processors (#82)

* Create the Frame class
* Implement callback dispatcher
* Dispatch actual preview frames
* Add docs
* Readme nits
* Don't leak processors
* Rename clear() to release()
* Add preview format and Size
* Readme nits again
pull/83/head
Mattia Iavarone 7 years ago committed by GitHub
parent 399e0e4d76
commit e40f93acfb
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
  1. 97
      README.md
  2. 12
      cameraview/src/androidTest/java/com/otaliastudios/cameraview/CameraCallbacksTest.java
  3. 45
      cameraview/src/androidTest/java/com/otaliastudios/cameraview/CameraViewTest.java
  4. 20
      cameraview/src/androidTest/java/com/otaliastudios/cameraview/IntegrationTest.java
  5. 19
      cameraview/src/main/java/com/otaliastudios/cameraview/Camera1.java
  6. 1
      cameraview/src/main/java/com/otaliastudios/cameraview/CameraController.java
  7. 126
      cameraview/src/main/java/com/otaliastudios/cameraview/CameraView.java
  8. 106
      cameraview/src/main/java/com/otaliastudios/cameraview/Frame.java
  9. 24
      cameraview/src/main/java/com/otaliastudios/cameraview/FrameProcessor.java
  10. 2
      cameraview/src/main/java/com/otaliastudios/cameraview/Size.java
  11. 78
      cameraview/src/test/java/com/otaliastudios/cameraview/FrameTest.java

@ -39,11 +39,12 @@ See below for a [list of what was done](#roadmap) and [licensing info](#contribu
- Take high-resolution pictures with `capturePicture` - Take high-resolution pictures with `capturePicture`
- Take quick snapshots as a freeze frame of the preview with `captureSnapshot` (similar to Snapchat and Instagram) - Take quick snapshots as a freeze frame of the preview with `captureSnapshot` (similar to Snapchat and Instagram)
- Control HDR, flash, zoom, white balance, exposure correction and more - Control HDR, flash, zoom, white balance, exposure correction and more
- **Frame processing** support
- **Metadata** support for pictures and videos - **Metadata** support for pictures and videos
- Automatically detected orientation tags - Automatically detected orientation tags
- Plug in location tags with `setLocation()` API - Plug in location tags with `setLocation()` API
- `CameraUtils` to help with Bitmaps and orientations - `CameraUtils` to help with Bitmaps and orientations
- Lightweight, no dependencies, just support `ExifInterface` - **Lightweight**, no dependencies, just support `ExifInterface`
- Works down to API level 15 - Works down to API level 15
# Docs # Docs
@ -57,12 +58,12 @@ See below for a [list of what was done](#roadmap) and [licensing info](#contribu
- [Center Inside](#center-inside) - [Center Inside](#center-inside)
- [Center Crop](#center-crop) - [Center Crop](#center-crop)
- [Camera Controls](#camera-controls) - [Camera Controls](#camera-controls)
- [Frame Processing](#frame-processing)
- [Other APIs](#other-apis) - [Other APIs](#other-apis)
- [Permissions Behavior](#permissions-behavior) - [Permissions Behavior](#permissions-behavior)
- [Manifest file](#manifest-file)
- [Logging](#logging) - [Logging](#logging)
- [Roadmap](#roadmap)
- [Device-specific issues](#device-specific-issues) - [Device-specific issues](#device-specific-issues)
- [Roadmap](#roadmap)
## Usage ## Usage
@ -437,6 +438,43 @@ cameraView.setPlaySounds(true);
cameraView.setPlaySounds(false); cameraView.setPlaySounds(false);
``` ```
## Frame Processing
We support frame processors that will receive data from the camera preview stream:
```java
cameraView.addFrameProcessor(new FrameProcessor() {
@Override
@WorkerThread
public void process(Frame frame) {
byte[] data = frame.getData();
int rotation = frame.getRotation();
long time = frame.getTime();
Size size = frame.getSize();
int format = frame.getFormat();
// Process...
}
}
```
For your convenience, the `FrameProcessor` method is run in a background thread so you can do your job
in a synchronous fashion. Once the process method returns, internally we will re-use the `Frame` instance and
apply new data to it. So:
- you can do your job synchronously in the `process()` method
- if you must hold the `Frame` instance longer, use `frame = frame.freeze()` to get a frozen instance
that will not be affected
|Frame API|Type|Description|
|---------|----|-----------|
|`frame.getData()`|`byte[]`|The current preview frame, in its original orientation.|
|`frame.getTime()`|`long`|The preview timestamp, in `System.currentTimeMillis()` reference.|
|`frame.getRotation()`|`int`|The rotation that should be applied to the byte array in order to see what the user sees.|
|`frame.getSize()`|`Size`|The frame size, before any rotation is applied, to access data.|
|`frame.getFormat()`|`int`|The frame `ImageFormat`. This will always be `ImageFormat.NV21` for now.|
|`frame.freeze()`|`Frame`|Clones this frame and makes it immutable. Can be expensive because requires copying the byte array.|
|`frame.release()`|`-`|Disposes the content of this frame. Should be used on frozen frames to release memory.|
## Other APIs ## Other APIs
Other APIs not mentioned above are provided, and are well documented and commented in code. Other APIs not mentioned above are provided, and are well documented and commented in code.
@ -461,7 +499,7 @@ Other APIs not mentioned above are provided, and are well documented and comment
|`getSnapshotSize()`|Returns `getPreviewSize()`, since a snapshot is a preview frame.| |`getSnapshotSize()`|Returns `getPreviewSize()`, since a snapshot is a preview frame.|
|`getPictureSize()`|Returns the size of the output picture. The aspect ratio is consistent with `getPreviewSize()`.| |`getPictureSize()`|Returns the size of the output picture. The aspect ratio is consistent with `getPreviewSize()`.|
Take also a look at public methods in `CameraUtils`, `CameraOptions`, `ExtraProperties`, `CameraLogger`. Take also a look at public methods in `CameraUtils`, `CameraOptions`, `ExtraProperties`.
## Permissions behavior ## Permissions behavior
@ -470,13 +508,13 @@ Take also a look at public methods in `CameraUtils`, `CameraOptions`, `ExtraProp
- `android.permission.CAMERA` : required for capturing pictures and videos - `android.permission.CAMERA` : required for capturing pictures and videos
- `android.permission.RECORD_AUDIO` : required for capturing videos with `Audio.ON` (the default) - `android.permission.RECORD_AUDIO` : required for capturing videos with `Audio.ON` (the default)
You can handle permissions yourself and then call `CameraView.start()` once they are acquired. If they are not, `CameraView` will request permissions to the user based on whether they are needed. In that case, you can restart the camera if you have a successful response from `onRequestPermissionResults()`. ### Declaration
## Manifest file The library manifest file declares the `android.permission.CAMERA` permission, but not the audio one.
This means that:
The library manifest file is not strict and only asks for camera permissions. This means that: - If you wish to record videos with `Audio.ON` (the default), you should also add
`android.permission.RECORD_AUDIO` to required permissions
- If you wish to record videos with `Audio.ON` (the default), you should also add `android.permission.RECORD_AUDIO` to required permissions
```xml ```xml
<uses-permission android:name="android.permission.RECORD_AUDIO"/> <uses-permission android:name="android.permission.RECORD_AUDIO"/>
@ -490,7 +528,18 @@ The library manifest file is not strict and only asks for camera permissions. Th
android:required="true"/> android:required="true"/>
``` ```
If you don't request this feature, you can use `CameraUtils.hasCameras()` to detect if current device has cameras, and then start the camera view. If you don't request this feature, you can use `CameraUtils.hasCameras()` to detect if current
device has cameras, and then start the camera view.
### Handling
On Marshmallow+, the user must explicitly approve our permissions. You can
- handle permissions yourself and then call `cameraView.start()` once they are acquired
- or call `cameraView.start()` anyway: `CameraView` will present a permission request to the user based on
whether they are needed or not with the current configuration.
In the second case, you should restart the camera if you have a successful response from `onRequestPermissionResults()`.
## Logging ## Logging
@ -512,9 +561,21 @@ CameraLogger.registerLogger(new Logger() {
Make sure you enable the logger using `CameraLogger.setLogLevel(@LogLevel int)`. The default will only Make sure you enable the logger using `CameraLogger.setLogLevel(@LogLevel int)`. The default will only
log error events. log error events.
## Device-specific issues
There are a couple of known issues if you are working with certain devices. The emulator is one of
the most tricky in this sense.
- Devices, or activities, with hardware acceleration turned off: this can be the case with emulators.
In this case we will use SurfaceView as our surface provider. That is intrinsically flawed and can't
deal with all we want to do here (runtime layout changes, scaling, etc.). So, nothing to do in this case.
- Devices with no support for MediaRecorder: the emulator does not support it, officially. This means
that video/audio recording is flawed. Again, not our fault.
## Roadmap ## Roadmap
This is what was done since the library was forked. I have kept the original structure, but practically all the code was changed. This is what was done since the library was forked. I have kept the original structure, but practically
all the code was changed.
- *a huge number of serious bugs fixed* - *a huge number of serious bugs fixed*
- *decent orientation support for both pictures and videos* - *decent orientation support for both pictures and videos*
@ -542,28 +603,18 @@ This is what was done since the library was forked. I have kept the original str
- *Tests!* - *Tests!*
- *`CameraLogger` APIs for logging and bug reports* - *`CameraLogger` APIs for logging and bug reports*
- *Better threading, start() in worker thread and callbacks in UI* - *Better threading, start() in worker thread and callbacks in UI*
- *Frame processor support*
- *inject external loggers*
These are still things that need to be done, off the top of my head: These are still things that need to be done, off the top of my head:
- [ ] `Camera2` integration - [ ] `Camera2` integration
- [ ] check onPause / onStop / onSaveInstanceState consistency
- [ ] add a `setPreferredAspectRatio` API to choose the capture size. Preview size will adapt, and then, if let free, the CameraView will adapt as well - [ ] add a `setPreferredAspectRatio` API to choose the capture size. Preview size will adapt, and then, if let free, the CameraView will adapt as well
- [ ] animate grid lines similar to stock camera app - [ ] animate grid lines similar to stock camera app
- [ ] add onRequestPermissionResults for easy permission callback - [ ] add onRequestPermissionResults for easy permission callback
- [ ] better error handling, maybe with a onError(e) method in the public listener, or have each public method return a boolean - [ ] better error handling, maybe with a onError(e) method in the public listener, or have each public method return a boolean
- [ ] decent code coverage - [ ] decent code coverage
## Device-specific issues
There are a couple of known issues if you are working with certain devices. The emulator is one of
the most tricky in this sense.
- Devices, or activities, with hardware acceleration turned off: this can be the case with emulators.
In this case we will use SurfaceView as our surface provider. That is intrinsically flawed and can't
deal with all we want to do here (runtime layout changes, scaling, etc.). So, nothing to do in this case.
- Devices with no support for MediaRecorder: the emulator does not support it, officially. This means
that video/audio recording is flawed. Again, not our fault.
# Contributing and licenses # Contributing and licenses
The original project which served as a starting point for this library, The original project which served as a starting point for this library,

@ -38,6 +38,7 @@ public class CameraCallbacksTest extends BaseTest {
private CameraView camera; private CameraView camera;
private CameraListener listener; private CameraListener listener;
private FrameProcessor processor;
private MockCameraController mockController; private MockCameraController mockController;
private MockCameraPreview mockPreview; private MockCameraPreview mockPreview;
private Task<Boolean> task; private Task<Boolean> task;
@ -50,6 +51,7 @@ public class CameraCallbacksTest extends BaseTest {
public void run() { public void run() {
Context context = context(); Context context = context();
listener = mock(CameraListener.class); listener = mock(CameraListener.class);
processor = mock(FrameProcessor.class);
camera = new CameraView(context) { camera = new CameraView(context) {
@Override @Override
protected CameraController instantiateCameraController(CameraCallbacks callbacks) { protected CameraController instantiateCameraController(CameraCallbacks callbacks) {
@ -70,6 +72,7 @@ public class CameraCallbacksTest extends BaseTest {
}; };
camera.instantiatePreview(); camera.instantiatePreview();
camera.addCameraListener(listener); camera.addCameraListener(listener);
camera.addFrameProcessor(processor);
task = new Task<>(); task = new Task<>();
task.listen(); task.listen();
} }
@ -294,4 +297,13 @@ public class CameraCallbacksTest extends BaseTest {
Bitmap bitmap = BitmapFactory.decodeByteArray(result, 0, result.length); Bitmap bitmap = BitmapFactory.decodeByteArray(result, 0, result.length);
return new int[]{ bitmap.getWidth(), bitmap.getHeight() }; return new int[]{ bitmap.getWidth(), bitmap.getHeight() };
} }
@Test
public void testProcessFrame() {
completeTask().when(processor).process(any(Frame.class));
camera.mCameraCallbacks.dispatchFrame(new byte[]{0, 1, 2, 3}, 1000, 90, new Size(1, 1), 0);
assertNotNull(task.await(200));
verify(processor, times(1)).process(any(Frame.class));
}
} }

@ -3,6 +3,7 @@ package com.otaliastudios.cameraview;
import android.content.Context; import android.content.Context;
import android.location.Location; import android.location.Location;
import android.support.annotation.NonNull;
import android.support.test.filters.MediumTest; import android.support.test.filters.MediumTest;
import android.support.test.runner.AndroidJUnit4; import android.support.test.runner.AndroidJUnit4;
import android.view.MotionEvent; import android.view.MotionEvent;
@ -551,5 +552,49 @@ public class CameraViewTest extends BaseTest {
//endregion //endregion
//region Lists of listeners and processors
@Test
public void testCameraListenerList() {
assertTrue(cameraView.mListeners.isEmpty());
CameraListener listener = new CameraListener() {};
cameraView.addCameraListener(listener);
assertEquals(cameraView.mListeners.size(), 1);
cameraView.removeCameraListener(listener);
assertEquals(cameraView.mListeners.size(), 0);
cameraView.addCameraListener(listener);
cameraView.addCameraListener(listener);
assertEquals(cameraView.mListeners.size(), 2);
cameraView.clearCameraListeners();
assertTrue(cameraView.mListeners.isEmpty());
}
@Test
public void testFrameProcessorsList() {
assertTrue(cameraView.mFrameProcessors.isEmpty());
FrameProcessor processor = new FrameProcessor() {
public void process(@NonNull Frame frame) {}
};
cameraView.addFrameProcessor(processor);
assertEquals(cameraView.mFrameProcessors.size(), 1);
cameraView.removeFrameProcessor(processor);
assertEquals(cameraView.mFrameProcessors.size(), 0);
cameraView.addFrameProcessor(processor);
cameraView.addFrameProcessor(processor);
assertEquals(cameraView.mFrameProcessors.size(), 2);
cameraView.clearFrameProcessors();
assertTrue(cameraView.mFrameProcessors.isEmpty());
}
//endregion
// TODO: test permissions // TODO: test permissions
} }

@ -504,4 +504,24 @@ public class IntegrationTest extends BaseTest {
assertTrue(b.getHeight() == size.getHeight() || b.getHeight() == size.getWidth()); assertTrue(b.getHeight() == size.getHeight() || b.getHeight() == size.getWidth());
} }
//endregion
//region Frame Processing
@Test
public void testFrameProcessing() throws Exception {
FrameProcessor processor = mock(FrameProcessor.class);
camera.addFrameProcessor(processor);
camera.start();
waitForOpen(true);
// Expect 30 frames
CountDownLatch latch = new CountDownLatch(30);
doCountDown(latch).when(processor).process(any(Frame.class));
boolean did = latch.await(4, TimeUnit.SECONDS);
assertTrue(did);
}
//endregion
} }

@ -21,7 +21,7 @@ import java.util.List;
@SuppressWarnings("deprecation") @SuppressWarnings("deprecation")
class Camera1 extends CameraController { class Camera1 extends CameraController implements Camera.PreviewCallback {
private static final String TAG = Camera1.class.getSimpleName(); private static final String TAG = Camera1.class.getSimpleName();
private static final CameraLogger LOG = CameraLogger.create(TAG); private static final CameraLogger LOG = CameraLogger.create(TAG);
@ -143,10 +143,14 @@ class Camera1 extends CameraController {
); );
synchronized (mLock) { synchronized (mLock) {
Camera.Parameters params = mCamera.getParameters(); Camera.Parameters params = mCamera.getParameters();
mPreviewFormat = params.getPreviewFormat();
params.setPreviewSize(mPreviewSize.getWidth(), mPreviewSize.getHeight()); // <- not allowed during preview params.setPreviewSize(mPreviewSize.getWidth(), mPreviewSize.getHeight()); // <- not allowed during preview
params.setPictureSize(mCaptureSize.getWidth(), mCaptureSize.getHeight()); // <- allowed params.setPictureSize(mCaptureSize.getWidth(), mCaptureSize.getHeight()); // <- allowed
mCamera.setParameters(params); mCamera.setParameters(params);
} }
mCamera.setPreviewCallback(this); // Frame processing
LOG.i("setup:", "Starting preview with startPreview()."); LOG.i("setup:", "Starting preview with startPreview().");
mCamera.startPreview(); mCamera.startPreview();
LOG.i("setup:", "Started preview with startPreview()."); LOG.i("setup:", "Started preview with startPreview().");
@ -475,7 +479,6 @@ class Camera1 extends CameraController {
// Got to rotate the preview frame, since byte[] data here does not include // Got to rotate the preview frame, since byte[] data here does not include
// EXIF tags automatically set by camera. So either we add EXIF, or we rotate. // EXIF tags automatically set by camera. So either we add EXIF, or we rotate.
// Adding EXIF to a byte array, unfortunately, is hard. // Adding EXIF to a byte array, unfortunately, is hard.
Camera.Parameters params = mCamera.getParameters();
final int sensorToDevice = computeExifRotation(); final int sensorToDevice = computeExifRotation();
final int sensorToDisplay = computeSensorToDisplayOffset(); final int sensorToDisplay = computeSensorToDisplayOffset();
final boolean exifFlip = computeExifFlip(); final boolean exifFlip = computeExifFlip();
@ -484,7 +487,7 @@ class Camera1 extends CameraController {
final int preHeight = mPreviewSize.getHeight(); final int preHeight = mPreviewSize.getHeight();
final int postWidth = flip ? preHeight : preWidth; final int postWidth = flip ? preHeight : preWidth;
final int postHeight = flip ? preWidth : preHeight; final int postHeight = flip ? preWidth : preHeight;
final int format = params.getPreviewFormat(); final int format = mPreviewFormat;
WorkerHandler.run(new Runnable() { WorkerHandler.run(new Runnable() {
@Override @Override
public void run() { public void run() {
@ -496,11 +499,21 @@ class Camera1 extends CameraController {
mIsCapturingImage = false; mIsCapturingImage = false;
} }
}); });
mCamera.setPreviewCallback(Camera1.this);
} }
}); });
return true; return true;
} }
@Override
public void onPreviewFrame(byte[] data, Camera camera) {
mCameraCallbacks.dispatchFrame(data,
System.currentTimeMillis(),
computeExifRotation(),
mPreviewSize,
mPreviewFormat);
}
@Override @Override
boolean shouldFlipSizes() { boolean shouldFlipSizes() {
int offset = computeSensorToDisplayOffset(); int offset = computeSensorToDisplayOffset();

@ -32,6 +32,7 @@ abstract class CameraController implements CameraPreview.SurfaceCallback {
protected Size mCaptureSize; protected Size mCaptureSize;
protected Size mPreviewSize; protected Size mPreviewSize;
protected int mPreviewFormat;
protected ExtraProperties mExtraProperties; protected ExtraProperties mExtraProperties;
protected CameraOptions mOptions; protected CameraOptions mOptions;

@ -61,8 +61,9 @@ public class CameraView extends FrameLayout {
private CameraPreview mCameraPreview; private CameraPreview mCameraPreview;
private OrientationHelper mOrientationHelper; private OrientationHelper mOrientationHelper;
private CameraController mCameraController; private CameraController mCameraController;
private ArrayList<CameraListener> mListeners = new ArrayList<>(2);
private MediaActionSound mSound; private MediaActionSound mSound;
/* for tests */ ArrayList<CameraListener> mListeners = new ArrayList<>(2);
/* for tests */ ArrayList<FrameProcessor> mFrameProcessors = new ArrayList<>(1);
// Views // Views
GridLinesLayout mGridLinesLayout; GridLinesLayout mGridLinesLayout;
@ -74,6 +75,7 @@ public class CameraView extends FrameLayout {
// Threading // Threading
private Handler mUiHandler; private Handler mUiHandler;
private WorkerHandler mWorkerHandler; private WorkerHandler mWorkerHandler;
private WorkerHandler mFrameProcessorsHandler;
public CameraView(@NonNull Context context) { public CameraView(@NonNull Context context) {
super(context, null); super(context, null);
@ -119,6 +121,7 @@ public class CameraView extends FrameLayout {
mCameraController = instantiateCameraController(mCameraCallbacks); mCameraController = instantiateCameraController(mCameraCallbacks);
mUiHandler = new Handler(Looper.getMainLooper()); mUiHandler = new Handler(Looper.getMainLooper());
mWorkerHandler = WorkerHandler.get("CameraViewWorker"); mWorkerHandler = WorkerHandler.get("CameraViewWorker");
mFrameProcessorsHandler = WorkerHandler.get("FrameProcessorsWorker");
// Views // Views
mGridLinesLayout = new GridLinesLayout(context); mGridLinesLayout = new GridLinesLayout(context);
@ -577,9 +580,14 @@ public class CameraView extends FrameLayout {
mCameraController.stop(); mCameraController.stop();
} }
/**
* Destroys this instance, releasing immediately
* the camera resource.
*/
public void destroy() { public void destroy() {
// TODO: this is not strictly needed clearCameraListeners();
clearCameraListeners(); // Release clearFrameProcessors();
mCameraController.stopImmediately(); mCameraController.stopImmediately();
} }
@ -1092,6 +1100,40 @@ public class CameraView extends FrameLayout {
} }
/**
* Adds a {@link FrameProcessor} instance to be notified of
* new frames in the preview stream.
*
* @param processor a frame processor.
*/
public void addFrameProcessor(FrameProcessor processor) {
if (processor != null) {
mFrameProcessors.add(processor);
}
}
/**
* Remove a {@link FrameProcessor} that was previously registered.
*
* @param processor a frame processor
*/
public void removeFrameProcessor(FrameProcessor processor) {
if (processor != null) {
mFrameProcessors.remove(processor);
}
}
/**
* Clears the list of {@link FrameProcessor} that have been registered
* to preview frames.
*/
public void clearFrameProcessors() {
mFrameProcessors.clear();
}
/** /**
* Asks the camera to capture an image of the current scene. * Asks the camera to capture an image of the current scene.
* This will trigger {@link CameraListener#onPictureTaken(byte[])} if a listener * This will trigger {@link CameraListener#onPictureTaken(byte[])} if a listener
@ -1310,6 +1352,7 @@ public class CameraView extends FrameLayout {
void dispatchOnFocusEnd(@Nullable Gesture trigger, boolean success, PointF where); void dispatchOnFocusEnd(@Nullable Gesture trigger, boolean success, PointF where);
void dispatchOnZoomChanged(final float newValue, final PointF[] fingers); void dispatchOnZoomChanged(final float newValue, final PointF[] fingers);
void dispatchOnExposureCorrectionChanged(float newValue, float[] bounds, PointF[] fingers); void dispatchOnExposureCorrectionChanged(float newValue, float[] bounds, PointF[] fingers);
void dispatchFrame(byte[] frame, long time, int rotation, Size size, int previewFormat);
} }
private class Callbacks implements CameraCallbacks { private class Callbacks implements CameraCallbacks {
@ -1321,6 +1364,9 @@ public class CameraView extends FrameLayout {
private Integer mDisplayOffset; private Integer mDisplayOffset;
private Integer mDeviceOrientation; private Integer mDeviceOrientation;
// Frame processing
private Frame mFrame;
Callbacks() {} Callbacks() {}
@Override @Override
@ -1557,71 +1603,29 @@ public class CameraView extends FrameLayout {
} }
}); });
} }
}
//endregion @Override
public void dispatchFrame(final byte[] frame, final long time, final int rotation,
//region Deprecated final Size size, final int previewFormat) {
mLogger.i("dispatchFrame", time, rotation, "processors:", mFrameProcessors.size());
/** if (mFrameProcessors.isEmpty()) return;
* This does nothing. if (mFrame == null) mFrame = new Frame();
* @deprecated mFrameProcessorsHandler.post(new Runnable() {
* @param focus no-op @Override
*/ public void run() {
@Deprecated mFrame.set(frame, time, rotation, size, previewFormat);
public void setFocus(int focus) { for (FrameProcessor processor : mFrameProcessors) {
processor.process(mFrame);
} }
/**
* This does nothing.
* @return no-op
* @deprecated
*/
@Deprecated
public int getFocus() {
return 0;
} }
});
}
/**
* This does nothing.
* @deprecated
* @param method no-op
*/
@Deprecated
public void setCaptureMethod(int method) {}
/**
* This does nothing.
* @deprecated
* @param permissions no-op
*/
@Deprecated
public void setPermissionPolicy(int permissions) {}
/**
* Sets the zoom mode for the current session.
*
* @param zoom no-op
* @deprecated use {@link #mapGesture(Gesture, GestureAction)} to map zoom control to gestures
*/
@Deprecated
public void setZoomMode(int zoom) {
} }
//endregion
/** //region Deprecated
* Gets the current zoom mode.
* @return no-op
* @deprecated use {@link #mapGesture(Gesture, GestureAction)} to map zoom control to gestures
*/
@Deprecated
public int getZoomMode() {
return 0;
}
//endregion //endregion
} }

@ -0,0 +1,106 @@
package com.otaliastudios.cameraview;
/**
* A preview frame to be processed by {@link FrameProcessor}s.
*/
public class Frame {
private byte[] mData = null;
private long mTime = -1;
private int mRotation = 0;
private Size mSize = null;
private int mFormat = -1;
Frame() {}
void set(byte[] data, long time, int rotation, Size size, int format) {
this.mData = data;
this.mTime = time;
this.mRotation = rotation;
this.mSize = size;
this.mFormat = format;
}
@Override
public boolean equals(Object obj) {
// We want a super fast implementation here, do not compare arrays.
return obj instanceof Frame && ((Frame) obj).mTime == mTime;
}
/**
* Clones the frame, returning a frozen content that will not be overwritten.
* This can be kept or safely passed to other threads.
* Using freeze without clearing with {@link #release()} can result in memory leaks.
*
* @return a frozen Frame
*/
public Frame freeze() {
byte[] data = new byte[mData.length];
System.arraycopy(mData, 0, data, 0, mData.length);
Frame other = new Frame();
other.set(data, mTime, mRotation, mSize, mFormat);
return other;
}
/**
* Disposes the contents of this frame. Can be useful for frozen frames
* that are not useful anymore.
*/
public void release() {
mData = null;
mRotation = 0;
mTime = -1;
mSize = null;
mFormat = -1;
}
/**
* Returns the frame data.
* @return the frame data
*/
public byte[] getData() {
return mData;
}
/**
* Returns the milliseconds epoch for this frame,
* in the {@link System#currentTimeMillis()} reference.
*
* @return time data
*/
public long getTime() {
return mTime;
}
/**
* Returns the clock-wise rotation that should be applied on the data
* array, such that the resulting frame matches what the user is seeing
* on screen.
*
* @return clock-wise rotation
*/
public int getRotation() {
return mRotation;
}
/**
* Returns the frame size.
*
* @return frame size
*/
public Size getSize() {
return mSize;
}
/**
* Returns the data format, in one of the
* {@link android.graphics.ImageFormat} constants.
* This will always be {@link android.graphics.ImageFormat#NV21} for now.
*
* @return the data format
* @see android.graphics.ImageFormat
*/
public int getFormat() {
return mFormat;
}
}

@ -0,0 +1,24 @@
package com.otaliastudios.cameraview;
import android.support.annotation.NonNull;
import android.support.annotation.WorkerThread;
/**
* A FrameProcessor will process {@link Frame}s coming from the camera preview.
* It must be passed to {@link CameraView#addFrameProcessor(FrameProcessor)}.
*/
public interface FrameProcessor {
/**
* Processes the given frame. The frame will hold the correct values only for the
* duration of this method. When it returns, the frame contents will be replaced.
*
* To keep working with the Frame in an async manner, please use {@link Frame#freeze()},
* which will return an immutable Frame. In that case you can pass / hold the frame for
* as long as you want, and then release its contents using {@link Frame#release()}.
*
* @param frame the new frame
*/
@WorkerThread
void process(@NonNull Frame frame);
}

@ -7,7 +7,7 @@ public class Size implements Comparable<Size> {
private final int mWidth; private final int mWidth;
private final int mHeight; private final int mHeight;
public Size(int width, int height) { Size(int width, int height) {
mWidth = width; mWidth = width;
mHeight = height; mHeight = height;
} }

@ -0,0 +1,78 @@
package com.otaliastudios.cameraview;
import android.graphics.ImageFormat;
import org.junit.Test;
import static org.junit.Assert.assertArrayEquals;
import static org.junit.Assert.assertEquals;
import static org.junit.Assert.assertFalse;
import static org.junit.Assert.assertNotEquals;
import static org.junit.Assert.assertNull;
import static org.junit.Assert.assertTrue;
public class FrameTest {
@Test
public void testDefaults() {
Frame frame = new Frame();
assertEquals(frame.getTime(), -1);
assertEquals(frame.getFormat(), -1);
assertEquals(frame.getRotation(), 0);
assertNull(frame.getData());
assertNull(frame.getSize());
}
@Test
public void testEquals() {
Frame f1 = new Frame();
long time = 1000;
f1.set(null, time, 90, null, ImageFormat.NV21);
Frame f2 = new Frame();
f2.set(new byte[2], time, 0, new Size(10, 10), ImageFormat.NV21);
assertEquals(f1, f2);
f2.set(new byte[2], time + 1, 0, new Size(10, 10), ImageFormat.NV21);
assertNotEquals(f1, f2);
}
@Test
public void testRelease() {
Frame frame = new Frame();
frame.set(new byte[2], 1000, 90, new Size(10, 10), ImageFormat.NV21);
frame.release();
assertEquals(frame.getTime(), -1);
assertEquals(frame.getFormat(), -1);
assertEquals(frame.getRotation(), 0);
assertNull(frame.getData());
assertNull(frame.getSize());
}
@Test
public void testFreeze() {
Frame frame = new Frame();
byte[] data = new byte[]{0, 1, 5, 0, 7, 3, 4, 5};
long time = 1000;
int rotation = 90;
Size size = new Size(10, 10);
int format = ImageFormat.NV21;
frame.set(data, time, rotation, size, format);
Frame frozen = frame.freeze();
assertArrayEquals(data, frozen.getData());
assertEquals(time, frozen.getTime());
assertEquals(rotation, frozen.getRotation());
assertEquals(size, frozen.getSize());
// Mutate the first, ensure that frozen is not affected
frame.set(new byte[]{3, 2, 1}, 50, 180, new Size(1, 1), ImageFormat.JPEG);
assertArrayEquals(data, frozen.getData());
assertEquals(time, frozen.getTime());
assertEquals(rotation, frozen.getRotation());
assertEquals(size, frozen.getSize());
assertEquals(format, frozen.getFormat());
}
}
Loading…
Cancel
Save