* Fix deploy CI trigger
* Use actions/checkout@v2
* Call onImageAvailable on a separate thread
* Add setFrameProcessingPoolSize API
* Add setFrameProcessingExecutors API
* Ensure captures are not blocked by frame processing
* Wait for first frame in onStartPreview
* Enable abortCaptures()
* Improve testFrameProcessing_format
* Improve testFrameProcessing_format again
With the Camera1 engine, the incoming format will always be `ImageFormat.NV21`.
With the Camera1 engine, the incoming format will always be `ImageFormat.NV21`.
You can check which formats are available for use through `CameraOptions.getSupportedFrameProcessingFormats()`.
You can check which formats are available for use through `CameraOptions.getSupportedFrameProcessingFormats()`.
### Advanced: Thread Control
Starting from `v2.5.1`, you can control the number of background threads that are allocated
for frame processing work. This should further push you into perform processing actions synchronously
and can be useful if processing is very slow with respect to the preview frame rate, in order to
avoid dropping too many frames.
You can change the number of threads by calling `setFrameProcessingExecutors()`. Whenever you do,
we recommend that you also change the frame processing pool size to a compatible value.
The frame processing pool size is roughly the number of `Frame` instances that can exist at any given
moment. We recommend that this value is set to the number of executors plus 1. For example:
- Single threaded (default):
```java
cameraView.setFrameProcessingExecutors(1);
cameraView.setFrameProcessingPoolSize(2);
```
- Two threads:
```java
cameraView.setFrameProcessingExecutors(2);
cameraView.setFrameProcessingPoolSize(3);
```
### XML Attributes
### XML Attributes
```xml
```xml
<com.otaliastudios.cameraview.CameraView
<com.otaliastudios.cameraview.CameraView
app:cameraFrameProcessingMaxWidth="640"
app:cameraFrameProcessingMaxWidth="640"
app:cameraFrameProcessingMaxHeight="640"
app:cameraFrameProcessingMaxHeight="640"
app:cameraFrameProcessingFormat="0x23"/>
app:cameraFrameProcessingFormat="0x23"
app:cameraFrameProcessingPoolSize="2"
app:cameraFrameProcessingExecutors="1"/>
```
```
### Related APIs
### Related APIs
@ -135,16 +165,20 @@ You can check which formats are available for use through `CameraOptions.getSupp
|`camera.clearFrameProcessors()`|`-`|Removes all `FrameProcessor`s.|
|`camera.clearFrameProcessors()`|`-`|Removes all `FrameProcessor`s.|
|`camera.setFrameProcessingMaxWidth(int)`|`-`|Sets the max width for incoming frames.|
|`camera.setFrameProcessingMaxWidth(int)`|`-`|Sets the max width for incoming frames.|
|`camera.setFrameProcessingMaxHeight(int)`|`-`|Sets the max height for incoming frames.|
|`camera.setFrameProcessingMaxHeight(int)`|`-`|Sets the max height for incoming frames.|
|`camera.getFrameProcessingMaxWidth()`|`int`|Gets the max width for incoming frames.|
|`camera.getFrameProcessingMaxWidth()`|`int`|Returns the max width for incoming frames.|
|`camera.getFrameProcessingMaxHeight()`|`int`|Gets the max height for incoming frames.|
|`camera.getFrameProcessingMaxHeight()`|`int`|Returns the max height for incoming frames.|
|`camera.setFrameProcessingFormat(int)`|`-`|Sets the desired format for incoming frames. Should be one of the ImageFormat constants.|
|`camera.setFrameProcessingFormat(int)`|`-`|Sets the desired format for incoming frames. Should be one of the ImageFormat constants.|
|`camera.getFrameProcessingFormat()`|`-`|Gets the format for incoming frames. One of the ImageFormat constants.|
|`camera.getFrameProcessingFormat()`|`-`|Returns the format for incoming frames. One of the ImageFormat constants.|
|`camera.setFrameProcessingPoolSize(int)`|`-`|Sets the frame pool size, roughly the number of Frames that can exist at any given moment. Defaults to 2, which fits all use cases unless you change the executors.|
|`camera.getFrameProcessingPoolSize()`|`-`|Returns the frame pool size.|
|`camera.setFrameProcessingExecutors(int)`|`-`|Sets the processing thread size. Defaults to 1, but can be increased if your processing is slow and you are dropping too many frames. This should always be tuned together with the frame pool size.|
|`camera.getFrameProcessingExecutors()`|`-`|Returns the processing thread size.|
|`frame.getDataClass()`|`Class<T>`|The class of the data returned by `getData()`. Either `byte[]` or `android.media.Image`.|
|`frame.getDataClass()`|`Class<T>`|The class of the data returned by `getData()`. Either `byte[]` or `android.media.Image`.|
|`frame.getData()`|`T`|The current preview frame, in its original orientation.|
|`frame.getData()`|`T`|The current preview frame, in its original orientation.|
|`frame.getTime()`|`long`|The preview timestamp, in `System.currentTimeMillis()` reference.|
|`frame.getTime()`|`long`|The preview timestamp, in `System.currentTimeMillis()` reference.|
|`frame.getRotation()`|`int`|The rotation that should be applied to the byte array in order to see what the user sees.|
|`frame.getRotation()`|`int`|The rotation that should be applied to the byte array in order to see what the user sees.|
|`frame.getSize()`|`Size`|The frame size, before any rotation is applied, to access data.|
|`frame.getSize()`|`Size`|The frame size, before any rotation is applied, to access data.|
|`frame.getFormat()`|`int`|The frame `ImageFormat`. This will always be `ImageFormat.NV21` for now.|
|`frame.getFormat()`|`int`|The frame `ImageFormat`. Defaults to `ImageFormat.NV21` for Camera1 and `ImageFormat.YUV_420_888` for Camera2.|
|`frame.freeze()`|`Frame`|Clones this frame and makes it immutable. Can be expensive because requires copying the byte array.|
|`frame.freeze()`|`Frame`|Clones this frame and makes it immutable. Can be expensive because requires copying the byte array.|
|`frame.release()`|`-`|Disposes the content of this frame. Should be used on frozen frames to release memory.|
|`frame.release()`|`-`|Disposes the content of this frame. Should be used on frozen frames to release memory.|