RAW support (#691)

* Check engine state after picture metering - Fixes #685

* Ensure actions are only started in a valid holder state - Fixes #669

* Improve size selection

* Add PictureFormat definition

* Change PictureResult to reflect new format

* Throw if toBitmap is called with a DNG file

* Update the CameraView interface to support PictureFormat

* Implement DNG support into the CameraEngine, restart if needed

* Make CameraOptions engine aware

* Fix action bug

* Make CameraOptions check RAW availability

* Complete engine/options logic for RAW

* Add RAW control to demo app

* Ensure toBitmap does not crash in demo

* RAW support inside Full2PictureRecorder

* Add DNG test, fix implementation bugs

* Add option to see RAW result through share option

* Enable RAW toBitmap for API 24+

* Add documentation

* Improve documentation

* Change tests

* Small change
pull/697/head
Mattia Iavarone 5 years ago committed by GitHub
parent 79d00d5eba
commit bb3b23e69a
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
  1. 2
      README.md
  2. 1
      cameraview/build.gradle
  3. 22
      cameraview/src/androidTest/java/com/otaliastudios/cameraview/CameraViewTest.java
  4. 3
      cameraview/src/androidTest/java/com/otaliastudios/cameraview/PictureResultTest.java
  5. 4
      cameraview/src/androidTest/java/com/otaliastudios/cameraview/engine/Camera1IntegrationTest.java
  6. 3
      cameraview/src/androidTest/java/com/otaliastudios/cameraview/engine/Camera2IntegrationTest.java
  7. 27
      cameraview/src/androidTest/java/com/otaliastudios/cameraview/engine/CameraIntegrationTest.java
  8. 6
      cameraview/src/androidTest/java/com/otaliastudios/cameraview/engine/MockCameraEngine.java
  9. 41
      cameraview/src/androidTest/java/com/otaliastudios/cameraview/engine/options/Camera1OptionsTest.java
  10. 333
      cameraview/src/main/java/com/otaliastudios/cameraview/CameraOptions.java
  11. 2
      cameraview/src/main/java/com/otaliastudios/cameraview/CameraUtils.java
  12. 30
      cameraview/src/main/java/com/otaliastudios/cameraview/CameraView.java
  13. 27
      cameraview/src/main/java/com/otaliastudios/cameraview/PictureResult.java
  14. 8
      cameraview/src/main/java/com/otaliastudios/cameraview/controls/ControlParser.java
  15. 52
      cameraview/src/main/java/com/otaliastudios/cameraview/controls/PictureFormat.java
  16. 14
      cameraview/src/main/java/com/otaliastudios/cameraview/engine/Camera1Engine.java
  17. 68
      cameraview/src/main/java/com/otaliastudios/cameraview/engine/Camera2Engine.java
  18. 29
      cameraview/src/main/java/com/otaliastudios/cameraview/engine/CameraEngine.java
  19. 14
      cameraview/src/main/java/com/otaliastudios/cameraview/engine/action/BaseAction.java
  20. 132
      cameraview/src/main/java/com/otaliastudios/cameraview/engine/options/Camera1Options.java
  21. 178
      cameraview/src/main/java/com/otaliastudios/cameraview/engine/options/Camera2Options.java
  22. 15
      cameraview/src/main/java/com/otaliastudios/cameraview/internal/utils/ExifHelper.java
  23. 3
      cameraview/src/main/java/com/otaliastudios/cameraview/picture/Full1PictureRecorder.java
  24. 67
      cameraview/src/main/java/com/otaliastudios/cameraview/picture/Full2PictureRecorder.java
  25. 1
      cameraview/src/main/java/com/otaliastudios/cameraview/picture/Snapshot1PictureRecorder.java
  26. 35
      cameraview/src/main/java/com/otaliastudios/cameraview/picture/Snapshot2PictureRecorder.java
  27. 1
      cameraview/src/main/java/com/otaliastudios/cameraview/picture/SnapshotGlPictureRecorder.java
  28. 5
      cameraview/src/main/res/values/attrs.xml
  29. 29
      cameraview/src/test/java/com/otaliastudios/cameraview/internal/utils/ExifHelperTest.java
  30. 19
      demo/src/main/AndroidManifest.xml
  31. 3
      demo/src/main/java/com/otaliastudios/cameraview/demo/CameraActivity.java
  32. 9
      demo/src/main/java/com/otaliastudios/cameraview/demo/Option.java
  33. 66
      demo/src/main/java/com/otaliastudios/cameraview/demo/PicturePreviewActivity.java
  34. 36
      demo/src/main/java/com/otaliastudios/cameraview/demo/VideoPreviewActivity.java
  35. 5
      demo/src/main/res/drawable/ic_share.xml
  36. 8
      demo/src/main/res/menu/share.xml
  37. 3
      demo/src/main/res/values/styles.xml
  38. 4
      demo/src/main/res/xml/filepaths.xml
  39. 14
      docs/_posts/2018-12-20-controls.md
  40. 1
      docs/index.md

@ -36,6 +36,7 @@ api 'com.otaliastudios:cameraview:2.4.0'
- Take super-fast snapshots with `takePictureSnapshot` and `takeVideoSnapshot` [[docs]](https://natario1.github.io/CameraView/docs/capturing-media.html)
- Smart sizing: create a `CameraView` of any size [[docs]](https://natario1.github.io/CameraView/docs/preview-size.html)
- Control HDR, flash, zoom, white balance, exposure, location, grid drawing & more [[docs]](https://natario1.github.io/CameraView/docs/controls.html)
- RAW pictures support [[docs]](https://natario1.github.io/CameraView/docs/controls.html)
- Lightweight
- Works down to API level 15
- Well tested
@ -141,6 +142,7 @@ Using CameraView is extremely simple:
app:cameraFilter="@string/real_time_filter"
app:cameraPictureMetering="true|false"
app:cameraPictureSnapshotMetering="false|true"
app:cameraPictureFormat="jpeg|dng"
app:cameraExperimental="false|true">
<!-- Watermark! -->

@ -267,6 +267,5 @@ task mergeCoverageReports(type: JacocoReport) {
//endregion
// export JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.8.0_101.jdk/Contents/Home
// To deploy ./gradlew bintrayUpload

@ -20,6 +20,7 @@ import com.otaliastudios.cameraview.controls.ControlParser;
import com.otaliastudios.cameraview.controls.Engine;
import com.otaliastudios.cameraview.controls.Facing;
import com.otaliastudios.cameraview.controls.Flash;
import com.otaliastudios.cameraview.controls.PictureFormat;
import com.otaliastudios.cameraview.controls.Preview;
import com.otaliastudios.cameraview.engine.CameraEngine;
import com.otaliastudios.cameraview.filter.Filter;
@ -161,6 +162,7 @@ public class CameraViewTest extends BaseTest {
assertEquals(cameraView.getHdr(), controls.getHdr());
assertEquals(cameraView.getAudio(), controls.getAudio());
assertEquals(cameraView.getVideoCodec(), controls.getVideoCodec());
assertEquals(cameraView.getPictureFormat(), controls.getPictureFormat());
//noinspection SimplifiableJUnitAssertion
assertEquals(cameraView.getLocation(), null);
assertEquals(cameraView.getExposureCorrection(), 0f, 0f);
@ -321,9 +323,9 @@ public class CameraViewTest extends BaseTest {
@Test
public void testGestureAction_exposureCorrection() {
CameraOptions o = mock(CameraOptions.class);
when(o.getExposureCorrectionMinValue()).thenReturn(-10f);
when(o.getExposureCorrectionMaxValue()).thenReturn(10f);
CameraOptions o = new CameraOptions() {};
o.exposureCorrectionMaxValue = 10F;
o.exposureCorrectionMinValue = -10F;
mockController.setMockCameraOptions(o);
mockController.setMockEngineState(true);
mockController.mExposureCorrectionChanged = false;
@ -563,9 +565,9 @@ public class CameraViewTest extends BaseTest {
@Test
public void testExposureCorrection() {
// This needs a valid CameraOptions value.
CameraOptions o = mock(CameraOptions.class);
when(o.getExposureCorrectionMinValue()).thenReturn(-10f);
when(o.getExposureCorrectionMaxValue()).thenReturn(10f);
CameraOptions o = new CameraOptions() {};
o.exposureCorrectionMaxValue = 10F;
o.exposureCorrectionMinValue = -10F;
mockController.setMockCameraOptions(o);
cameraView.setExposureCorrection(5f);
@ -745,6 +747,14 @@ public class CameraViewTest extends BaseTest {
assertEquals(cameraView.get(VideoCodec.class), VideoCodec.H_264);
}
@Test
public void testPictureFormat() {
cameraView.set(PictureFormat.JPEG);
assertEquals(cameraView.get(PictureFormat.class), PictureFormat.JPEG);
cameraView.set(PictureFormat.DNG);
assertEquals(cameraView.get(PictureFormat.class), PictureFormat.DNG);
}
@Test
public void testPreviewStreamSizeSelector() {
SizeSelector source = SizeSelectors.minHeight(50);

@ -4,6 +4,7 @@ package com.otaliastudios.cameraview;
import android.location.Location;
import com.otaliastudios.cameraview.controls.Facing;
import com.otaliastudios.cameraview.controls.PictureFormat;
import com.otaliastudios.cameraview.size.Size;
import androidx.test.ext.junit.runners.AndroidJUnit4;
@ -24,7 +25,7 @@ public class PictureResultTest extends BaseTest {
@Test
public void testResult() {
int format = PictureResult.FORMAT_JPEG;
PictureFormat format = PictureFormat.JPEG;
int rotation = 90;
Size size = new Size(20, 120);
byte[] jpeg = new byte[]{2, 4, 1, 5, 2};

@ -3,8 +3,6 @@ package com.otaliastudios.cameraview.engine;
import com.otaliastudios.cameraview.DoNotRunOnTravis;
import com.otaliastudios.cameraview.controls.Engine;
import org.junit.Ignore;
import org.junit.Test;
import org.junit.runner.RunWith;
import androidx.annotation.NonNull;
@ -20,7 +18,7 @@ import androidx.test.filters.LargeTest;
@RunWith(AndroidJUnit4.class)
@LargeTest
@DoNotRunOnTravis(because = "These do work but fail on CI emulators, due to bugs in the Camera1 emulated devices.")
public class CameraIntegration1Test extends CameraIntegrationTest {
public class Camera1IntegrationTest extends CameraIntegrationTest {
@NonNull
@Override

@ -8,7 +8,6 @@ import com.otaliastudios.cameraview.controls.Engine;
import com.otaliastudios.cameraview.engine.action.ActionHolder;
import com.otaliastudios.cameraview.engine.action.BaseAction;
import org.junit.Ignore;
import org.junit.Test;
import org.junit.runner.RunWith;
@ -27,7 +26,7 @@ import java.util.concurrent.CountDownLatch;
@RunWith(AndroidJUnit4.class)
@LargeTest
@DoNotRunOnTravis(because = "These do work but fail on CI emulators.")
public class CameraIntegration2Test extends CameraIntegrationTest {
public class Camera2IntegrationTest extends CameraIntegrationTest {
@NonNull
@Override

@ -23,6 +23,7 @@ import com.otaliastudios.cameraview.controls.Engine;
import com.otaliastudios.cameraview.controls.Flash;
import com.otaliastudios.cameraview.controls.Hdr;
import com.otaliastudios.cameraview.controls.Mode;
import com.otaliastudios.cameraview.controls.PictureFormat;
import com.otaliastudios.cameraview.controls.WhiteBalance;
import com.otaliastudios.cameraview.frame.Frame;
import com.otaliastudios.cameraview.frame.FrameProcessor;
@ -806,6 +807,32 @@ public abstract class CameraIntegrationTest extends BaseTest {
//endregion
//region Picture Formats
// TODO this fails because setPictureFormat triggers a restart() and takePicture can be called
// in the middle of the restart, failing because the engine is not properly set up. To fix this
// we would have to change the whole CameraEngine threading scheme.
// @Test
public void testPictureFormat_DNG() {
openSync(true);
if (camera.getCameraOptions().supports(PictureFormat.DNG)) {
camera.setPictureFormat(PictureFormat.DNG);
camera.takePicture();
PictureResult result = waitForPictureResult(true);
// assert that result.getData() is a DNG file:
// We can use the first 4 bytes assuming they are the same as a TIFF file
// https://en.wikipedia.org/wiki/List_of_file_signatures
byte[] b = result.getData();
boolean isII = b[0] == 'I' && b[1] == 'I' && b[2] == '*' && b[3] == '.';
boolean isMM = b[0] == 'M' && b[1] == 'M' && b[2] == '.' && b[3] == '*';
if (!isII && !isMM) {
throw new RuntimeException("Not a DNG file.");
}
}
}
//endregion
//region Frame Processing
private void assert30Frames(FrameProcessor mock) throws Exception {

@ -11,6 +11,7 @@ import com.otaliastudios.cameraview.PictureResult;
import com.otaliastudios.cameraview.VideoResult;
import com.otaliastudios.cameraview.controls.Facing;
import com.otaliastudios.cameraview.controls.Flash;
import com.otaliastudios.cameraview.controls.PictureFormat;
import com.otaliastudios.cameraview.frame.FrameManager;
import com.otaliastudios.cameraview.gesture.Gesture;
import com.otaliastudios.cameraview.controls.Hdr;
@ -116,6 +117,11 @@ public class MockCameraEngine extends CameraEngine {
mLocation = location;
}
@Override
public void setPictureFormat(@NonNull PictureFormat pictureFormat) {
mPictureFormat = pictureFormat;
}
@Override
public void takePicture(@NonNull PictureResult.Stub stub) {
super.takePicture(stub);

@ -1,8 +1,10 @@
package com.otaliastudios.cameraview;
package com.otaliastudios.cameraview.engine.options;
import android.hardware.Camera;
import com.otaliastudios.cameraview.BaseTest;
import com.otaliastudios.cameraview.CameraOptions;
import com.otaliastudios.cameraview.controls.Audio;
import com.otaliastudios.cameraview.controls.Facing;
import com.otaliastudios.cameraview.controls.Flash;
@ -22,7 +24,6 @@ import androidx.test.filters.SmallTest;
import org.junit.Test;
import org.junit.runner.RunWith;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
@ -38,11 +39,11 @@ import static org.mockito.Mockito.when;
@RunWith(AndroidJUnit4.class)
@SmallTest
public class CameraOptions1Test extends BaseTest {
public class Camera1OptionsTest extends BaseTest {
@Test
public void testEmpty() {
CameraOptions o = new CameraOptions(mock(Camera.Parameters.class), 0, false);
CameraOptions o = new Camera1Options(mock(Camera.Parameters.class), 0, false);
assertTrue(o.getSupportedPictureAspectRatios().isEmpty());
assertTrue(o.getSupportedPictureSizes().isEmpty());
assertTrue(o.getSupportedWhiteBalance().isEmpty());
@ -72,7 +73,7 @@ public class CameraOptions1Test extends BaseTest {
);
Camera.Parameters params = mock(Camera.Parameters.class);
when(params.getSupportedPictureSizes()).thenReturn(sizes);
CameraOptions o = new CameraOptions(params, 0, false);
CameraOptions o = new Camera1Options(params, 0, false);
Collection<Size> supportedSizes = o.getSupportedPictureSizes();
assertEquals(supportedSizes.size(), sizes.size());
for (Camera.Size size : sizes) {
@ -91,7 +92,7 @@ public class CameraOptions1Test extends BaseTest {
);
Camera.Parameters params = mock(Camera.Parameters.class);
when(params.getSupportedPictureSizes()).thenReturn(sizes);
CameraOptions o = new CameraOptions(params, 0, true);
CameraOptions o = new Camera1Options(params, 0, true);
Collection<Size> supportedSizes = o.getSupportedPictureSizes();
assertEquals(supportedSizes.size(), sizes.size());
for (Camera.Size size : sizes) {
@ -116,7 +117,7 @@ public class CameraOptions1Test extends BaseTest {
Camera.Parameters params = mock(Camera.Parameters.class);
when(params.getSupportedPictureSizes()).thenReturn(sizes);
CameraOptions o = new CameraOptions(params, 0, false);
CameraOptions o = new Camera1Options(params, 0, false);
Collection<AspectRatio> supportedRatios = o.getSupportedPictureAspectRatios();
assertEquals(supportedRatios.size(), expected.size());
for (AspectRatio ratio : expected) {
@ -137,7 +138,7 @@ public class CameraOptions1Test extends BaseTest {
);
Camera.Parameters params = mock(Camera.Parameters.class);
when(params.getSupportedVideoSizes()).thenReturn(sizes);
CameraOptions o = new CameraOptions(params, 0, false);
CameraOptions o = new Camera1Options(params, 0, false);
Collection<Size> supportedSizes = o.getSupportedVideoSizes();
assertEquals(supportedSizes.size(), sizes.size());
for (Camera.Size size : sizes) {
@ -158,7 +159,7 @@ public class CameraOptions1Test extends BaseTest {
Camera.Parameters params = mock(Camera.Parameters.class);
when(params.getSupportedVideoSizes()).thenReturn(null);
when(params.getSupportedPreviewSizes()).thenReturn(sizes);
CameraOptions o = new CameraOptions(params, 0, false);
CameraOptions o = new Camera1Options(params, 0, false);
Collection<Size> supportedSizes = o.getSupportedVideoSizes();
assertEquals(supportedSizes.size(), sizes.size());
for (Camera.Size size : sizes) {
@ -177,7 +178,7 @@ public class CameraOptions1Test extends BaseTest {
);
Camera.Parameters params = mock(Camera.Parameters.class);
when(params.getSupportedVideoSizes()).thenReturn(sizes);
CameraOptions o = new CameraOptions(params, 0, true);
CameraOptions o = new Camera1Options(params, 0, true);
Collection<Size> supportedSizes = o.getSupportedVideoSizes();
assertEquals(supportedSizes.size(), sizes.size());
for (Camera.Size size : sizes) {
@ -202,7 +203,7 @@ public class CameraOptions1Test extends BaseTest {
Camera.Parameters params = mock(Camera.Parameters.class);
when(params.getSupportedVideoSizes()).thenReturn(sizes);
CameraOptions o = new CameraOptions(params, 0, false);
CameraOptions o = new Camera1Options(params, 0, false);
Collection<AspectRatio> supportedRatios = o.getSupportedVideoAspectRatios();
assertEquals(supportedRatios.size(), expected.size());
for (AspectRatio ratio : expected) {
@ -218,7 +219,7 @@ public class CameraOptions1Test extends BaseTest {
when(params.getMaxExposureCompensation()).thenReturn(0);
when(params.getMinExposureCompensation()).thenReturn(0);
CameraOptions o = new CameraOptions(params, 0, false);
CameraOptions o = new Camera1Options(params, 0, false);
assertFalse(o.supports(GestureAction.AUTO_FOCUS));
assertTrue(o.supports(GestureAction.TAKE_PICTURE));
assertTrue(o.supports(GestureAction.NONE));
@ -232,7 +233,7 @@ public class CameraOptions1Test extends BaseTest {
public void testAlwaysSupportedControls() {
// Grid, VideoQuality, SessionType and Audio are always supported.
Camera.Parameters params = mock(Camera.Parameters.class);
CameraOptions o = new CameraOptions(params, 0, false);
CameraOptions o = new Camera1Options(params, 0, false);
Collection<Grid> grids = o.getSupportedControls(Grid.class);
Collection<VideoCodec> video = o.getSupportedControls(VideoCodec.class);
@ -253,7 +254,7 @@ public class CameraOptions1Test extends BaseTest {
supported.add(cameraInfo.facing);
}
CameraOptions o = new CameraOptions(mock(Camera.Parameters.class), 0, false);
CameraOptions o = new Camera1Options(mock(Camera.Parameters.class), 0, false);
Camera1Mapper m = Camera1Mapper.get();
Collection<Facing> s = o.getSupportedControls(Facing.class);
assertEquals(s.size(), supported.size());
@ -272,7 +273,7 @@ public class CameraOptions1Test extends BaseTest {
Camera.Parameters.WHITE_BALANCE_SHADE // Not supported
));
CameraOptions o = new CameraOptions(params, 0, false);
CameraOptions o = new Camera1Options(params, 0, false);
Collection<WhiteBalance> w = o.getSupportedControls(WhiteBalance.class);
assertEquals(w.size(), 2);
assertTrue(w.contains(WhiteBalance.AUTO));
@ -291,7 +292,7 @@ public class CameraOptions1Test extends BaseTest {
Camera.Parameters.FLASH_MODE_RED_EYE // Not supported
));
CameraOptions o = new CameraOptions(params, 0, false);
CameraOptions o = new Camera1Options(params, 0, false);
Collection<Flash> f = o.getSupportedControls(Flash.class);
assertEquals(f.size(), 3);
assertTrue(f.contains(Flash.OFF));
@ -312,7 +313,7 @@ public class CameraOptions1Test extends BaseTest {
Camera.Parameters.SCENE_MODE_FIREWORKS // Not supported
));
CameraOptions o = new CameraOptions(params, 0, false);
CameraOptions o = new Camera1Options(params, 0, false);
Collection<Hdr> h = o.getSupportedControls(Hdr.class);
assertEquals(h.size(), 2);
assertTrue(h.contains(Hdr.OFF));
@ -329,7 +330,7 @@ public class CameraOptions1Test extends BaseTest {
when(params.isZoomSupported()).thenReturn(true);
//noinspection ArraysAsListWithZeroOrOneArgument
when(params.getSupportedFocusModes()).thenReturn(Arrays.asList(Camera.Parameters.FOCUS_MODE_AUTO));
CameraOptions o = new CameraOptions(params, 0, false);
CameraOptions o = new Camera1Options(params, 0, false);
assertTrue(o.isZoomSupported());
assertTrue(o.isAutoFocusSupported());
}
@ -340,7 +341,7 @@ public class CameraOptions1Test extends BaseTest {
when(params.getMaxExposureCompensation()).thenReturn(10);
when(params.getMinExposureCompensation()).thenReturn(-10);
when(params.getExposureCompensationStep()).thenReturn(0.5f);
CameraOptions o = new CameraOptions(params, 0, false);
CameraOptions o = new Camera1Options(params, 0, false);
assertTrue(o.isExposureCorrectionSupported());
assertEquals(o.getExposureCorrectionMinValue(), -10f * 0.5f, 0f);
assertEquals(o.getExposureCorrectionMaxValue(), 10f * 0.5f, 0f);
@ -355,7 +356,7 @@ public class CameraOptions1Test extends BaseTest {
new int[]{60000, 120000}
);
when(params.getSupportedPreviewFpsRange()).thenReturn(result);
CameraOptions o = new CameraOptions(params, 0, false);
CameraOptions o = new Camera1Options(params, 0, false);
assertEquals(20F, o.getPreviewFrameRateMinValue(), 0.001F);
assertEquals(120F, o.getPreviewFrameRateMaxValue(), 0.001F);
}

@ -18,6 +18,7 @@ import com.otaliastudios.cameraview.controls.Control;
import com.otaliastudios.cameraview.controls.Engine;
import com.otaliastudios.cameraview.controls.Facing;
import com.otaliastudios.cameraview.controls.Flash;
import com.otaliastudios.cameraview.controls.PictureFormat;
import com.otaliastudios.cameraview.controls.Preview;
import com.otaliastudios.cameraview.engine.mappers.Camera1Mapper;
import com.otaliastudios.cameraview.engine.mappers.Camera2Mapper;
@ -44,263 +45,27 @@ import java.util.Set;
/**
* Options telling you what is available and what is not.
*/
public class CameraOptions {
private Set<WhiteBalance> supportedWhiteBalance = new HashSet<>(5);
private Set<Facing> supportedFacing = new HashSet<>(2);
private Set<Flash> supportedFlash = new HashSet<>(4);
private Set<Hdr> supportedHdr = new HashSet<>(2);
private Set<Size> supportedPictureSizes = new HashSet<>(15);
private Set<Size> supportedVideoSizes = new HashSet<>(5);
private Set<AspectRatio> supportedPictureAspectRatio = new HashSet<>(4);
private Set<AspectRatio> supportedVideoAspectRatio = new HashSet<>(3);
private boolean zoomSupported;
private boolean exposureCorrectionSupported;
private float exposureCorrectionMinValue;
private float exposureCorrectionMaxValue;
private boolean autoFocusSupported;
private float previewFrameRateMinValue;
private float previewFrameRateMaxValue;
public CameraOptions(@NonNull Camera.Parameters params, int cameraId, boolean flipSizes) {
List<String> strings;
Camera1Mapper mapper = Camera1Mapper.get();
// Facing
Camera.CameraInfo cameraInfo = new Camera.CameraInfo();
for (int i = 0, count = Camera.getNumberOfCameras(); i < count; i++) {
Camera.getCameraInfo(i, cameraInfo);
Facing value = mapper.unmapFacing(cameraInfo.facing);
if (value != null) supportedFacing.add(value);
}
// WB
strings = params.getSupportedWhiteBalance();
if (strings != null) {
for (String string : strings) {
WhiteBalance value = mapper.unmapWhiteBalance(string);
if (value != null) supportedWhiteBalance.add(value);
}
}
// Flash
supportedFlash.add(Flash.OFF);
strings = params.getSupportedFlashModes();
if (strings != null) {
for (String string : strings) {
Flash value = mapper.unmapFlash(string);
if (value != null) supportedFlash.add(value);
}
}
// Hdr
supportedHdr.add(Hdr.OFF);
strings = params.getSupportedSceneModes();
if (strings != null) {
for (String string : strings) {
Hdr value = mapper.unmapHdr(string);
if (value != null) supportedHdr.add(value);
}
}
// zoom
zoomSupported = params.isZoomSupported();
// autofocus
autoFocusSupported = params.getSupportedFocusModes()
.contains(Camera.Parameters.FOCUS_MODE_AUTO);
// Exposure correction
float step = params.getExposureCompensationStep();
exposureCorrectionMinValue = (float) params.getMinExposureCompensation() * step;
exposureCorrectionMaxValue = (float) params.getMaxExposureCompensation() * step;
exposureCorrectionSupported = params.getMinExposureCompensation() != 0
|| params.getMaxExposureCompensation() != 0;
// Picture Sizes
List<Camera.Size> sizes = params.getSupportedPictureSizes();
for (Camera.Size size : sizes) {
int width = flipSizes ? size.height : size.width;
int height = flipSizes ? size.width : size.height;
supportedPictureSizes.add(new Size(width, height));
supportedPictureAspectRatio.add(AspectRatio.of(width, height));
}
// Video Sizes
// As a safety measure, remove Sizes bigger than CamcorderProfile.highest
CamcorderProfile profile = CamcorderProfiles.get(cameraId,
new Size(Integer.MAX_VALUE, Integer.MAX_VALUE));
Size videoMaxSize = new Size(profile.videoFrameWidth, profile.videoFrameHeight);
List<Camera.Size> vsizes = params.getSupportedVideoSizes();
if (vsizes != null) {
for (Camera.Size size : vsizes) {
if (size.width <= videoMaxSize.getWidth()
&& size.height <= videoMaxSize.getHeight()) {
int width = flipSizes ? size.height : size.width;
int height = flipSizes ? size.width : size.height;
supportedVideoSizes.add(new Size(width, height));
supportedVideoAspectRatio.add(AspectRatio.of(width, height));
}
}
} else {
// StackOverflow threads seems to agree that if getSupportedVideoSizes is null,
// previews can be used.
List<Camera.Size> fallback = params.getSupportedPreviewSizes();
for (Camera.Size size : fallback) {
if (size.width <= videoMaxSize.getWidth()
&& size.height <= videoMaxSize.getHeight()) {
int width = flipSizes ? size.height : size.width;
int height = flipSizes ? size.width : size.height;
supportedVideoSizes.add(new Size(width, height));
supportedVideoAspectRatio.add(AspectRatio.of(width, height));
}
}
}
// Preview FPS
previewFrameRateMinValue = Float.MAX_VALUE;
previewFrameRateMaxValue = -Float.MAX_VALUE;
List<int[]> fpsRanges = params.getSupportedPreviewFpsRange();
for (int[] fpsRange : fpsRanges) {
float lower = (float) fpsRange[0] / 1000F;
float upper = (float) fpsRange[1] / 1000F;
previewFrameRateMinValue = Math.min(previewFrameRateMinValue, lower);
previewFrameRateMaxValue = Math.max(previewFrameRateMaxValue, upper);
}
}
// Camera2Engine constructor.
@RequiresApi(Build.VERSION_CODES.LOLLIPOP)
public CameraOptions(@NonNull CameraManager manager,
@NonNull String cameraId,
boolean flipSizes) throws CameraAccessException {
Camera2Mapper mapper = Camera2Mapper.get();
CameraCharacteristics cameraCharacteristics = manager.getCameraCharacteristics(cameraId);
// Facing
for (String cameraId1 : manager.getCameraIdList()) {
CameraCharacteristics cameraCharacteristics1 = manager
.getCameraCharacteristics(cameraId1);
Integer cameraFacing = cameraCharacteristics1.get(CameraCharacteristics.LENS_FACING);
if (cameraFacing != null) {
Facing value = mapper.unmapFacing(cameraFacing);
if (value != null) supportedFacing.add(value);
}
}
// WB
int[] awbModes = cameraCharacteristics.get(
CameraCharacteristics.CONTROL_AWB_AVAILABLE_MODES);
//noinspection ConstantConditions
for (int awbMode : awbModes) {
WhiteBalance value = mapper.unmapWhiteBalance(awbMode);
if (value != null) supportedWhiteBalance.add(value);
}
// Flash
supportedFlash.add(Flash.OFF);
Boolean hasFlash = cameraCharacteristics.get(CameraCharacteristics.FLASH_INFO_AVAILABLE);
if (hasFlash != null && hasFlash) {
int[] aeModes = cameraCharacteristics.get(
CameraCharacteristics.CONTROL_AE_AVAILABLE_MODES);
//noinspection ConstantConditions
for (int aeMode : aeModes) {
Set<Flash> flashes = mapper.unmapFlash(aeMode);
supportedFlash.addAll(flashes);
}
}
// HDR
supportedHdr.add(Hdr.OFF);
int[] sceneModes = cameraCharacteristics.get(
CameraCharacteristics.CONTROL_AVAILABLE_SCENE_MODES);
//noinspection ConstantConditions
for (int sceneMode : sceneModes) {
Hdr value = mapper.unmapHdr(sceneMode);
if (value != null) supportedHdr.add(value);
}
// Zoom
Float maxZoom = cameraCharacteristics.get(
CameraCharacteristics.SCALER_AVAILABLE_MAX_DIGITAL_ZOOM);
if(maxZoom != null) {
zoomSupported = maxZoom > 1;
}
// AutoFocus
// This now means 3A metering with respect to a specific region of the screen.
// Some controls (AF, AE) have special triggers that might or might not be supported.
// But they can also be on some continuous search mode so that the trigger is not needed.
// What really matters in my opinion is the availability of regions.
Integer afRegions = cameraCharacteristics.get(CameraCharacteristics.CONTROL_MAX_REGIONS_AF);
Integer aeRegions = cameraCharacteristics.get(CameraCharacteristics.CONTROL_MAX_REGIONS_AE);
Integer awbRegions = cameraCharacteristics.get(
CameraCharacteristics.CONTROL_MAX_REGIONS_AWB);
autoFocusSupported = (afRegions != null && afRegions > 0)
|| (aeRegions != null && aeRegions > 0)
|| (awbRegions != null && awbRegions > 0);
// Exposure correction
Range<Integer> exposureRange = cameraCharacteristics.get(
CameraCharacteristics.CONTROL_AE_COMPENSATION_RANGE);
Rational exposureStep = cameraCharacteristics.get(
CameraCharacteristics.CONTROL_AE_COMPENSATION_STEP);
if (exposureRange != null && exposureStep != null && exposureStep.floatValue() != 0) {
exposureCorrectionMinValue = exposureRange.getLower() / exposureStep.floatValue();
exposureCorrectionMaxValue = exposureRange.getUpper() / exposureStep.floatValue();
}
exposureCorrectionSupported = exposureCorrectionMinValue != 0
&& exposureCorrectionMaxValue != 0;
// Picture Sizes
StreamConfigurationMap streamMap = cameraCharacteristics.get(
CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
if (streamMap == null) {
throw new RuntimeException("StreamConfigurationMap is null. Should not happen.");
}
android.util.Size[] psizes = streamMap.getOutputSizes(ImageFormat.JPEG);
for (android.util.Size size : psizes) {
int width = flipSizes ? size.getHeight() : size.getWidth();
int height = flipSizes ? size.getWidth() : size.getHeight();
supportedPictureSizes.add(new Size(width, height));
supportedPictureAspectRatio.add(AspectRatio.of(width, height));
}
// Video Sizes
// As a safety measure, remove Sizes bigger than CamcorderProfile.highest
CamcorderProfile profile = CamcorderProfiles.get(cameraId,
new Size(Integer.MAX_VALUE, Integer.MAX_VALUE));
Size videoMaxSize = new Size(profile.videoFrameWidth, profile.videoFrameHeight);
android.util.Size[] vsizes = streamMap.getOutputSizes(MediaRecorder.class);
for (android.util.Size size : vsizes) {
if (size.getWidth() <= videoMaxSize.getWidth()
&& size.getHeight() <= videoMaxSize.getHeight()) {
int width = flipSizes ? size.getHeight() : size.getWidth();
int height = flipSizes ? size.getWidth() : size.getHeight();
supportedVideoSizes.add(new Size(width, height));
supportedVideoAspectRatio.add(AspectRatio.of(width, height));
}
}
// Preview FPS
Range<Integer>[] range = cameraCharacteristics.get(
CameraCharacteristics.CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES);
if (range != null) {
previewFrameRateMinValue = Float.MAX_VALUE;
previewFrameRateMaxValue = -Float.MAX_VALUE;
for (Range<Integer> fpsRange : range) {
previewFrameRateMinValue = Math.min(previewFrameRateMinValue, fpsRange.getLower());
previewFrameRateMaxValue = Math.max(previewFrameRateMaxValue, fpsRange.getUpper());
}
} else {
previewFrameRateMinValue = 0F;
previewFrameRateMaxValue = 0F;
}
}
public abstract class CameraOptions {
protected Set<WhiteBalance> supportedWhiteBalance = new HashSet<>(5);
protected Set<Facing> supportedFacing = new HashSet<>(2);
protected Set<Flash> supportedFlash = new HashSet<>(4);
protected Set<Hdr> supportedHdr = new HashSet<>(2);
protected Set<Size> supportedPictureSizes = new HashSet<>(15);
protected Set<Size> supportedVideoSizes = new HashSet<>(5);
protected Set<AspectRatio> supportedPictureAspectRatio = new HashSet<>(4);
protected Set<AspectRatio> supportedVideoAspectRatio = new HashSet<>(3);
protected Set<PictureFormat> supportedPictureFormats = new HashSet<>(2);
protected boolean zoomSupported;
protected boolean exposureCorrectionSupported;
protected float exposureCorrectionMinValue;
protected float exposureCorrectionMaxValue;
protected boolean autoFocusSupported;
protected float previewFrameRateMinValue;
protected float previewFrameRateMaxValue;
protected CameraOptions() { }
/**
* Shorthand for getSupported*().contains(value).
@ -308,7 +73,7 @@ public class CameraOptions {
* @param control value to check
* @return whether it's supported
*/
public boolean supports(@NonNull Control control) {
public final boolean supports(@NonNull Control control) {
return getSupportedControls(control.getClass()).contains(control);
}
@ -319,7 +84,7 @@ public class CameraOptions {
* @param action value to be checked
* @return whether it's supported
*/
public boolean supports(@NonNull GestureAction action) {
public final boolean supports(@NonNull GestureAction action) {
switch (action) {
case AUTO_FOCUS:
return isAutoFocusSupported();
@ -338,7 +103,8 @@ public class CameraOptions {
@SuppressWarnings("unchecked")
@NonNull
public <T extends Control> Collection<T> getSupportedControls(@NonNull Class<T> controlClass) {
public final <T extends Control> Collection<T> getSupportedControls(
@NonNull Class<T> controlClass) {
if (controlClass.equals(Audio.class)) {
return (Collection<T>) Arrays.asList(Audio.values());
} else if (controlClass.equals(Facing.class)) {
@ -359,6 +125,8 @@ public class CameraOptions {
return (Collection<T>) Arrays.asList(Engine.values());
} else if (controlClass.equals(Preview.class)) {
return (Collection<T>) Arrays.asList(Preview.values());
} else if (controlClass.equals(PictureFormat.class)) {
return (Collection<T>) getSupportedPictureFormats();
}
// Unrecognized control.
return Collections.emptyList();
@ -370,7 +138,7 @@ public class CameraOptions {
* @return a collection of supported values.
*/
@NonNull
public Collection<Size> getSupportedPictureSizes() {
public final Collection<Size> getSupportedPictureSizes() {
return Collections.unmodifiableSet(supportedPictureSizes);
}
@ -379,9 +147,8 @@ public class CameraOptions {
*
* @return a collection of supported values.
*/
@SuppressWarnings("WeakerAccess")
@NonNull
public Collection<AspectRatio> getSupportedPictureAspectRatios() {
public final Collection<AspectRatio> getSupportedPictureAspectRatios() {
return Collections.unmodifiableSet(supportedPictureAspectRatio);
}
@ -391,7 +158,7 @@ public class CameraOptions {
* @return a collection of supported values.
*/
@NonNull
public Collection<Size> getSupportedVideoSizes() {
public final Collection<Size> getSupportedVideoSizes() {
return Collections.unmodifiableSet(supportedVideoSizes);
}
@ -400,9 +167,8 @@ public class CameraOptions {
*
* @return a set of supported values.
*/
@SuppressWarnings("WeakerAccess")
@NonNull
public Collection<AspectRatio> getSupportedVideoAspectRatios() {
public final Collection<AspectRatio> getSupportedVideoAspectRatios() {
return Collections.unmodifiableSet(supportedVideoAspectRatio);
}
@ -414,7 +180,7 @@ public class CameraOptions {
* @return a collection of supported values.
*/
@NonNull
public Collection<Facing> getSupportedFacing() {
public final Collection<Facing> getSupportedFacing() {
return Collections.unmodifiableSet(supportedFacing);
}
@ -428,7 +194,7 @@ public class CameraOptions {
* @return a collection of supported values.
*/
@NonNull
public Collection<Flash> getSupportedFlash() {
public final Collection<Flash> getSupportedFlash() {
return Collections.unmodifiableSet(supportedFlash);
}
@ -443,7 +209,7 @@ public class CameraOptions {
* @return a collection of supported values.
*/
@NonNull
public Collection<WhiteBalance> getSupportedWhiteBalance() {
public final Collection<WhiteBalance> getSupportedWhiteBalance() {
return Collections.unmodifiableSet(supportedWhiteBalance);
}
@ -454,19 +220,30 @@ public class CameraOptions {
* @see Hdr#ON
* @return a collection of supported values.
*/
@SuppressWarnings("WeakerAccess")
@NonNull
public Collection<Hdr> getSupportedHdr() {
public final Collection<Hdr> getSupportedHdr() {
return Collections.unmodifiableSet(supportedHdr);
}
/**
* Set of supported picture formats.
*
* @see PictureFormat#JPEG
* @see PictureFormat#DNG
* @return a collection of supported values.
*/
@NonNull
public final Collection<PictureFormat> getSupportedPictureFormats() {
return Collections.unmodifiableSet(supportedPictureFormats);
}
/**
* Whether zoom is supported. If this is false, pinch-to-zoom
* will not work and {@link CameraView#setZoom(float)} will have no effect.
*
* @return whether zoom is supported.
*/
public boolean isZoomSupported() {
public final boolean isZoomSupported() {
return zoomSupported;
}
@ -478,7 +255,7 @@ public class CameraOptions {
*
* @return whether auto focus is supported.
*/
public boolean isAutoFocusSupported() {
public final boolean isAutoFocusSupported() {
return autoFocusSupported;
}
@ -490,7 +267,7 @@ public class CameraOptions {
* @see #getExposureCorrectionMaxValue()
* @return whether exposure correction is supported.
*/
public boolean isExposureCorrectionSupported() {
public final boolean isExposureCorrectionSupported() {
return exposureCorrectionSupported;
}
@ -500,7 +277,7 @@ public class CameraOptions {
*
* @return min EV value
*/
public float getExposureCorrectionMinValue() {
public final float getExposureCorrectionMinValue() {
return exposureCorrectionMinValue;
}
@ -511,7 +288,7 @@ public class CameraOptions {
*
* @return max EV value
*/
public float getExposureCorrectionMaxValue() {
public final float getExposureCorrectionMaxValue() {
return exposureCorrectionMaxValue;
}
@ -520,7 +297,7 @@ public class CameraOptions {
*
* @return the min value
*/
public float getPreviewFrameRateMinValue() {
public final float getPreviewFrameRateMinValue() {
return previewFrameRateMinValue;
}
@ -529,7 +306,7 @@ public class CameraOptions {
*
* @return the max value
*/
public float getPreviewFrameRateMaxValue() {
public final float getPreviewFrameRateMaxValue() {
return previewFrameRateMaxValue;
}
}

@ -284,7 +284,7 @@ public class CameraUtils {
ExifInterface exif = new ExifInterface(stream);
int exifOrientation = exif.getAttributeInt(ExifInterface.TAG_ORIENTATION,
ExifInterface.ORIENTATION_NORMAL);
orientation = ExifHelper.readExifOrientation(exifOrientation);
orientation = ExifHelper.getOrientation(exifOrientation);
flip = exifOrientation == ExifInterface.ORIENTATION_FLIP_HORIZONTAL ||
exifOrientation == ExifInterface.ORIENTATION_FLIP_VERTICAL ||
exifOrientation == ExifInterface.ORIENTATION_TRANSPOSE ||

@ -40,6 +40,7 @@ import com.otaliastudios.cameraview.controls.Flash;
import com.otaliastudios.cameraview.controls.Grid;
import com.otaliastudios.cameraview.controls.Hdr;
import com.otaliastudios.cameraview.controls.Mode;
import com.otaliastudios.cameraview.controls.PictureFormat;
import com.otaliastudios.cameraview.controls.Preview;
import com.otaliastudios.cameraview.controls.VideoCodec;
import com.otaliastudios.cameraview.controls.WhiteBalance;
@ -246,6 +247,7 @@ public class CameraView extends FrameLayout implements LifecycleObserver {
setPictureSize(sizeSelectors.getPictureSizeSelector());
setPictureMetering(pictureMetering);
setPictureSnapshotMetering(pictureSnapshotMetering);
setPictureFormat(controls.getPictureFormat());
setVideoSize(sizeSelectors.getVideoSizeSelector());
setVideoCodec(controls.getVideoCodec());
setVideoMaxSize(videoMaxSize);
@ -847,6 +849,8 @@ public class CameraView extends FrameLayout implements LifecycleObserver {
setPreview((Preview) control);
} else if (control instanceof Engine) {
setEngine((Engine) control);
} else if (control instanceof PictureFormat) {
setPictureFormat((PictureFormat) control);
}
}
@ -881,6 +885,8 @@ public class CameraView extends FrameLayout implements LifecycleObserver {
return (T) getPreview();
} else if (controlClass == Engine.class) {
return (T) getEngine();
} else if (controlClass == PictureFormat.class) {
return (T) getPictureFormat();
} else {
throw new IllegalArgumentException("Unknown control class: " + controlClass);
}
@ -948,6 +954,7 @@ public class CameraView extends FrameLayout implements LifecycleObserver {
setAudio(oldEngine.getAudio());
setAudioBitRate(oldEngine.getAudioBitRate());
setPictureSize(oldEngine.getPictureSizeSelector());
setPictureFormat(oldEngine.getPictureFormat());
setVideoSize(oldEngine.getVideoSizeSelector());
setVideoCodec(oldEngine.getVideoCodec());
setVideoMaxSize(oldEngine.getVideoMaxSize());
@ -1418,6 +1425,29 @@ public class CameraView extends FrameLayout implements LifecycleObserver {
return mCameraEngine.getPictureSnapshotMetering();
}
/**
* Sets the format for pictures taken with {@link #takePicture()}. This format does not apply
* to picture snapshots taken with {@link #takePictureSnapshot()}.
* The {@link PictureFormat#JPEG} is always supported - for other values, please check
* the {@link CameraOptions#getSupportedPictureFormats()} value.
*
* @param pictureFormat new format
*/
public void setPictureFormat(@NonNull PictureFormat pictureFormat) {
mCameraEngine.setPictureFormat(pictureFormat);
}
/**
* Returns the current picture format.
* @see #setPictureFormat(PictureFormat)
* @return the picture format
*/
@NonNull
public PictureFormat getPictureFormat() {
return mCameraEngine.getPictureFormat();
}
/**
* Sets a capture size selector for video mode.
* The {@link SizeSelector} will be invoked with the list of available sizes, and the first

@ -2,8 +2,10 @@ package com.otaliastudios.cameraview;
import android.graphics.BitmapFactory;
import android.location.Location;
import android.os.Build;
import com.otaliastudios.cameraview.controls.Facing;
import com.otaliastudios.cameraview.controls.PictureFormat;
import com.otaliastudios.cameraview.size.Size;
import java.io.File;
@ -31,19 +33,16 @@ public class PictureResult {
public Size size;
public Facing facing;
public byte[] data;
public int format;
public PictureFormat format;
}
public final static int FORMAT_JPEG = 0;
// public final static int FORMAT_PNG = 1;
private final boolean isSnapshot;
private final Location location;
private final int rotation;
private final Size size;
private final Facing facing;
private final byte[] data;
private final int format;
private final PictureFormat format;
PictureResult(@NonNull Stub builder) {
isSnapshot = builder.isSnapshot;
@ -118,12 +117,12 @@ public class PictureResult {
}
/**
* Returns the image format. At the moment this will always be
* {@link #FORMAT_JPEG}.
* Returns the format for {@link #getData()}.
*
* @return the current format
* @return the format
*/
public int getFormat() {
@NonNull
public PictureFormat getFormat() {
return format;
}
@ -137,8 +136,18 @@ public class PictureResult {
* @param callback a callback to be notified of image decoding
*/
public void toBitmap(int maxWidth, int maxHeight, @NonNull BitmapCallback callback) {
if (format == PictureFormat.JPEG) {
CameraUtils.decodeBitmap(getData(), maxWidth, maxHeight, new BitmapFactory.Options(),
rotation, callback);
} else if (format == PictureFormat.DNG && Build.VERSION.SDK_INT >= 24) {
// Apparently: BitmapFactory added DNG support in API 24.
// https://github.com/aosp-mirror/platform_frameworks_base/blob/nougat-mr1-release/core/jni/android/graphics/BitmapFactory.cpp
CameraUtils.decodeBitmap(getData(), maxWidth, maxHeight, new BitmapFactory.Options(),
rotation, callback);
} else {
throw new UnsupportedOperationException("PictureResult.toBitmap() does not support "
+ "this picture format: " + format);
}
}
/**

@ -23,6 +23,7 @@ public class ControlParser {
private int audio;
private int videoCodec;
private int engine;
private int pictureFormat;
public ControlParser(@NonNull Context context, @NonNull TypedArray array) {
preview = array.getInteger(R.styleable.CameraView_cameraPreview, Preview.DEFAULT.value());
@ -38,6 +39,8 @@ public class ControlParser {
videoCodec = array.getInteger(R.styleable.CameraView_cameraVideoCodec,
VideoCodec.DEFAULT.value());
engine = array.getInteger(R.styleable.CameraView_cameraEngine, Engine.DEFAULT.value());
pictureFormat = array.getInteger(R.styleable.CameraView_cameraPictureFormat,
PictureFormat.DEFAULT.value());
}
@NonNull
@ -90,4 +93,9 @@ public class ControlParser {
public Engine getEngine() {
return Engine.fromValue(engine);
}
@NonNull
public PictureFormat getPictureFormat() {
return PictureFormat.fromValue(pictureFormat);
}
}

@ -0,0 +1,52 @@
package com.otaliastudios.cameraview.controls;
import androidx.annotation.NonNull;
import com.otaliastudios.cameraview.CameraOptions;
import com.otaliastudios.cameraview.CameraView;
/**
* Format of the picture results for pictures that are taken with {@link CameraView#takePicture()}.
* This does not apply to picture snapshots.
*
* @see CameraView#setPictureFormat(PictureFormat)
*/
public enum PictureFormat implements Control {
/**
* The picture result data will be a JPEG file.
* This value is always supported.
*/
JPEG(0),
/**
* The picture result data will be a DNG file.
* This is only supported with the {@link Engine#CAMERA2} engine and only on
* specific devices. Please check {@link CameraOptions#getSupportedPictureFormats()}.
*/
DNG(1);
static final PictureFormat DEFAULT = JPEG;
private int value;
PictureFormat(int value) {
this.value = value;
}
int value() {
return value;
}
@NonNull
static PictureFormat fromValue(int value) {
PictureFormat[] list = PictureFormat.values();
for (PictureFormat action : list) {
if (action.value() == value) {
return action;
}
}
return DEFAULT;
}
}

@ -13,17 +13,17 @@ import androidx.annotation.NonNull;
import androidx.annotation.Nullable;
import androidx.annotation.VisibleForTesting;
import android.util.Range;
import android.view.SurfaceHolder;
import com.google.android.gms.tasks.Task;
import com.google.android.gms.tasks.Tasks;
import com.otaliastudios.cameraview.CameraException;
import com.otaliastudios.cameraview.CameraLogger;
import com.otaliastudios.cameraview.CameraOptions;
import com.otaliastudios.cameraview.controls.PictureFormat;
import com.otaliastudios.cameraview.engine.mappers.Camera1Mapper;
import com.otaliastudios.cameraview.engine.offset.Axis;
import com.otaliastudios.cameraview.engine.offset.Reference;
import com.otaliastudios.cameraview.engine.options.Camera1Options;
import com.otaliastudios.cameraview.frame.Frame;
import com.otaliastudios.cameraview.PictureResult;
import com.otaliastudios.cameraview.VideoResult;
@ -156,7 +156,7 @@ public class Camera1Engine extends CameraEngine implements
// Set parameters that might have been set before the camera was opened.
LOG.i("onStartEngine:", "Applying default parameters.");
Camera.Parameters params = mCamera.getParameters();
mCameraOptions = new CameraOptions(params, mCameraId,
mCameraOptions = new Camera1Options(params, mCameraId,
getAngles().flip(Reference.SENSOR, Reference.VIEW));
applyAllParameters(params);
mCamera.setParameters(params);
@ -717,6 +717,14 @@ public class Camera1Engine extends CameraEngine implements
return false;
}
@Override
public void setPictureFormat(@NonNull PictureFormat pictureFormat) {
if (pictureFormat != PictureFormat.JPEG) {
throw new UnsupportedOperationException("Unsupported picture format: " + pictureFormat);
}
mPictureFormat = pictureFormat;
}
//endregion
//region Frame Processing

@ -35,13 +35,13 @@ import com.google.android.gms.tasks.TaskCompletionSource;
import com.google.android.gms.tasks.Tasks;
import com.otaliastudios.cameraview.CameraException;
import com.otaliastudios.cameraview.CameraLogger;
import com.otaliastudios.cameraview.CameraOptions;
import com.otaliastudios.cameraview.PictureResult;
import com.otaliastudios.cameraview.VideoResult;
import com.otaliastudios.cameraview.controls.Facing;
import com.otaliastudios.cameraview.controls.Flash;
import com.otaliastudios.cameraview.controls.Hdr;
import com.otaliastudios.cameraview.controls.Mode;
import com.otaliastudios.cameraview.controls.PictureFormat;
import com.otaliastudios.cameraview.controls.WhiteBalance;
import com.otaliastudios.cameraview.engine.action.Action;
import com.otaliastudios.cameraview.engine.action.ActionHolder;
@ -53,6 +53,7 @@ import com.otaliastudios.cameraview.engine.meter.MeterAction;
import com.otaliastudios.cameraview.engine.meter.MeterResetAction;
import com.otaliastudios.cameraview.engine.offset.Axis;
import com.otaliastudios.cameraview.engine.offset.Reference;
import com.otaliastudios.cameraview.engine.options.Camera2Options;
import com.otaliastudios.cameraview.frame.Frame;
import com.otaliastudios.cameraview.frame.FrameManager;
import com.otaliastudios.cameraview.gesture.Gesture;
@ -386,7 +387,14 @@ public class Camera2Engine extends CameraEngine implements ImageReader.OnImageAv
LOG.i("createCamera:", "Applying default parameters.");
mCameraCharacteristics = mManager.getCameraCharacteristics(mCameraId);
boolean flip = getAngles().flip(Reference.SENSOR, Reference.VIEW);
mCameraOptions = new CameraOptions(mManager, mCameraId, flip);
int format;
switch (mPictureFormat) {
case JPEG: format = ImageFormat.JPEG; break;
case DNG: format = ImageFormat.RAW_SENSOR; break;
default: throw new IllegalArgumentException("Unknown format:"
+ mPictureFormat);
}
mCameraOptions = new Camera2Options(mManager, mCameraId, flip, format);
createRepeatingRequestBuilder(CameraDevice.TEMPLATE_PREVIEW);
} catch (CameraAccessException e) {
task.trySetException(createCameraException(e));
@ -426,6 +434,12 @@ public class Camera2Engine extends CameraEngine implements ImageReader.OnImageAv
final TaskCompletionSource<Void> task = new TaskCompletionSource<>();
// Compute sizes.
// TODO preview stream should never be bigger than 1920x1080 as per
// CameraDevice.createCaptureSession. This should be probably be applied
// before all the other external selectors, to treat it as a hard limit.
// OR: pass an int into these functions to be able to take smaller dims
// when session configuration fails
// OR: both.
mCaptureSize = computeCaptureSize();
mPreviewStreamSize = computePreviewStreamSize();
@ -477,13 +491,18 @@ public class Camera2Engine extends CameraEngine implements ImageReader.OnImageAv
}
// 3. PICTURE RECORDING
// Format is supported, or it would have thrown in Camera2Options constructor.
if (getMode() == Mode.PICTURE) {
int format;
switch (mPictureFormat) {
case JPEG: format = ImageFormat.JPEG; break;
case DNG: format = ImageFormat.RAW_SENSOR; break;
default: throw new IllegalArgumentException("Unknown format:" + mPictureFormat);
}
mPictureReader = ImageReader.newInstance(
mCaptureSize.getWidth(),
mCaptureSize.getHeight(),
ImageFormat.JPEG,
2
);
format, 2);
outputSurfaces.add(mPictureReader.getSurface());
}
@ -501,8 +520,8 @@ public class Camera2Engine extends CameraEngine implements ImageReader.OnImageAv
sizes.add(new Size(aSize.getWidth(), aSize.getHeight()));
}
mFrameProcessingSize = SizeSelectors.and(
SizeSelectors.maxWidth(Math.min(700, mPreviewStreamSize.getWidth())),
SizeSelectors.maxHeight(Math.min(700, mPreviewStreamSize.getHeight())),
SizeSelectors.maxWidth(Math.min(640, mPreviewStreamSize.getWidth())),
SizeSelectors.maxHeight(Math.min(640, mPreviewStreamSize.getHeight())),
SizeSelectors.biggest()).select(sizes).get(0);
mFrameProcessingReader = ImageReader.newInstance(
mFrameProcessingSize.getWidth(),
@ -613,6 +632,7 @@ public class Camera2Engine extends CameraEngine implements ImageReader.OnImageAv
throw createCameraException(e);
}
removeRepeatingRequestBuilderSurfaces();
mLastRepeatingResult = null;
LOG.i("onStopPreview:", "Returning.");
return Tasks.forResult(null);
}
@ -690,7 +710,10 @@ public class Camera2Engine extends CameraEngine implements ImageReader.OnImageAv
action.addCallback(new CompletionCallback() {
@Override
protected void onActionCompleted(@NonNull Action action) {
onTakePictureSnapshot(stub, outputRatio, false);
// This is called on any thread, so be careful.
setPictureSnapshotMetering(false);
takePictureSnapshot(stub);
setPictureSnapshotMetering(true);
}
});
action.start(this);
@ -720,7 +743,10 @@ public class Camera2Engine extends CameraEngine implements ImageReader.OnImageAv
action.addCallback(new CompletionCallback() {
@Override
protected void onActionCompleted(@NonNull Action action) {
onTakePicture(stub, false);
// This is called on any thread, so be careful.
setPictureMetering(false);
takePicture(stub);
setPictureMetering(true);
}
});
action.start(this);
@ -1304,6 +1330,28 @@ public class Camera2Engine extends CameraEngine implements ImageReader.OnImageAv
return false;
}
@Override
public void setPictureFormat(final @NonNull PictureFormat pictureFormat) {
if (pictureFormat != mPictureFormat) {
mPictureFormat = pictureFormat;
LOG.i("setPictureFormat", "changing to", pictureFormat, "posting.");
mHandler.run(new Runnable() {
@Override
public void run() {
LOG.i("setPictureFormat", "changing to", pictureFormat,
"executing. EngineState:", getEngineState(),
"BindState:", getBindState());
if (getEngineState() == STATE_STOPPED) {
LOG.i("setPictureFormat", "not started so won't restart.");
} else {
LOG.i("setPictureFormat", "started or starting. Calling restart()");
restart();
}
}
});
}
}
//endregion
//region Frame Processing
@ -1518,8 +1566,10 @@ public class Camera2Engine extends CameraEngine implements ImageReader.OnImageAv
@Override
public void applyBuilder(@NonNull Action source, @NonNull CaptureRequest.Builder builder)
throws CameraAccessException {
if (getPreviewState() == STATE_STARTED) {
mSession.capture(builder.build(), mRepeatingRequestCallback, null);
}
}
//endregion
}

@ -18,7 +18,7 @@ import com.otaliastudios.cameraview.CameraException;
import com.otaliastudios.cameraview.CameraLogger;
import com.otaliastudios.cameraview.CameraOptions;
import com.otaliastudios.cameraview.PictureResult;
import com.otaliastudios.cameraview.engine.mappers.Camera1Mapper;
import com.otaliastudios.cameraview.controls.PictureFormat;
import com.otaliastudios.cameraview.overlay.Overlay;
import com.otaliastudios.cameraview.VideoResult;
import com.otaliastudios.cameraview.engine.offset.Angles;
@ -168,6 +168,7 @@ public abstract class CameraEngine implements
@SuppressWarnings("WeakerAccess") protected WhiteBalance mWhiteBalance;
@SuppressWarnings("WeakerAccess") protected VideoCodec mVideoCodec;
@SuppressWarnings("WeakerAccess") protected Hdr mHdr;
@SuppressWarnings("WeakerAccess") protected PictureFormat mPictureFormat;
@SuppressWarnings("WeakerAccess") protected Location mLocation;
@SuppressWarnings("WeakerAccess") protected float mZoomValue;
@SuppressWarnings("WeakerAccess") protected float mExposureCorrectionValue;
@ -1020,6 +1021,11 @@ public abstract class CameraEngine implements
return mLocation;
}
@NonNull
public final PictureFormat getPictureFormat() {
return mPictureFormat;
}
public final float getZoomValue() {
return mZoomValue;
}
@ -1109,6 +1115,9 @@ public abstract class CameraEngine implements
// If closed, keep. If opened, check supported and apply.
public abstract void setLocation(@Nullable Location location);
// If closed, keep. If opened, check supported and apply.
public abstract void setPictureFormat(@NonNull PictureFormat pictureFormat);
public abstract void startAutoFocus(@Nullable Gesture gesture, @NonNull PointF point);
public abstract void setPlaySounds(boolean playSounds);
@ -1125,11 +1134,11 @@ public abstract class CameraEngine implements
/* not final for tests */
public void takePicture(final @NonNull PictureResult.Stub stub) {
LOG.v("takePicture", "scheduling");
LOG.i("takePicture", "scheduling");
mHandler.run(new Runnable() {
@Override
public void run() {
LOG.v("takePicture", "performing. BindState:", getBindState(),
LOG.i("takePicture", "performing. BindState:", getBindState(),
"isTakingPicture:", isTakingPicture());
if (mMode == Mode.VIDEO) {
throw new IllegalStateException("Can't take hq pictures while in VIDEO mode");
@ -1139,6 +1148,7 @@ public abstract class CameraEngine implements
stub.isSnapshot = false;
stub.location = mLocation;
stub.facing = mFacing;
stub.format = mPictureFormat;
onTakePicture(stub, mPictureMetering);
}
});
@ -1150,17 +1160,18 @@ public abstract class CameraEngine implements
* @param stub a picture stub
*/
public final void takePictureSnapshot(final @NonNull PictureResult.Stub stub) {
LOG.v("takePictureSnapshot", "scheduling");
LOG.i("takePictureSnapshot", "scheduling");
mHandler.run(new Runnable() {
@Override
public void run() {
LOG.v("takePictureSnapshot", "performing. BindState:",
LOG.i("takePictureSnapshot", "performing. BindState:",
getBindState(), "isTakingPicture:", isTakingPicture());
if (getBindState() < STATE_STARTED) return;
if (isTakingPicture()) return;
stub.location = mLocation;
stub.isSnapshot = true;
stub.facing = mFacing;
stub.format = PictureFormat.JPEG;
// Leave the other parameters to subclasses.
//noinspection ConstantConditions
AspectRatio ratio = AspectRatio.of(getPreviewSurfaceSize(Reference.OUTPUT));
@ -1191,11 +1202,11 @@ public abstract class CameraEngine implements
}
public final void takeVideo(final @NonNull VideoResult.Stub stub, final @NonNull File file) {
LOG.v("takeVideo", "scheduling");
LOG.i("takeVideo", "scheduling");
mHandler.run(new Runnable() {
@Override
public void run() {
LOG.v("takeVideo", "performing. BindState:", getBindState(),
LOG.i("takeVideo", "performing. BindState:", getBindState(),
"isTakingVideo:", isTakingVideo());
if (getBindState() < STATE_STARTED) return;
if (isTakingVideo()) return;
@ -1223,11 +1234,11 @@ public abstract class CameraEngine implements
*/
public final void takeVideoSnapshot(@NonNull final VideoResult.Stub stub,
@NonNull final File file) {
LOG.v("takeVideoSnapshot", "scheduling");
LOG.i("takeVideoSnapshot", "scheduling");
mHandler.run(new Runnable() {
@Override
public void run() {
LOG.v("takeVideoSnapshot", "performing. BindState:", getBindState(),
LOG.i("takeVideoSnapshot", "performing. BindState:", getBindState(),
"isTakingVideo:", isTakingVideo());
if (getBindState() < STATE_STARTED) return;
if (isTakingVideo()) return;

@ -30,6 +30,7 @@ public abstract class BaseAction implements Action {
private final List<ActionCallback> callbacks = new ArrayList<>();
private int state;
private ActionHolder holder;
private boolean needsOnStart;
@Override
public final int getState() {
@ -39,7 +40,11 @@ public abstract class BaseAction implements Action {
@Override
public final void start(@NonNull ActionHolder holder) {
holder.addAction(this);
if (holder.getLastResult(this) != null) {
onStart(holder);
} else {
needsOnStart = true;
}
}
@Override
@ -49,6 +54,7 @@ public abstract class BaseAction implements Action {
onAbort(holder);
setState(STATE_COMPLETED);
}
needsOnStart = false;
}
/**
@ -58,7 +64,7 @@ public abstract class BaseAction implements Action {
*/
@CallSuper
protected void onStart(@NonNull ActionHolder holder) {
this.holder = holder; // must be here
this.holder = holder;
// Overrideable
}
@ -72,9 +78,13 @@ public abstract class BaseAction implements Action {
// Overrideable
}
@CallSuper
@Override
public void onCaptureStarted(@NonNull ActionHolder holder, @NonNull CaptureRequest request) {
// Overrideable
if (needsOnStart) {
onStart(holder);
needsOnStart = false;
}
}
@Override

@ -0,0 +1,132 @@
package com.otaliastudios.cameraview.engine.options;
import android.hardware.Camera;
import android.media.CamcorderProfile;
import androidx.annotation.NonNull;
import com.otaliastudios.cameraview.CameraOptions;
import com.otaliastudios.cameraview.controls.Facing;
import com.otaliastudios.cameraview.controls.Flash;
import com.otaliastudios.cameraview.controls.Hdr;
import com.otaliastudios.cameraview.controls.PictureFormat;
import com.otaliastudios.cameraview.controls.WhiteBalance;
import com.otaliastudios.cameraview.engine.mappers.Camera1Mapper;
import com.otaliastudios.cameraview.internal.utils.CamcorderProfiles;
import com.otaliastudios.cameraview.size.AspectRatio;
import com.otaliastudios.cameraview.size.Size;
import java.util.List;
public class Camera1Options extends CameraOptions {
public Camera1Options(@NonNull Camera.Parameters params, int cameraId, boolean flipSizes) {
List<String> strings;
Camera1Mapper mapper = Camera1Mapper.get();
// Facing
Camera.CameraInfo cameraInfo = new Camera.CameraInfo();
for (int i = 0, count = Camera.getNumberOfCameras(); i < count; i++) {
Camera.getCameraInfo(i, cameraInfo);
Facing value = mapper.unmapFacing(cameraInfo.facing);
if (value != null) supportedFacing.add(value);
}
// WB
strings = params.getSupportedWhiteBalance();
if (strings != null) {
for (String string : strings) {
WhiteBalance value = mapper.unmapWhiteBalance(string);
if (value != null) supportedWhiteBalance.add(value);
}
}
// Flash
supportedFlash.add(Flash.OFF);
strings = params.getSupportedFlashModes();
if (strings != null) {
for (String string : strings) {
Flash value = mapper.unmapFlash(string);
if (value != null) supportedFlash.add(value);
}
}
// Hdr
supportedHdr.add(Hdr.OFF);
strings = params.getSupportedSceneModes();
if (strings != null) {
for (String string : strings) {
Hdr value = mapper.unmapHdr(string);
if (value != null) supportedHdr.add(value);
}
}
// zoom
zoomSupported = params.isZoomSupported();
// autofocus
autoFocusSupported = params.getSupportedFocusModes()
.contains(Camera.Parameters.FOCUS_MODE_AUTO);
// Exposure correction
float step = params.getExposureCompensationStep();
exposureCorrectionMinValue = (float) params.getMinExposureCompensation() * step;
exposureCorrectionMaxValue = (float) params.getMaxExposureCompensation() * step;
exposureCorrectionSupported = params.getMinExposureCompensation() != 0
|| params.getMaxExposureCompensation() != 0;
// Picture Sizes
List<Camera.Size> sizes = params.getSupportedPictureSizes();
for (Camera.Size size : sizes) {
int width = flipSizes ? size.height : size.width;
int height = flipSizes ? size.width : size.height;
supportedPictureSizes.add(new Size(width, height));
supportedPictureAspectRatio.add(AspectRatio.of(width, height));
}
// Video Sizes
// As a safety measure, remove Sizes bigger than CamcorderProfile.highest
CamcorderProfile profile = CamcorderProfiles.get(cameraId,
new Size(Integer.MAX_VALUE, Integer.MAX_VALUE));
Size videoMaxSize = new Size(profile.videoFrameWidth, profile.videoFrameHeight);
List<Camera.Size> vsizes = params.getSupportedVideoSizes();
if (vsizes != null) {
for (Camera.Size size : vsizes) {
if (size.width <= videoMaxSize.getWidth()
&& size.height <= videoMaxSize.getHeight()) {
int width = flipSizes ? size.height : size.width;
int height = flipSizes ? size.width : size.height;
supportedVideoSizes.add(new Size(width, height));
supportedVideoAspectRatio.add(AspectRatio.of(width, height));
}
}
} else {
// StackOverflow threads seems to agree that if getSupportedVideoSizes is null,
// previews can be used.
List<Camera.Size> fallback = params.getSupportedPreviewSizes();
for (Camera.Size size : fallback) {
if (size.width <= videoMaxSize.getWidth()
&& size.height <= videoMaxSize.getHeight()) {
int width = flipSizes ? size.height : size.width;
int height = flipSizes ? size.width : size.height;
supportedVideoSizes.add(new Size(width, height));
supportedVideoAspectRatio.add(AspectRatio.of(width, height));
}
}
}
// Preview FPS
previewFrameRateMinValue = Float.MAX_VALUE;
previewFrameRateMaxValue = -Float.MAX_VALUE;
List<int[]> fpsRanges = params.getSupportedPreviewFpsRange();
for (int[] fpsRange : fpsRanges) {
float lower = (float) fpsRange[0] / 1000F;
float upper = (float) fpsRange[1] / 1000F;
previewFrameRateMinValue = Math.min(previewFrameRateMinValue, lower);
previewFrameRateMaxValue = Math.max(previewFrameRateMaxValue, upper);
}
// Picture formats
supportedPictureFormats.add(PictureFormat.JPEG);
}
}

@ -0,0 +1,178 @@
package com.otaliastudios.cameraview.engine.options;
import android.graphics.ImageFormat;
import android.hardware.camera2.CameraAccessException;
import android.hardware.camera2.CameraCharacteristics;
import android.hardware.camera2.CameraManager;
import android.hardware.camera2.params.StreamConfigurationMap;
import android.media.CamcorderProfile;
import android.media.MediaRecorder;
import android.os.Build;
import android.util.Range;
import android.util.Rational;
import androidx.annotation.NonNull;
import androidx.annotation.RequiresApi;
import com.otaliastudios.cameraview.CameraOptions;
import com.otaliastudios.cameraview.controls.Facing;
import com.otaliastudios.cameraview.controls.Flash;
import com.otaliastudios.cameraview.controls.Hdr;
import com.otaliastudios.cameraview.controls.PictureFormat;
import com.otaliastudios.cameraview.controls.WhiteBalance;
import com.otaliastudios.cameraview.engine.mappers.Camera2Mapper;
import com.otaliastudios.cameraview.internal.utils.CamcorderProfiles;
import com.otaliastudios.cameraview.size.AspectRatio;
import com.otaliastudios.cameraview.size.Size;
import java.util.Set;
import static android.hardware.camera2.CameraCharacteristics.*;
@RequiresApi(Build.VERSION_CODES.LOLLIPOP)
public class Camera2Options extends CameraOptions {
public Camera2Options(@NonNull CameraManager manager,
@NonNull String cameraId,
boolean flipSizes,
int pictureFormat) throws CameraAccessException {
Camera2Mapper mapper = Camera2Mapper.get();
CameraCharacteristics cameraCharacteristics = manager.getCameraCharacteristics(cameraId);
// Facing
for (String cameraId1 : manager.getCameraIdList()) {
CameraCharacteristics cameraCharacteristics1 = manager
.getCameraCharacteristics(cameraId1);
Integer cameraFacing = cameraCharacteristics1.get(LENS_FACING);
if (cameraFacing != null) {
Facing value = mapper.unmapFacing(cameraFacing);
if (value != null) supportedFacing.add(value);
}
}
// WB
int[] awbModes = cameraCharacteristics.get(CONTROL_AWB_AVAILABLE_MODES);
//noinspection ConstantConditions
for (int awbMode : awbModes) {
WhiteBalance value = mapper.unmapWhiteBalance(awbMode);
if (value != null) supportedWhiteBalance.add(value);
}
// Flash
supportedFlash.add(Flash.OFF);
Boolean hasFlash = cameraCharacteristics.get(FLASH_INFO_AVAILABLE);
if (hasFlash != null && hasFlash) {
int[] aeModes = cameraCharacteristics.get(CONTROL_AE_AVAILABLE_MODES);
//noinspection ConstantConditions
for (int aeMode : aeModes) {
Set<Flash> flashes = mapper.unmapFlash(aeMode);
supportedFlash.addAll(flashes);
}
}
// HDR
supportedHdr.add(Hdr.OFF);
int[] sceneModes = cameraCharacteristics.get(CONTROL_AVAILABLE_SCENE_MODES);
//noinspection ConstantConditions
for (int sceneMode : sceneModes) {
Hdr value = mapper.unmapHdr(sceneMode);
if (value != null) supportedHdr.add(value);
}
// Zoom
Float maxZoom = cameraCharacteristics.get(SCALER_AVAILABLE_MAX_DIGITAL_ZOOM);
if(maxZoom != null) {
zoomSupported = maxZoom > 1;
}
// AutoFocus
// This now means 3A metering with respect to a specific region of the screen.
// Some controls (AF, AE) have special triggers that might or might not be supported.
// But they can also be on some continuous search mode so that the trigger is not needed.
// What really matters in my opinion is the availability of regions.
Integer afRegions = cameraCharacteristics.get(CONTROL_MAX_REGIONS_AF);
Integer aeRegions = cameraCharacteristics.get(CONTROL_MAX_REGIONS_AE);
Integer awbRegions = cameraCharacteristics.get(CONTROL_MAX_REGIONS_AWB);
autoFocusSupported = (afRegions != null && afRegions > 0)
|| (aeRegions != null && aeRegions > 0)
|| (awbRegions != null && awbRegions > 0);
// Exposure correction
Range<Integer> exposureRange = cameraCharacteristics.get(CONTROL_AE_COMPENSATION_RANGE);
Rational exposureStep = cameraCharacteristics.get(CONTROL_AE_COMPENSATION_STEP);
if (exposureRange != null && exposureStep != null && exposureStep.floatValue() != 0) {
exposureCorrectionMinValue = exposureRange.getLower() / exposureStep.floatValue();
exposureCorrectionMaxValue = exposureRange.getUpper() / exposureStep.floatValue();
}
exposureCorrectionSupported = exposureCorrectionMinValue != 0
&& exposureCorrectionMaxValue != 0;
// Picture Sizes
StreamConfigurationMap streamMap = cameraCharacteristics.get(
SCALER_STREAM_CONFIGURATION_MAP);
if (streamMap == null) {
throw new RuntimeException("StreamConfigurationMap is null. Should not happen.");
}
int[] pictureFormats = streamMap.getOutputFormats();
boolean hasPictureFormat = false;
for (int picFormat : pictureFormats) {
if (picFormat == pictureFormat) {
hasPictureFormat = true;
break;
}
}
if (!hasPictureFormat) {
throw new IllegalStateException("Picture format not supported: " + pictureFormat);
}
android.util.Size[] psizes = streamMap.getOutputSizes(pictureFormat);
for (android.util.Size size : psizes) {
int width = flipSizes ? size.getHeight() : size.getWidth();
int height = flipSizes ? size.getWidth() : size.getHeight();
supportedPictureSizes.add(new Size(width, height));
supportedPictureAspectRatio.add(AspectRatio.of(width, height));
}
// Video Sizes
// As a safety measure, remove Sizes bigger than CamcorderProfile.highest
CamcorderProfile profile = CamcorderProfiles.get(cameraId,
new Size(Integer.MAX_VALUE, Integer.MAX_VALUE));
Size videoMaxSize = new Size(profile.videoFrameWidth, profile.videoFrameHeight);
android.util.Size[] vsizes = streamMap.getOutputSizes(MediaRecorder.class);
for (android.util.Size size : vsizes) {
if (size.getWidth() <= videoMaxSize.getWidth()
&& size.getHeight() <= videoMaxSize.getHeight()) {
int width = flipSizes ? size.getHeight() : size.getWidth();
int height = flipSizes ? size.getWidth() : size.getHeight();
supportedVideoSizes.add(new Size(width, height));
supportedVideoAspectRatio.add(AspectRatio.of(width, height));
}
}
// Preview FPS
Range<Integer>[] range = cameraCharacteristics.get(CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES);
if (range != null) {
previewFrameRateMinValue = Float.MAX_VALUE;
previewFrameRateMaxValue = -Float.MAX_VALUE;
for (Range<Integer> fpsRange : range) {
previewFrameRateMinValue = Math.min(previewFrameRateMinValue, fpsRange.getLower());
previewFrameRateMaxValue = Math.max(previewFrameRateMaxValue, fpsRange.getUpper());
}
} else {
previewFrameRateMinValue = 0F;
previewFrameRateMaxValue = 0F;
}
// Picture formats
supportedPictureFormats.add(PictureFormat.JPEG);
int[] caps = cameraCharacteristics.get(REQUEST_AVAILABLE_CAPABILITIES);
if (caps != null) {
for (int cap : caps) {
if (cap == CameraCharacteristics.REQUEST_AVAILABLE_CAPABILITIES_RAW) {
supportedPictureFormats.add(PictureFormat.DNG);
}
}
}
}
}

@ -11,7 +11,7 @@ public class ExifHelper {
* Maps an {@link ExifInterface} orientation value
* to the actual degrees.
*/
public static int readExifOrientation(int exifOrientation) {
public static int getOrientation(int exifOrientation) {
int orientation;
switch (exifOrientation) {
case ExifInterface.ORIENTATION_NORMAL:
@ -34,5 +34,18 @@ public class ExifHelper {
}
return orientation;
}
/**
* Maps a degree value to {@link ExifInterface} constant.
*/
public static int getExifOrientation(int orientation) {
switch ((orientation + 360) % 360) {
case 0: return ExifInterface.ORIENTATION_NORMAL;
case 90: return ExifInterface.ORIENTATION_ROTATE_90;
case 180: return ExifInterface.ORIENTATION_ROTATE_180;
case 270: return ExifInterface.ORIENTATION_ROTATE_270;
default: throw new IllegalArgumentException("Invalid orientation: " + orientation);
}
}
}

@ -57,11 +57,10 @@ public class Full1PictureRecorder extends PictureRecorder {
int exifOrientation = exif.getAttributeInt(
ExifInterface.TAG_ORIENTATION,
ExifInterface.ORIENTATION_NORMAL);
exifRotation = ExifHelper.readExifOrientation(exifOrientation);
exifRotation = ExifHelper.getOrientation(exifOrientation);
} catch (IOException e) {
exifRotation = 0;
}
mResult.format = PictureResult.FORMAT_JPEG;
mResult.data = data;
mResult.rotation = exifRotation;
camera.startPreview(); // This is needed, read somewhere in the docs.

@ -3,12 +3,15 @@ package com.otaliastudios.cameraview.picture;
import android.hardware.camera2.CameraAccessException;
import android.hardware.camera2.CameraDevice;
import android.hardware.camera2.CaptureRequest;
import android.hardware.camera2.DngCreator;
import android.hardware.camera2.TotalCaptureResult;
import android.media.Image;
import android.media.ImageReader;
import android.os.Build;
import com.otaliastudios.cameraview.CameraLogger;
import com.otaliastudios.cameraview.PictureResult;
import com.otaliastudios.cameraview.controls.PictureFormat;
import com.otaliastudios.cameraview.engine.Camera2Engine;
import com.otaliastudios.cameraview.engine.action.Action;
import com.otaliastudios.cameraview.engine.action.ActionHolder;
@ -16,7 +19,9 @@ import com.otaliastudios.cameraview.engine.action.BaseAction;
import com.otaliastudios.cameraview.internal.utils.ExifHelper;
import com.otaliastudios.cameraview.internal.utils.WorkerHandler;
import java.io.BufferedOutputStream;
import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream;
import java.io.IOException;
import java.nio.ByteBuffer;
@ -39,6 +44,8 @@ public class Full2PictureRecorder extends PictureRecorder
private final ImageReader mPictureReader;
private final CaptureRequest.Builder mPictureBuilder;
private DngCreator mDngCreator;
public Full2PictureRecorder(@NonNull PictureResult.Stub stub,
@NonNull Camera2Engine engine,
@NonNull CaptureRequest.Builder pictureBuilder,
@ -54,7 +61,9 @@ public class Full2PictureRecorder extends PictureRecorder
protected void onStart(@NonNull ActionHolder holder) {
super.onStart(holder);
mPictureBuilder.addTarget(mPictureReader.getSurface());
if (mResult.format == PictureFormat.JPEG) {
mPictureBuilder.set(CaptureRequest.JPEG_ORIENTATION, mResult.rotation);
}
mPictureBuilder.setTag(CameraDevice.TEMPLATE_STILL_CAPTURE);
try {
holder.applyBuilder(this, mPictureBuilder);
@ -75,6 +84,20 @@ public class Full2PictureRecorder extends PictureRecorder
setState(STATE_COMPLETED);
}
}
@Override
public void onCaptureCompleted(@NonNull ActionHolder holder,
@NonNull CaptureRequest request,
@NonNull TotalCaptureResult result) {
super.onCaptureCompleted(holder, request, result);
if (mResult.format == PictureFormat.DNG) {
mDngCreator = new DngCreator(holder.getCharacteristics(this), result);
mDngCreator.setOrientation(ExifHelper.getExifOrientation(mResult.rotation));
if (mResult.location != null) {
mDngCreator.setLocation(mResult.location);
}
}
}
};
}
@ -86,37 +109,59 @@ public class Full2PictureRecorder extends PictureRecorder
@Override
public void onImageAvailable(ImageReader reader) {
LOG.i("onImageAvailable started.");
// Read the JPEG.
Image image = null;
//noinspection TryFinallyCanBeTryWithResources
try {
image = reader.acquireNextImage();
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.remaining()];
buffer.get(bytes);
mResult.data = bytes;
switch (mResult.format) {
case JPEG: readJpegImage(image); break;
case DNG: readRawImage(image); break;
default: throw new IllegalStateException("Unknown format: " + mResult.format);
}
} catch (Exception e) {
mResult = null;
mError = e;
dispatchResult();
return;
} finally {
if (image != null) image.close();
if (image != null) {
image.close();
}
}
// Leave.
LOG.i("onImageAvailable ended.");
dispatchResult();
}
private void readJpegImage(@NonNull Image image) {
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.remaining()];
buffer.get(bytes);
mResult.data = bytes;
// Just like Camera1, unfortunately, the camera might rotate the image
// and put EXIF=0 instead of respecting our EXIF and leave the image unaltered.
mResult.format = PictureResult.FORMAT_JPEG;
mResult.rotation = 0;
try {
ExifInterface exif = new ExifInterface(new ByteArrayInputStream(mResult.data));
int exifOrientation = exif.getAttributeInt(ExifInterface.TAG_ORIENTATION,
ExifInterface.ORIENTATION_NORMAL);
mResult.rotation = ExifHelper.readExifOrientation(exifOrientation);
mResult.rotation = ExifHelper.getOrientation(exifOrientation);
} catch (IOException ignore) { }
}
// Leave.
LOG.i("onImageAvailable ended.");
dispatchResult();
private void readRawImage(@NonNull Image image) {
ByteArrayOutputStream array = new ByteArrayOutputStream();
BufferedOutputStream stream = new BufferedOutputStream(array);
try {
mDngCreator.writeImage(stream, image);
stream.flush();
mResult.data = array.toByteArray();
} catch (IOException e) {
mDngCreator.close();
try { stream.close(); } catch (IOException ignore) {}
throw new RuntimeException(e);
}
}
}

@ -81,7 +81,6 @@ public class Snapshot1PictureRecorder extends PictureRecorder {
mResult.data = data;
mResult.size = new Size(outputRect.width(), outputRect.height());
mResult.rotation = 0;
mResult.format = PictureResult.FORMAT_JPEG;
dispatchResult();
}
});

@ -48,7 +48,7 @@ public class Snapshot2PictureRecorder extends SnapshotGlPictureRecorder {
private final static CameraLogger LOG = CameraLogger.create(TAG);
private final static long LOCK_TIMEOUT = 2500;
private static class FlashAction extends BaseAction {
private class FlashAction extends BaseAction {
@Override
protected void onStart(@NonNull ActionHolder holder) {
@ -82,6 +82,26 @@ public class Snapshot2PictureRecorder extends SnapshotGlPictureRecorder {
}
}
private class ResetFlashAction extends BaseAction {
@Override
protected void onStart(@NonNull ActionHolder holder) {
super.onStart(holder);
try {
// See Camera2Engine.setFlash() comments: turning TORCH off has bugs and we must do
// as follows.
LOG.i("ResetFlashAction:", "Reverting the flash changes.");
CaptureRequest.Builder builder = holder.getBuilder(this);
builder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON);
builder.set(CaptureRequest.FLASH_MODE, CaptureResult.FLASH_MODE_OFF);
holder.applyBuilder(this, builder);
builder.set(CaptureRequest.CONTROL_AE_MODE, mOriginalAeMode);
builder.set(CaptureRequest.FLASH_MODE, mOriginalFlashMode);
holder.applyBuilder(this);
} catch (CameraAccessException ignore) {}
}
}
private final Action mAction;
private final ActionHolder mHolder;
private final boolean mActionNeeded;
@ -135,18 +155,7 @@ public class Snapshot2PictureRecorder extends SnapshotGlPictureRecorder {
@Override
protected void dispatchResult() {
// Revert our changes.
LOG.i("dispatchResult:", "Reverting the flash changes.");
try {
// See Camera2Engine.setFlash() comments: turning TORCH off has bugs and we must do
// as follows.
CaptureRequest.Builder builder = mHolder.getBuilder(mAction);
builder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON);
builder.set(CaptureRequest.FLASH_MODE, CaptureResult.FLASH_MODE_OFF);
mHolder.applyBuilder(mAction, builder);
builder.set(CaptureRequest.CONTROL_AE_MODE, mOriginalAeMode);
builder.set(CaptureRequest.FLASH_MODE, mOriginalFlashMode);
mHolder.applyBuilder(mAction);
} catch (CameraAccessException ignore) {}
new ResetFlashAction().start(mHolder);
super.dispatchResult();
}
}

@ -238,7 +238,6 @@ public class SnapshotGlPictureRecorder extends PictureRecorder {
LOG.i("takeFrame:", "timestampUs:", timestampUs);
mViewport.drawFrame(timestampUs, mTextureId, mTransform);
if (mHasOverlay) mOverlayDrawer.render(timestampUs);
mResult.format = PictureResult.FORMAT_JPEG;
mResult.data = eglSurface.saveFrameTo(Bitmap.CompressFormat.JPEG);
// 6. Cleanup

@ -145,6 +145,11 @@
<attr name="cameraPictureMetering" format="boolean|reference"/>
<attr name="cameraPictureSnapshotMetering" format="boolean|reference"/>
<attr name="cameraPictureFormat" format="enum">
<enum name="jpeg" value="0" />
<enum name="dng" value="1" />
</attr>
<attr name="cameraExperimental" format="boolean" />
</declare-styleable>

@ -3,13 +3,8 @@ package com.otaliastudios.cameraview.internal.utils;
import androidx.exifinterface.media.ExifInterface;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
import java.util.ArrayList;
import java.util.List;
import static junit.framework.Assert.assertNotNull;
import static org.junit.Assert.assertEquals;
import static org.junit.Assert.assertFalse;
@ -20,22 +15,22 @@ public class ExifHelperTest {
@Test
public void testValues() {
assertEquals(0, ExifHelper.readExifOrientation(ExifInterface.ORIENTATION_NORMAL));
assertEquals(0, ExifHelper.readExifOrientation(ExifInterface.ORIENTATION_FLIP_HORIZONTAL));
assertEquals(180, ExifHelper.readExifOrientation(ExifInterface.ORIENTATION_ROTATE_180));
assertEquals(180, ExifHelper.readExifOrientation(ExifInterface.ORIENTATION_FLIP_VERTICAL));
assertEquals(90, ExifHelper.readExifOrientation(ExifInterface.ORIENTATION_ROTATE_90));
assertEquals(90, ExifHelper.readExifOrientation(ExifInterface.ORIENTATION_TRANSPOSE));
assertEquals(270, ExifHelper.readExifOrientation(ExifInterface.ORIENTATION_ROTATE_270));
assertEquals(270, ExifHelper.readExifOrientation(ExifInterface.ORIENTATION_TRANSVERSE));
assertEquals(0, ExifHelper.getOrientation(ExifInterface.ORIENTATION_NORMAL));
assertEquals(0, ExifHelper.getOrientation(ExifInterface.ORIENTATION_FLIP_HORIZONTAL));
assertEquals(180, ExifHelper.getOrientation(ExifInterface.ORIENTATION_ROTATE_180));
assertEquals(180, ExifHelper.getOrientation(ExifInterface.ORIENTATION_FLIP_VERTICAL));
assertEquals(90, ExifHelper.getOrientation(ExifInterface.ORIENTATION_ROTATE_90));
assertEquals(90, ExifHelper.getOrientation(ExifInterface.ORIENTATION_TRANSPOSE));
assertEquals(270, ExifHelper.getOrientation(ExifInterface.ORIENTATION_ROTATE_270));
assertEquals(270, ExifHelper.getOrientation(ExifInterface.ORIENTATION_TRANSVERSE));
}
@Test
public void testUnknownValues() {
assertEquals(0, ExifHelper.readExifOrientation(-15));
assertEquals(0, ExifHelper.readExifOrientation(-1));
assertEquals(0, ExifHelper.readExifOrientation(195));
assertEquals(0, ExifHelper.readExifOrientation(Integer.MAX_VALUE));
assertEquals(0, ExifHelper.getOrientation(-15));
assertEquals(0, ExifHelper.getOrientation(-1));
assertEquals(0, ExifHelper.getOrientation(195));
assertEquals(0, ExifHelper.getOrientation(Integer.MAX_VALUE));
}
}

@ -15,7 +15,6 @@
<activity
android:name=".CameraActivity"
android:theme="@style/Theme.MainActivity"
android:configChanges="screenLayout|keyboardHidden"
android:hardwareAccelerated="true">
<intent-filter>
@ -24,9 +23,23 @@
</intent-filter>
</activity>
<activity android:name=".PicturePreviewActivity" />
<activity
android:name=".PicturePreviewActivity"
android:label="Picture Preview" />
<activity
android:name=".VideoPreviewActivity"
android:label="Video Preview" />
<activity android:name=".VideoPreviewActivity"/>
<provider
android:name="androidx.core.content.FileProvider"
android:authorities="${applicationId}.provider"
android:grantUriPermissions="true"
android:exported="false">
<meta-data
android:name="android.support.FILE_PROVIDER_PATHS"
android:resource="@xml/filepaths" />
</provider>
</application>

@ -115,6 +115,7 @@ public class CameraActivity extends AppCompatActivity implements View.OnClickLis
// Some controls
new Option.Flash(), new Option.WhiteBalance(), new Option.Hdr(),
new Option.PictureMetering(), new Option.PictureSnapshotMetering(),
new Option.PictureFormat(),
// Video recording
new Option.PreviewFrameRate(), new Option.VideoCodec(), new Option.Audio(),
// Gestures
@ -133,7 +134,7 @@ public class CameraActivity extends AppCompatActivity implements View.OnClickLis
// Engine and preview
false, false, true,
// Some controls
false, false, false, false, true,
false, false, false, false, false, true,
// Video recording
false, false, true,
// Gestures

@ -10,6 +10,7 @@ import android.view.ViewGroup;
import com.otaliastudios.cameraview.CameraListener;
import com.otaliastudios.cameraview.CameraOptions;
import com.otaliastudios.cameraview.CameraView;
import com.otaliastudios.cameraview.controls.Control;
import com.otaliastudios.cameraview.gesture.Gesture;
import com.otaliastudios.cameraview.gesture.GestureAction;
import com.otaliastudios.cameraview.overlay.Overlay;
@ -135,7 +136,7 @@ public abstract class Option<T> {
}
}
private static abstract class ControlOption<T extends com.otaliastudios.cameraview.controls.Control> extends Option<T> {
private static abstract class ControlOption<T extends Control> extends Option<T> {
private final Class<T> controlClass;
ControlOption(@NonNull Class<T> controlClass, String name) {
@ -547,4 +548,10 @@ public abstract class Option<T> {
}
}
public static class PictureFormat extends ControlOption<com.otaliastudios.cameraview.controls.PictureFormat> {
public PictureFormat() {
super(com.otaliastudios.cameraview.controls.PictureFormat.class, "Picture Format");
}
}
}

@ -1,20 +1,36 @@
package com.otaliastudios.cameraview.demo;
import android.app.Activity;
import android.content.Context;
import android.content.Intent;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.graphics.Color;
import android.graphics.drawable.ColorDrawable;
import android.net.Uri;
import android.os.Bundle;
import androidx.annotation.Nullable;
import androidx.appcompat.app.AppCompatActivity;
import androidx.core.content.FileProvider;
import android.util.Log;
import android.view.Menu;
import android.view.MenuItem;
import android.widget.ImageView;
import android.widget.Toast;
import com.otaliastudios.cameraview.CameraUtils;
import com.otaliastudios.cameraview.FileCallback;
import com.otaliastudios.cameraview.size.AspectRatio;
import com.otaliastudios.cameraview.BitmapCallback;
import com.otaliastudios.cameraview.PictureResult;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
public class PicturePreviewActivity extends Activity {
public class PicturePreviewActivity extends AppCompatActivity {
private static PictureResult picture;
@ -42,12 +58,18 @@ public class PicturePreviewActivity extends Activity {
captureLatency.setTitleAndMessage("Approx. latency", delay + " milliseconds");
captureResolution.setTitleAndMessage("Resolution", result.getSize() + " (" + ratio + ")");
exifRotation.setTitleAndMessage("EXIF rotation", result.getRotation() + "");
try {
result.toBitmap(1000, 1000, new BitmapCallback() {
@Override
public void onBitmapReady(Bitmap bitmap) {
imageView.setImageBitmap(bitmap);
}
});
} catch (UnsupportedOperationException e) {
imageView.setImageDrawable(new ColorDrawable(Color.GREEN));
Toast.makeText(this, "Can't preview this format: " + picture.getFormat(),
Toast.LENGTH_LONG).show();
}
if (result.isSnapshot()) {
// Log the real size for debugging reason.
@ -69,4 +91,46 @@ public class PicturePreviewActivity extends Activity {
setPictureResult(null);
}
}
@Override
public boolean onCreateOptionsMenu(Menu menu) {
getMenuInflater().inflate(R.menu.share, menu);
return true;
}
@Override
public boolean onOptionsItemSelected(MenuItem item) {
if (item.getItemId() == R.id.share) {
Toast.makeText(this, "Sharing...", Toast.LENGTH_SHORT).show();
String extension;
switch (picture.getFormat()) {
case JPEG: extension = "jpg"; break;
case DNG: extension = "dng"; break;
default: throw new RuntimeException("Unknown format.");
}
File file = new File(getFilesDir(), "picture." + extension);
CameraUtils.writeToFile(picture.getData(), file, new FileCallback() {
@Override
public void onFileReady(@Nullable File file) {
if (file != null) {
Context context = PicturePreviewActivity.this;
Intent intent = new Intent(Intent.ACTION_SEND);
intent.setType("image/*");
Uri uri = FileProvider.getUriForFile(context,
context.getPackageName() + ".provider",
file);
intent.putExtra(Intent.EXTRA_STREAM, uri);
intent.addFlags(Intent.FLAG_GRANT_READ_URI_PERMISSION);
startActivity(intent);
} else {
Toast.makeText(PicturePreviewActivity.this,
"Error while writing file.",
Toast.LENGTH_SHORT).show();
}
}
});
return true;
}
return super.onOptionsItemSelected(item);
}
}

@ -1,22 +1,33 @@
package com.otaliastudios.cameraview.demo;
import android.app.Activity;
import android.content.Context;
import android.content.Intent;
import android.media.MediaPlayer;
import android.net.Uri;
import android.os.Bundle;
import androidx.annotation.Nullable;
import androidx.appcompat.app.AppCompatActivity;
import androidx.core.content.FileProvider;
import android.util.Log;
import android.view.Menu;
import android.view.MenuItem;
import android.view.View;
import android.view.ViewGroup;
import android.widget.MediaController;
import android.widget.Toast;
import android.widget.VideoView;
import com.otaliastudios.cameraview.CameraUtils;
import com.otaliastudios.cameraview.FileCallback;
import com.otaliastudios.cameraview.VideoResult;
import com.otaliastudios.cameraview.size.AspectRatio;
import java.io.File;
public class VideoPreviewActivity extends Activity {
public class VideoPreviewActivity extends AppCompatActivity {
private VideoView videoView;
@ -97,4 +108,27 @@ public class VideoPreviewActivity extends Activity {
setVideoResult(null);
}
}
@Override
public boolean onCreateOptionsMenu(Menu menu) {
getMenuInflater().inflate(R.menu.share, menu);
return true;
}
@Override
public boolean onOptionsItemSelected(MenuItem item) {
if (item.getItemId() == R.id.share) {
Toast.makeText(this, "Sharing...", Toast.LENGTH_SHORT).show();
Intent intent = new Intent(Intent.ACTION_SEND);
intent.setType("video/*");
Uri uri = FileProvider.getUriForFile(this,
this.getPackageName() + ".provider",
videoResult.getFile());
intent.putExtra(Intent.EXTRA_STREAM, uri);
intent.addFlags(Intent.FLAG_GRANT_READ_URI_PERMISSION);
startActivity(intent);
return true;
}
return super.onOptionsItemSelected(item);
}
}

@ -0,0 +1,5 @@
<vector android:autoMirrored="true" android:height="24dp"
android:viewportHeight="24.0" android:viewportWidth="24.0"
android:width="24dp" xmlns:android="http://schemas.android.com/apk/res/android">
<path android:fillColor="#FFFFFF" android:pathData="M18,16.08c-0.76,0 -1.44,0.3 -1.96,0.77L8.91,12.7c0.05,-0.23 0.09,-0.46 0.09,-0.7s-0.04,-0.47 -0.09,-0.7l7.05,-4.11c0.54,0.5 1.25,0.81 2.04,0.81 1.66,0 3,-1.34 3,-3s-1.34,-3 -3,-3 -3,1.34 -3,3c0,0.24 0.04,0.47 0.09,0.7L8.04,9.81C7.5,9.31 6.79,9 6,9c-1.66,0 -3,1.34 -3,3s1.34,3 3,3c0.79,0 1.5,-0.31 2.04,-0.81l7.12,4.16c-0.05,0.21 -0.08,0.43 -0.08,0.65 0,1.61 1.31,2.92 2.92,2.92 1.61,0 2.92,-1.31 2.92,-2.92s-1.31,-2.92 -2.92,-2.92z"/>
</vector>

@ -0,0 +1,8 @@
<?xml version="1.0" encoding="utf-8"?>
<menu xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto">
<item android:id="@+id/share"
android:title="Share"
app:showAsAction="always"
android:icon="@drawable/ic_share"/>
</menu>

@ -8,7 +8,4 @@
<item name="colorAccent">@color/colorAccent</item>
</style>
<style name="Theme.MainActivity" parent="AppTheme"/>
<style name="Theme.PreviewActivity" parent="Theme.AppCompat.NoActionBar"/>
</resources>

@ -0,0 +1,4 @@
<?xml version="1.0" encoding="utf-8"?>
<paths>
<files-path name="files" path="." />
</paths>

@ -25,6 +25,7 @@ or `CameraOptions.supports(Control)` to see if it is supported.
app:cameraFlash="off|on|auto|torch"
app:cameraWhiteBalance="auto|incandescent|fluorescent|daylight|cloudy"
app:cameraHdr="off|on"
app:cameraPictureFormat="jpeg|dng"
app:cameraAudio="on|off|mono|stereo"
app:cameraAudioBitRate="0"
app:cameraVideoCodec="deviceDefault|h263|h264"
@ -100,6 +101,19 @@ cameraView.setHdr(Hdr.OFF);
cameraView.setHdr(Hdr.ON);
```
##### cameraPictureFormat
The format for pictures taken with `takePicture()`. Does not apply to picture snapshots taken
with `takePictureSnapshot()`. The `JPEG` value is always supported, while for other values
support might change depending on the engine and the device sensor.
The available values are exposed through the `CameraOptions` object.
```java
cameraView.setPictureFormat(PictureFormat.JPEG);
cameraView.setPictureFormat(PictureFormat.DNG);
```
##### cameraAudio
Turns on or off audio stream while recording videos.

@ -23,6 +23,7 @@ addressing most of the common issues and needs, and still leaving you with flexi
- Take super-fast snapshots with `takePictureSnapshot` and `takeVideoSnapshot` [[docs]](docs/capturing-media.html)
- Smart sizing: create a `CameraView` of any size [[docs]](docs/preview-size.html)
- Control HDR, flash, zoom, white balance, exposure, location, grid drawing & more [[docs]](docs/controls.html)
- RAW pictures support [[docs]](docs/controls.html)
- Lightweight
- Works down to API level 15
- Well tested

Loading…
Cancel
Save