Code Bug Fix: Android Camera conflict between OCR and taking picture from preview

Original Source Link

I am developping an android application that combine a camera review in order to take picture and at the same time i want show OCR blocktext over the preview , the application first waorked fine with only taking picture from the preview , but when i added the OCR i don’t get the preview anymore (just white scrren) but the OCR is working and showing up detected caracters. i think there is a problem with the camera opened by the preview and the OCR , kind of a conflict , but i can’t find it.

here is a sample of my code :

    Camera camera;
SurfaceView surfaceView;
SurfaceHolder surfaceHolder;
boolean previewing = false;
Context context;

LinearLayout previewLayout;

View borderCamera;
TextView resBorderSizeTV;

private OnFragmentInteractionListener mListener;

private TextRecognizer textRecognizer;
private CameraSource cameraSource;
private TextView textBlockContent;

Camera.Size previewSizeOptimal;

public interface OnFragmentInteractionListener {
    void onFragmentInteraction(Bitmap bitmap);

public void onCreate(@Nullable Bundle savedInstanceState) {

public View onCreateView(LayoutInflater inflater, ViewGroup container, Bundle savedInstanceState) {
    // Inflate the layout for this fragment
    View view = inflater.inflate(R.layout.fragment_photo, container, false);
    ButterKnife.bind(this, view);
    context = getContext();
    surfaceView = (SurfaceView) view.findViewById(;
    surfaceHolder = surfaceView.getHolder();

    //--------------------------------------------------------------------- OCR elements -----------------------------------------------------
    textBlockContent = view.findViewById(;
    textRecognizer = new TextRecognizer.Builder(getActivity().getApplicationContext()).build();

    return view;

public void onAttach(Context context) {
    if (context instanceof OnFragmentInteractionListener) {
        mListener = (OnFragmentInteractionListener) context;
    } else {
        throw new RuntimeException(context.toString() + " must implement OnFragmentInteractionListener");

public void onDetach() {
    mListener = null;

public void surfaceCreated(SurfaceHolder holder) {
    camera =;

    //--------------------------------------------------------------------- OCR Process-----------------------------------------------------

    try {
        //noinspection MissingPermission
        int rc = ActivityCompat.checkSelfPermission(getActivity().getApplicationContext(), Manifest.permission.CAMERA);
        if (rc == PackageManager.PERMISSION_GRANTED) {
            if (!textRecognizer.isOperational()) {
                Log.w("MainActivity", "Detector dependencies are not yet available.");
            cameraSource = new CameraSource.Builder(getActivity().getApplicationContext(), textRecognizer)
                    .setRequestedPreviewSize(1280, 1024)

        } else {
    } catch (IOException ex) {

    textRecognizer.setProcessor(new Detector.Processor<TextBlock>() {
        public void release() {


        public void receiveDetections(Detector.Detections<TextBlock> detections) {
            Log.d("Main", "receiveDetections");
            final SparseArray<TextBlock> items = detections.getDetectedItems();
            if (items.size() != 0) {
                Log.d("Main", "receiveDetections : " + items.toString());
       Runnable() {
                    public void run() {
                        StringBuilder value = new StringBuilder();
                        for (int i = 0; i < items.size(); ++i) {
                            TextBlock item = items.valueAt(i);
                        //update text block content to TextView

    //--------------------------------------------------------------------- End OCR -----------------------------------------------------

public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
    if (previewing) {
        previewing = false;
    if (camera != null) {
        try {
            Camera.Parameters parameters = camera.getParameters();

            //get preview sizes
            List<Camera.Size> previewSizes = parameters.getSupportedPreviewSizes();

            //find optimal - it very important
            previewSizeOptimal = getOptimalPreviewSize(previewSizes, parameters.getPictureSize().width, parameters.getPictureSize().height);
            //TODO TESSSSSSST
            //set parameters
            if (previewSizeOptimal != null) {
                parameters.setPreviewSize(previewSizeOptimal.width, previewSizeOptimal.height);
            if (camera.getParameters().getFocusMode().contains(Camera.Parameters.FOCUS_MODE_AUTO)) {
            if (camera.getParameters().getFlashMode() != null && camera.getParameters().getFlashMode().contains(Camera.Parameters.FLASH_MODE_AUTO)) {


            //rotate screen, because camera sensor usually in landscape mode
            Display display = ((WindowManager) context.getSystemService(Context.WINDOW_SERVICE)).getDefaultDisplay();
            if (display.getRotation() == Surface.ROTATION_0) {
            } else if (display.getRotation() == Surface.ROTATION_270) {
            //write some info
            int x1 = previewLayout.getWidth();
            int y1 = previewLayout.getHeight();
            int x2 = borderCamera.getWidth();
            int y2 = borderCamera.getHeight();
            String info =
                    "Preview width:" + String.valueOf(x1) + "n" + "Preview height:" + String.valueOf(y1) + "n" + "Border width:" + String.valueOf(x2) + "n" + "Border height:" + String.valueOf(y2);
            previewing = true;

        } catch (IOException e) {
            // TODO Auto-generated catch block

public Camera.Size getOptimalPreviewSize(List<Camera.Size> sizes, int w, int h) {
    final double ASPECT_TOLERANCE = 0.1;
    double targetRatio = (double) w / h;
    if (sizes == null) return null;
    Camera.Size optimalSize = null;
    double minDiff = Double.MAX_VALUE;
    int targetHeight = h;
    // Try to find an size match aspect ratio and size
    for (Camera.Size size : sizes) {
        double ratio = (double) size.width / size.height;
        if (Math.abs(ratio - targetRatio) > ASPECT_TOLERANCE) continue;
        if (Math.abs(size.height - targetHeight) < minDiff) {
            optimalSize = size;
            minDiff = Math.abs(size.height - targetHeight);
    // Cannot find the one match the aspect ratio, ignore the requirement
    if (optimalSize == null) {
        minDiff = Double.MAX_VALUE;
        for (Camera.Size size : sizes) {
            if (Math.abs(size.height - targetHeight) < minDiff) {
                optimalSize = size;
                minDiff = Math.abs(size.height - targetHeight);
    return optimalSize;

public void surfaceDestroyed(SurfaceHolder holder) {
    camera = null;
    previewing = false;


void makePhoto() {
    if (camera != null) {
        camera.takePicture(myShutterCallback, myPictureCallback_RAW, myPictureCallback_JPG);


Camera.ShutterCallback myShutterCallback = new Camera.ShutterCallback() {
    public void onShutter() {
Camera.PictureCallback myPictureCallback_RAW = new Camera.PictureCallback() {
    public void onPictureTaken(byte[] data, Camera camera) {

Camera.PictureCallback myPictureCallback_JPG = new Camera.PictureCallback() {
    public void onPictureTaken(byte[] data, Camera camera) {
        Bitmap bitmapPicture = BitmapFactory.decodeByteArray(data, 0, data.length);
        Bitmap croppedBitmap = null;
        Display display = ((WindowManager) context.getSystemService(Context.WINDOW_SERVICE)).getDefaultDisplay();
        if (display.getRotation() == Surface.ROTATION_0) {
            //rotate bitmap, because camera sensor usually in landscape mode
            Matrix matrix = new Matrix();
            Bitmap rotatedBitmap = Bitmap.createBitmap(bitmapPicture, 0, 0, bitmapPicture.getWidth(), bitmapPicture.getHeight(), matrix, true);
            Log.i("onPictureTaken", "rotatedBitmap width" + rotatedBitmap.getWidth());
            Log.i("onPictureTaken", "rotatedBitmap height " + rotatedBitmap.getHeight());
            Log.i("onPictureTaken", "previewLayout.getWidth() " + previewLayout.getWidth());
            Log.i("onPictureTaken", "previewLayout.getHeight() " + previewLayout.getHeight());
            //save file
            //calculate aspect ratio
            float koefX = (float) rotatedBitmap.getWidth() / (float) previewLayout.getWidth();
            float koefY = (float) rotatedBitmap.getHeight() / (float) previewLayout.getHeight();
            Log.i("onPictureTaken", "koefx " + koefX);
            Log.i("onPictureTaken", "koefy " + koefY);
            //get viewfinder border size and position on the screen
            int x1 = borderCamera.getLeft();
            int y1 = borderCamera.getTop();
            int x2 = borderCamera.getWidth();
            int y2 = borderCamera.getHeight();
            Log.i("onPictureTaken", "borderCamera.getWidth() " + borderCamera.getWidth());
            Log.i("onPictureTaken", "borderCamera.getHeight() " + borderCamera.getHeight());
            //TODO size 1028 * 1251
            //calculate position and size for cropping
            int cropStartX = Math.round(x1 * koefX);
            int cropStartY = Math.round(y1 * koefY);
            int cropWidthX = Math.round(x2 * koefX);
            int cropHeightY = Math.round(y2 * koefY);
            //check limits and make crop
            if (cropStartX + cropWidthX <= rotatedBitmap.getWidth() && cropStartY + cropHeightY <= rotatedBitmap.getHeight()) {
                croppedBitmap = Bitmap.createBitmap(rotatedBitmap, cropStartX, cropStartY, cropWidthX, cropHeightY);
            } else {
                croppedBitmap = null;
            //save result
            if (croppedBitmap != null) {
                // Scale down to the output size
                //Bitmap scaledBitmap = Bitmap.createScaledBitmap(croppedBitmap, 1028, 1251, true);

        } else if (display.getRotation() == Surface.ROTATION_270) {
            // for Landscape mode
        //pass to another fragment
        if (mListener != null) {
            if (croppedBitmap != null) mListener.onFragmentInteraction(croppedBitmap);
        if (camera != null) {

public void createImageFile(final Bitmap bitmap) {
    File path = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_PICTURES);
    String timeStamp = new SimpleDateFormat("MMdd_HHmmssSSS").format(new Date());
    String imageFileName = "pic_" + timeStamp + ".jpg";
    final File file = new File(path, imageFileName);
    try {
        // Make sure the Pictures directory exists.
        if (path.mkdirs()) {
            Toast.makeText(context, "Not exist :" + path.getName(), Toast.LENGTH_SHORT).show();
        OutputStream os = new FileOutputStream(file);
        bitmap.compress(Bitmap.CompressFormat.JPEG, 100, os);
        Log.i("ExternalStorage", "Writed " + path + file.getName());
        // Tell the media scanner about the new file so that it is
        // immediately available to the user.
        MediaScannerConnection.scanFile(context, new String[]{file.toString()}, null, new MediaScannerConnection.OnScanCompletedListener() {
            public void onScanCompleted(String path, Uri uri) {
                Log.i("ExternalStorage", "Scanned " + path + ":");
                Log.i("ExternalStorage", "-> uri=" + uri);
        Toast.makeText(context, file.getName(), Toast.LENGTH_SHORT).show();

    } catch (Exception e) {
        // Unable to create file, likely because external storage is
        // not currently mounted.
        Log.w("ExternalStorage", "Error writing " + file, e);

Tagged : / / /

Code Bug Fix: How to detect when user’s done entering the value to EditText in RecyclerView?

Original Source Link

enter image description here

I have a recyclerview which contains EditTexts. How can I get the value of each edittext only after the user is done from entering the amount in any of the edit texts so i can update the total amount.

What I’m trying to achieve is adding the values of the edittexts and send it to the Activity. In the Activity I have the Proceed button (where i will perform some validations on the total amount) and a total amount TextView (retrieved from the recyclerview adapter using a listener).

I tried to use setOnEditorActionListener but this will not help if the user clicked on back button instead of hitting the enter.

Also, I tried to use the focus change listener but the problem is that the EditText never loses focus even when clicking outside in the page.

And of course, the TextWatcher is not an ideal solution as it could be very expensive in OnBindViewHolder.

I need to ensure that whenever the user clicks on Proceed Button, the total amount is updated before.


Best way is to add two communications (via interfaces):

  • FIRST between Activity and Adapter

  • SECOND between Adapter and single ViewHolder

Affter adding such type of communication your can calculate sum “live”.


Step #1

Create FIRST interface, for example:

interface AdapterContentChanged {
    fun valuesChanged()

and implement it in your Activity or create new variable (as anonymous class).

Step #2

Pass your activity (or instance of above interface) when you are creating adapter, for example:

private val ownAdapter = OwnAdapter(
    items,    // elements inside list
    this      // interface implementation

Step 3

Create SECOND interface, for example:

interface OwnViewHolderTextChanged {
    fun onTextChanged(position: Int, newValue: Int)

and implement it in your adapter or create new variable (anonymous class) – same like in step #1.

Step 4

Pass your adapter (or variable) and position (of the item) when you are binding viewHolder, for example:

override fun onBindViewHolder(holder: OwnViewHolder, position: Int) {
    val number = list[position]
    holder.bind(number, position, this)

Step 5

In bind() method (from above example), add new TextWatcher to your EditText.

In afterTextChanged() method (from TextWatcher) call method from interface and pass new value. For example:

fun bind(
    // TODO - add here more information which you need,
    position: Int,
    listener: OwnViewHolderTextChanged
) {
    itemView.edit_text.addTextChangedListener(object : TextWatcher {
        override fun afterTextChanged(s: Editable?) {
            // Get EditText content
            val newValue = getNumber()

            // Call method from interface
            listener.onTextChanged(position = position, newValue = newValue)

        override fun beforeTextChanged(s: CharSequence?, start: Int, count: Int, after: Int) {
            // Not used

        override fun onTextChanged(s: CharSequence?, start: Int, before: Int, count: Int) {
            // Not used


To calculate value from EditText you can use something like this:

private fun getNumber(): Int =
    try {
    } catch (e: Exception) {

Step 6

When text was changed inside method you have to:

  • update content of the list

  • notify adapter that “something was changed”

For example:

override fun onTextChanged(position: Int, newValue: Int) {
    list[position] = newValue

Step 7

When something changed (and Activty will know about that), you can calculate new sum:

override fun valuesChanged() {
    val sum: Int = ownAdapter.getCurrentSum()
    text_view.text = "Sum:  $sum"



Tagged : / / /

Code Bug Fix: symbol: class FlutterSocketIoPlugin location: package com.itsclicking.clickapp.fluttersocketio

Original Source Link

What is the reason why I get an error like the following when using flutter_socket_io package? (running on android device)

Launching libmain.dart on LLD L31 in debug mode...
Running Gradle task 'assembleDebug'... error: cannot find symbol
  symbol:   class FlutterSocketIoPlugin
  location: package com.itsclicking.clickapp.fluttersocketio
1 error

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':app:compileDebugJavaWithJavac'.
> Compilation failed; see the compiler error output for details.

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at

BU�LD FAILED in 1m 33s
Exception: Gradle task assembleDebug failed with exit code 1

Tagged : / / /

Code Bug Fix: android-Memory usage not decrease in Profiler without LeakCanary

Original Source Link

I have just started to use LeakCanary. I use this code to avoid memory leak in Fragment.

 public void onDestroyView() {

    view = null;
    imageView= null;
    recyclerView= null;
    progressBar = null;
    if (list != null) {
    recyclerAdapter= null;
    swipeRefreshLayout = null;

So when I use above codes with LeakCanary I see memory usage is decreasing in Profiler(for example it’s decreasing from 120MB to 80MB). But when I remove LeakCanary(from gradle) I see memory usage is not decreasing. It doesn’t change(still 120MB). Is it a bug ?

Tagged : / / /

Code Bug Fix: Image Caching With Anroid Libraries

Original Source Link

My app is getting a list of movie objects, which includes a movie poster url, and it’s displaying those posters and titles, then when a user clicks on that movie, it goes to a movie detail activity, that also has the poster and title, along with some more info. Pretty simple. I’ve noticed though that when I click the movie, there’s still a second or two where the poster hasn’t loaded, and I’d like to eliminate that, which seems relatively easy with caching. I was using Picasso, I recently switched to Coil for image loading, and they both claim to handle caching behind the scenes, but I’m assuming that isn’t happening because of this loading time. In both activities I’m just loading it with:

posterImageView.load(it.posterURL ?: {url of "no poster found" image})

Is there anything extra I need to be doing in order to be loading these from cache instead of loading the image from url every time?

Tagged : / /

Linux HowTo: Generate Quick Documentation in Android Studio

Original Source Link

In Xcode, there’s a shortcut (++/) to add documentation to the current method/class/etc. For methods, this shortcut generates a template that includes parameters, if they are available.


func foo(bar: Bool) { }

The generated documentation is…

/// Description
/// - Parameter bar: bar description
func foo(bar: Bool) { }

Is this type of action available in Android Studio?

Tagged : / / /

Code Bug Fix: Accessing raw capacitive touchscreen data

Original Source Link

I wanted to get access to the raw capacitive touchscreen data on my android app. I noticed the following link walks through the steps to do this.

However, it uses LG Nexus 5 phone and requires flashing the android kernel.

I want to follow the example, but I have a pixel phone and I have never flashed the kernel. So I wanted to know if there were other methods of getting raw capacitive touchscreen data or if it was okay for me to just follow the above tutorial even with a different android phone.

Tagged : /

Code Bug Fix: TaskStackBuilder not working from notification

Original Source Link

I’m trying to preserve the normal workflow to the home activity when opening the app from a push notification. I’m creating a pending intent using TastStackBuilder:

    Intent childActivityIntent = new Intent(context, ChildActivity.class);

    TaskStackBuilder taskStackBuilder = TaskStackBuilder.create(context);

    PendingIntent pendingIntent = taskStackBuilder.getPendingIntent(0, PendingIntent.FLAG_UPDATE_CURRENT);

    int iconResource = getIconResource(icon);

    Uri defaultSoundUri = RingtoneManager.getDefaultUri(RingtoneManager.TYPE_NOTIFICATION);
    NotificationCompat.Builder notificationBuilder = new NotificationCompat.Builder(context, "fcm_default_channel")

And this is the manifest:

            <action android:name="android.intent.action.MAIN" />
            <category android:name="android.intent.category.LAUNCHER" />


When i click the notification, the correct activity is opened ChildActivity but when I press back, the HomeActivity is not opened, the application just closes.

I already tried reinstalling the app, changing the PendingIntent flag to FLAG_CANCEL_CURRENT but nothing seems to work.

However if I start the activities on the TaskStackBuilder, the workflow is applied correctly:

    Intent childActivityIntent = new Intent(context, ChildActivity.class);

    TaskStackBuilder taskStackBuilder = TaskStackBuilder.create(context);

Any idea why this is not working?

Tagged : / / /

Code Bug Fix: Making GET Request after a POST Request in the same Session with OkHttp in Android Studio

Original Source Link

I am trying to retrieve some JSON data using OkHttp in Android Studio from the URL:

Before I can get the data using this URL, it requires me to Login into the Duolingo server first (which makes sense since I want it to return data from my profile) so I make a POST request with my credentials using OkHttp. This is my code to achieve that:

        OkHttpClient client = new OkHttpClient();

        String postUrl = "";
        String getUrl = "";
        String credentials = "{"identifier": "[email protected]", "password": "Password"}";
        RequestBody body = RequestBody.create(credentials, MediaType.parse("application/json; charset=utf-8"));

        Request request = new Request.Builder()

        client.newCall(request).enqueue(new Callback()
            public void onFailure(@NotNull Call call, @NotNull IOException e)
                Log.i("TAG", "ERROR - " + e.getMessage());

            public void onResponse(@NotNull Call call, @NotNull Response response) throws IOException
                if (response.isSuccessful())
                    Log.i("LOG", "LOGGED IN");
                    Log.i("LOG", "FAILED - " + response.toString());

The Response is successful and I get a 200 Response Code.

Now since I have Logged In, I want to make a GET Request to the URL mentioned above to get the JSON Data. Problem is I do not know how to make a GET Request in succession to the POST Request I just made. OkHttp treats the 2 consecutive requests as separate while I want them to be treated as the same session.

Someone told me Cookies can help but I am totally oblivious to that and how they work. All help is appreciated!

Tagged : /

Making Game: How to copy Chrome cookies on Android to Chrome on Windows?

Original Source Link

Is there a way to do this? I found a few extensions that can do it but doesn’t work on Android Chrome. Is rooting required?

There is no way to do that by using Google Chrome on your Android device, since Google Chrome does not support extensions and does not allow you to export your cookies. I am also not aware of any way to access Chrome’s cookies even by rooting your phone.

If you don’t mind using a different browser, you could try to use Firefox for Android since it fully supports extension. You can run the same extensions on your phone than the one available for PC.

I have made a Firefox extension for this exact purpose. It is called Cookie-Editor. You can view, edit and delete your cookies directly on your phone. You can also import and export your cookies to be able to transfer them between your phone and your computer. Is is compatible with other Chrome extensions like EditThisCookie too, so you don’t even have to switch browser on your PC.

Cookie-Editor lets you efficiently create, edit and delete cookies for the current tab. Perfect for developing, quickly testing or even manually managing your cookies for your privacy.
Also supports Firefox for Android.

You can find the extension for Firefox here

Or for Google Chrome here

Have you tried loging in to chrome on the pc with the same google account you are loged in to the android phone and then enable sync on chrome options?

Tagged : / / /