• Introducing XDA Computing: Discussion zones for Hardware, Software, and more!    Check it out!

Book Reader Application Using General Text Recognition by Huawei HiAI in Android

Search This thread

Basavaraj.navi

Senior Member
Dec 4, 2020
67
3
29
Bangalore
Introduction


In this article, we will learn how to integrate Huawei General Text Recognition using Huawei HiAI. We will build the Book reader application.

About application:

Usually user get bored to read book. This application helps them to listen book reading instead of manual book reading. So all they need to do is just capture photo of book and whenever user is travelling or whenever user want to read the book on their free time. Just user need to select image from galley and listen like music.

Huawei general text recognition works on OCR technology.

First let us understand about OCR.

What is optical character recognition (OCR)?

Optical Character Recognition (OCR) technology is a business solution for automating data extraction from printed or written text from a scanned document or image file and then converting the text into a machine-readable form to be used for data processing like editing or searching.

Now let us understand about General Text Recognition (GTR).

At the core of the GTR is Optical Character Recognition (OCR) technology, which extracts text in screenshots and photos taken by the phone camera. For photos taken by the camera, this API can correct for tilts, camera angles, reflections, and messy backgrounds up to a certain degree. It can also be used for document and streetscape photography, as well as a wide range of usage scenarios, and it features strong anti-interference capability. This API works on device side processing and service connection.

Features

  • For photos: Provides text area detection and text recognition for Chinese, English, Japanese, Korean, Russian, Italian, Spanish, Portuguese, German, and French texts in multiple printing fonts. A wide range of scenarios are supported, and a high recognition accuracy can be achieved even under the influence of complex lighting condition, background, or more.
  • For screenshots: Optimizes text extraction algorithms based on the characteristics of screenshots captured on mobile phones. Currently, this function is available in the Chinese mainland supporting Chinese and English texts.
OCR features

  • Lightweight: This API greatly reduces the computing time and ROM space the algorithm model takes up, making your app more lightweight.
  • Customized hierarchical result return: You can choose to return the coordinates of text blocks, text lines, and text characters in the screenshot based on app requirements.
How to integrate General Text Recognition

1. Configure the application on the AGC.

2. Apply for HiAI Engine Library

3. Client application development process.

Configure application on the AGC

Follow the steps

Step 1: We need to register as a developer account in AppGallery Connect. If you are already a developer ignore this step.

Step 2: Create an app by referring to Creating a Project and Creating an App in the Project

Step 3: Set the data storage location based on the current location.

Step 4: Generating a Signing Certificate Fingerprint.

Step 5: Configuring the Signing Certificate Fingerprint.

Step 6: Download your agconnect-services.json file, paste it into the app root directory.

Apply for HiAI Engine Library

What is Huawei HiAI?


HiAI is Huawei’s AI computing platform. HUAWEI HiAI is a mobile terminal–oriented artificial intelligence (AI) computing platform that constructs three layers of ecology: service capability openness, application capability openness, and chip capability openness. The three-layer open platform that integrates terminals, chips, and the cloud brings more extraordinary experience for users and developers.

How to apply for HiAI Engine?

Follow the steps

Step 1: Navigate to this URL, choose App Service > Development and click HUAWEI HiAI.

cke_108.png

Step 2: Click Apply for HUAWEI HiAI kit.

cke_109.png

Step 3: Enter required information like Product name and Package name, click Next button.

cke_110.png

Step 4: Verify the application details and click Submit button.

Step 5: Click the Download SDK button to open the SDK list.

cke_111.png

Step 6: Unzip downloaded SDK and add into your android project under libs folder.

cke_112.png

Step 7: Add jar files dependences into app build.gradle file.


Code:
implementation fileTree(include: ['*.aar', '*.jar'], dir: 'libs')
implementation 'com.google.code.gson:gson:2.8.6'
repositories {
flatDir {
dirs 'libs'
}
}
Client application development process

Follow the steps

Step 1: Create an Android application in the Android studio (Any IDE which is your favorite).

Step 2: Add the App level Gradle dependencies. Choose inside project Android > app > build.gradle.


Code:
apply plugin: 'com.android.application'
apply plugin: 'com.huawei.agconnect'
Root level gradle dependencies.


Code:
maven { url 'https://developer.huawei.com/repo/' }
classpath 'com.huawei.agconnect:agcp:1.4.1.300'
Step 3: Add permission in AndroidManifest.xml


XML:
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.READ_INTERNAL_STORAGE" />
<uses-permission android:name="android.permission.CAMERA" />
Step 4: Build application.

Initialize all view.


Java:
private void initializeView() {
        mPlayAudio = findViewById(R.id.playAudio);
        mTxtViewResult = findViewById(R.id.result);
        mImageView = findViewById(R.id.imgViewPicture);
    }

Request the runtime permission



Java:
private void requestPermissions() {
        try {
            if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) {
                int permission1 = ActivityCompat.checkSelfPermission(this,
                        Manifest.permission.WRITE_EXTERNAL_STORAGE);
                int permission2 = ActivityCompat.checkSelfPermission(this,
                        Manifest.permission.CAMERA);
                if (permission1 != PackageManager.PERMISSION_GRANTED || permission2 != PackageManager
                        .PERMISSION_GRANTED) {
                    ActivityCompat.requestPermissions(this, new String[]{Manifest.permission.WRITE_EXTERNAL_STORAGE,
                            Manifest.permission.READ_EXTERNAL_STORAGE, Manifest.permission.CAMERA}, 0x0010);
                }
            }
        } catch (Exception e) {
            e.printStackTrace();
        }
    }


    @override
    public void onRequestPermissionsResult(int requestCode, String[] permissions, int[] grantResults) {
        super.onRequestPermissionsResult(requestCode, permissions, grantResults);
        if (grantResults.length <= 0
                || grantResults[0] != PackageManager.PERMISSION_GRANTED) {
            Toast.makeText(this, "Permission denied", Toast.LENGTH_SHORT).show();
        }

    }

Initialize vision base



Java:
private void initVision() {
        VisionBase.init(this, new ConnectionCallback() {
            @override
            public void onServiceConnect() {
                Log.e(TAG, " onServiceConnect");
            }

            @override
            public void onServiceDisconnect() {
                Log.e(TAG, " onServiceDisconnect");
            }
        });
    }

Initialize text to speech


Java:
private void initializeTextToSpeech() {
        textToSpeech = new TextToSpeech(getApplicationContext(), new TextToSpeech.OnInitListener() {
            @override
            public void onInit(int status) {
                if (status != TextToSpeech.ERROR) {
                    textToSpeech.setLanguage(Locale.UK);
                }
            }
        });
    }
Copy code
Create TextDetector instance.


Java:
mTextDetector = new TextDetector(this);
Define Vision image.


Java:
VisionImage image = VisionImage.fromBitmap(mBitmap);
Create instance of Text class.


Java:
final Text result = new Text();
Create and set VisionTextConfiguration


Java:
VisionTextConfiguration config = new VisionTextConfiguration.Builder()
.setAppType(VisionTextConfiguration.APP_NORMAL)
.setProcessMode(VisionTextConfiguration.MODE_IN)
.setDetectType(TextDetectType.TYPE_TEXT_DETECT_FOCUS_SHOOT)
.setLanguage(TextConfiguration.AUTO).build();
//Set vision configuration
mTextDetector.setVisionConfiguration(config);
Call detect method to get the result



Java:
int result_code = mTextDetector.detect(image, result, new VisionCallback<Text>() {
            @override
            public void onResult(Text text) {
                dismissDialog();
                Message message = Message.obtain();
                message.what = TYPE_SHOW_RESULT;
                message.obj = text;
                mHandler.sendMessage(message);
            }

            @override
            public void onError(int i) {
                Log.d(TAG, "Callback: onError " + i);
                mHandler.sendEmptyMessage(TYPE_TEXT_ERROR);
            }

            @override
            public void onProcessing(float v) {
                Log.d(TAG, "Callback: onProcessing:" + v);
            }
        });
Copy code
Create Handler


Java:
private final Handler mHandler = new Handler() {
        [USER=439709]@override[/USER]
        public void handleMessage(Message msg) {
            super.handleMessage(msg);
            int status = msg.what;
            Log.d(TAG, "handleMessage status = " + status);
            switch (status) {
                case TYPE_CHOOSE_PHOTO: {
                    if (mBitmap == null) {
                        Log.e(TAG, "bitmap is null");
                        return;
                    }
                    mImageView.setImageBitmap(mBitmap);
                    mTxtViewResult.setText("");
                    showDialog();
                    detectTex();
                    break;
                }
                case TYPE_SHOW_RESULT: {
                    Text result = (Text) msg.obj;
                    if (dialog != null && dialog.isShowing()) {
                        dialog.dismiss();
                    }
                    if (result == null) {
                        mTxtViewResult.setText("Failed to detect text lines, result is null.");
                        break;
                    }
                    String textValue = result.getValue();
                    Log.d(TAG, "text value: " + textValue);
                    StringBuffer textResult = new StringBuffer();
                    List<TextLine> textLines = result.getBlocks().get(0).getTextLines();
                    for (TextLine line : textLines) {
                        textResult.append(line.getValue() + " ");
                    }
                    Log.d(TAG, "OCR Detection succeeded.");
                    mTxtViewResult.setText(textResult.toString());
                    textToSpeechString = textResult.toString();
                    break;
                }

                case TYPE_TEXT_ERROR: {
                    mTxtViewResult.setText("Failed to detect text lines, result is null.");
                }
                default:
                    break;
            }
        }
    };

Complete code as follows



Java:
import android.Manifest;
import android.app.Activity;
import android.app.ProgressDialog;
import android.content.Intent;
import android.content.pm.PackageManager;
import android.database.Cursor;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.net.Uri;
import android.os.Build;
import android.os.Handler;
import android.os.Message;
import android.provider.MediaStore;
import android.speech.tts.TextToSpeech;
import android.support.v4.app.ActivityCompat;
import android.support.v7.app.AppCompatActivity;
import android.os.Bundle;
import android.util.Log;
import android.view.View;
import android.widget.Button;
import android.widget.ImageView;
import android.widget.TextView;
import android.widget.Toast;

import com.huawei.hiai.vision.common.ConnectionCallback;
import com.huawei.hiai.vision.common.VisionBase;
import com.huawei.hiai.vision.common.VisionCallback;
import com.huawei.hiai.vision.common.VisionImage;
import com.huawei.hiai.vision.text.TextDetector;
import com.huawei.hiai.vision.visionkit.text.Text;
import com.huawei.hiai.vision.visionkit.text.TextDetectType;
import com.huawei.hiai.vision.visionkit.text.TextLine;
import com.huawei.hiai.vision.visionkit.text.config.TextConfiguration;
import com.huawei.hiai.vision.visionkit.text.config.VisionTextConfiguration;

import java.util.List;
import java.util.Locale;

public class MainActivity extends AppCompatActivity {
    private static final String TAG = MainActivity.class.getSimpleName();
    private static final int REQUEST_CHOOSE_PHOTO_CODE = 2;

    private Bitmap mBitmap;
    private ImageView mPlayAudio;
    private ImageView mImageView;
    private TextView mTxtViewResult;
    protected ProgressDialog dialog;
    private TextDetector mTextDetector;
    Text imageText = null;
    TextToSpeech textToSpeech;
    String textToSpeechString = "";

    private static final int TYPE_CHOOSE_PHOTO = 1;
    private static final int TYPE_SHOW_RESULT = 2;
    private static final int TYPE_TEXT_ERROR = 3;

    [USER=439709]@override[/USER]
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);
        initializeView();
        requestPermissions();
        initVision();
        initializeTextToSpeech();
    }

    private void initializeView() {
        mPlayAudio = findViewById(R.id.playAudio);
        mTxtViewResult = findViewById(R.id.result);
        mImageView = findViewById(R.id.imgViewPicture);
    }

    private void initVision() {
        VisionBase.init(this, new ConnectionCallback() {
            [USER=439709]@override[/USER]
            public void onServiceConnect() {
                Log.e(TAG, " onServiceConnect");
            }

            [USER=439709]@override[/USER]
            public void onServiceDisconnect() {
                Log.e(TAG, " onServiceDisconnect");
            }
        });
    }

    private void initializeTextToSpeech() {
        textToSpeech = new TextToSpeech(getApplicationContext(), new TextToSpeech.OnInitListener() {
            [USER=439709]@override[/USER]
            public void onInit(int status) {
                if (status != TextToSpeech.ERROR) {
                    textToSpeech.setLanguage(Locale.UK);
                }
            }
        });
    }

    public void onChildClick(View view) {
        switch (view.getId()) {
            case R.id.btnSelect: {
                Log.d(TAG, "Select an image");
                Intent intent = new Intent(Intent.ACTION_PICK);
                intent.setType("image/*");
                startActivityForResult(intent, REQUEST_CHOOSE_PHOTO_CODE);
                break;
            }
            case R.id.playAudio: {
                if (textToSpeechString != null && !textToSpeechString.isEmpty())
                textToSpeech.speak(textToSpeechString, TextToSpeech.QUEUE_FLUSH, null);
                break;
            }
        }
    }

    private void detectTex() {
        /* create a TextDetector instance firstly */
        mTextDetector = new TextDetector(this);

        /*Define VisionImage and transfer the Bitmap image to be detected*/
        VisionImage image = VisionImage.fromBitmap(mBitmap);

        /*Define the Text class.*/
        final Text result = new Text();

        /*Use VisionTextConfiguration to select the type of the image to be called. */
        VisionTextConfiguration config = new VisionTextConfiguration.Builder()
                .setAppType(VisionTextConfiguration.APP_NORMAL)
                .setProcessMode(VisionTextConfiguration.MODE_IN)
                .setDetectType(TextDetectType.TYPE_TEXT_DETECT_FOCUS_SHOOT)
                .setLanguage(TextConfiguration.AUTO).build();
        //Set vision configuration
        mTextDetector.setVisionConfiguration(config);

        /*Call the detect method of TextDetector to obtain the result*/
        int result_code = mTextDetector.detect(image, result, new VisionCallback<Text>() {
            [USER=439709]@override[/USER]
            public void onResult(Text text) {
                dismissDialog();
                Message message = Message.obtain();
                message.what = TYPE_SHOW_RESULT;
                message.obj = text;
                mHandler.sendMessage(message);
            }

            [USER=439709]@override[/USER]
            public void onError(int i) {
                Log.d(TAG, "Callback: onError " + i);
                mHandler.sendEmptyMessage(TYPE_TEXT_ERROR);
            }

            [USER=439709]@override[/USER]
            public void onProcessing(float v) {
                Log.d(TAG, "Callback: onProcessing:" + v);
            }
        });
    }

    private void showDialog() {
        if (dialog == null) {
            dialog = new ProgressDialog(MainActivity.this);
            dialog.setTitle("Detecting text...");
            dialog.setMessage("Please wait...");
            dialog.setIndeterminate(true);
            dialog.setCancelable(false);
        }
        dialog.show();
    }


    private final Handler mHandler = new Handler() {
        [USER=439709]@override[/USER]
        public void handleMessage(Message msg) {
            super.handleMessage(msg);
            int status = msg.what;
            Log.d(TAG, "handleMessage status = " + status);
            switch (status) {
                case TYPE_CHOOSE_PHOTO: {
                    if (mBitmap == null) {
                        Log.e(TAG, "bitmap is null");
                        return;
                    }
                    mImageView.setImageBitmap(mBitmap);
                    mTxtViewResult.setText("");
                    showDialog();
                    detectTex();
                    break;
                }
                case TYPE_SHOW_RESULT: {
                    Text result = (Text) msg.obj;
                    if (dialog != null && dialog.isShowing()) {
                        dialog.dismiss();
                    }
                    if (result == null) {
                        mTxtViewResult.setText("Failed to detect text lines, result is null.");
                        break;
                    }
                    String textValue = result.getValue();
                    Log.d(TAG, "text value: " + textValue);
                    StringBuffer textResult = new StringBuffer();
                    List<TextLine> textLines = result.getBlocks().get(0).getTextLines();
                    for (TextLine line : textLines) {
                        textResult.append(line.getValue() + " ");
                    }
                    Log.d(TAG, "OCR Detection succeeded.");
                    mTxtViewResult.setText(textResult.toString());
                    textToSpeechString = textResult.toString();
                    break;
                }

                case TYPE_TEXT_ERROR: {
                    mTxtViewResult.setText("Failed to detect text lines, result is null.");
                }
                default:
                    break;
            }
        }
    };

    [USER=439709]@override[/USER]
    protected void onActivityResult(int requestCode, int resultCode, Intent data) {
        super.onActivityResult(requestCode, resultCode, data);
        if (requestCode == REQUEST_CHOOSE_PHOTO_CODE && resultCode == Activity.RESULT_OK) {
            if (data == null) {
                return;
            }

            Uri selectedImage = data.getData();
            getBitmap(selectedImage);
        }
    }

    private void requestPermissions() {
        try {
            if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) {
                int permission1 = ActivityCompat.checkSelfPermission(this,
                        Manifest.permission.WRITE_EXTERNAL_STORAGE);
                int permission2 = ActivityCompat.checkSelfPermission(this,
                        Manifest.permission.CAMERA);
                if (permission1 != PackageManager.PERMISSION_GRANTED || permission2 != PackageManager
                        .PERMISSION_GRANTED) {
                    ActivityCompat.requestPermissions(this, new String[]{Manifest.permission.WRITE_EXTERNAL_STORAGE,
                            Manifest.permission.READ_EXTERNAL_STORAGE, Manifest.permission.CAMERA}, 0x0010);
                }
            }
        } catch (Exception e) {
            e.printStackTrace();
        }
    }

    private void getBitmap(Uri imageUri) {
        String[] pathColumn = {MediaStore.Images.Media.DATA};

        Cursor cursor = getContentResolver().query(imageUri, pathColumn, null, null, null);
        if (cursor == null) return;
        cursor.moveToFirst();
        int columnIndex = cursor.getColumnIndex(pathColumn[0]);
        /* get image path */
        String picturePath = cursor.getString(columnIndex);
        cursor.close();

        mBitmap = BitmapFactory.decodeFile(picturePath);
        if (mBitmap == null) {
            return;
        }
        //You can set image here
        //mImageView.setImageBitmap(mBitmap);
        // You can pass it  handler as well
        mHandler.sendEmptyMessage(TYPE_CHOOSE_PHOTO);
        mTxtViewResult.setText("");
        mPlayAudio.setEnabled(true);
    }

    private void dismissDialog() {
        if (dialog != null && dialog.isShowing()) {
            dialog.dismiss();
        }
    }

    [USER=439709]@override[/USER]
    public void onRequestPermissionsResult(int requestCode, String[] permissions, int[] grantResults) {
        super.onRequestPermissionsResult(requestCode, permissions, grantResults);
        if (grantResults.length <= 0
                || grantResults[0] != PackageManager.PERMISSION_GRANTED) {
            Toast.makeText(this, "Permission denied", Toast.LENGTH_SHORT).show();
        }

    }

    [USER=439709]@override[/USER]
    protected void onDestroy() {
        super.onDestroy();
        /* release ocr instance and free the npu resources*/
        if (mTextDetector != null) {
            mTextDetector.release();
        }
        dismissDialog();
        if (mBitmap != null) {
            mBitmap.recycle();
        }
    }
}

activity_main.xml


XML:
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
    android:layout_width="match_parent"
android:layout_height="match_parent"
xmlns:app="http://schemas.android.com/apk/res-auto"
android:fitsSystemWindows="true"
androidrientation="vertical"
android:background="@android:color/darker_gray">
    <android.support.v7.widget.Toolbar
        android:layout_width="match_parent"
android:layout_height="50dp"
android:background="#ff0000"
android:elevation="10dp">
        <LinearLayout
            android:layout_width="match_parent"
android:layout_height="match_parent"
androidrientation="horizontal">
            <TextView
                android:layout_width="match_parent"
android:layout_height="match_parent"
android:text="Book Reader"
android:layout_gravity="center"
android:gravity="center|start"
android:layout_weight="1"
android:textColor="@android:color/white"
android:textStyle="bold"
android:textSize="20sp"/>

            <ImageView
                android:layout_width="40dp"
android:layout_height="40dp"
android:src="@drawable/ic_baseline_play_circle_outline_24"
android:layout_gravity="center|end"
android:layout_marginEnd="10dp"
android:id="@+id/playAudio"
androidadding="5dp"/>
        </LinearLayout>
    </android.support.v7.widget.Toolbar>

    <ScrollView
        android:layout_width="match_parent"
android:layout_height="match_parent"
android:fitsSystemWindows="true">

        <LinearLayout
            android:layout_width="match_parent"
android:layout_height="match_parent"
androidrientation="vertical"
android:background="@android:color/darker_gray"
            >



            <android.support.v7.widget.CardView
                android:layout_width="match_parent"
android:layout_height="wrap_content"
app:cardCornerRadius="5dp"
app:cardElevation="10dp"
android:layout_marginStart="10dp"
android:layout_marginEnd="10dp"
android:layout_marginTop="20dp"
android:layout_gravity="center">

                <ImageView
                    android:id="@+id/imgViewPicture"
android:layout_width="300dp"
android:layout_height="300dp"
android:layout_margin="8dp"
android:layout_gravity="center_horizontal"
android:scaleType="fitXY" />

            </android.support.v7.widget.CardView>


            <android.support.v7.widget.CardView
                android:layout_width="match_parent"
android:layout_height="wrap_content"
app:cardCornerRadius="5dp"
app:cardElevation="10dp"
android:layout_marginStart="10dp"
android:layout_marginEnd="10dp"
android:layout_marginTop="10dp"
android:layout_gravity="center"
android:layout_marginBottom="20dp">

                <LinearLayout
                    android:layout_width="match_parent"
android:layout_height="wrap_content"
androidrientation="vertical"
                    >
                    <TextView
                        android:layout_margin="5dp"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:textColor="@android:color/black"
android:text="Text on the image"
android:textStyle="normal"
                        />
                    <TextView
                        android:id="@+id/result"
android:layout_margin="5dp"
android:layout_marginBottom="20dp"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:textSize="18sp"
android:textColor="#ff0000"/>
                </LinearLayout>

            </android.support.v7.widget.CardView>

            <Button
                android:id="@+id/btnSelect"
android:layout_width="match_parent"
android:layout_height="wrap_content"
androidnClick="onChildClick"
android:layout_marginStart="10dp"
android:layout_marginEnd="10dp"
android:layout_marginBottom="10dp"
android:text="[USER=936943]@string[/USER]/select_picture"
android:background="@drawable/round_button_bg"
android:textColor="@android:color/white"
android:textAllCaps="false"/>

        </LinearLayout>
    </ScrollView>

</LinearLayout>
Result


cke_113.png

cke_114.png

cke_115.png

cke_116.png

cke_117.png

Tips and Tricks


  • Maximum width and height: 1440 px and 15210 px (If the image is larger than this, you will receive error code 200).
  • Photos recommended size for optimal recognition accuracy.
  • Resolution > 720P
  • Aspect ratio < 2:1
  • If you are taking Video from a camera or gallery make sure your app has camera and storage permission.
  • Add the downloaded huawei-hiai-vision-ove-10.0.4.307.aar, huawei-hiai-pdk-1.0.0.aar file to libs folder.
  • Check dependencies added properly
  • Latest HMS Core APK is required.
  • Min SDK is 21. Otherwise you will get Manifest merge issue.
Conclusion


In this article, we have learnt the following concepts.

  1. What is OCR?
  2. Learnt about general text recognition.
  3. Feature of GTR
  4. Features of OCR
  5. How to integrate General Text Recognition using Huawei HiAI
  6. How to Apply Huawei HiAI
  7. How to build the application
Reference


General Text Recognition

Apply for Huawei HiAI

Happy coding
 
Last edited: