4

Create a Virtual Fitting Room for Glasses in an E-commerce App

 2 years ago
source link: https://www.codeproject.com/Tips/5319657/Create-a-Virtual-Fitting-Room-for-Glasses-in-an-E
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

Background

The ubiquity of the Internet and smart devices has made e-commerce the preferred choice for countless consumers. However, many longtime users have grown wary of the stagnant shopping model, and thus enhancing user experience is critical to stimulating further growth in e-commerce and attracting a broader user base. If app can offer intelligent graphics processing capabilities to identify a user's facial and physical features, which when combined with a new display paradigm, enables users to try on products virtually through their mobile phones, it will take a groundbreaking digital shopping experience.

Effects

A user opens a shopping app, then the user taps a product's picture to view the 3D model of the product, which they can rotate, enlarge, and shrink for interactive viewing.

Getting Started

Configuring the Maven Repository Address for the HMS Core SDK

Open the project-level build.gradle file in your Android Studio project. Go to buildscript > repositories and allprojects > repositories to configure the Maven repository address for the HMS Core SDK.

Copy Code
buildscript {
    repositories{
      ...   
        maven {url 'http://developer.huawei.com/repo/'}
    }   
}
allprojects {
    repositories {     
       ...
         maven { url 'http://developer.huawei.com/repo/'}
    }
}

Adding Build Dependencies for the HMS Core SDK

Open the app-level build.gradle file of your project. Add build dependencies in the dependencies block and use the Full-SDK of Scene Kit and AR Engine SDK.

Copy Code
dependencies {
....
implementation 'com.huawei.scenekit:full-sdk:5.0.2.302'
implementation 'com.huawei.hms:arenginesdk:2.13.0.4'
}

Adding Permissions in the AndroidManifest.xml File

Open the AndroidManifest.xml file in main directory and add the permission to use the camera above the <application line.

Copy Code
<!--Camera permission-->
<uses-permission android:name="android.permission.CAMERA" />

Development Procedure

Configuring MainActivity

Add two buttons to the layout configuration file of MainActivity. Set the background of the onBtnShowProduct button to the preview image of the product and add the text Try it on! to the onBtnTryProductOn button to guide the user to the feature.

Copy Code
<Button
    android:layout_width="260dp"
    android:layout_height="160dp"
    android:background="@drawable/sunglasses"
    android:onClick="onBtnShowProduct" />

<Button
    android:layout_width="wrap_content"
    android:layout_height="wrap_content"
    android:text="Try it on!"
    android:textAllCaps="false"
    android:textSize="24sp"
    android:onClick="onBtnTryProductOn" />

If the user taps the onBtnShowProduct button, the 3D model of the product will be loaded. After tapping the onBtnTryProductOn button, the user will enter the AR fitting screen.

Configuring the 3D Model Display for a Product

  1. Create a SceneSampleView inherited from SceneView.

    Copy Code
    public class SceneSampleView extends SceneView {
        public SceneSampleView(Context context) {
            super(context);
        }
        public SceneSampleView(Context context, AttributeSet attributeSet) {
            super(context, attributeSet);
        }
    }

    Override the surfaceCreated method to create and initialize SceneView. Then call loadScene to load the materials, which should be in the glTF or GLB format, to have them rendered and displayed. Call loadSkyBox to load skybox materials, loadSpecularEnvTexture to load specular maps, and loadDiffuseEnvTexture to load diffuse maps. These files should be in the DDS (cubemap) format.

    All loaded materials are stored in the src > main > assets > SceneView folder.

    Copy Code
    @Override
    public void surfaceCreated(SurfaceHolder holder) {
        super.surfaceCreated(holder);
        // Load the materials to be rendered.
        loadScene("SceneView/sunglasses.glb");
        // Call loadSkyBox to load skybox texture materials.
        loadSkyBox("SceneView/skyboxTexture.dds");
        // Call loadSpecularEnvTexture to load specular texture materials.
        loadSpecularEnvTexture("SceneView/specularEnvTexture.dds");
        // Call loadDiffuseEnvTexture to load diffuse texture materials.
        loadDiffuseEnvTexture("SceneView/diffuseEnvTexture.dds");
    }
  2. Create a SceneViewActivity inherited from Activity.
    Call setContentView using the onCreate method, and then pass SceneSampleView that you have created using the XML tag in the layout file to setContentView.
    Copy Code
    public class SceneViewActivity extends Activity {
        @Override
        protected void onCreate(Bundle savedInstanceState) {
            super.onCreate(savedInstanceState);
            setContentView(R.layout.activity_sample);
        }
    }

    Create SceneSampleView in the layout file as follows:

    Copy Code
    <com.huawei.scene.demo.sceneview.SceneSampleView
        android:layout_width="match_parent"
        android:layout_height="match_parent"/>
  3. Create an onBtnShowProduct in MainActivity.
    When the user taps the onBtnShowProduct button, SceneViewActivity is called to load, render, and finally display the 3D model of the product.
    Copy Code
    public void onBtnShowProduct(View view) {
        startActivity(new Intent(this, SceneViewActivity.class));
    }

Configuring AR Fitting for a Product

Product virtual try-on is easily accessible, thanks to the facial recognition, graphics rendering, and AR display capabilities offered by HMS Core.

  1. Create a FaceViewActivity inherited from Activity, and create the corresponding layout file.
    Create face_view in the layout file to display the try-on effect.

    Copy Code
    <com.huawei.hms.scene.sdk.FaceView
        android:id="@+id/face_view"
        android:layout_width="match_parent"
        android:layout_height="match_parent"
        app:sdk_type="AR_ENGINE"></com.huawei.hms.scene.sdk.FaceView>

    Create a switch. When the user taps it, they can check the difference between the appearances with and without the virtual glasses.

    Copy Code
    <Switch
        android:layout_width="wrap_content"
        android:layout_height="wrap_content"
        android:id="@+id/switch_view"
        android:layout_alignParentTop="true"
        android:layout_marginTop="15dp"
        android:layout_alignParentEnd="true"
        android:layout_marginEnd ="15dp"
        android:text="Try it on"
        android:theme="@style/AppTheme"
        tools:ignore="RelativeOverlap" />
  2. Override the onCreate method in FaceViewActivity to obtain FaceView.
    Copy Code
    public class FaceViewActivity extends Activity {
        private FaceView mFaceView;
        @Override
        protected void onCreate(Bundle savedInstanceState) {
            super.onCreate(savedInstanceState);
            setContentView(R.layout.activity_face_view);
            mFaceView = findViewById(R.id.face_view);
        }
    }
  3. Create a listener method for the switch. When the switch is enabled, the loadAsset method is called to load the 3D model of the product. Set the position for facial recognition in LandmarkType.
    Copy Code
    mSwitch.setOnCheckedChangeListener(new CompoundButton.OnCheckedChangeListener() {
        @Override
        public void onCheckedChanged(CompoundButton buttonView, boolean isChecked) {
            mFaceView.clearResource();
            if (isChecked) {
                // Load materials.
             int index = mFaceView.loadAsset
                         ("FaceView/sunglasses.glb", LandmarkType.TIP_OF_NOSE);
            }
        }
    });

    Use setInitialPose to adjust the size and position of the model. Create the position, rotation, and scale arrays and pass values to them.

    Copy Code
    final float[] position = { 0.0f, 0.0f, -0.15f };
    final float[] rotation = { 0.0f, 0.0f, 0.0f, 0.0f };
    final float[] scale = { 2.0f, 2.0f, 0.3f };

    Put the following code below the loadAsset line:

    Copy Code
    mFaceView.setInitialPose(index, position, scale, rotation);
  4. Create an onBtnTryProductOn in MainActivity. When the user taps the onBtnTryProductOn button, the FaceViewActivity is called, enabling the user to view the try-on effect.
    Copy Code
    public void onBtnTryProductOn(View view) {
        if (ContextCompat.checkSelfPermission(this, Manifest.permission.CAMERA)
                != PackageManager.PERMISSION_GRANTED) {
            ActivityCompat.requestPermissions(
                    this, new String[]
                    { Manifest.permission.CAMERA }, FACE_VIEW_REQUEST_CODE);
        } else {
            startActivity(new Intent(this, FaceViewActivity.class));
        }
    }

History

  • 10th December, 2021: Initial version

Recommend

About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK