Create a Virtual Fitting Room for Glasses in an E-commerce App
source link: https://www.codeproject.com/Tips/5319657/Create-a-Virtual-Fitting-Room-for-Glasses-in-an-E
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
Background
The ubiquity of the Internet and smart devices has made e-commerce the preferred choice for countless consumers. However, many longtime users have grown wary of the stagnant shopping model, and thus enhancing user experience is critical to stimulating further growth in e-commerce and attracting a broader user base. If app can offer intelligent graphics processing capabilities to identify a user's facial and physical features, which when combined with a new display paradigm, enables users to try on products virtually through their mobile phones, it will take a groundbreaking digital shopping experience.
Effects
A user opens a shopping app, then the user taps a product's picture to view the 3D model of the product, which they can rotate, enlarge, and shrink for interactive viewing.
Getting Started
Configuring the Maven Repository Address for the HMS Core SDK
Open the project-level build.gradle file in your Android Studio project. Go to buildscript > repositories and allprojects > repositories to configure the Maven repository address for the HMS Core SDK.
buildscript { repositories{ ... maven {url 'http://developer.huawei.com/repo/'} } } allprojects { repositories { ... maven { url 'http://developer.huawei.com/repo/'} } }
Adding Build Dependencies for the HMS Core SDK
Open the app-level build.gradle file of your project. Add build dependencies in the dependencies block and use the Full-SDK of Scene Kit and AR Engine SDK.
dependencies { .... implementation 'com.huawei.scenekit:full-sdk:5.0.2.302' implementation 'com.huawei.hms:arenginesdk:2.13.0.4' }
Adding Permissions in the AndroidManifest.xml File
Open the AndroidManifest.xml file in main directory and add the permission to use the camera above the <application line.
<!--Camera permission--> <uses-permission android:name="android.permission.CAMERA" />
Development Procedure
Configuring MainActivity
Add two buttons to the layout configuration file of MainActivity
. Set the background of the onBtnShowProduct
button to the preview image of the product and add the text Try it on! to the onBtnTryProductOn
button to guide the user to the feature.
<Button android:layout_width="260dp" android:layout_height="160dp" android:background="@drawable/sunglasses" android:onClick="onBtnShowProduct" /> <Button android:layout_width="wrap_content" android:layout_height="wrap_content" android:text="Try it on!" android:textAllCaps="false" android:textSize="24sp" android:onClick="onBtnTryProductOn" />
If the user taps the onBtnShowProduct
button, the 3D model of the product will be loaded. After tapping the onBtnTryProductOn
button, the user will enter the AR fitting screen.
Configuring the 3D Model Display for a Product
Create a
SceneSampleView
inherited fromSceneView
.Copy Codepublic class SceneSampleView extends SceneView { public SceneSampleView(Context context) { super(context); } public SceneSampleView(Context context, AttributeSet attributeSet) { super(context, attributeSet); } }
Override the
surfaceCreated
method to create and initializeSceneView
. Then callloadScene
to load the materials, which should be in the glTF or GLB format, to have them rendered and displayed. CallloadSkyBox
to load skybox materials,loadSpecularEnvTexture
to load specular maps, andloadDiffuseEnvTexture
to load diffuse maps. These files should be in the DDS (cubemap) format.All loaded materials are stored in the src > main > assets > SceneView folder.
Copy Code@Override public void surfaceCreated(SurfaceHolder holder) { super.surfaceCreated(holder); // Load the materials to be rendered. loadScene("SceneView/sunglasses.glb"); // Call loadSkyBox to load skybox texture materials. loadSkyBox("SceneView/skyboxTexture.dds"); // Call loadSpecularEnvTexture to load specular texture materials. loadSpecularEnvTexture("SceneView/specularEnvTexture.dds"); // Call loadDiffuseEnvTexture to load diffuse texture materials. loadDiffuseEnvTexture("SceneView/diffuseEnvTexture.dds"); }
- Create a
SceneViewActivity
inherited fromActivity
.
CallsetContentView
using theonCreate
method, and then passSceneSampleView
that you have created using the XML tag in the layout file tosetContentView
.Copy Codepublic class SceneViewActivity extends Activity { @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_sample); } }
Create
SceneSampleView
in the layout file as follows:Copy Code<com.huawei.scene.demo.sceneview.SceneSampleView android:layout_width="match_parent" android:layout_height="match_parent"/>
- Create an
onBtnShowProduct
inMainActivity
.
When the user taps theonBtnShowProduct
button,SceneViewActivity
is called to load, render, and finally display the 3D model of the product.Copy Codepublic void onBtnShowProduct(View view) { startActivity(new Intent(this, SceneViewActivity.class)); }
Configuring AR Fitting for a Product
Product virtual try-on is easily accessible, thanks to the facial recognition, graphics rendering, and AR display capabilities offered by HMS Core.
Create a
FaceViewActivity
inherited fromActivity
, and create the corresponding layout file.
Createface_view
in the layout file to display the try-on effect.Copy Code<com.huawei.hms.scene.sdk.FaceView android:id="@+id/face_view" android:layout_width="match_parent" android:layout_height="match_parent" app:sdk_type="AR_ENGINE"></com.huawei.hms.scene.sdk.FaceView>
Create a switch. When the user taps it, they can check the difference between the appearances with and without the virtual glasses.
Copy Code<Switch android:layout_width="wrap_content" android:layout_height="wrap_content" android:id="@+id/switch_view" android:layout_alignParentTop="true" android:layout_marginTop="15dp" android:layout_alignParentEnd="true" android:layout_marginEnd ="15dp" android:text="Try it on" android:theme="@style/AppTheme" tools:ignore="RelativeOverlap" />
- Override the
onCreate
method inFaceViewActivity
to obtainFaceView
.Copy Codepublic class FaceViewActivity extends Activity { private FaceView mFaceView; @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_face_view); mFaceView = findViewById(R.id.face_view); } }
- Create a listener method for the switch. When the switch is enabled, the
loadAsset
method is called to load the 3D model of the product. Set the position for facial recognition inLandmarkType
.Copy CodemSwitch.setOnCheckedChangeListener(new CompoundButton.OnCheckedChangeListener() { @Override public void onCheckedChanged(CompoundButton buttonView, boolean isChecked) { mFaceView.clearResource(); if (isChecked) { // Load materials. int index = mFaceView.loadAsset ("FaceView/sunglasses.glb", LandmarkType.TIP_OF_NOSE); } } });
Use
setInitialPose
to adjust the size and position of the model. Create theposition
,rotation
, andscale
arrays and pass values to them.Copy Codefinal float[] position = { 0.0f, 0.0f, -0.15f }; final float[] rotation = { 0.0f, 0.0f, 0.0f, 0.0f }; final float[] scale = { 2.0f, 2.0f, 0.3f };
Put the following code below the
loadAsset
line:Copy CodemFaceView.setInitialPose(index, position, scale, rotation);
- Create an
onBtnTryProductOn
inMainActivity
. When the user taps theonBtnTryProductOn
button, theFaceViewActivity
is called, enabling the user to view the try-on effect.Copy Codepublic void onBtnTryProductOn(View view) { if (ContextCompat.checkSelfPermission(this, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) { ActivityCompat.requestPermissions( this, new String[] { Manifest.permission.CAMERA }, FACE_VIEW_REQUEST_CODE); } else { startActivity(new Intent(this, FaceViewActivity.class)); } }
History
- 10th December, 2021: Initial version
Recommend
-
6
This how-to will help you setting-up a very powerful development environment on your workstation, using virtual machines all in a virtual local network. But Why ? You work on various projects, with specific requirements,...
-
8
Lenovo's ThinkReality A3 Smart Glasses can show up to five virtual displays Their potential applications include 3D visualizations and AR-guided workflows for en...
-
8
VOGUE by Google, MIT, and UW: The AI-Powered Online Fitting RoomVOGUE by Google, MIT, and UW: The AI-Powered Online Fitting Room by@whatsai
-
3
Visit virtual AI-powered fitting room in Walmart iPhone app News
-
6
Walmart introduces virtual fitting room feature so customers can try on clothes from home — but it requires stripping down to your underwear or tight-fitting clothing to use 9 sneaky ways Wal...
-
6
Walmart Introduces Virtual Fitting Room App to Try on Clothes at Home US Markets Loading... In...
-
7
Full featureset remains limited to certain Android phones — Nreal’s $380 AR glasses want to be a virtual monitor for MacBooks Air AR glasses add support for M1/M2 MacBooks,...
-
3
The Nreal Air AR glasses will soon give Windows gamers a virtual curved ultrawide screen...
-
8
Squeezing millions of documents in 128 TB of virtual memory How dynamic management of virtual memory enabled us to remove limitations in Meilisearch indexing policy. ...
-
2
The ASUS AirVision M1 glasses give you big virtual screens in a travel-friendly packageTheir MicroLED displays are also surprisingly sharp.
About Joyk
Aggregate valuable and interesting links.
Joyk means Joy of geeK