10

A react native application to detect color waves using mobile camera

 2 years ago
source link: https://reactnativeexample.com/a-react-native-application-to-detect-color-waves-using-mobile-camera/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client
Apps

A react native application to detect color waves using mobile camera

Nov 09, 2021 4 min read

Colorwaves

An app to detect colorwaves (swatches/palettes) in the real world โ€“ powered by VisionCamera and Reanimated.

I wrote this app in less than a day, a speed simply not matched by native app development. Because itโ€™s written in React Native (TypeScript/JavaScript), it runs both on iOS and Android, but performance critical parts (e.g. the image processing algorithm or the animations) are backed by native Objective-C/Swift/Java code.

See this Tweet for more information.

Buy Me a Coffee at ko-fi.com

Try it!

Download the repository and run the following commands to try Colorwaver for yourself:

yarn try-ios

Android

yarn try-android

Project structure

This is a bare React Native project, created with create-react-native-app.


  • ๐Ÿ“ src: Contains the actual TypeScript + React (-Native) front-end for the Colorwaver App.
    • ๐Ÿ“„ src/getColorPalette.ts: Exposes the native iOS/Android frame processor plugin as a JS function with TypeScript types. This function has to be called from a frame processor. ('worklet')

    • ๐Ÿ“„ src/Router.tsx: The main component that gets registered by index.js. This acts as a main navigator, navigating either to the Splash Page (Permissions) or the App Page (Main App) depending on whether the user has granted permissions or not.

    • ๐Ÿ“„ src/Splash.tsx: The first Splash page to ask the user for Camera permission.

    • ๐Ÿ“„ src/App.tsx: Contains the actual source code for the Appโ€™s front-end.

      • VisionCamera is used to get a Camera device and display a Camera component. Also, a frame processor (a function that gets called for every frame the camera โ€œseesโ€) gets attached, which calls the native iOS/Android frame processor plugin.
      • Reanimated is used to smoothly animate between color changes.

      Because VisionCamera also uses the Worklet API, the entire process between receiving a camera frame and actually displaying the paletteโ€™s colors does not use the React-JS Thread at all.
      The frame processing runs on a separate Thread from VisionCamera, which then dispatches the animations on the Reanimated UI Thread.
      This is why the App runs as smooth as a native iOS or Android app.

    • ๐Ÿ“„ src/useAnimatedColor.ts: A helper function to animate color changes with SharedValues.


  • ๐Ÿ“ ios: Contains the basic skeleton for a React Native iOS app, plus the native getColorPalette(...) Frame Processor Plugin.
    • ๐Ÿ“„ ios/PaletteFrameProcessorPlugin.m: Declares the Swift frame processor plugin โ€œgetColorPalette(...)โ€œ.

    • ๐Ÿ“„ ios/PaletteFrameProcessorPlugin.swift: Contains the actual Swift code for the native iOS frame processor plugin โ€œgetColorPalette(...)โ€œ.

      This uses the CoreImage API to convert the CMSampleBuffer to a UIImage, and then uses the UIImageColors library to build the color palette.

      VisionCameraโ€™s frame processor API is used to expose this function as a frame processor plugin.

    • ๐Ÿ“„ ios/Colorwaver-Bridging-Header.h: A Bridging Header to import Objective-C headers into Swift.

    • ๐Ÿ“„ ios/Podfile: Adds the UIImageColors library.

    • ๐Ÿ“„ ios/UIColor+hexString.swift: An extension for UIColor to convert UIColor instances to strings. This is required because React Native handles colors as strings.


  • ๐Ÿ“ android: Contains the basic skeleton for a React Native Android app, plus the native getColorPalette(...) Frame Processor Plugin.
    • ๐Ÿ“„ android/app/build.gradle: The gradle build file for the Android project. The following third-party dependencies are installed:

      • androidx.camera: camera-core
      • androidx.palette: palette
    • ๐Ÿ“ android/app/src/main/java/com/colorwaver/utils: Contains two files copied from android/camera-samples to convert ImageProxy instances to Bitmaps. (YuvToRgbConverter.kt)

    • ๐Ÿ“ android/app/src/main/java/com/colorwaver: Contains the actual Java source code of the Project.

    • ๐Ÿ“„ android/app/src/main/java/com/colorwaver/MainApplication.java: Sets up react-native-reanimated.

    • ๐Ÿ“„ android/app/src/main/java/com/colorwaver/MainActivity.java: Installs the PaletteFrameProcessorPlugin frame processor plugin inside of the onCreate method.

    • ๐Ÿ“„ android/app/src/main/java/com/colorwaver/PaletteFrameProcessorPlugin.java: Contains the actual Java code for the native Android frame processor plugin โ€œgetColorPalette(...)โ€œ.

      This uses the YuvToRgbConverter to convert the ImageProxy to a Bitmap, and then passes that to the Palette API from AndroidX to build the color palette.

      VisionCameraโ€™s frame processor API is used to expose this function as a frame processor plugin.


  • ๐Ÿ“„ babel.config.js: Adds the native frame processor plugin getColorPalette (called __getColorPalette) to Reanimatedโ€™s global list.

Credits

GitHub

View Github

Previous Post

The Next-Gen Crypto Wallet built using React Native

Next Post

Football App UI Built With React Native

You might also like...

Subscribe to React Native Example for Android and iOS

Get the latest posts delivered right to your inbox


Recommend

About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK