A time-bound technical assessment integrating Android Camera, OpenCV (C++), OpenGL ES, JNI, and a minimal TypeScript Web Viewer.
This project demonstrates:
- Real-time camera capture on Android using Camera2 API.
- Frame processing in native C++ via OpenCV (e.g., Canny edge detection / grayscale).
- Rendering the processed frame via OpenGL ES 2.0.
- Exporting or previewing the processed frame in a TypeScript-based web viewer.
- Camera capture using
TextureView/SurfaceTexture(Camera2 recommended) - JNI bridge to native C++ (NDK + CMake)
- OpenCV (C++) processing: Grayscale + Canny Edge Detection
- Render processed frames using OpenGL ES 2.0 as a texture
- Minimal TypeScript web viewer showing a static processed frame + FPS text overlay
- Bonus: Toggle button (raw / processed) and FPS counter (optional)
🧩 Folder Structure
edge_assessment/
├── app/
│ ├── src/main/java/com/example/edgeviewer/
│ │ └── MainActivity.java
│ └── res/layout/
│ └── activity_main.xml
│
├── jni/
│ ├── CMakeLists.txt
│ └── native-lib.cpp
│
├── gl/
│ └── GLRenderer.java
│
├── web/
│ ├── index.html
│ ├── main.ts
│ ├── tsconfig.json
│ └── processed_sample.png
│
└── README.md
High-level flow:
Camera (SurfaceTexture) -> Java/Kotlin frame callback -> Mat (Java-side or direct native Mat) -> JNI -> native C++ (OpenCV) -> processed Mat -> upload to OpenGL texture -> GLSurfaceView renders texture
Modules:
/app— Android app (camera setup, JNI calls, UI controls, lifecycle)/jni— native C++ code (OpenCV processing). UseMatobjects and JNI functions that acceptjlongpointers tocv::Mat./gl— OpenGL ES 2.0 renderer, shader programs, texture upload helper./web— TypeScript + HTML viewer that displays a sample processed frame and a small stats overlay.
-
Install prerequisites
- Android Studio (prefer latest stable)
- Android NDK (r23b or compatible)
- CMake (bundled with Android Studio or install separately)
- OpenCV Android SDK (download from opencv.org) — extract and reference
sdk/native/jni/includeandsdk/native/libsinCMakeLists.txt
-
Project-level
- Ensure
android.ndkVersioninbuild.gradlematches installed NDK - In
app/build.gradleenable externalNativeBuild with CMake and point toCMakeLists.txt
- Ensure
-
Build & Run
- Build in Android Studio (Gradle will invoke CMake to produce the
.so) - Grant camera permission on first run
- Build in Android Studio (Gradle will invoke CMake to produce the
-
TypeScript Web Viewer
cd web && npm install(if using dev dependencies)npx tsc(compilemain.ts->main.js)- Open
index.htmlin browser (static; no server needed)
Make small focused commits. Push early & often.
chore: repo skeleton + README— add top-level folders and READMEfeat: initial Android project (app module, activity)— minimal MainActivity and gradle configfeat: camera capture (TextureView) + frame callback— capture frame buffer, log framesfeat: add NDK integration + CMakeLists— native-lib stub exportedfeat: JNI bridge + OpenCV native processing (canny skeleton)— static image testfeat: GL renderer skeleton (GLSurfaceView + GLRenderer)— renders solid colorfeat: connect processed Mat -> upload to GL texture— show processed texturefeat: web viewer (TypeScript) + sample image— basic viewer workingfix: performance tweaks (frame throttling, conversions)docs: add screenshots + final README updates
git init
git add README.md
git commit -m "chore: repo skeleton + README"
# create app boilerplate
git add app/
git commit -m "feat: initial Android app skeleton (MainActivity, gradle)"
# add native + CMake
git add jni/
git commit -m "feat: add NDK integration and CMakeLists"
# add JNI processing
git add jni/native-lib.cpp
git commit -m "feat: JNI bridge + openCV canny skeleton"
# add GL renderer
git add gl/
git commit -m "feat: add GLRenderer skeleton"
# add web viewer
git add web/
git commit -m "feat: add simple TypeScript web viewer and sample image"
# push
git remote add origin <repo-url>
git push -u origin main