Living Optics SpectralCamera
Snapshot hyperspectral.
96 contiguous spectral bands captured simultaneously across the scene at 30 fps. No scanning, no motion blur. The reference snapshot hyperspectral camera.
96 contiguous spectral bands captured simultaneously across the whole scene, 30 frames per second. No scanning, no motion blur — the snapshot hyperspectral platform Easepect deploys for defence, research and industrial inspection.
400–1000 nm contiguous coverage
Snapshot — no scanning, no motion blur
First-class SDK for research + robotics
Runs on Jetson for real-time deployment
Traditional hyperspectral systems scan line-by-line and freeze the scene; any motion ruins the cube. Living Optics captures the full spectrum in a single snapshot — image moving objects, flying platforms and live conveyor processes without motion artefacts. Every pixel in every frame carries its own 96-band signature.
Easepect supports Living Optics customers across Southeast Asia with scoping workshops, loan units, airframe integration and model-training support — so you arrive at a deployable pipeline, not just a beautiful cube.
Snapshot hyperspectral.
96 contiguous spectral bands captured simultaneously across the scene at 30 fps. No scanning, no motion blur. The reference snapshot hyperspectral camera.
Python + ROS bindings.
First-class Python API and ROS 2 driver. Stream raw cubes, run pre-trained classifiers or slot in your own PyTorch / scikit-learn models.
Starting-point classifiers.
Pre-trained models for material discrimination, vegetation, polymer sorting and food quality. Fine-tune on a few hundred labelled samples.
NVIDIA Jetson deployment.
Move the trained model to a Jetson for real-time production use — conveyor inspection, UAV onboard inference, ISR.
Define the target material or signal. Easepect runs a scoping workshop — bring samples, leave with a data plan.
Benchtop, drone or conveyor — image samples and live processes to build a labelled spectral dataset.
Start from a pre-trained classifier or fine-tune your own on the captured cube. Python / PyTorch workflow.
Move the trained model to edge compute (Jetson) or host PC for real-time production use.
| Dimension | Hyperspectral (Living Optics) | Multispectral (MicaSense) |
|---|---|---|
| Bands | 96 contiguous (400–1000 nm) | 5 fixed bands |
| Resolution | ≈ 6 nm spectral | Wide bands |
| Capture mode | Snapshot, every frame | Snapshot, every frame |
| Motion-safe | Yes — 30 fps full cube | Yes |
| Use case | Research, novel materials, defence | Routine agronomy NDVI / NDRE |
| Indicative cost | SGD 5–6 figures | SGD 4–5 figures |
For routine agronomy NDVI / NDRE, multispectral is the better-value pick. For novel materials, defence applications, polymer sorting or industrial inline inspection, only hyperspectral resolves the signal.
Multispectral gives you 5–10 pre-selected broad bands. Living Optics gives you 96 contiguous bands across 400–1000 nm in a single snapshot. You can discriminate materials multispectral cannot see — similar polymers, mineral phases, camouflage, early-stage disease — because the full spectral fingerprint is there.
Traditional hyperspectral systems build the cube line-by-line as the sensor or scene moves. Any motion blurs the cube. Living Optics captures every band of the 2D scene simultaneously at 30 fps — so you can image moving objects, live processes and flying platforms without stitching.
Not always. Pre-trained classifiers ship for common tasks (plastics sorting, vegetation stress, material discrimination). For novel targets you fine-tune on a few hundred labelled samples — we run the workshop as part of the sale.
Yes — the UAV payload variant mounts on DJI M300 / M350 class airframes. Typical flight: 80 m AGL, 3 m/s, full spectral cube recorded per frame for airborne hyperspectral surveys.
Living Optics is research / industrial-grade hyperspectral — priced for capital procurement. Typical SpectralCamera + SDK is in the SGD high-five to low-six figures. Evaluation loans are available to qualified buyers.
On-site demos, loaner units and scoping workshops. Tell us the target material or signal — we design the evaluation.