Skip to content
Easepect
Solutions / Hyperspectral

Spectral fingerprints, at video rate.

96 contiguous spectral bands captured simultaneously across the whole scene, 30 frames per second. No scanning, no motion blur — the snapshot hyperspectral platform Easepect deploys for defence, research and industrial inspection.

96 bands

400–1000 nm contiguous coverage

30 fps

Snapshot — no scanning, no motion blur

Python / ROS

First-class SDK for research + robotics

Edge-ready

Runs on Jetson for real-time deployment

Why hyperspectral

Every pixel, every band, every frame — simultaneously.

Traditional hyperspectral systems scan line-by-line and freeze the scene; any motion ruins the cube. Living Optics captures the full spectrum in a single snapshot — image moving objects, flying platforms and live conveyor processes without motion artefacts. Every pixel in every frame carries its own 96-band signature.

Easepect supports Living Optics customers across Southeast Asia with scoping workshops, loan units, airframe integration and model-training support — so you arrive at a deployable pipeline, not just a beautiful cube.

Snapshot 96 bands Python / ROS Real-time
Advanced hyperspectral imaging research
Stack

Camera, SDK, model, edge.

01 Hyperspectral

Living Optics SpectralCamera

Snapshot hyperspectral.

96 contiguous spectral bands captured simultaneously across the scene at 30 fps. No scanning, no motion blur. The reference snapshot hyperspectral camera.

96 bandsSnapshot30 fps
02 Hyperspectral

Analyser SDK

Python + ROS bindings.

First-class Python API and ROS 2 driver. Stream raw cubes, run pre-trained classifiers or slot in your own PyTorch / scikit-learn models.

PythonROS 2PyTorch
03 Hyperspectral

Pre-trained models

Starting-point classifiers.

Pre-trained models for material discrimination, vegetation, polymer sorting and food quality. Fine-tune on a few hundred labelled samples.

Pre-trainedTransfer
04 Hyperspectral

Edge compute

NVIDIA Jetson deployment.

Move the trained model to a Jetson for real-time production use — conveyor inspection, UAV onboard inference, ISR.

JetsonEdgeReal-time
Workflow

From scoping to deployed pipeline.

01

Scope

Define the target material or signal. Easepect runs a scoping workshop — bring samples, leave with a data plan.

02

Capture

Benchtop, drone or conveyor — image samples and live processes to build a labelled spectral dataset.

03

Model

Start from a pre-trained classifier or fine-tune your own on the captured cube. Python / PyTorch workflow.

04

Deploy

Move the trained model to edge compute (Jetson) or host PC for real-time production use.

Hyperspectral vs Multispectral

Which one is right for the job?

Dimension Hyperspectral (Living Optics) Multispectral (MicaSense)
Bands 96 contiguous (400–1000 nm) 5 fixed bands
Resolution ≈ 6 nm spectral Wide bands
Capture mode Snapshot, every frame Snapshot, every frame
Motion-safe Yes — 30 fps full cube Yes
Use case Research, novel materials, defence Routine agronomy NDVI / NDRE
Indicative cost SGD 5–6 figures SGD 4–5 figures

For routine agronomy NDVI / NDRE, multispectral is the better-value pick. For novel materials, defence applications, polymer sorting or industrial inline inspection, only hyperspectral resolves the signal.

Applications

Where Living Optics opens new signal.

  • Defence — material discrimination, camouflaged-target detection, ISR.
  • Food & pharma — inline quality inspection and contamination screening.
  • Recycling — polymer and metal sorting by spectral signature.
  • Mineralogy — real-time ore-grade estimation on conveyors.
  • Precision agriculture — early disease detection before visible symptoms.
  • Environmental science — water quality, algal blooms, pollutant plumes.
  • Forensics, cultural heritage and document authentication.
  • Medical and life-science research imaging.
FAQ

Common questions.

How is this different from multispectral (e.g. MicaSense)? +

Multispectral gives you 5–10 pre-selected broad bands. Living Optics gives you 96 contiguous bands across 400–1000 nm in a single snapshot. You can discriminate materials multispectral cannot see — similar polymers, mineral phases, camouflage, early-stage disease — because the full spectral fingerprint is there.

Snapshot vs push-broom / line-scan hyperspectral? +

Traditional hyperspectral systems build the cube line-by-line as the sensor or scene moves. Any motion blurs the cube. Living Optics captures every band of the 2D scene simultaneously at 30 fps — so you can image moving objects, live processes and flying platforms without stitching.

Do I need to train my own model? +

Not always. Pre-trained classifiers ship for common tasks (plastics sorting, vegetation stress, material discrimination). For novel targets you fine-tune on a few hundred labelled samples — we run the workshop as part of the sale.

Can this fly on a drone? +

Yes — the UAV payload variant mounts on DJI M300 / M350 class airframes. Typical flight: 80 m AGL, 3 m/s, full spectral cube recorded per frame for airborne hyperspectral surveys.

What does it cost? +

Living Optics is research / industrial-grade hyperspectral — priced for capital procurement. Typical SpectralCamera + SDK is in the SGD high-five to low-six figures. Evaluation loans are available to qualified buyers.

Next step

Evaluate Living Optics for your use case.

On-site demos, loaner units and scoping workshops. Tell us the target material or signal — we design the evaluation.

Typical response · under 24h · SGT business hours