From visible to SWIR
A fleet of miniaturized satellites will soon circle our planet. By taking hyperspectral sensing technology on board, they can reveal what happens in and on its surface.
Which parts of a mountain range are teeming with valuable metals? Where are the islands of waste plastic in our oceans? What’s the moisture level of a piece of land after a heat wave? ...
All that information – and more – can be obtained through hyperspectral remote sensing.
By equipping a satellite with hyperspectral imaging capabilities, you’re able to see the big picture, in sharp detail. Each pixel contains a complete breakdown of visible and invisible frequencies of the reflected light. Revealing the spectral signatures of minerals, soil, vegetation, ...
Hyperspectral aerial image of large test fields of soybean cultures in Brazil. The color image is augmented with spectral features to show specific variations within crop cultures (e.g. disease patterns) – Courtesy of GAMAYA.
Applications of hyperspectral remote sensing range from precision agriculture to geology, environmental monitoring and archaeology.
Until recently, hyperspectral remote sensing suffered from low temporal resolution: images were taken one or two weeks apart – the time that a big satellite needs to complete its trajectory. For many applications, like agriculture, that frequency was too low to fully realize the benefits of the technology.
That’s why the advent of low-cost miniaturized satellites – so-called CubeSats – promises to be a game-changer for hyperspectral remote sensing. They allow for the development of a constellation of satellites that scan the same location almost daily.
In space, and especially for small satellites, constraints are different than on-ground:
Those conditions are difficult to meet with grating-based cameras. They are bulky. They suffer from low signal-to-noise ratio when the lighting is low. And because they require images to be reconstructed line by line, instability greatly reduces their ability to reconstruct images.
Watch this webinar to learn more about the data-processing pipeline in imec's hyperspectral software, which enables you to acquire reliable spectral images and videos in real-life circumstances.
As a leading R&D center for semiconductor technology, imec can deposit and pattern spectral filters on the surface of area-scan image sensors. That makes it possible to create push-frame hyperspectral sensors that relax the constraints on the line-of-sight pointing accuracy in small satellites.
By depositing exactly the same spectral filter on adjacent rows of pixels, that patterning capability is also used to increase the signal-to-noise ratio of the image with TDI functionality.
In an alternative configuration, it is possible to combine push-frame hyperspectral imaging with snapshot multispectral imaging. This results in the potential to make video imaging from space –even in low-earth orbit.
Last but not least: with spectral filters operating in the SWIR, combined with an off-the-shelf InGaAS detector, imec broadens spectral imaging further than the visible and NIR spectrum.
This makes these types of sensors are ideal for use in satellites and high-altitude drones, for the classification of agricultural soils, minerals, plastic pollution, surveillance ... With imec's current off-the-shelf sensors, you can evaluate the technology for these use cases before you go to a space-grade sensing solution
Imec has a long history of applying its innovative technologies to research in and from space. Our hyperspectral imaging sensors were already used in the CHIEM project that developed a novel compact hyperspectral imager – compatible with a 12U CubeSat satellite.
Contact us if you want to use our sensors for your satellite projects. Our off-the-shelf sensors, which have not been designed for remote sensing and are not space-qualified, can be used for technology validation. And we're happy to help you with the custom development of your sensor or system.
Presenter: Wouter Charle, imec
Using hyperspectral imaging in a drone is challenging for multiple reasons: Integration in the drone may require combination of many sensors and to output good data, pushbroom cameras require a controlled environment or multiple attempts and several people crew.
Live from Imec demo studio, it will be demonstrated how imec’s newest snapshot drone camera can be connected to a DJI M600Pro drone and be controlled remotely with live streaming of data. A stitched hypercube will be shown to the audience to illustrate how the pre-processing pipeline works to reduce the amount of time spent by researchers before doing their analysis
Presenter: Paul Danini, imec
Spectral imaging has matured for the last 5 years and now the technology progressively moves from the lab to real life use cases. In this presentation the audience will understand how Imec technology drives new applications and benefits to the remote sensing community by giving examples of track record and use cases. The 60min talk will also include a live demonstration of the Snapshot SWIR camera to illustrate practically how imec technology is implemented.
Contact our business development team.