You have no items in your shopping cart.

Image & Video Processing

Basic of Image fusion



With the development of new imaging sensors arises the need of a meaningful combination of all employed imaging sources. The fusion process can do at different levels of information representation; a generic categorization to consider the different levels as, sorting the ascending order of abstraction: pixel, signal, symbolic and feature level so-called pixel level fusion process, where a composite image has to be built of several input images. To date, the result of pixel level image fusion is considered primarily to be presented to the human observer, especially in image sequence fusion (where the input data consists of image sequences). A possible application is the fusion of forward looking infrared (FLIR) and low light visible images (LLTV) obtained by an airborne sensor platform to aid a pilot navigates in poor weather conditions or darkness. In pixel-level image fusion, some generic requirements can be imposed on the fusion result. The fusion process should preserve all relevant information of the input imagery in the composite image (pattern conservation) The fusion scheme should not introduce any artifacts or inconsistencies which would distract the human observer or following processing stages .The fusion process should be shift and rotational invariant, i.e. the fusion result should not depend on the location or orientation of an object the input imagery .In case of image sequence fusion arises the additional problem of temporal stability and consistency of the fused image sequence. The human visual system is primarily sensitive to moving light stimuli, so moving artifacts or time depended contrast changes introduced by the fusion process are highly distracting to the human observer. So, in case of image sequence fusion the two additional requirements apply temporal stability: The fused image sequence should be temporal stable, i.e. gray level changes in the fused sequence must only be caused by gray level changes in the input sequences, they must not be introduced by the fusion scheme itself.

Temporal consistency: Gray level changes occurring in the input sequences must be present in the fused sequence without any delay or contrast change.

Demonstration Video

Application of Image Fusion

Image fusion finds application in a very wide range of areas involving image processing. Someof the areas which find critical application of image fusion are as the following.

Intelligent robots

  • Require motion control, based on feedback from the environment from visual, tactile, force/torque, and other types of sensors
  • Stereo camera fusion
  • Intelligent viewing control
  • Automatic target recognition and tracking

The images are taken by Sony digital still camera MVC-FD7.

            Image 1 (focus on left part)                                                                                              Image 2 (focus on right part)

focus-on-left-part                                  focus-on-right-part

        Image taken using auto-focus function                                                                                            The fused image (all focus)

image-taken-using-auto-focus-function                                  the-fused-image

Medical image

With the development of new imaging methods in medical diagnostics arises the need of meaningful (and spatial correct) combination of all available image datasets. Examples for imaging devices include computer tomography (CT), magnetic resonance imaging (MRI) or the newer positron emission tomography (PET). The following images in the result 1.3 illustrate the fusion of a CT and a MRI image.

  • Fusing X-ray computed tomography (CT) and magnetic resonance (MR) images
  • Computer assisted surgery
  • Spatial registration of 3-D surface




  • Electronic circuit inspection
  • Product surface inspection and measurement
  • Non-destructive material inspection
  • Manufacture process monitoring
  • Complex machine/device diagnostics
  • Intelligent robots on assembly lines

  image-fusion-visual_2                             image-fusion-mmw

             Image 1 (Visual)                                                                                                                                    Image 2 (MMW)

 the-region-edges-in-image-1                              the-region-edges-in-image-2

          The region edges in image 1                                                                                                              The region edges in image 2

the-region-object-detection-of-image-1                                the-region-object-detection-of-image-2

     The region/object detection of image 1                                                                                         The region/object detection of image 2

Military and law enforcement

  • Detection, tracking, identification of ocean (air, ground)target/event
  • Concealed weapon detection
  • Battle-field monitoring
  • Night pilot guidance


                         Image 1                                                                                                                                             Image 2

the-region-edges-in-image-3                                  the-region-edges-in-image-4

         The region edges in image 1                                                                                                                The region edges in image 2

the-region-object-detection-of-image-3                                                       the-region-object-detection-of-image-4

        The region/object detection of image 1                                                                                          The region/object detection of image 2

the-decision_map-of-fusion                                   image

                  The decision_map of fusion                                                                                                 The result of region/object based fusion


Remote sensing

  • Using various parts of the electro-magnetic spectrum
  • Sensors: from black-and-white aerial photography to multi-spectral active microwave space-borne imaging radar
  • Fusion techniques are classified into photographic method and numerical method



Navigation Aid

To allow helicopter pilots navigate under poor visibility conditions (such as fog or heavy rain) helicopters are equipped with several imaging sensors, which can be viewed by the pilot in a helmet mounted display. A typical sensor suite includes both a low-light-television (LLTV) sensor and a thermal imaging forward-looking-infrared (FLIR) sensor. In the current configuration, the pilot can choose one of the two sensors to watch in his display. A possible improvement is combining both imaging sources into a single fused image which contains the relevant image information of both imaging devices. The following images in the result 1.1 illustrate this application.

Merging Out-Of-Focus Images

Due to the limited depth-of-focus of optical lenses (especially such with long focal lengths) it is often not possible to get an image which contains all relevant objects ‘in focus’. One possibility to overcome this problem is to take several pictures with different focus points and combine them together into a single frame which finally contains the focused regions of all input images. The following images in the result 1.2 illustrate this approach.

Registered Images with Different Focus Points

image-fusion-image-1                                                         image-fusion-image

                           Image 1                                                                                                                                                   Image 2

the-region-edges-in-image-5                                     the-region-edges-in-image-6

             The region edges in image 1                                                                                                               The region edges in image 2


the-region-object-detection-of-image-5                                                           the-region-object-detection-of-image

        The region/object detection of image 1                                                                                               The region/object detection of image 2

the-region-edges-in-image-6                                                            the-result-of-region-object-based-fusion-

             The decision_map of fusion                                                                                                             The result of region/object based fusion


Other related Articles

Matlab code for Pixel level Image fusion using Minimum Method
Matlab code for Pixel level Image fusion using Maximum Method
Matlab Code for Pixel Image fusion using Average Method

read more
Brain Computer Interface

What are Brainwaves?


What are Brainwaves?

At the root of all our thoughts, emotions and behaviours is the communication between neurons within our brains. Brainwaves are produced by synchronised electrical pulses from masses of neurons communicating with each other.

Brainwaves are detected using sensors placed on the scalp. They are divided into bandwidths to describe their functions (below), but are best thought of as a continuous spectrum of consciousness; Delta being slow, loud and functional to Gamma being fast, subtle, and complex. It is a handy analogy to think of Brainwaves as musical notes – the low frequency waves like a deeply penetrating drum beat, while the higher frequency brainwaves are like a subtle high pitched flute.

Our brainwaves change according to what we’re doing and feeling. When slower brainwaves are dominant we can feel tired, slow, sluggish, or dreamy. The higher frequencies are dominant when we feel wired, or hyper-alert.

The descriptions that follow are only broadly descriptions – in practice things are far more complex, and brainwaves reflect different aspects when they occur in different locations in the brain.

Brainwave speed is measured in Hertz (cycles per second) and they are dived into bands deliniating slow, moderate, and fast waves.
Delta waves (.5 to 3 Hz)

Delta brainwaves are the slowest but loudest brainwaves (low frequency and deeply penetrating, like a drum beat). They are generated in deepest meditation and dreamless sleep. Delta waves suspend external awareness and are the source of empathy. Healing and regeneration are stimulated in this state, and that is why deep restorative sleep is so essential to the healing process.
Theta waves (3 to 8 Hz)

Theta brainwaves occur most often in sleep but are also dominant in the deep meditation. It acts as our gateway to learning and memory. In theta, our senses are withdrawn from the external world and focused on signals originating from within. It is that twilight state which we normally only experience fleetingly as we wake or drift off to sleep. In theta we are in a dream; vivid imagery, intuition and information beyond our normal conscious awareness. It’s where we hold our ‘stuff’, our fears, troubled history, and nightmares.
Alpha waves (8 to 12 Hz)

Alpha brainwaves are present during quietly flowing thoughts, but not quite meditation. Alpha is ‘the power of now’, being here, in the present. Alpha is the resting state for the brain. Alpha waves aid overall mental coordination, calmness, alertness, mind/body integration and learning.
Beta waves (12 to 38 Hz)

Beta brainwaves dominate our normal waking state of consciousness when attention is directed towards cognitive tasks and the outside world. Beta is a ‘fast’ activity, present when we are alert, attentive, engaged in problem solving, judgment, decision making, and engaged in focused mental activity. Beta brainwaves are further divided into three bands; Low Beta (Beta1, 12-15Hz) can be thought of as a ‘fast idle, or musing. Beta (aka.Beta2, 15-22Hz) as high engagement. Hi-Beta (Beta3, 22-38Hz) is highly complex thought, integrating new experiences, high anxiety, or excitement. Continual high frequency processing is not a very efficient way to run the brain, as it takes a tremendous amount of energy.
Gamma waves (38 to 42 Hz)

Gamma brainwaves are the fastest of brain waves (high frequency, like a flute), and relate to simultaneous processing of information from different brain areas. It passes information rapidly, and as the most subtle of the brainwave frequencies, the mind has to be quiet to access it. Gamma was traditionally dismissed as ‘spare brain noise’ until researchers discovered it was highly active when in states of universal love, altruism, and the ‘higher virtues’. Gamma rhythms modulate perception and consciousness, disappearing under anaesthesia. Gamma is also above the frequency of neuronal firing, so how it is generated remains a mystery. The presence of Gamma relates to expanded consciousness and spiritual emergence.
What brainwaves mean to you

Our brainwave profile and our daily experience of the world are inseparable. When our brainwaves are out of balance, there will be corresponding problems in our emotional or neuro-physical health. Research has identified brainwave patterns associated with all sorts of emotional and neurological conditions.

Over-arousal in certain brain areas is linked with anxiety disorders, sleep problems, nightmares, hyper-vigilance, impulsive behaviour, anger/aggression, agitated depression, chronic nerve pain and spasticity. Under-arousal in certain brain areas leads to some types of depression, attention deficit, chronic pain and insomnia. A combination of under-arousal and over-arousal is seen in cases of anxiety, depression and ADHD.

Instabilities in brain rhythms correlate with tics, obsessive-compulsive disorder, aggressive behaviour, rage, bruxism, panic attacks, bipolar disorder, migraines, narcolepsy, epilepsy, sleep apnoea, vertigo, tinnitus, anorexia/bulimia, PMT, diabetes, hypoglycaemia and explosive behaviour.

read more
1 2 3 52
Page 1 of 52