The following Java method takes an array of BufferedImage objects and returns average image obtained from the array of images.

The main application of image averaging is noise removal. Image noise is mostly unwanted and manifested in the pixels of an image. It is inherent to digital cameras, and is generated, in part, by heat and low light conditions, and is often prominent in long exposures and photographs taken at high ISO sensitivity. Its effect is analagous to film grain.

When images of an unchanging scene are corrupted by random noise, a sequence of these images can be averaged together in order to reduce the effects of the noise. This works because noise perturbs pixel grey levels, and a positive perturbation of a given magnitude tends to be just as likely as a negative perturbation of the same magnitude. Hence there is a tendency for these 'errors' in pixel grey level to cancel each other out to an increasing degree, as the number of averaged images increases.

Although the example is written for grey level images, you can change the method for applying it to RGB images easily. You will just need to compute average value for each channel (Red,Green or Blue channel).

   public static BufferedImage average(BufferedImage[] images) {
        int n = images.length;
        // Assuming that all images have the same dimensions
        int w = images[0].getWidth();
        int h = images[0].getHeight();
        BufferedImage average =
                new BufferedImage(w, h, BufferedImage.TYPE_BYTE_GRAY);
        WritableRaster raster =
        for (int y=0; y < h; ++y)
            for (int x=0; x < w; ++x) {
                float sum = 0.0f;
                for (int i=0; i<n; ++i)
                    sum = sum + images[i].getRaster().getSample(x, y, 0);
                raster.setSample(x, y, 0, Math.round(sum/n));            
        return average;