Time Limit: Java: 2000 ms / Others: 2000 ms
Memory Limit: Java: 65536 KB / Others: 65536 KB
Aliasing is the stairstep effect achieved when attempting to represent a smooth curve using a finite number of discrete pixels. Of course, all computer displays consist of a finite number of pixels, and many strategies have been devised to smooth the jagged edges with varying degrees of success.
Boudreaux and Thibodeaux are writing video game rendering software for the next big firstperson shooter, and they don't know much about any of the progress made in the field of antialiasing. Therefore, they've decided to use a very simplistic (and visually unappealing) method to smooth the ragged edges. Unfortunately, it blurs the entire image, but at least it gets rid of those jaggies!
Normally, the game displays in m x n pixels, but they perform an extra antialiasing step that converts that image into an (m  1) x (n  1) image. Nobody will notice a pixel missing from each dimension, and they can calculate the new pixels by averaging squares of 4 pixels from the original image (and rounding down). For example, the images below represent the original image (left) and the antialiased image (right) using numbers to represent varying shades of black and white.


Input to this problem will consist of a (nonempty) series of up to 100 data sets. Each data set will be formatted according to the following description, and there will be no blank lines separating data sets.
A single data set has 3 components:
After the last data set, there will be a single line: ENDOFINPUT
The output will be the antialiased image, which will be R  1 rows, each with C  1 integer pixel values. Each pixel in the output will be generated by averaging (and rounding down) the grayscale pixel values of the corresponding square of four pixels in the Original Image.
START 2 2 00 00 END START 2 9 012345678 012345678 END START 4 4 4440 4400 4000 0000 END START 9 9 900000009 090000090 009000900 000909000 000090000 000909000 009000900 090000090 900000009 END ENDOFINPUT