A Scalable Tile-Based Framework for Region-Merging Segmentation

Processing large very high-resolution remote sensing images on resource-constrained devices is a challenging task because of the large size of these data sets. For applications such as environmental monitoring or natural resources management, complex algorithms have to be used to extract information from the images. The memory required to store the images and the data structures of such algorithms may be very high (hundreds of gigabytes) and therefore leads to unfeasibility on commonly available computers. Segmentation algorithms constitute an essential step for the extraction of objects of interest in a scene and will be the topic of the investigation in this paper. The objective of the present work is to adapt image segmentation algorithms for large amounts of data. To overcome the memory issue, large images are usually divided into smaller image tiles, which are processed independently. Region-merging algorithms do not cope well with image tiling since artifacts are present on the tile edges in the final result due to the incoherencies of the regions across the tiles. In this paper, we propose a scalable tile-based framework for region-merging algorithms to segment large images, while ensuring identical results, with respect to processing the whole image at once. We introduce the original concept of the stability margin for a tile. It allows ensuring identical results to those obtained if the whole image had been segmented without tiling. Finally, we discuss the benefits of this framework and demonstrate the scalability of this approach by applying it to real large images.