RUS  ENG
Full version
JOURNALS // Program Systems: Theory and Applications // Archive

Program Systems: Theory and Applications, 2025 Volume 16, Issue 2, Pages 111–152 (Mi ps470)

This article is cited in 1 paper

Artificial intelligence and machine learning

Improving the accuracy of segmentation masks using a generative-adversarial network model

I. V. Vinokurov

Financial University under the Government of the Russian Federation, Moscow, Russia

Abstract: Masks obtained using the deep learning model Mask R-CNN may in some cases contain fragmented contours, uneven boundaries, false fusions of adjacent objects, and areas with missed segmentation. The more detection objects in the image and the smaller their size, the more often various types of defects in their masks are encountered. Examples of such images include aerial photographs of cottage and garden associations and cooperatives characterized by high building density. To correct these defects, it is proposed to use a generative adversarial network model that performs post-processing of the predicted Mask R-CNN masks.
A qualitative assessment of the model formed in the work demonstrated that it is capable of restoring the integrity of contours at an acceptable level, filling in missing areas, and separating erroneously merged objects. Quantitative analysis using the IoU, precision, recall, and F1-score metrics showed a statistically significant improvement in the segmentation quality compared to the original Mask R-CNN masks. The obtained results confirmed that the proposed approach allows to increase the accuracy of the formation of object masks to a level that satisfies the requirements of their practical application in automated aerial photograph analysis systems.

Key words and phrases: computer vision, image segmentation, object masks, generative adversarial networks, Mask R-CNN, PyTorch.

UDC: 004.932.75'1, 004.89
BBK: 32.813.5: 32.973.202-018

MSC: Primary 68T20; Secondary 68T07, 68T45

Received: 21.04.2025
Accepted: 11.06.2025

Language: Russian and English

DOI: 10.25209/2079-3316-2025-16-2-111-152



© Steklov Math. Inst. of RAS, 2026