Determination of the water depths in coastal zones is a common requirement for the majority of coastal engineering and coastal science applications. However, production of high quality bathymetric maps requires expensive field survey, high technology equipment and expert personnel. Remotely sensed images can be conveniently used to reduce the cost and labor needed for bathymetric measurements and to overcome the difficulties in spatial and temporal depth provision. An Artificial Neural Network (ANN) methodology is introduced in this study to derive bathymetric maps in shallow waters via remote sensing images and sample depth measurements. This methodology provides fast and practical solution for depth estimation in shallow waters, coupling temporal and spatial capabilities of remote sensing imagery with modeling flexibility of ANN. Its main advantage in practice is that it enables to directly use image reflectance values in depth estimations, without refining depth-caused scatterings from other environmental factors (e.g. bottom material and vegetation). Its function-free structure allows evaluating nonlinear relationships between multi-band images and in-situ depth measurements, therefore leads more reliable depth estimations than classical regressive approaches. The west coast of the Foca, Izmir/Turkey was used as a test bed. Aster first three band images and Quickbird pan-sharpened images were used to derive ANN based bathymetric maps of this study area. In-situ depth measurements were supplied from the General Command of Mapping, Turkey (HGK). Two models were set, one for Aster and one for Quickbird image inputs. Bathymetric maps relying solely on in-situ depth measurements were used to evaluate resultant derived bathymetric maps. The efficiency of the methodology was discussed at the end of the paper. It is concluded that the proposed methodology could decrease spatial and repetitive depth measurement requirements in bathymetric mapping especially for preliminary engineering application. (C) 2010 Elsevier Ltd. All rights reserved.