Kabakuş, Abdullah TalhaErdogmus, Pakize2023-07-262023-07-2620221532-06261532-0634https://doi.org/10.1002/cpe.7216https://hdl.handle.net/20.500.12684/13415The easiest way to propose a solution based on deep neural networks is using the pre-trained models through the transfer-learning technique. Deep learning platforms provide various pre-trained deep neural networks that can be easily applied for image classification tasks. So, Which pre-trained model provides the best performance for image classification tasks? is a question that instinctively comes to mind and should be shed light on by the research community. To this end, we propose an experimental comparison of the six popular pre-trained deep neural networks, namely, (i) VGG19, (ii) ResNet50, (iii) DenseNet201, (iv) MobileNetV2, (v) InceptionV3, and (vi) Xception by employing them through the transfer-learning technique. Then, the proposed benchmark models were both trained and evaluated under the same configurations on two gold-standard datasets, namely, (i) CIFAR-10 and (ii) Stanford Dogs to benchmark them. Three evaluation metrics were employed to measure performance differences between the employed pre-trained models as follows: (i) Accuracy, (ii) training duration, and (iii) inference time. The key findings that were obtained through the conducted a wide variety of experiments were discussed.en10.1002/cpe.7216info:eu-repo/semantics/closedAccessConvolutional Neural Network; Deep Learning; Deep Neural Network; Keras; Tensorflow; Transfer-LearningAn experimental comparison of the widely used pre-trained deep neural networks for image classification tasks towards revealing the promise of transfer-learningArticle34242-s2.0-85134766477WOS:000830194200001Q3Q3