Browsing by Subject "Images classification"
Now showing 1 - 2 of 2
Results Per Page
Sort Options
Item Cracked Wall Image Classification Based on Deep Neural Network Using Visibility Graph Features(Institute of Electrical and Electronics Engineers Inc., 2021) Altundogan T.G.; Karakose M.Visibility graphs are graphs created by making use of the relations of objects with each other depending on their visibility features. Today, visibility graphs are used quite frequently in signal processing applications. In this study, cracked and non-cracked wall images taken from a dataset were classified by a deep neural network depending on the visibility graph properties. In the proposed method, firstly, histograms of the images are obtained. The resulting histogram is then expressed by visibility graphs. A feature vector of each image is created with the maximum clique and maximum degree features of the obtained visibility graphs. Then, deep neural network training is performed with the feature vectors created. The classification success of the proposed method on images separated for testing is 99%. © 2021 IEEE.Item Hermos: An annotated image dataset for visual detection of grape leaf diseases(SAGE Publications Ltd, 2024) Özacar T.; Öztürk Ö.; Güngör Savaş N.Powdery mildew, dead arm and vineyard downy mildew diseases are frequently seen in the vineyards in the Gediz River Basin, West Anatolia of Turkey. These diseases can be detected early using artificial intelligence (AI)–based systems that can contribute to crop yields and also reduce the labour of the farmer and the amount of pesticides used. This article presents a dataset – namely, Hermos – for use in such AI-based systems. Hermos contains four classes of grape leaf images: leaves with powdery mildew, leaves with dead arm, leaves with downy mildew and healthy leaves. We have currently 492 images and 13,913 labels in the dataset. We have published Hermos in the Linked Open Data (LOD) cloud in order to make it easier for consumers to access, process and manipulate the data. © The Author(s) 2022.