Repository logo
  • English
  • Català
  • Čeština
  • Deutsch
  • Español
  • Français
  • Gàidhlig
  • Italiano
  • Latviešu
  • Magyar
  • Nederlands
  • Polski
  • Português
  • Português do Brasil
  • Srpski (lat)
  • Suomi
  • Svenska
  • Türkçe
  • Tiếng Việt
  • Қазақ
  • বাংলা
  • हिंदी
  • Ελληνικά
  • Српски
  • Yкраї́нська
  • Log In
    Have you forgotten your password?
Repository logoRepository logo
  • Communities & Collections
  • All Contents
  • English
  • Català
  • Čeština
  • Deutsch
  • Español
  • Français
  • Gàidhlig
  • Italiano
  • Latviešu
  • Magyar
  • Nederlands
  • Polski
  • Português
  • Português do Brasil
  • Srpski (lat)
  • Suomi
  • Svenska
  • Türkçe
  • Tiếng Việt
  • Қазақ
  • বাংলা
  • हिंदी
  • Ελληνικά
  • Српски
  • Yкраї́нська
  • Log In
    Have you forgotten your password?
  1. Home
  2. Browse by Author

Browsing by Author "Borandaǧ E."

Now showing 1 - 2 of 2
Results Per Page
Sort Options
  • No Thumbnail Available
    Item
    The average scattering number of graphs
    (EDP Sciences, 2016) Aslan E.; Kilinç D.; Yücalar F.; Borandaǧ E.
    The scattering number of a graph is a measure of the vulnerability of a graph. In this paper we investigate a refinement that involves the average of a local version of the parameter. If v is a vertex in a connected graph G, then scv(G) = max {ω(G - Sv) - | Sv |}, where the maximum is taken over all disconnecting sets Sv of G that contain v. The average scattering number of G denoted by scav(G), is defined as scav(G) = Σv ϵ V(G) scv(G) / n, where n will denote the number of vertices in graph G. Like the scattering number itself, this is a measure of the vulnerability of a graph, but it is more sensitive. Next, the relations between average scattering number and other parameters are determined. The average scattering number of some graph classes are obtained. Moreover, some results about the average scattering number of graphs obtained by graph operations are given. © EDP Sciences 2016.
  • No Thumbnail Available
    Item
    A hybrid approach based on deep learning for gender recognition using human ear images; [Insan kulaǧi görüntüleri kullanarak cinsiyet tanima için derin öǧrenme tabanli melez bir yaklaşim]
    (Gazi Universitesi, 2022) Karasulu B.; Yücalar F.; Borandaǧ E.
    Nowadays, the use of the human ear images gains importance for the sustainability of biometric authorization and surveillance systems. Contemporary studies show that such processes can be done semi-automatically or fully automatically, instead of being done manually. Due to the fact that deep learning uses abstract features (i.e., representation learning), it reaches quite high performance values compared to classical methods. In our study, a synergistic gender recognition approach based on hybrid deep learning was created based on the use of human ear images in classifying people fully automatically according to their gender. By means of hybridization, hybrid deep neural network architectural models are used, which include both convolutional neural network component and recurrent neural network type components together. In these models, long-short term memory and gated recurrent unit are taken as recurrent neural network type components. Thanks to these components, the hybrid model extracts the relational dependencies between the pixel regions in the image very well. On account of this synergistic approach, the gender classification accuracy of hybrid models is higher than the standalone convolutional neural network model in our study. Two different image datasets with gender marking were used in our experiments. The reliability of the experimental results has been proven by objective metrics. In the conducted experiments, the highest values in gender recognition with hybrid models were obtained with the test accuracy of 85.16% for the EarVN dataset and 87.61% for the WPUT dataset, respectively. Discussion and conclusions are included in the last section of our study. © 2022 Gazi Universitesi Muhendislik-Mimarlik. All rights reserved.

Manisa Celal Bayar University copyright © 2002-2025 LYRASIS

  • Cookie settings
  • Privacy policy
  • End User Agreement
  • Send Feedback