Wyniki wyszukiwania

Filtruj wyniki

  • Czasopisma
  • Data

Wyniki wyszukiwania

Wyników: 1
Wyników na stronie: 25 50 75
Sortuj wg:

Abstrakt

In this paper, we present an improved efficient capsule network (CN) model for the classification of the Kuzushiji-MNIST and Kuzushiji-49 benchmark datasets. CNs are a promising approach in the field of deep learning, offering advantages such as robustness, better generalization, and a simpler network structure compared to traditional convolutional neural networks (CNNs). Proposed model, based on the Efficient CapsNet architecture, incorporates the self-attention routing mechanism, resulting in improved efficiency and reduced parameter count. The experiments conducted on the Kuzushiji-MNIST and Kuzushiji-49 datasets demonstrate that the model achieves competitive performance, ranking within the top ten solutions for both benchmarks. Despite using significantly fewer parameters compared to higher-rated competitors, presented model achieves comparable accuracy, with overall differences of only 0.91% and 1.97% for the Kuzushiji-MNIST and Kuzushiji- 49 datasets, respectively. Furthermore, the training time required to achieve these results is substantially reduced, enabling training on nonspecialized workstations. The proposed novelties of capsule architecture, including the integration of the self-attention mechanism and the efficient network structure, contribute to the improved efficiency and performance of presented model. These findings highlight the potential of CNs as a more efficient and effective approach for character classification tasks, with broader applications in various domains.
Przejdź do artykułu

Autorzy i Afiliacje

Michał Bukowski
1
ORCID: ORCID
Izabella Antoniuk
1
ORCID: ORCID
Jarosław Kurek
1
ORCID: ORCID

  1. Department of Artificial Intelligence, Institute of Information Technology, Warsaw University of Life Sciences, Nowoursynowska 159, Warsaw, 02-776, Poland

Ta strona wykorzystuje pliki 'cookies'. Więcej informacji