Repository logoGCRIS
  • English
  • Türkçe
  • Русский
Log In
New user? Click here to register. Have you forgotten your password?
Home
Communities
Browse GCRIS
Entities
Overview
GCRIS Guide
  1. Home
  2. Browse by Author

Browsing by Author "Ke, Chang-Qing"

Filter results by typing the first few letters
Now showing 1 - 1 of 1
  • Results Per Page
  • Sort Options
  • Loading...
    Thumbnail Image
    Article
    Citation - WoS: 3
    Citation - Scopus: 4
    S-Transformer: a new deep learning model enhanced by sequential transformer encoders for drought forecasting
    (SPRINGER HEIDELBERG, 2025) Ali Danandeh Mehr; Amir A. Ghavifekr; Elman Ghazaei; Mir Jafar Sadegh Safari; Chang-Qing Ke; Vahid Nourani; Mehr, Ali Danandeh; Safari, Mir Jafar Sadegh; Ke, Chang-Qing; Nourani, Vahid; Ghazaei, Elman; Danandeh Mehr, Ali; Ghavifekr, Amir A.
    Droughts are prolonged periods of rainfall deficit the frequency of which has increased due to global warming causing severe impacts on water resources agriculture ecosystems and food security. Given their significance accurate monitoring and forecasting of droughts are crucial for effective water resource management. This paper introduces sequential-based transformers (S-Transformer) a novel deep-learning approach aimed to apply for meteorological droughts prediction using their historical events. The core of the S-transformer algorithm is the orderly computing of an output by utilizing the sequence of inputs. Training of the S-transformer involves forward and backward passes through the network to adjust the weights and biases using gradient descent optimization. This process uses fixed-size dynamic windows to minimize the difference between the observed and forecasted outputs. To demonstrate the effectiveness and performance of the new model two case studies were presented based on the observed standardized precipitation index in Isparta and Burdur cities T & uuml,rkiye. In addition the S-Transformer efficiency was compared with those of three benchmark models including a classic multilayer perceptron a deep learning long-short-term memory and a deep classic transformer model. The promising results of the proposed model proved its superiority over its counterparts in terms of different performance metrics. In Isparta and Burdur cities the S-Transformer achieved the root mean squared values of 0.096 and 0.098 on the testing set respectively.
Repository logo
Collections
  • Scopus Collection
  • WoS Collection
  • TrDizin Collection
  • PubMed Collection
Entities
  • Research Outputs
  • Organizations
  • Researchers
  • Projects
  • Awards
  • Equipments
  • Events
About
  • Contact
  • GCRIS
  • Research Ecosystems
  • Feedback
  • OAI-PMH

Log in to GCRIS Dashboard

GCRIS Mobile

Download GCRIS Mobile on the App StoreGet GCRIS Mobile on Google Play

Powered by Research Ecosystems

  • Privacy policy
  • End User Agreement
  • Feedback