Download PDFOpen PDF in browser

Occluded Multi-Lingual Offline Handwriting Inpainting Based on Multi-Head Attention and Stacked-LSTM Decoder

EasyChair Preprint 9696

16 pagesDate: February 14, 2023

Abstract

The encoder-decoder with attention model has become a common framework to online handwriting recovery. Convolutional Neural Network (CNN) and Long Short Term Memory (LSTM) with attention mechanism are respectively used as encoder and decoder. Inspired by the current success of transformers in many tasks, we introduce in this paper a novel recovery-inpainting framework, named Temporal Order with Multi-head Attention Network and stacked-LSTM decoder (TO-MultiNet), to denoise the corrupted offline handwriting and to obtain its online counterpart signal characterized by dynamic features. First, TO-MultiNet framework is trained to generate the temporal order and the pen velocity from offline handwriting. Then, the obtained model is further used to inpaint the occluded handwriting images. This work is validated by the Beta-GRU recognition system that is applied on Arabic, Latin and Indian On/Off dual handwriting datasets. Experimental results prove the effectiveness of Multi-head attention with stacked-LSTM decoder which increases the quality of the obtained uncorrupted image and improves the recognition rate based on the novel Beta-GRU model.

Keyphrases: Beta-elliptic model, GRU, LSTM, Occluded offline handwriting, multi-head attention, transformer

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@booklet{EasyChair:9696,
  author    = {Besma Rabhi and Yahia Hamdi and Abdelkarim Elbaati and Habib Dhahri and Adel M. Alimi},
  title     = {Occluded Multi-Lingual Offline Handwriting Inpainting Based on Multi-Head Attention and Stacked-LSTM Decoder},
  howpublished = {EasyChair Preprint 9696},
  year      = {EasyChair, 2023}}
Download PDFOpen PDF in browser