Dataset Groups Activity Stream AttentionHTR: Handwritten Text Recognition This work proposes an attention-based sequence-to-sequence model for handwritten word recognition and explores transfer learning for data-efficient training of HTR systems. BibTex: @dataset{Dmitrijs_Kass_and_Ekta_Vats_2024, abstract = {This work proposes an attention-based sequence-to-sequence model for handwritten word recognition and explores transfer learning for data-efficient training of HTR systems.}, author = {Dmitrijs Kass and Ekta Vats}, doi = {10.57702/sb0i0mif}, institution = {No Organization}, keyword = {'Attention Encoder-Decoder Networks', 'Handwritten Text Recognition', 'Transfer Learning'}, month = {dec}, publisher = {TIB}, title = {AttentionHTR: Handwritten Text Recognition}, url = {https://service.tib.eu/ldmservice/dataset/attentionhtr--handwritten-text-recognition}, year = {2024} }