Awni Hannun 12/17/2017

Training Sequence Models with Attention

Read Original

This technical article provides practical advice for training sequence-to-sequence models with attention mechanisms. It covers how to diagnose if a model is learning to condition on input by visualizing attention alignments and addresses the 'inference gap' caused by teacher forcing during training. The content is aimed at developers and researchers working with deep learning for sequence tasks like speech recognition or machine translation.

Training Sequence Models with Attention

Comments

No comments yet

Be the first to share your thoughts!

Browser Extension

Get instant access to AllDevBlogs from your browser

Top of the Week