Building a Recurrent

What is meant by zero padding here and clipped ,I didn’t get

Hey @Bhawna,

The input to the Embedding layer is a 2D Tensor whose size is (batch_size,sequence_length).The rows of the input contain sentences and columns contain words corresponding to the sentence in that row. Since our deep learning models are designed to work on fixed-length input data, but In real world, we provide it with sentences of varying lengths.So we create a threshold max_len which help us obtain sentences of fixed lengths.

If a sentence has a length less than max_len it fills the other words with zero and that is called zero padding and If a sentence has a length greater than max_len then it clips the further part of the sentence.

In this way, we only get sentences containing a fixed amount of words.

I hope this clears your doubt.

ok… got it…