Embedding
classkeras.layers.Embedding(
input_dim,
output_dim,
embeddings_initializer="uniform",
embeddings_regularizer=None,
embeddings_constraint=None,
mask_zero=False,
weights=None,
lora_rank=None,
**kwargs
)
Turns nonnegative integers (indexes) into dense vectors of fixed size.
e.g. [[4], [20]] -> [[0.25, 0.1], [0.6, -0.2]]
This layer can only be used on nonnegative integer inputs of a fixed range.
Example
>>> model = keras.Sequential()
>>> model.add(keras.layers.Embedding(1000, 64))
>>> # The model will take as input an integer matrix of size (batch,
>>> # input_length), and the largest integer (i.e. word index) in the input
>>> # should be no larger than 999 (vocabulary size).
>>> # Now model.output_shape is (None, 10, 64), where `None` is the batch
>>> # dimension.
>>> input_array = np.random.randint(1000, size=(32, 10))
>>> model.compile('rmsprop', 'mse')
>>> output_array = model.predict(input_array)
>>> print(output_array.shape)
(32, 10, 64)
Arguments
embeddings
matrix (see keras.initializers
).embeddings
matrix (see keras.regularizers
).embeddings
matrix (see keras.constraints
).True
,
then all subsequent layers in the model need
to support masking or an exception will be raised.
If mask_zero
is set to True
, as a consequence,
index 0 cannot be used in the vocabulary (input_dim
should
equal size of vocabulary + 1).(input_dim, output_dim)
. The initial embeddings values
to use.Embedding
layer by calling layer.enable_lora(rank)
.Input shape
2D tensor with shape: (batch_size, input_length)
.
Output shape
3D tensor with shape: (batch_size, input_length, output_dim)
.