RotaryEmbedding
classkeras_hub.layers.RotaryEmbedding(
max_wavelength=10000, scaling_factor=1.0, sequence_axis=1, feature_axis=-1, **kwargs
)
Rotary positional encoding layer.
This layer encodes absolute positional information with a rotation
matrix. It calculates the rotary encoding with a mix of sine and
cosine functions with geometrically increasing wavelengths.
Defined and formulated in RoFormer: Enhanced Transformer with Rotary Position Embedding.
The input must be a tensor with shape a sequence dimension and a feature
dimension. Typically, this will either an input with shape
(batch_size, sequence_length, feature_length)
or
(batch_size, sequence_length, num_heads, feature_length)
.
This layer will return a new tensor with the rotary embedding applied to
the input tensor.
Arguments
keras.layers.Layer
,
including name
, trainable
, dtype
etc.Call arguments
inputs
and returned.Examples
batch_size = 16
feature_length = 18
sequence_length = 256
num_heads = 8
# No multi-head dimension.
tensor = np.ones((batch_size, sequence_length, feature_length))
rot_emb_layer = RotaryEmbedding()
tensor_rot = rot_emb_layer(tensor)
# With multi-head dimension.
tensor = np.ones((batch_size, sequence_length, num_heads, feature_length))
tensor_rot = rot_emb_layer(tensor)
References