Transformers

Go Back
Report Abuse

Transformers

Architecture for sequence modeling

Description

Description
Transformers use self-attention to process sequences efficiently, powering models like BERT and GPT.
Phone

There are no reviews yet.