Glossaire

Transformateur

Explore the impact of Transformer models in AI with Ultralytics. Discover their architecture, key components, and applications in NLP and vision.

Train YOLO models simply
with Ultralytics HUB

En savoir plus

The Transformer model has become a cornerstone in the field of artificial intelligence, especially in natural language processing (NLP) and, more recently, in computer vision tasks. First introduced in the paper "Attention is All You Need" by Vaswani et al. in 2017, the Transformer architecture fundamentally changed how machines process and understand language by leveraging self-attention mechanisms.

Understanding Transformers

Transformers are designed to handle sequential data with more flexibility than previous models like recurrent neural networks (RNNs) and convolutional neural networks (CNNs). Unlike RNNs, which process data sequentially, Transformers allow for much greater parallelization, greatly reducing training time and enhancing performance on large datasets.

Central to the Transformer model is the self-attention mechanism. This allows the model to weigh the importance of different words in a sentence, offering context-aware understanding and generation of language. Read more about self-attention in the Self-Attention glossary page.

Key Components

  1. Encoder-Decoder Structure: The Transformer is built on an encoder-decoder structure, where the encoder processes input text, and the decoder generates the output. Each consists of multiple layers that contain a self-attention mechanism and a feed-forward neural network.

  2. Positional Encoding: As Transformers don't inherently understand the order of sequences, positional encoding is added to input embeddings to help encode the position of words within the sequence.

  3. Attention Mechanism: At the heart of the Transformer is the attention mechanism that assigns different levels of importance to each part of the input sequence, allowing it to focus on relevant parts while generating outputs.

Applications dans le monde réel

Traitement du langage naturel

Transformers have powered major advancements in NLP. Models based on the Transformer architecture, such as GPT-3 and BERT, have set new benchmarks in tasks ranging from text generation to sentiment analysis and machine translation. These models handle tasks better than their predecessors by understanding context at a nuanced level.

  • BERT is known for tasks that require understanding both the left and right context of words through its bidirectional attention mechanism.

Vision par ordinateur

While initially designed for NLP, Transformers are increasingly being applied to computer vision tasks. Models like ViT (Vision Transformer) use Transformers to achieve state-of-the-art results in image classification, segmentation, and more. Delve into the role of Transformers in vision models to understand their impact on computer vision.

For those interested, the Ultralytics HUB offers tools to integrate Transformer models into a range of projects, enhancing performance and scalability. Learn more about deploying models in real-world applications with Ultralytics HUB.

Distinctions from Related Models

  • RNNs and LSTMs: Unlike RNNs and LSTMs, Transformers can process sequences in parallel, leading to faster training and improved effectiveness in capturing long-range dependencies.

  • CNNs: While CNNs are traditionally used for image data, Transformers are proving effective due to their ability to capture contextual relationships in data without being constrained by spatial hierarchies.

Exploration plus poussée

Explore the potential of Transformers in AI by reading the paper "Attention is All You Need" and related literature. For more on the evolution of these architectures, consider learning about model variations like Transformer-XL and Longformer, which tackle sequence limitations in original Transformer designs.

Transformers continue to drive innovation across AI domains, with applications expanding from NLP to fields like healthcare, finance, and beyond. Stay updated with Ultralytics' blog for the latest trends and advancements in Transformer technology.

Read all