Green check
Link copied to clipboard

How to Use Ultralytics YOLO11 for Object Tracking

Join us as we take a closer look at how to use Ultralytics YOLO11 for object tracking in real-time applications like surveillance, farming, and manufacturing.

Let’s say you want to monitor and track the movement of components on an assembly line in a manufacturing facility to ensure quality control and improve workflow efficiency. Typically, this would involve manual inspections or using basic sensors to track items, which can be time-consuming and prone to errors. However, computer vision and object tracking can be used to automate and enhance this process. 

Object tracking is a computer vision task that helps detect, identify, and track objects in a video. It can be used for a wide variety of applications, from animal monitoring on farms to security and surveillance in retail stores. The objects being tracked in a video are usually visualized using bounding boxes to help the user see exactly where they are located and detected within the video frame.

Launched during Ultralytics’ annual hybrid event, YOLO Vision 2024 (YV24), Ultralytics YOLO11 is a computer vision model that can handle a wide variety of Vision AI tasks, including object tracking. In this article, we’ll explore how object tracking works and discuss real-world applications. We’ll also take a look at how you can try out object tracking using YOLO11. Let’s get started!

Fig 1. An example of using YOLO11 for object tracking in a retail store.

AI-Powered Object Tracking with YOLO11

Object tracking is an essential computer vision technique. It makes it possible for objects in a video to be identified and tracked over time. Object tracking can seem very similar to another computer vision task - object detection. The key difference between the two lies in how they handle video frames. Object detection looks at each frame individually, identifying and classifying objects without considering previous or future frames. Object tracking, on the other hand, connects the dots between frames, following the same objects over time and keeping track of their movements.

Here’s a more detailed walk-through of how object tracking works:

  • Object detection: The process starts by detecting objects in a single frame of a video. YOLO11 can be used to identify multiple objects and their locations accurately.
  • Assign unique IDs: Each detected object is given a unique ID to distinguish it from others and make it easier to track.
  • Track movement across frames: A tracking algorithm follows the objects across subsequent frames, updating their positions while maintaining the association with their unique IDs.
  • Handle occlusions: If an object temporarily disappears from view (e.g., blocked by another object), the system ensures the tracking resumes once the object reappears.
  • Update object information: As objects move, their positions and attributes (like speed or direction) are continuously updated to reflect changes over time.

Ultralytics supports real-time object tracking by leveraging advanced tracking algorithms like BoT-SORT and ByteTrack. It also works seamlessly with segmentation and pose estimation YOLO11 models, making it a flexible tool for a wide range of tracking tasks.

Applications of YOLO11 Object Tracking

The versatile capabilities of the Ultralytics YOLO11 model open up a wide range of possible applications in many industries. Let’s take a closer look at some YOLO11 object-tracking use cases.

YOLO11 for Autonomous Vehicle Tracking

Object tracking is crucial for helping self-driving cars operate safely and efficiently. These vehicles need to constantly understand their surroundings to make real-time decisions, like stopping, turning, or changing lanes. Object detection allows the car to identify key elements in its environment, such as pedestrians, cyclists, other vehicles, and traffic signs. However, detecting these objects in a single moment isn’t enough for safe navigation.

That’s where object tracking comes in. It lets the car follow these objects over time, tracking their movements across multiple frames. For instance, it helps autonomous vehicles predict where a pedestrian is heading, monitor the speed and direction of nearby vehicles, or recognize that a traffic light hasn’t changed. By combining detection and tracking, self-driving cars can anticipate the movement of objects around them, respond proactively, and drive safely and smoothly.

Fig 2. YOLO11 can be used to detect and track cars.

Using YOLO11 Object Tracking to Monitor Animals

Tracking animals on a farm, like cattle, is vital for effective management, but it can be a tedious and time-consuming task. Traditional methods, such as using sensors or tags, often have downsides. These devices can stress the animals when attached and are prone to falling off or getting damaged, which disrupts tracking.

Computer vision provides a better solution for farmers to monitor and track animals without the need for physical tags. Object tracking can give farmers valuable insights into the animals’ behavior and health. For example, it can help detect conditions like lameness that affect the way an animal walks. By using object tracking, farmers can spot subtle changes in movement and address health issues early.

Beyond health monitoring, computer vision can also help farmers understand other behaviors, such as social interactions, eating habits, and movement patterns. These insights can improve herd management, optimize feeding schedules, and promote the animals’ overall well-being. By reducing manual labor and minimizing stress for the animals, computer vision-based tracking is a practical and efficient tool for modern farming.

Fig 3. Using YOLO11 to track farmers and a cow.

Object Tracking in Manufacturing Using YOLO11

Object tracking has many use cases in the manufacturing sector. For example, object detection and tracking systems can monitor production lines. Products or raw materials can be easily tracked and counted as they move on a conveyor belt. These systems can also be integrated with other computer vision systems to perform additional tasks. For instance, an item with a defect can be identified using a defect detection system and tracked using object tracking to ensure it’s properly taken care of.

Another important application of object tracking in manufacturing is related to safety. Object tracking systems can be used to detect and track workers in potentially hazardous manufacturing environments. Hazardous regions can be marked and monitored constantly using computer vision systems, and supervisors can be notified if workers (being tracked) come near such areas. Such safety systems can also be used to detect and track equipment, preventing the chance of theft. 

Fig 4. An example of YOLO11 object detection being used to detect workers.

Object Tracking and Surveillance Using YOLO11

Real-time object tracking is widely used in security and surveillance systems. These systems can be used to monitor public places, transportation hubs, and large retail environments like shopping malls. Large, crowded areas can use this technology to track suspicious individuals or crowd behavior, providing a seamless surveillance solution. For example, during the pandemic, object tracking systems were used to track crowded areas and make sure that people were maintaining social distancing.

Object tracking can also be used in traffic surveillance as well. Object tracking makes it possible to track and analyze how vehicles behave, spotting unusual or suspicious actions in real-time to help prevent accidents or crimes. A good example is speed estimation systems. They can detect and track a vehicle to determine its speed.

Fig 5. Speed estimation can be done using object tracking.

Try Out Object Tracking with Ultralytics YOLO11

Now that we’ve explored some of the object tracking applications, let’s discuss how you can try it out using the Ultralytics YOLO11 model

To get started, install the Ultralytics Python package using pip, conda, or Docker. If you face any challenges during installation, our Common Issues Guide offers helpful troubleshooting tips. 

Once you’ve installed the package successfully, run the following code. It outlines how to load the Ultralytics YOLO11 model and use it to track objects in a video file. The model used in the code is “yolo11n.pt”. The ‘n’ stands for Nano - the smallest variant of the YOLO11 model. There are also other model variants to choose from - small, medium, large, and extra-large.

Fig 6. A code snippet that showcases object tracking using the YOLO11 model.

You can also choose to use a custom-trained model instead of a pre-trained model. Custom training involves fine-tuning a pre-trained model to fit your specific application

As mentioned earlier, object tracking is supported for the following YOLO11 models: object detection, pose estimation, and instance segmentation. If you have a specific application involving tracking, you can custom-train any of these models depending on your application. You can custom-train a model using the Ultralytics Python package or the no-code platform, Ultralytics HUB

Key Takeaways

Ultralytics YOLO11 is a great tool for tracking objects in videos, and it can be used in many different fields, such as self-driving cars, farming, manufacturing, and security. It can detect and follow objects in real-time, helping businesses and industries keep track of their workers and equipment. The model is easy to use and can be customized for specific needs, making it a good option for anyone interested in adopting computer vision capabilities seamlessly. 

To learn more, visit our GitHub repository, and engage with our community. Explore AI applications in self-driving cars and agriculture on our solutions pages. 🚀

Facebook logoTwitter logoLinkedIn logoCopy-link symbol

Read more in this category

Let’s build the future
of AI together!

Begin your journey with the future of machine learning