PKG-INFO 18 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262263264265266267268269270271272273274275276277278279
  1. Metadata-Version: 2.4
  2. Name: ultralytics
  3. Version: 8.3.63
  4. Summary: Ultralytics YOLO 🚀 for SOTA object detection, multi-object tracking, instance segmentation, pose estimation and image classification.
  5. Author-email: Glenn Jocher <glenn.jocher@ultralytics.com>, Jing Qiu <jing.qiu@ultralytics.com>
  6. Maintainer-email: Ultralytics <hello@ultralytics.com>
  7. License: AGPL-3.0
  8. Project-URL: Homepage, https://ultralytics.com
  9. Project-URL: Source, https://github.com/ultralytics/ultralytics
  10. Project-URL: Documentation, https://docs.ultralytics.com
  11. Project-URL: Bug Reports, https://github.com/ultralytics/ultralytics/issues
  12. Project-URL: Changelog, https://github.com/ultralytics/ultralytics/releases
  13. Keywords: machine-learning,deep-learning,computer-vision,ML,DL,AI,YOLO,YOLOv3,YOLOv5,YOLOv8,YOLOv9,YOLOv10,YOLO11,HUB,Ultralytics
  14. Classifier: Development Status :: 4 - Beta
  15. Classifier: Intended Audience :: Developers
  16. Classifier: Intended Audience :: Education
  17. Classifier: Intended Audience :: Science/Research
  18. Classifier: License :: OSI Approved :: GNU Affero General Public License v3 or later (AGPLv3+)
  19. Classifier: Programming Language :: Python :: 3
  20. Classifier: Programming Language :: Python :: 3.8
  21. Classifier: Programming Language :: Python :: 3.9
  22. Classifier: Programming Language :: Python :: 3.10
  23. Classifier: Programming Language :: Python :: 3.11
  24. Classifier: Programming Language :: Python :: 3.12
  25. Classifier: Topic :: Software Development
  26. Classifier: Topic :: Scientific/Engineering
  27. Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
  28. Classifier: Topic :: Scientific/Engineering :: Image Recognition
  29. Classifier: Operating System :: POSIX :: Linux
  30. Classifier: Operating System :: MacOS
  31. Classifier: Operating System :: Microsoft :: Windows
  32. Requires-Python: >=3.8
  33. Description-Content-Type: text/markdown
  34. License-File: LICENSE
  35. Requires-Dist: numpy>=1.23.0
  36. Requires-Dist: numpy<2.0.0; sys_platform == "darwin"
  37. Requires-Dist: matplotlib>=3.3.0
  38. Requires-Dist: opencv-python>=4.6.0
  39. Requires-Dist: pillow>=7.1.2
  40. Requires-Dist: pyyaml>=5.3.1
  41. Requires-Dist: requests>=2.23.0
  42. Requires-Dist: scipy>=1.4.1
  43. Requires-Dist: torch>=1.8.0
  44. Requires-Dist: torch!=2.4.0,>=1.8.0; sys_platform == "win32"
  45. Requires-Dist: torchvision>=0.9.0
  46. Requires-Dist: tqdm>=4.64.0
  47. Requires-Dist: psutil
  48. Requires-Dist: py-cpuinfo
  49. Requires-Dist: pandas>=1.1.4
  50. Requires-Dist: seaborn>=0.11.0
  51. Requires-Dist: ultralytics-thop>=2.0.0
  52. Provides-Extra: dev
  53. Requires-Dist: ipython; extra == "dev"
  54. Requires-Dist: pytest; extra == "dev"
  55. Requires-Dist: pytest-cov; extra == "dev"
  56. Requires-Dist: coverage[toml]; extra == "dev"
  57. Requires-Dist: mkdocs>=1.6.0; extra == "dev"
  58. Requires-Dist: mkdocs-material>=9.5.9; extra == "dev"
  59. Requires-Dist: mkdocstrings[python]; extra == "dev"
  60. Requires-Dist: mkdocs-redirects; extra == "dev"
  61. Requires-Dist: mkdocs-ultralytics-plugin>=0.1.8; extra == "dev"
  62. Requires-Dist: mkdocs-macros-plugin>=1.0.5; extra == "dev"
  63. Provides-Extra: export
  64. Requires-Dist: onnx>=1.12.0; extra == "export"
  65. Requires-Dist: coremltools>=7.0; (platform_system != "Windows" and python_version <= "3.11") and extra == "export"
  66. Requires-Dist: scikit-learn>=1.3.2; (platform_system != "Windows" and python_version <= "3.11") and extra == "export"
  67. Requires-Dist: openvino>=2024.0.0; extra == "export"
  68. Requires-Dist: tensorflow>=2.0.0; extra == "export"
  69. Requires-Dist: tensorflowjs>=3.9.0; extra == "export"
  70. Requires-Dist: tensorstore>=0.1.63; (platform_machine == "aarch64" and python_version >= "3.9") and extra == "export"
  71. Requires-Dist: keras; extra == "export"
  72. Requires-Dist: flatbuffers<100,>=23.5.26; platform_machine == "aarch64" and extra == "export"
  73. Requires-Dist: numpy==1.23.5; platform_machine == "aarch64" and extra == "export"
  74. Requires-Dist: h5py!=3.11.0; platform_machine == "aarch64" and extra == "export"
  75. Provides-Extra: solutions
  76. Requires-Dist: shapely>=2.0.0; extra == "solutions"
  77. Requires-Dist: streamlit; extra == "solutions"
  78. Provides-Extra: logging
  79. Requires-Dist: comet; extra == "logging"
  80. Requires-Dist: tensorboard>=2.13.0; extra == "logging"
  81. Requires-Dist: dvclive>=2.12.0; extra == "logging"
  82. Provides-Extra: extra
  83. Requires-Dist: hub-sdk>=0.0.12; extra == "extra"
  84. Requires-Dist: ipython; extra == "extra"
  85. Requires-Dist: albumentations>=1.4.6; extra == "extra"
  86. Requires-Dist: pycocotools>=2.0.7; extra == "extra"
  87. Dynamic: license-file
  88. <div align="center">
  89. <h1>YOLOv12</h1>
  90. <h3>YOLOv12: Attention-Centric Real-Time Object Detectors</h3>
  91. [Yunjie Tian](https://sunsmarterjie.github.io/)<sup>1</sup>, [Qixiang Ye](https://people.ucas.ac.cn/~qxye?language=en)<sup>2</sup>, [David Doermann](https://cse.buffalo.edu/~doermann/)<sup>1</sup>
  92. <sup>1</sup> University at Buffalo, SUNY, <sup>2</sup> University of Chinese Academy of Sciences.
  93. <p align="center">
  94. <img src="assets/tradeoff_turbo.svg" width=90%> <br>
  95. Comparison with popular methods in terms of latency-accuracy (left) and FLOPs-accuracy (right) trade-offs
  96. </p>
  97. </div>
  98. [![arXiv](https://img.shields.io/badge/arXiv-2502.12524-b31b1b.svg)](https://arxiv.org/abs/2502.12524) [![Hugging Face Demo](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/sunsmarterjieleaf/yolov12) <a href="https://colab.research.google.com/github/roboflow-ai/notebooks/blob/main/notebooks/train-yolov12-object-detection-model.ipynb"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"></a> [![Kaggle Notebook](https://img.shields.io/badge/Kaggle-Notebook-blue?logo=kaggle)](https://www.kaggle.com/code/jxxn03x/yolov12-on-custom-data) [![LightlyTrain Notebook](https://img.shields.io/badge/LightlyTrain-Notebook-blue?)](https://colab.research.google.com/github/lightly-ai/lightly-train/blob/main/examples/notebooks/yolov12.ipynb) [![deploy](https://media.roboflow.com/deploy.svg)](https://blog.roboflow.com/use-yolov12-with-roboflow/#deploy-yolov12-models-with-roboflow) [![Openbayes](https://img.shields.io/static/v1?label=Demo&message=OpenBayes%E8%B4%9D%E5%BC%8F%E8%AE%A1%E7%AE%97&color=green)](https://openbayes.com/console/public/tutorials/A4ac4xNrUCQ)
  99. ## Updates
  100. - 2025/06/17: **Use this repo for YOLOv12 instead of [ultralytics](https://github.com/ultralytics/ultralytics/tree/main/ultralytics/cfg/models/12). Their implementation is inefficient, requires more memory, and has unstable training, which are fixed here!**
  101. - 2025/07/01: YOLOv12's **classification** models are released, see [code](https://github.com/sunsmarterjie/yolov12/tree/Cls).
  102. - 2025/06/04: YOLOv12's **instance segmentation** models are released, see [code](https://github.com/sunsmarterjie/yolov12/tree/Seg).
  103. - 2025/04/15: Pretrain a YOLOv12 model with [LightlyTrain](https://docs.lightly.ai/train/stable/index.html), a novel framework that lets you pretrain any computer vision model on your unlabeled data, with [YOLOv12 support](https://docs.lightly.ai/train/stable/models/yolov12.html). Here is also a [Colab tutorial](https://colab.research.google.com/github/lightly-ai/lightly-train/blob/main/examples/notebooks/yolov12.ipynb)!
  104. - 2025/03/18: Some guys are interested in the heatmap. See this [issue](https://github.com/sunsmarterjie/yolov12/issues/74).
  105. - 2025/03/09: **YOLOv12-turbo** is released: a faster YOLOv12 version.
  106. - 2025/02/24: Blogs: [ultralytics](https://docs.ultralytics.com/models/yolo12/), [LearnOpenCV](https://learnopencv.com/yolov12/). Thanks to them!
  107. - 2025/02/22: [YOLOv12 TensorRT CPP Inference Repo + Google Colab Notebook](https://github.com/mohamedsamirx/YOLOv12-TensorRT-CPP).
  108. - 2025/02/22: [Android deploy](https://github.com/mpj1234/ncnn-yolov12-android/tree/main) / [TensorRT-YOLO](https://github.com/laugh12321/TensorRT-YOLO) accelerates yolo12. Thanks to them!
  109. - 2025/02/21: Try yolo12 for classification, oriented bounding boxes, pose estimation, and instance segmentation at [ultralytics](https://github.com/ultralytics/ultralytics/tree/main/ultralytics/cfg/models/12). Please pay attention to this [issue](https://github.com/sunsmarterjie/yolov12/issues/29). Thanks to them!
  110. - 2025/02/20: [Any computer or edge device?](https://github.com/roboflow/inference) / [ONNX CPP Version](https://github.com/mohamedsamirx/YOLOv12-ONNX-CPP). Thanks to them!
  111. - 2025/02/20: Train a yolov12 model on a custom dataset: [Blog](https://blog.roboflow.com/train-yolov12-model/) and [Youtube](https://www.youtube.com/watch?v=fksJmIMIfXo). / [Step-by-step instruction](https://youtu.be/dO8k5rgXG0M). Thanks to them!
  112. - 2025/02/19: [arXiv version](https://arxiv.org/abs/2502.12524) is public. [Demo](https://huggingface.co/spaces/sunsmarterjieleaf/yolov12) is available (try [Demo2](https://huggingface.co/spaces/sunsmarterjieleaf/yolov12_demo2) [Demo3](https://huggingface.co/spaces/sunsmarterjieleaf/yolov12_demo3) if busy).
  113. <details>
  114. <summary>
  115. <font size="+1">Abstract</font>
  116. </summary>
  117. Enhancing the network architecture of the YOLO framework has been crucial for a long time but has focused on CNN-based improvements despite the proven superiority of attention mechanisms in modeling capabilities. This is because attention-based models cannot match the speed of CNN-based models. This paper proposes an attention-centric YOLO framework, namely YOLOv12, that matches the speed of previous CNN-based ones while harnessing the performance benefits of attention mechanisms.
  118. YOLOv12 surpasses all popular real-time object detectors in accuracy with competitive speed. For example, YOLOv12-N achieves 40.6% mAP with an inference latency of 1.64 ms on a T4 GPU, outperforming advanced YOLOv10-N / YOLOv11-N by 2.1%/1.2% mAP with a comparable speed. This advantage extends to other model scales. YOLOv12 also surpasses end-to-end real-time detectors that improve DETR, such as RT-DETR / RT-DETRv2: YOLOv12-S beats RT-DETR-R18 / RT-DETRv2-R18 while running 42% faster, using only 36% of the computation and 45% of the parameters.
  119. </details>
  120. ## Main Results
  121. **Turbo (default)**:
  122. | Model (det) | size<br><sup>(pixels) | mAP<sup>val<br>50-95 | Speed (ms) <br><sup>T4 TensorRT10<br> | params<br><sup>(M) | FLOPs<br><sup>(G) |
  123. | :----------------------------------------------------------------------------------- | :-------------------: | :-------------------:| :------------------------------:| :-----------------:| :---------------:|
  124. | [YOLO12n](https://github.com/sunsmarterjie/yolov12/releases/download/turbo/yolov12n.pt) | 640 | 40.4 | 1.60 | 2.5 | 6.0 |
  125. | [YOLO12s](https://github.com/sunsmarterjie/yolov12/releases/download/turbo/yolov12s.pt) | 640 | 47.6 | 2.42 | 9.1 | 19.4 |
  126. | [YOLO12m](https://github.com/sunsmarterjie/yolov12/releases/download/turbo/yolov12m.pt) | 640 | 52.5 | 4.27 | 19.6 | 59.8 |
  127. | [YOLO12l](https://github.com/sunsmarterjie/yolov12/releases/download/turbo/yolov12l.pt) | 640 | 53.8 | 5.83 | 26.5 | 82.4 |
  128. | [YOLO12x](https://github.com/sunsmarterjie/yolov12/releases/download/turbo/yolov12x.pt) | 640 | 55.4 | 10.38 | 59.3 | 184.6 |
  129. [**v1.0**](https://github.com/sunsmarterjie/yolov12/tree/V1.0):
  130. | Model (det) | size<br><sup>(pixels) | mAP<sup>val<br>50-95 | Speed (ms) <br><sup>T4 TensorRT10<br> | params<br><sup>(M) | FLOPs<br><sup>(G) |
  131. | :----------------------------------------------------------------------------------- | :-------------------: | :-------------------:| :------------------------------:| :-----------------:| :---------------:|
  132. | [YOLO12n](https://github.com/sunsmarterjie/yolov12/releases/download/v1.0/yolov12n.pt) | 640 | 40.6 | 1.64 | 2.6 | 6.5 |
  133. | [YOLO12s](https://github.com/sunsmarterjie/yolov12/releases/download/v1.0/yolov12s.pt) | 640 | 48.0 | 2.61 | 9.3 | 21.4 |
  134. | [YOLO12m](https://github.com/sunsmarterjie/yolov12/releases/download/v1.0/yolov12m.pt) | 640 | 52.5 | 4.86 | 20.2 | 67.5 |
  135. | [YOLO12l](https://github.com/sunsmarterjie/yolov12/releases/download/v1.0/yolov12l.pt) | 640 | 53.7 | 6.77 | 26.4 | 88.9 |
  136. | [YOLO12x](https://github.com/sunsmarterjie/yolov12/releases/download/v1.0/yolov12x.pt) | 640 | 55.2 | 11.79 | 59.1 | 199.0 |
  137. [**Instance segmentation**](https://github.com/sunsmarterjie/yolov12/tree/Seg):
  138. | Model (seg) | size<br><sup>(pixels) | mAP<sup>box<br>50-95 | mAP<sup>mask<br>50-95 | Speed (ms) <br><sup>T4 TensorRT10<br> | params<br><sup>(M) | FLOPs<br><sup>(G) |
  139. | :------------------------------------------------------------------------------------| :--------------------: | :-------------------: | :---------------------: | :--------------------------------:| :------------------: | :-----------------: |
  140. | [YOLOv12n-seg](https://github.com/sunsmarterjie/yolov12/releases/download/seg/yolov12n-seg.pt) | 640 | 39.9 | 32.8 | 1.84 | 2.8 | 9.9 |
  141. | [YOLOv12s-seg](https://github.com/sunsmarterjie/yolov12/releases/download/seg/yolov12s-seg.pt) | 640 | 47.5 | 38.6 | 2.84 | 9.8 | 33.4 |
  142. | [YOLOv12m-seg](https://github.com/sunsmarterjie/yolov12/releases/download/seg/yolov12m-seg.pt) | 640 | 52.4 | 42.3 | 6.27 | 21.9 | 115.1 |
  143. | [YOLOv12l-seg](https://github.com/sunsmarterjie/yolov12/releases/download/seg/yolov12l-seg.pt) | 640 | 54.0 | 43.2 | 7.61 | 28.8 | 137.7 |
  144. | [YOLOv12x-seg](https://github.com/sunsmarterjie/yolov12/releases/download/seg/yolov12x-seg.pt) | 640 | 55.2 | 44.2 | 15.43 | 64.5 | 308.7 |
  145. [**Classification**](https://github.com/sunsmarterjie/yolov12/tree/Cls):
  146. | Model (cls) | size<br><sup>(pixels) | Acc.<br><sup>top-1<br> | Acc.<br><sup>top-5<br> | Speed (ms) <br><sup>T4 TensorRT10<br> | params<br><sup>(M) | FLOPs<br><sup>(G) |
  147. | :----------------------------------------------------------------------------------------| :-------------------: | :------------: | :------------: | :-------------------------------------:| :----------------: | :---------------: |
  148. | [YOLOv12n-cls](https://github.com/sunsmarterjie/yolov12/releases/download/cls/yolov12n-cls.pt) | 224 | 71.7 | 90.5 | 1.27 | 2.9 | 0.5 |
  149. | [YOLOv12s-cls](https://github.com/sunsmarterjie/yolov12/releases/download/cls/yolov12s-cls.pt) | 224 | 76.4 | 93.3 | 1.52 | 7.2 | 1.5 |
  150. | [YOLOv12m-cls](https://github.com/sunsmarterjie/yolov12/releases/download/cls/yolov12m-cls.pt) | 224 | 78.8 | 94.4 | 2.03 | 12.7 | 4.5 |
  151. | [YOLOv12l-cls](https://github.com/sunsmarterjie/yolov12/releases/download/cls/yolov12l-cls.pt) | 224 | 79.5 | 94.5 | 2.73 | 16.8 | 6.2 |
  152. | [YOLOv12x-cls](https://github.com/sunsmarterjie/yolov12/releases/download/cls/yolov12x-cls.pt) | 224 | 80.1 | 95.3 | 3.64 | 35.5 | 13.7 |
  153. </details>
  154. ## Installation
  155. ```
  156. wget https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.3/flash_attn-2.7.3+cu11torch2.2cxx11abiFALSE-cp311-cp311-linux_x86_64.whl
  157. conda create -n yolov12 python=3.11
  158. conda activate yolov12
  159. pip install -r requirements.txt
  160. pip install -e .
  161. ```
  162. ## Validation
  163. [`yolov12n`](https://github.com/sunsmarterjie/yolov12/releases/download/turbo/yolov12n.pt)
  164. [`yolov12s`](https://github.com/sunsmarterjie/yolov12/releases/download/turbo/yolov12s.pt)
  165. [`yolov12m`](https://github.com/sunsmarterjie/yolov12/releases/download/turbo/yolov12m.pt)
  166. [`yolov12l`](https://github.com/sunsmarterjie/yolov12/releases/download/turbo/yolov12l.pt)
  167. [`yolov12x`](https://github.com/sunsmarterjie/yolov12/releases/download/turbo/yolov12x.pt)
  168. ```python
  169. from ultralytics import YOLO
  170. model = YOLO('yolov12{n/s/m/l/x}.pt')
  171. model.val(data='coco.yaml', save_json=True)
  172. ```
  173. ## Training
  174. ```python
  175. from ultralytics import YOLO
  176. model = YOLO('yolov12n.yaml')
  177. # Train the model
  178. results = model.train(
  179. data='coco.yaml',
  180. epochs=600,
  181. batch=256,
  182. imgsz=640,
  183. scale=0.5, # S:0.9; M:0.9; L:0.9; X:0.9
  184. mosaic=1.0,
  185. mixup=0.0, # S:0.05; M:0.15; L:0.15; X:0.2
  186. copy_paste=0.1, # S:0.15; M:0.4; L:0.5; X:0.6
  187. device="0,1,2,3",
  188. )
  189. # Evaluate model performance on the validation set
  190. metrics = model.val()
  191. # Perform object detection on an image
  192. results = model("path/to/image.jpg")
  193. results[0].show()
  194. ```
  195. ## Prediction
  196. ```python
  197. from ultralytics import YOLO
  198. model = YOLO('yolov12{n/s/m/l/x}.pt')
  199. model.predict()
  200. ```
  201. ## Export
  202. ```python
  203. from ultralytics import YOLO
  204. model = YOLO('yolov12{n/s/m/l/x}.pt')
  205. model.export(format="engine", half=True) # or format="onnx"
  206. ```
  207. ## Demo
  208. ```
  209. python app.py
  210. # Please visit http://127.0.0.1:7860
  211. ```
  212. ## Acknowledgement
  213. The code is based on [ultralytics](https://github.com/ultralytics/ultralytics). Thanks for their excellent work!
  214. ## Citation
  215. ```BibTeX
  216. @article{tian2025yolov12,
  217. title={YOLOv12: Attention-Centric Real-Time Object Detectors},
  218. author={Tian, Yunjie and Ye, Qixiang and Doermann, David},
  219. journal={arXiv preprint arXiv:2502.12524},
  220. year={2025}
  221. }
  222. ```