当前位置: 首页> 汽车> 维修 > 如何制作一个微信刷题小程序_域名查询by21777_电商怎么做营销推广_焦作seo公司

如何制作一个微信刷题小程序_域名查询by21777_电商怎么做营销推广_焦作seo公司

时间:2025/7/14 3:02:44来源:https://blog.csdn.net/chengyq116/article/details/142284407 浏览次数: 0次
如何制作一个微信刷题小程序_域名查询by21777_电商怎么做营销推广_焦作seo公司

Large Language Model {LLM} model size, parameter size and parameter count

  • 1. Model Size and Memory Usage
  • 2. Hugging Face
    • 2.1. GPT-Neo 2.7B
    • 2.2. Massive Text Embedding Benchmark (MTEB) Leaderboard
    • 2.3. Open LLM Leaderboard
    • 2.4. LLM-Perf Leaderboard
    • 2.5. Open Ko-LLM Leaderboard
  • References

The model size is the number of parameters in the Large Language Model (LLM).

An LLM has 70B parameters, it means that the model has 70 billion adjustable parameters.

1. Model Size and Memory Usage

The total memory usage of the model:
memory usage (in bytes) = number of parameters × \times × size of each parameter

Number of parameters1 B
32-bit precision1 × \times × 1,000,000,000 × \times × 4 byte = 4,000,000,000 / 1024 / 1024 / 1024 = 3.7253 GByte
16-bit precision1 × \times × 1,000,000,000 × \times × 2 byte = 2,000,000,000 / 1024 / 1024 / 1024 = 1.862645 GByte
8-bit precision1 × \times × 1,000,000,000 × \times × 1 byte = 1,000,000,000 / 1024 / 1024 / 1024 = 0.93132 GByte
Number of parameters3 B
32-bit precision3 × \times × 1,000,000,000 × \times × 4 byte = 12,000,000,000 / 1024 / 1024 / 1024 = 11.17587 GByte
16-bit precision3 × \times × 1,000,000,000 × \times × 2 byte = 6,000,000,000 / 1024 / 1024 / 1024 = 5.58794 GByte
8-bit precision3 × \times × 1,000,000,000 × \times × 1 byte = 3,000,000,000 / 1024 / 1024 / 1024 = 2.7939677 GByte
Number of parameters7 B
32-bit precision7 × \times × 1,000,000,000 × \times × 4 byte = 28,000,000,000 / 1024 / 1024 / 1024 = 26.077 GByte
16-bit precision7 × \times × 1,000,000,000 × \times × 2 byte = 14,000,000,000 / 1024 / 1024 / 1024 = 13.038516 GByte
8-bit precision7 × \times × 1,000,000,000 × \times × 1 byte = 7,000,000,000 / 1024 / 1024 / 1024 = 6.519258 GByte

thousand [ˈθaʊznd]

  1. the number 1,000 (千)
  2. a large number

million ['mɪljən]

  1. the number 1,000,000 (百万)
  2. a large number

billion ['bɪljən]

  1. the number 1,000,000,000 (十亿)
  2. a very large number

trillion ['trɪljən]

  1. the number 1,000,000,000,000 (万亿)

2. Hugging Face

2.1. GPT-Neo 2.7B

https://huggingface.co/EleutherAI/gpt-neo-2.7B/blob/main/README.md
https://huggingface.co/EleutherAI/gpt-neo-1.3B/blob/main/README.md

GPT-Neo refers to the class of models, while 2.7B represents the number of parameters of this particular pre-trained model.

GPT-Neo refers to the class of models, while 1.3B represents the number of parameters of this particular pre-trained model.

2.2. Massive Text Embedding Benchmark (MTEB) Leaderboard

https://huggingface.co/spaces/mteb/leaderboard

Massive Text Embedding Benchmark
https://huggingface.co/mteb

Model Size (million parameters, in number of parameters)
Memory Usage (GB, fp32)

在这里插入图片描述

2.3. Open LLM Leaderboard

https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard

the number of parameters (B)
#Params (B)

在这里插入图片描述

2.4. LLM-Perf Leaderboard

https://huggingface.co/spaces/optimum/llm-perf-leaderboard

Memory (MB)
Params (B)

在这里插入图片描述

2.5. Open Ko-LLM Leaderboard

https://huggingface.co/spaces/upstage/open-ko-llm-leaderboard

#Params (B)

在这里插入图片描述

References

[1] Yongqiang Cheng, https://yongqiang.blog.csdn.net/
[2] The Big Benchmarks Collection, https://huggingface.co/collections/open-llm-leaderboard/the-big-benchmarks-collection-64faca6335a7fc7d4ffe974a

关键字:如何制作一个微信刷题小程序_域名查询by21777_电商怎么做营销推广_焦作seo公司

版权声明:

本网仅为发布的内容提供存储空间,不对发表、转载的内容提供任何形式的保证。凡本网注明“来源:XXX网络”的作品,均转载自其它媒体,著作权归作者所有,商业转载请联系作者获得授权,非商业转载请注明出处。

我们尊重并感谢每一位作者,均已注明文章来源和作者。如因作品内容、版权或其它问题,请及时与我们联系,联系邮箱:809451989@qq.com,投稿邮箱:809451989@qq.com

责任编辑: