https://github.com/huggingface/transformers/releases/tag/v4.41.0
uggingface
/
transformers
v4.41.0:
https://llm.stockmark.co.jp/
https://twitter.com/kosukearima/status/1790970299103809957
1000億パラメータの日本語LLMモデル「Stockmark-100b」
https://twitter.com/npaka123/status/1790455092203700317
Gemini 1.5 Pro
https://twitter.com/gyakuse/status/1790110045814010327
https://huggingface.co/spaces/sakasegawa/gpt-4o-tokenizer-vs-gpt-4-tokenizer
gpt-4o-tokenizer-vs-gpt-4-tokenizer
https://note.com/ngc_shj/n/n4481c5cd76dd?sub_rt=share_h
WSL2でJapanese Stable LM 2 1.6B
https://twitter.com/CurveWeb/status/1789109044202074508
TTM
>TinyTimeMixers (TTMs) are compact pre-trained models for Multivariate Time-Series Forecasting
50%zaoriku