# ChatGPT는 20B 크기의 모델

> Clean Markdown view of GeekNews topic #11616. Use the original source for factual precision when an external source URL is present.

## Metadata

- GeekNews HTML: [https://news.hada.io/topic?id=11616](https://news.hada.io/topic?id=11616)
- GeekNews Markdown: [https://news.hada.io/topic/11616.md](https://news.hada.io/topic/11616.md)
- Type: news
- Author: [jonghwanhyeon](https://news.hada.io/@jonghwanhyeon)
- Published: 2023-10-31T19:00:57+09:00
- Updated: 2023-10-31T19:00:57+09:00
- Original source: [arxiv.org](https://arxiv.org/abs/2310.17680)
- Points: 9
- Comments: 5

## Topic Body

Microsoft Research에서 EMNLP 2023에 제출한 논문인 "CodeFusion: A Pre-trained Diffusion Model for Code Generation"에서 ChatGPT(gpt-3.5-turbo)의 파라메터 개수가 20B로 공개되었습니다.  
  
- T5 (t5-large): 770M  
- CodeT5 (codet5-large): 770M  
- GPT3 (text-davinci-003): 175B  
- ChatGPT (gpt-turbo-3.5): 20B

## Comments



### Comment 20315

- Author: hyeonseokoh94
- Created: 2023-11-02T14:37:19+09:00
- Points: 1

이 사이즈가 말이 되나요? 인공지능 하시는 분들 얘기 들어보니 경악하시던데...  
All you need is 4090

### Comment 20291

- Author: geekarxiv
- Created: 2023-11-01T16:13:44+09:00
- Points: 1

원래 GPT3.5도 parameter size가 공개가 안되어있었나요?  
저는 단순히 GPT-3 175B에 RLHF한건줄 알았는데 아닌거였나보군요..

### Comment 20288

- Author: spark
- Created: 2023-11-01T15:59:22+09:00
- Points: 1

https://arxiv.org/abs/2310.17680v1 v1은 아직 볼 수 있네요

### Comment 20284

- Author: verilogeek
- Created: 2023-11-01T12:47:32+09:00
- Points: 1

"There are some errors in the paper and we need to retract it" 라고 하네요

### Comment 20285

- Author: verilogeek
- Created: 2023-11-01T12:52:53+09:00
- Points: 1
- Parent comment: 20284
- Depth: 1

에러가 있는 곳이 어디인지는 모르겠지만 숫자 몇몇개가 틀린거면 단순 revision을 했을테니.. 20B이 맞을 가능성이 높겠죠?
