# Wikipedia 의 robots.txt

> Clean Markdown view of GeekNews topic #90. Use the original source for factual precision when an external source URL is present.

## Metadata

- GeekNews HTML: [https://news.hada.io/topic?id=90](https://news.hada.io/topic?id=90)
- GeekNews Markdown: [https://news.hada.io/topic/90.md](https://news.hada.io/topic/90.md)
- Type: news
- Author: [xguru](https://news.hada.io/@xguru)
- Published: 2019-07-17T06:53:12+09:00
- Updated: 2019-07-17T06:53:12+09:00
- Original source: [en.wikipedia.org](https://en.wikipedia.org/robots.txt)
- Points: 7
- Comments: 2

## Topic Body

주석이 달려 있어서 재미난 robots.txt - 이상동작하는 봇, 규약을 지키지 않는 봇, 캡쳐봇, 구글광고봇, 문제많은 wget recursive 모드등.

## Comments



### Comment 135

- Author: rtyu1120
- Created: 2019-07-20T07:33:05+09:00
- Points: 1

"Doesn't follow robots.txt anyway, but..." 부분이 재미있네요 ㅋㅋ

### Comment 89

- Author: iolothebard
- Created: 2019-07-17T10:58:42+09:00
- Points: 1

robots.txt 의 모범 사례라 할만!!

좋은 정보 감사합니다.
