ITA-Dnipro / Dp-230-Golang-Retraining

1 stars 0 forks source link

Investigation. Crawling & 5XX errors check. #4

Closed DmytroTHR closed 2 years ago

DmytroTHR commented 2 years ago

A crawler is a program that visits Web sites and reads their pages and other information in order to create entries for a search engine index. The major search engines on the Web all have such a program, which is also known as a "spider" or a "bot." Crawlers are typically programmed to visit sites that have been submitted by their owners as new or updated. Entire sites or specific pages can be selectively visited and indexed. Crawlers apparently gained the name because they crawl through a site a page at a time, following the links to other pages on the site until all pages have been read. https://youtu.be/CDXOcvUNBaA https://github.com/gocolly/colly

5XX error codes occur when the server is unable to complete a request from a user. There are several 5XX errors that may be generated for a number of different reasons (for example, due to internal server errors, bad gateways, or a server timeout). https://youtu.be/GCRXkeAQXRQ https://komodor.com/learn/5xx-server-errors-the-complete-guide/