What's crawling and why is it useful?
It's fetching the contents of a web page using an app or a script. This is what Google is doing when its bot explores the web and analyzes webpage content.
As someone doing SEO you need to know what you are showing to Google. What your website looks like from a (Google) bot perspective. You need to check the quality of your XML sitemap if you are submitting one. You need to check your website webpages and meta data. Checking the web server logs is also a good idea, to know what Google bot is doing on your website.
You can also respectfully crawl your competitors' websites to better understand their SEO strategy.
url <- "https://en.wikipedia.org/wiki/World_population"
population <- url %>%
#removing extra row
population = population[-1,]
# convert to numeric
population$Population <- as.numeric(gsub(",","",population$Population))
population$Year <- as.numeric(population$Year)
and displayed as a plot
aes(x = Year, y = Population) +
scale_y_continuous(labels = scales::comma)
It's not really SEO, but it can be useful. I've also been using it to check the quality of the data on websites, like product prices, image availability, etc.