f6da179820
If robots.txt file is invalid, abort mission.
2018-02-23 10:36:14 +01:00
0e02f22d08
Exception handling
...
Big problem with the url https:/plus.google.com/+Python concerning
robots parsing.
Didn't find the bug. @tobast , if you have some time to look at it :)
2018-02-23 00:37:36 +01:00
77ca7ebcb9
Silly me.
2018-02-22 15:35:46 +01:00
9b78e268c9
Nearly working crawler
2018-02-22 14:33:07 +01:00
e19e623df1
Multiple bug fixes. TODO : remove <div id=footer>-like patterns
2018-02-22 14:07:53 +01:00
236e15296c
It can be useful to return the links list
2018-02-21 23:11:57 +01:00
4e6ac5ac7b
Url getter function : retrieves the list of so-called relevant links
2018-02-21 22:51:05 +01:00
a907cad33d
Start of url getter function
2018-02-21 19:06:46 +01:00
b05e642c79
Make the code somewhat readable
2018-02-21 11:54:41 +01:00
c97acb22b5
Add tentative crawl file
...
Nothing functional, just tests
2018-02-20 12:48:53 +01:00
bef1fca5b9
Init app 'crawl'
2018-02-20 08:51:16 +01:00