// first command line arg is url to start crawling with
// second command line arg is number of threads
// third is debug print
// fourth it stdout vs file
// fifth is filename
//sixth is number of iterations to run
// default values will be used if any of these are not provided
-
Notifications
You must be signed in to change notification settings - Fork 0
SunnyLannie/jjvcrawler
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
About
Distributed web crawler written in java. Utilize mapreduce
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published