Skip to content

Commit 4166b0b

Browse files
committed
break apart concurrency examples into separate projects
1 parent 18f6900 commit 4166b0b

7 files changed

Lines changed: 20 additions & 0 deletions

File tree

ch04-concurrency/src/com/clojurebook/concurrency.clj renamed to ch04-concurrency-game/src/com/clojurebook/concurrency.clj

File renamed without changes.

ch04-concurrency/src/com/clojurebook/concurrency/game.clj renamed to ch04-concurrency-game/src/com/clojurebook/concurrency/game.clj

File renamed without changes.

ch04-concurrency/src/com/clojurebook/concurrency/game_validators.clj renamed to ch04-concurrency-game/src/com/clojurebook/concurrency/game_validators.clj

File renamed without changes.
Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,8 @@
1+
(defproject com.clojurebook.concurrency.webcrawler "1.0.0-SNAPSHOT"
2+
:description "A naive agent-based webcrawler, explored in chapter 4 of
3+
'Clojure Programming' by Emerick, Carper, and Grand."
4+
:url "http://github.com/clojurebook/ClojureProgramming"
5+
:dependencies [[org.clojure/clojure "1.3.0"]
6+
[enlive "1.0.0"]]
7+
:profiles {:1.4 {:dependencies [[org.clojure/clojure "1.4.0-beta6"]]}}
8+
:main ^:skip-aot com.clojurebook.concurrency.webcrawler)

ch04-concurrency/src/com/clojurebook/concurrency/webcrawler.clj renamed to ch04-concurrency-webcrawler/src/com/clojurebook/concurrency/webcrawler.clj

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -109,3 +109,15 @@
109109
(Thread/sleep 60000)
110110
(pause)
111111
[(count @crawled-urls) (count url-queue)])
112+
113+
(defn -main
114+
[& [starting-url agent-count]]
115+
(when-not starting-url
116+
(println "Must provide a starting URL.
117+
e.g. `lein run http://www.bbc.co.uk [agent-count]`"))
118+
(let [agent-count (or agent-count "10")
119+
[crawled-count queued-count] (test-crawler (Integer/parseInt agent-count) starting-url)]
120+
(println (format "Crawled %s URLs in 60 seconds, %s additional URLs left in the queue"
121+
crawled-count queued-count))
122+
(shutdown-agents)))
123+

0 commit comments

Comments
 (0)