patternMinor
Clojure Neural Network
Viewed 0 times
neuralnetworkclojure
Problem
After reading this article about Neural Networks I was inspired to write my own implementation that allows for more than one hidden layer.
I am interested in how to make this code more idiomatic - for example I read somewhere that in Clojure you should rarely need to use the for macro (not sure if this is true or not) due to the functions in the standard library - and if there are any performance improvements. For the simple example below it runs fairly quickly but it is a very small (an XOR network).
Implementation:
```
(ns neural-net-again.ann
(:refer-clojure :exclude [+ - * == /])
(:use clojure.core.matrix)
(:use clojure.core.matrix.operators))
(set-current-implementation :vectorz)
(defn activation-fn [x] (Math/tanh x))
(defn dactivation-fn [y] (- 1.0 (* y y)))
(defn get-layers
[network]
(conj (apply (partial conj [(:inputs network)]) (:hidden network)) (:outputs network)))
(defn generate-layer
[neurons next-neurons]
(let [values (vec (repeat neurons 1))
weights (vec (for [i (range neurons)] (vec (repeatedly next-neurons rand))))]
{:values values :weights weights}))
(defn generate-network
[& {:keys [inputs hidden outputs]}]
(if (empty? hidden)
{:inputs (generate-layer (inc inputs) outputs) :outputs (generate-layer outputs 1)} ; add one to inputs for a extra bias neuron
(loop [current-layer (first hidden)
next-layer (first (rest hidden))
others (rest (rest hidden))
network {:inputs (generate-layer (inc inputs) (first hidden))}] ; add one to inputs for extra bias neuron
(if (nil? next-layer)
(-> network
(update-in [:hidden] #(conj % (generate-layer current-layer outputs)))
(assoc :outputs (generate-layer outputs 1)))
(recur next-layer (first others) (rest others) (update-in network [:hidden] #(conj % (generate-layer current-layer next-layer))))))))
(defn activate-layer
[{:keys [values weights]}]
(->> (transpose weigh
I am interested in how to make this code more idiomatic - for example I read somewhere that in Clojure you should rarely need to use the for macro (not sure if this is true or not) due to the functions in the standard library - and if there are any performance improvements. For the simple example below it runs fairly quickly but it is a very small (an XOR network).
Implementation:
```
(ns neural-net-again.ann
(:refer-clojure :exclude [+ - * == /])
(:use clojure.core.matrix)
(:use clojure.core.matrix.operators))
(set-current-implementation :vectorz)
(defn activation-fn [x] (Math/tanh x))
(defn dactivation-fn [y] (- 1.0 (* y y)))
(defn get-layers
[network]
(conj (apply (partial conj [(:inputs network)]) (:hidden network)) (:outputs network)))
(defn generate-layer
[neurons next-neurons]
(let [values (vec (repeat neurons 1))
weights (vec (for [i (range neurons)] (vec (repeatedly next-neurons rand))))]
{:values values :weights weights}))
(defn generate-network
[& {:keys [inputs hidden outputs]}]
(if (empty? hidden)
{:inputs (generate-layer (inc inputs) outputs) :outputs (generate-layer outputs 1)} ; add one to inputs for a extra bias neuron
(loop [current-layer (first hidden)
next-layer (first (rest hidden))
others (rest (rest hidden))
network {:inputs (generate-layer (inc inputs) (first hidden))}] ; add one to inputs for extra bias neuron
(if (nil? next-layer)
(-> network
(update-in [:hidden] #(conj % (generate-layer current-layer outputs)))
(assoc :outputs (generate-layer outputs 1)))
(recur next-layer (first others) (rest others) (update-in network [:hidden] #(conj % (generate-layer current-layer next-layer))))))))
(defn activate-layer
[{:keys [values weights]}]
(->> (transpose weigh
Solution
I'm the author of
If you want to improve performance, it's much better to use vectors in an optimised format throughout (vectorz-clj is a fine choice) rather than mixing in Clojure vectors everywhere. This saves the overhead of converting to/from Clojure vectors all the time, which is possibly your biggest performance bottleneck in this code. Typically, these will be significantly (probably 10-30x) faster than regular Clojure vectors for numerical operations with
Here's an illustration of the difference:
Some more specific tips:
core.matrix, so hopefully I can give you some tips from that perspective.If you want to improve performance, it's much better to use vectors in an optimised format throughout (vectorz-clj is a fine choice) rather than mixing in Clojure vectors everywhere. This saves the overhead of converting to/from Clojure vectors all the time, which is possibly your biggest performance bottleneck in this code. Typically, these will be significantly (probably 10-30x) faster than regular Clojure vectors for numerical operations with
core.matrixHere's an illustration of the difference:
;; add with regular Clojure vectors
=> (let [v (vec (range 1000))] (time (dotimes [i 1000] (add v v))))
"Elapsed time: 625.66391 msecs"
;; add with Vectorz vectors
(let [v (array (range 1000))] (time (dotimes [i 1000] (add v v))))
"Elapsed time: 18.917637 msecs"Some more specific tips:
- Use
(array ....)instead of(vec ....)to produce Vectorz format vectors (actually it will produce whatever format you have set as your current implementation, so you can switch back and forth as needed)
- Use the core.matrix function
emap(element map) rather thanmapv. This should produce vectors in the format of the first vector argument, so will maintain Vectorz types. Even better, find a specialised function that does what you want:(add x y)is likely to be much faster than(emap + x y)
activate-layerlooks like a big bottleneck. It would be much better written as an array operation that exploits matrix multiplication. I think(mmul (transpose weights) value)should do the trick. To make this extra quick, I suggest storing the weights in pre-transposed format, then you can just do(mmul transposed-weights value)
- I see you are using the
core.matrixoperators for+,-,*etc. That's fine, but be aware that they are somewhat slower than the equivalentclojure.coreoperators if you are applying them to single numbers rather than whole arrays. Normally I use the namedcore.matrixfunctions instead (add,sub,muletc.) if there is any risk of confusion.
Code Snippets
;; add with regular Clojure vectors
=> (let [v (vec (range 1000))] (time (dotimes [i 1000] (add v v))))
"Elapsed time: 625.66391 msecs"
;; add with Vectorz vectors
(let [v (array (range 1000))] (time (dotimes [i 1000] (add v v))))
"Elapsed time: 18.917637 msecs"Context
StackExchange Code Review Q#38498, answer score: 9
Revisions (0)
No revisions yet.