Profile cover photo
Profile photo
Jeff Heon
75 followers
75 followers
About
Jeff Heon's posts

"Where do the impressive performance gains of deep neural networks come from? Is their power due to the learning rules which adjust the connection weights or is it simply a function of the network architecture (i.e., many layers)? These two properties of networks are hard to disentangle. One way to tease apart the contributions of network architecture versus those of the learning regimen is to consider networks with randomised weights. To the extent that random networks show interesting behaviors, we can infer that the learning rule has not played a role in them. At the same time, examining these random networks allows us to evaluate what learning does add to the network’s abilities over and above minimising some loss function."

Artificial Neural Networks with Random Weights are Baseline Models
http://neuroplausible.com/random-network

Post has attachment
Break the busy cycle!

https://youtu.be/sQKrt1-IDaE

Post has attachment
We're not doing anything objective anyway, we might as well embrace bias, and our humanity in doing so.

Post has attachment
Yes, less zombie dependance on machine and more creating authentically.

Post has attachment
This is both funny and serious.

https://youtu.be/wewAC5X_CZ8

Post has attachment

Post has attachment

Post has attachment
I am not a (randomly generated) number ;)

Post has attachment
Wait while more posts are being loaded