Profile cover photo
Profile photo
Matthew J Price
9,182 followers -
Father, Transhumanist, Futurist, Philosopher, Economist, Master Floss Roller
Father, Transhumanist, Futurist, Philosopher, Economist, Master Floss Roller

9,182 followers
About
Posts

Hey everyone. I'm sorry I bailed without a word. Is anyone still around?
Add a comment...

Post has shared content
For those who are wondering how newspapers around the world translated "shithole," the Strong Language blog comes to the rescue. Most seemed to split between focusing on the shit ("shit countries") and focusing on the holes ("latrines" or the like), but the real winner is the Croatian Express, which very idiomatically translated it (in the headline!) as "vukojebina:" "the place where wolves fuck," or perhaps "wolffuckington."
Add a comment...

Post has attachment
Add a comment...

Post has attachment
Best gif I saw today
Imgur
Imgur
imgur.com
Add a comment...

Post has attachment
Add a comment...

Post has shared content
Interesting to read and consider, especially the whole discussion around friend zoning and different expectations between men and women
PhotoPhotoPhotoPhotoPhoto
12/21/17
5 Photos - View album
Add a comment...

Post has attachment

Post has shared content
Impressive - a real video on the left of a snowy trip, and the video as if it was a summer's day
AI research at nvidia. Other examples include a day scene being converted to night.
Add a comment...

Post has shared content
Yet more exciting work from Deepmind. Tools like this will accelerate AI research in lots of different areas.
A new hyperparameter system for deep learning from DeepMind. "The success of a neural network at a particular application is often determined by a series of choices made at the start of the research, including what type of network to use and the data and method used to train it. Currently, these choices -- known as hyperparameters -- are chosen through experience, random search or a computationally intensive search processes."

"We introduce a new method for training neural networks which allows an experimenter to quickly choose the best set of hyperparameters and model for the task. This technique -- known as Population Based Training (PBT) -- trains and optimises a series of networks at the same time, allowing the optimal set-up to be quickly found. Crucially, this adds no computational overhead, can be done as quickly as traditional techniques and is easy to integrate into existing machine learning pipelines."

"We rigorously tested the algorithm on a suite of challenging reinforcement learning problems with state-of-the-art methods on DeepMind Lab, Atari, and StarCraft II. In all cases, PBT stabilised training, quickly found good hyperparameters, and delivered results that were beyond state-of-the-art baselines."

"We have also found PBT to be effective for training Generative Adversarial Network (GAN), which are notoriously difficult to tune."

"We have also applied it to one of Google's state-of-the-art machine translation neural networks, which are usually trained with carefully hand tuned hyperparameter schedules that take months to perfect. With PBT we automatically found hyperparameter schedules that match and even exceed existing performance, but without any tuning and in the same time it normally takes to do a single training run."
Add a comment...

Post has attachment
Wait while more posts are being loaded