Profile

Cover photo
Verified name
Research at Google
1,056,931 followers|22,306,792 views
AboutPostsPhotosVideos

Stream

Research at Google

Shared publicly  - 
 
New 2014 satellite-based data released by +Global Forest Watch, the University of Maryland and Google, show over 18 million hectares of tree cover loss globally. This 2014 data represents the most up-to-date estimate of global tree cover change available today. http://bit.ly/1Ukhk1q
 ·  Translate
By Rachael Petersen, Nigel Sizer, Matt Hansen, Peter Potapov and David Thau. This post originally appeared on WRI Insights. Cet article peut être lu en français. Este artículo se puede leer en español. The world's tropical forests are in trouble, confirm new satellite-based data from the ...
50
10
Mark Robinson's profile photonavarros aca's profile photoMark Bridge's profile photoFrancesco Sorrentino's profile photo
 
Satellites Uncover 5 Surprising Hotspots for Tree Cover Loss | Global Forest Watch
Add a comment...

Research at Google

Shared publicly  - 
 
This week, Kohala, Hawaii hosts the 41st International Conference of Very Large Databases, a premier annual international forum for data management and database researchers, vendors, practitioners, application developers and users.

As a leader in Database research and Gold Sponsor, Google will have a strong presence at #VLDB2015 , with many Googlers publishing work, organizing workshops and presenting demos.  Click through to the Google Research blog, linked below, to learn more about the research Google is presenting.
44
7
Divided We-Fall's profile photoHarrison Shanklin's profile photo
Add a comment...

Research at Google

Shared publicly  - 
 
In the last few years, there have been incredible success applying Recurrent Neural Networks (RNNs) to a variety of problems, such as speech recognition, language modeling, translation, image captioning and more. 

Essential to these successes is the use of a very special kind of recurrent neural network architecture called Long Short Term Memory (LSTM), which accounts for long-term dependencies and relationships in data in order to produce amazing results. 

But what are LSTMs, and how do they work? In a blog post, linked below, Google intern +Christopher Olah gives an overview of LSTMs, explaining why and how they work so well. 
Recurrent Neural Networks. Humans don't start their thinking from scratch every second. As you read this essay, you understand each word based on your understanding of previous words. You don't throw everything away and start thinking from scratch again. Your thoughts have persistence.
227
111
Truyen Tran's profile photoSergey Kudakov (MMI)'s profile photoJAYTHACEO's profile photoArpit Koolwal's profile photo
2 comments
 
Great
Add a comment...

Research at Google

Shared publicly  - 
 
Congratulations to the recipients of the Summer 2015 Google Faculty Research Awards! This round we are funding 113 projects, with 27% of the funding awarded to universities outside the U.S. Check out the blog post to see the full list of recipients, as well as the areas they are working on.
64
15
Suthat Ronglong's profile photoEd Chi's profile photoAmit Sheth's profile photoChristine Tee Pei Yee (PyTee Christine)'s profile photo
2 comments
 
always the bigger
Add a comment...

Research at Google

Shared publicly  - 
 
Pursuing Google’s mission of organizing the world’s information to make it universally accessible and useful takes an enormous amount of computing and storage. This week, at the ACM SIGCOMM conference (http://goo.gl/MzySsn), we presented Jupiter Rising: A Decade of Clos Topologies and Centralized Control in Google’s Datacenter Network (http://goo.gl/xeItvp),  a paper with the technical details on five generations of our in-house data center network architecture.

Head over to the Google Research blog to learn the story behind building and deploying five generations of datacenter network infrastructure, with our latest-generation Jupiter network improving capacity by more than 100x relative to our first generation network.
156
46
Ing Jyh Tsang's profile photoTomasz Czerwiński's profile photo
Add a comment...

Research at Google

Shared publicly  - 
 
Here there be photons!

Check out Project Sunroof, which uses information that’s in Google Maps to figure out how many solar panels you’d probably need to provide power to your house, and how much you could save on your electric bill. Find out more in the video below
204
53
Mario José Collavo's profile photoSalah Hasnaoui (CHARMANT)'s profile photoFahid  R. Mohammed's profile photoGomgoru Koee's profile photo
6 comments
 
From what I've experienced and heard, most people are getting the "not available yet in your area/please signup for updates" message

The last bullet in their first FAQ raises a few more questions:

How do I know if solar power might be right for me?

A few key factors go a long way in determining whether solar power can save you money. You could be a great candidate for rooftop solar if:

o  You own your home
o  You have a roof that isn't heavily shaded and is in good condition
o  Your home uses at least a moderate amount of electricity
o  You have a good credit score (this matters only if you prefer to finance your system)
o  You live in one of the roughly 35 states with good solar policy

For example, what defines "good solar policy"?
Add a comment...
In their circles
1 person
Have them in circles
1,056,931 people
Eric Holmlund's profile photo
Adams Mitzu's profile photo
Catherine Lindgren's profile photo
eddie idgaf's profile photo
Samuel Swinford's profile photo
Matt Doane's profile photo
Joe Reed's profile photo
Sara Garcia's profile photo
David SUTHERLAND's profile photo

Research at Google

Shared publicly  - 
 
Neural Networks are used in an increasingly wide variety of Machine Learning tasks, and are often trained by calculating the rate at which the output changes with respect to the weights given to the various inputs. This is then used to update the weights to improve the network. While this may sound fairly straightforward, the process can quickly become computationally expensive if the neural network has many layers with millions of inputs! 

A key algorithm that makes this training possible is backpropagation, short for “backward propagation of errors.” For modern neural networks, backpropagation can make training with gradient descent as much as ten million times faster.

For a great description of exactly how this works, head over to a blog written by Google intern +Christopher Olah, who gives a thorough and intuitive description of backpropagation, and why it’s “an essential trick to have in your bag, not only in deep learning, but in a wide variety of numerical computing situations.”
Introduction. Backpropagation is the key algorithm that makes training deep models computationally tractable. For modern neural networks, it can make training with gradient descent as much as ten million times faster, relative to a naive implementation. That's the difference between a model ...
163
48
Dagmar Monett's profile photoJared Swiger's profile photoP. Held's profile photoMaria-Augusta Miceli's profile photo
 
Woo hoo. Let's fix weather predictions!
Add a comment...

Research at Google

Shared publicly  - 
 
In 2009, Google created the PhD Fellowship program to recognize and support outstanding graduate students doing exceptional research in Computer Science and related disciplines. Now in its seventh year, our fellowship programs have collectively supported over 200 graduate across the globe who seek to shape and influence the future of technology. Head over to the Google Research blog to see the recipients!
65
11
Tashkent Financial Institute's profile photoNorman Ma's profile photo
Add a comment...

Research at Google

Shared publicly  - 
 
+Google and +Gallup have released new research aimed at understanding the state of computer science education in the U.S. Learn more on the +Google for Education blog at http://goo.gl/u3rNy.
 
Searching for CS: Google’s New Study on Access & Barriers to K-12 Computer Science Education in the U.S.

+Google just released a new landscape study with +Gallup to understand the state of computer science (CS) in the U.S.  By surveying nearly 16,000 respondents (from students to superintendents), the study sheds light on perceptions of CS and associated opportunities, participation and barriers in K-12 education. Read more about the full report and what it means for your classroom here: http://goo.gl/u3rNye #googlecsedu
2 comments on original post
99
20
corry pee's profile photoLativia J Thomas's profile photoUmang Salgia's profile photoVicki Sinnock's profile photo
12 comments
 
Won't exactly fit my rant, but I hear you. I think there's a more common name for your Architect Syndrome... but I can't remember it.
Add a comment...

Research at Google

Shared publicly  - 
 
“I think you will see deep learning make a lot of progress in many areas. It doesn’t make any assumptions about the nature of problems, so it is applicable to many things.” 
-+Ilya Sutskever 

Congratulations to Google Research Scientist +Ilya Sutskever, who was named on the 2015 +MIT Technology Review list of "35 Innovators Under 35" (http://goo.gl/0N9MjM)! Read more about his career in the link below, and check out some of his research on his Google Scholar profile at https://goo.gl/pHOKZI.
82
10
八九寺杏花's profile photoSuthat Ronglong's profile photoTim Davies's profile photoGomgoru Koee's profile photo
4 comments
 
+Carles Gelada And if, instead, a univariate series it's watching has a strongly trending mean but noise which is Cauchy-Lorentz?
Add a comment...

Research at Google

Shared publicly  - 
 
Google is excited to both sponsor and help USENIX build Enigma, a new conference focused on security, privacy and electronic crime through the lens of emerging threats and novel attacks. Get more information about USENIX Enigma on the Google Research blog, linked below.
94
21
Mark Bridge's profile photoOxana Comanescu's profile photoSuthat Ronglong's profile photoPushkar Ratnalikar's profile photo
4 comments
 
:D
Add a comment...

Research at Google

Shared publicly  - 
 
KDD 2015: Google Publications and the Best Research Paper Award

Last week at the 21st ACM conference on Knowledge Discovery and Data Mining (KDD’15 http://goo.gl/ib7Kkv), the work titled Efficient Algorithms for Public-Private Social Networks (http://goo.gl/OzPoLg) was awarded Best Research Paper.

Co-authored by Googlers Ravi Kumar, +silvio lattanzi, +Vahab Mirrokni, former Googler intern Alessandro Epasto and research visitor +Flavio Chierichetti, this paper introduces the public-private model of graphs, and explores two powerful computational paradigms can be used to develop efficient social network algorithms.

Learn more about this work, and see all the Google publications presented at #KDD2015 , on the Google Research blog, below.
44
14
Animesh Pathak's profile photoSuthat Ronglong's profile photoJuan Alonso Fernández's profile photoCharalampos Tsourakakis's profile photo
2 comments
 
Receitas de sabomesa
Add a comment...
People
In their circles
1 person
Have them in circles
1,056,931 people
Eric Holmlund's profile photo
Adams Mitzu's profile photo
Catherine Lindgren's profile photo
eddie idgaf's profile photo
Samuel Swinford's profile photo
Matt Doane's profile photo
Joe Reed's profile photo
Sara Garcia's profile photo
David SUTHERLAND's profile photo
Story
Tagline
∀x, CS+x
Introduction
Google is full of smart people working on some of the most difficult problems in computer science today. Most people know about the research activities that back our major products, such as search algorithms, systems infrastructure, machine learning, and programming languages. Those are just the tip of the iceberg; Google has a tremendous number of exciting challenges that only arise through the vast amount of data and sheer scale of systems we build.

What we discover affects the world both through better Google products and services, and through dissemination of our findings by the broader academic research community.  We value each kind of impact, and often the most successful projects achieve both.