Start a hangout

## Profile

Ran Manor

Attends Ben-Gurion University of the Negev

526,805 views

AboutPostsPhotosYouTubeReviews

## Stream

### Ran Manor

Shared publicly -The Choose Your Own Premium Android Giveaway - Win a Galaxy S6 Edge, an LG G4, or the Huawei P8 - The Choice is Yours to Make!

1

Add a comment...

### Ran Manor

NLP -Hi,

I'm looking for a public benchmark data set to test algorithms for entities' relations extractions.

The only thing I found is this:

http://www.itl.nist.gov/iad/mig/tests/ace/2008/

But I couldn't find where I can download it.

Does anyone know another data set which can be downloaded for free?

Thanks.

I'm looking for a public benchmark data set to test algorithms for entities' relations extractions.

The only thing I found is this:

http://www.itl.nist.gov/iad/mig/tests/ace/2008/

But I couldn't find where I can download it.

Does anyone know another data set which can be downloaded for free?

Thanks.

1

Add a comment...

### Ran Manor

Discussion -tl;dr - how to force network learn "low-level" features?

I'm working on a convolutional neural network on some biological data. The data has some variance between subjects, so up until now I trained a network per subject.

Now I want to try a network on multiple subjects, assuming that the network will learn some common features between the subjects and so the extra data will improve my classification performance.

Unfortunately, that doesn't happen.

The performance I get from multiple subjects is very low.

My net is deep, 3 convolutional layer, 2 fully connected layers.

Is there any trick to force the net to learn low level features better?

Thanks.

I'm working on a convolutional neural network on some biological data. The data has some variance between subjects, so up until now I trained a network per subject.

Now I want to try a network on multiple subjects, assuming that the network will learn some common features between the subjects and so the extra data will improve my classification performance.

Unfortunately, that doesn't happen.

The performance I get from multiple subjects is very low.

My net is deep, 3 convolutional layer, 2 fully connected layers.

Is there any trick to force the net to learn low level features better?

Thanks.

3

21 comments

+Ran Manor thinking about the ability of the network to learn fft transformation, I have the intuition a layer with sin activation function would be necessary imho. I feel that logistic, relu, tanh, etc cannot learn fft filters. But I never tried this idea ;-)

Add a comment...

### Ran Manor

Discussion -*The function of stride in convolutional neural networks"

I often see in implementations of CNNs that there is a small amount of stride in the convolution layer and in the pooling layer.

I understand that the stride helps to reduce the dimension and it skips nearby samples which are usually highly correlated.

My question is, why is there stride in both the convolution and the pooling? why not just in one of them?

Can someone give me a better intuition behind it?

Thank you.

I often see in implementations of CNNs that there is a small amount of stride in the convolution layer and in the pooling layer.

I understand that the stride helps to reduce the dimension and it skips nearby samples which are usually highly correlated.

My question is, why is there stride in both the convolution and the pooling? why not just in one of them?

Can someone give me a better intuition behind it?

Thank you.

1

6 comments

pleasure :)

Add a comment...

### Ran Manor

Shared publicly -ח"כ, סגן ראש עירייה ועובד בכיר בעירייה מסבירים למה פוליטיקאים חייבים לשרת קבוצות אינטרס

1

Add a comment...

### Ran Manor

Discussion -I'm trying a supervised neural network on data that has two unbalanced classes, class 0 is 90% of the data and class 1 is 10%. I replicate the smaller class for training so the gradients won't go only to one direction. It works relatively fine and the network performance is almost balanced. I'm training using standard gradient descent with a fixed learning rate and my network has sigmoid units.

I've noticed that if I start using momentum or rectified linear units (separately) then the network performance starts to skew towards class 0 (the bigger class).

This is a weird effect and I don't have a good idea on how to explain it.

Any ideas?

Thanks.

I've noticed that if I start using momentum or rectified linear units (separately) then the network performance starts to skew towards class 0 (the bigger class).

This is a weird effect and I don't have a good idea on how to explain it.

Any ideas?

Thanks.

8

14 comments

We have been also working with umballanced binary tasks. One possibility to improve results is to change the loss function by one which takes into account precision/recall, as the F1-score or similar loss functions. Approximations of the gradient for this kind of measures is possible, and it leads to better F1 performance in test set. In case you are interested, here are the slides of our work.

http://www.slideshare.net/franciscozamoraceu/iwann2013

http://www.slideshare.net/franciscozamoraceu/iwann2013

Add a comment...

### Communities

15 communities### Ran Manor

Shared publicly -Use my Uber promo code, erpac, and get ₪30 off your first Uber ride. Redeem it at https://www.uber.com/invite/erpac

A gift from me to you!
Claim your first Uber ride free, up to 30 ₪.
You'll never need a taxi again. Available on iPhone and Android.

1

Add a comment...

### Ran Manor

Shared publicly -Welcome to the Sunday Giveaway, the place where we giveaway a new Android phone or tablet each and every Sunday.
A big congratulations to last week’s winner of the LG G4 giveaway: Reinaldo from the United States of America.
This week we are giving away a Samsung Galaxy S6!
The Samsung Galaxy S6 has landed, bringing with it a much needed injection of premium materials. Trading in its typical plastic design language, the latest Galaxy ...

1

Add a comment...

### Ran Manor

Discussion -Should be interesting!

I design learning algorithms for neural networks. My aim is to discover a learning procedure that is efficient at finding complex structure in...

4

1

Add a comment...

### Ran Manor

Open Source Apps -Easy Notifications is a new notification app with the missing toggles for a...

1

Add a comment...

Communities

15 communities

Work

Occupation

PhD Student

Skills

drums, guitar and machine learning.

Basic Information

Gender

Male

Apps with Google+ Sign-in

- Modern Combat 5:Blackout
- Seabeard
- Mortal Kombat X

Story

Tagline

Piled higher and deeper

Introduction

PhD student with interest in machine learning and deep learning.

Education

- Ben-Gurion University of the NegevElectrical & Computer Engineering, 2006 - present

Links

YouTube

Other profiles