Profile cover photo
Profile photo
Amit Sheth
1,881 followers -
educator, researcher, entrepreneur
educator, researcher, entrepreneur

1,881 followers
About
Posts

Post has attachment
What is it that you have that others do not?
Note: This content will appear as a guest Career Advice Column in the  ACM XRDS magazine. I have had significant involvement in advising or mentoring graduate students in Computer Science-- especially, Ph.D. (graduated 28 so far) and MS-Thesis (graduated ab...
Add a comment...

Post has attachment
The description is worth reading-- also for those of us interested on the topic, there is a live stream on the evening of May 21. In a group meeting after this event, we will request those who attend this to discuss what they heard/learned!

Post has attachment
This is a #mustread - esp see the answer to: How is neocortex relevant to machine learning?

+Ruwan Wickramarachchi look at the third figure and we will chat further. The four charasteristics identified are very important (we have talked about abstraction, but others are also very valuable).

A thought: it is not easy to do introspection or have self awareness or have good explanation capability if you remain at the data or information level-- one really needs go to the higher level abstractions for these important capabilities.

When I look at the answer to "Q) What does that mean in terms of machine learning?" it seems ML has a long way to go-- does anyone feel differently? Can reinforcement learning set goals on the go?

Post has attachment

Post has attachment
This is an exciting example of brain-inspired computing: I would particularly like SoonJye, Joey, Ninni, Revathy to look at this for potential parallels. For my research mentee, once again this will reinforce that any significant research these days require synthesis of multiple areas-- in this particular example, notice how ontology, cognitive models, statistical learning/mining, sensors, big data cloud computing, etc. The corollary is to find partnerships with complementing interest if you want to do high impact research.

Post has attachment
Relevant to what we have discussed

Post has attachment
Our 2013 paper on OTC loperamide abuse in the context of opioid abuse was the first one to talk about the subject- see my blog on lineage to the FDA warning [1]. Dr. Raminta Daniulaityte (she and I were the PIs) of our project [2] is invited to participate in an expert panel “Understanding and Addressing OTC Loperamide Use, Abuse, Misuse,” organized by the Consumer Healthcare Product Association. This will be held at the Georgetown University Medical Center, Washington DC, Thursday, May 10, 2018.
[1] https://lnkd.in/eiFJfZY
[2] [2] https://lnkd.in/-2C6tm
Add a comment...

Post has attachment
The peer review process – what happens when you send your manuscript to a journal http://flip.it/ukFfFd
Add a comment...

Post has shared content
Check the interview of our PhD student Manas Gaur on "Can you “teach” a machine a metaphor? A deep and very pleasant dive into the world of computers and thought with Manas Gaur" [with +Teodora Petkova]

https://plus.google.com/u/0/+Ontotext/posts/FcavCCy83Gy
part of a series: https://plus.google.com/u/0/collection/ACuxLF
Can you “teach” a machine a metaphor?
A deep and very pleasant dive into the world of computers and thought with Manas Gaur

Manas Gaur is a research assistant at the Kno.e.sis Ohio Center of Excellence in Knowledge-enabled Computing [http://knoesis.org/] in Wright State University in Dayton, Ohio. He is advised by Prof. +Amit Sheth - Lexis Nexis Ohio eminent scholar and Director of Kno.e.sis.

Manas works as a part of Knowledge Graph, Natural language processing, Data Mining, and Deep Learning research team. His research solves problems in Biomedical and Clinical Natural Language Text. He recently co-founded a start-up called AlbertPinto (https://www.albertpinto.co). It is a financial advisor that utilizes AI and background knowledge to provide a precise judgment on your spending and buying behavior so that you buy sensibly, save smartly and invest intelligently.
You will also find him active on Quora [http://bit.ly/2GWuNLB], answering questions like “Will knowledge graph be the future of natural language processing? or “Does a self-driving car use machine learning or data science?”

This week Manas was patient and kind enough to answer almost all the questions +Teodora Petkova, having found out his passion for machine learning and natural language processing, bombarded him with.

Learn whether you can you “teach” a machine a metaphor, and what is the easiest way to explain NLP to complete newbies, from Manas and prepare for a deep but very pleasant dive into the world of computers and thought.

Why is the Knowledge Graph is so essential a concept and a technology and what are some of its best uses you have seen so far?

Knowledge graph is a concept that took the shape of a technology based on it eminence in present interlinked environment. It's emergence as an idea is primarily attributed to the way our mind works. Prof. Vannevar Bush stated in his seminal article in 1945 titled; As we may think, our mind works with trailblazing. We try to link multiple concepts to deduce meaning. A network of entities and relations can create a domain model that support the task of search enhancement by term disambiguation/resolution, term recognition and categorization, context identification, and entity-based summarization. For example, a user search for information about the disease is supported by definition, symptoms, causes, medicines and related doctors is an important use-case of KG for human betterment.

The reason I consider Knowledge Graph as an essential technology is that of its capability to bring in contextually "supportive" information that aid in effective decision making. Consider an example from the domain of life sciences. A user looks for "Asthma" in non-semantic web search, and he/she is provided with a set of links containing some relevant information. The user needs to scroll through each connection and have to filter out relevant information. On the other hand, consider a Knowledge Graph powered search, a user receives definition, symptoms, severity, medicines, and devices used by an asthmatic patient. At first glance, KG saves times, provides contextually relevant information through its interlinked structure and provide an interface for reasoning for further enrichment and comprehensible interpretation.

Freebase, taalee [1] semantic engine are some of the early promising knowledge-centric systems holding the concept of knowledge graph. After the acquisition of Freebase by Google, there was an emergence of Google knowledge graph for both generic and health-related information. Moreover, Maana is developing a corporate-centric knowledge graph for improving business intelligence using data silos containing historical data. Baidu Knowledge graph is a first knowledge graph created in the Chinese language for enhancing business, health, and service sectors in China.

A fact: google knowledge graph as of 2016 contains 70 billion entities and relationships improving the google "semantic" search.

One-liner: Ontologies are sometimes regarded synonymous to Knowledge graph whereas there is a principal difference: Ontologies require Ontological commitment from expert whereas such is not the case with Knowledge Graph.

[1] http://ieeexplore.ieee.org - Managing semantic content for the Web - IEEE Journals & Magazine

Why does AI have a language problem and what solutions do you see for this? [ref.Kris Hammond’s “AI’s Language Problem]

It is indeed a question that finds itself in the loop of linguists, cognitivist, philosophers and computer scientists. Why is it? Because of a group of people who are adroit in language knows that language is not about aggregation of words. Whereas writing is a particular organization of words that makes sense. When a computer scientist designs a so-called artificial intelligent system using his knowledge, he is very well able to latch himself onto the principles of machine learning and AI. Slightly old AI (before 2010) circumscribing Machine learning and Deep learning barricades itself to the application of classification, prediction, representation, recognition, visualization, vector generation. Whereas, current AI, works towards conversational agents, intelligent agents, natural text generation and personal assistants. Then the question comes, why people have a question that AI has a language problem. It is because AI is still a model learning word, sentences statistically. It can do things that can be metered and monitored.

What is needed is a learning methodology that incorporates an understanding of the data and the meaning of the data. In short, a world model of the problem before solving the problem. World Model is a semantic representation of the data. Defining constraints, conditions or probabilistic soft rules over the model will provide a cognitive sense to the system. In addition to this, enriching the world model and regulations using the notion of perceptual computing is what is needed for AI to solve language problem. The intertwined braid of Semantic, Cognitive and Perceptual computing is critical for language generation in AI.

My solution to the problem is broad but intuitively several technologies can perform these jobs. Starting from perceptual computing. Worldwide web has so many natural language documents (e.g., Brown corpus) that can be given to a probabilistic model to learn following :
which word fits best along which word, which word to be written after the current word, which word to start the sentence, which word to end the sentence and which set of words are coherent for a sentence.
Supporting the probabilistic perceptual model, we have word model which provide interlinking between different entities in the world around us restricting itself to the context and domain. Furthermore, we are certain about the explosion of different combinations that can be generated from semantic and perceptual systems. We define cognitive rules (using probabilistic soft logic or gaussian models or LSTMs/RNN) to filter out irrelevancy in the intermediate outcome during the learning cycle.
LSTM: Long Short-Term Memory
RNN: Recurrent Neural Network

An important article that can trigger an Idea, How to solve Language problem for the AI: http://www.dtic.mil/dtic/tr/fulltext/u2/a022584.pdf

I was wondering, can metaphors be described in mathematical equations and can you “teach” a machine a metaphor?

Metaphors are word conjunctions in which the subject links object with terms which define similarity. For example, "Eating pizza is like eating stale bread." But, the literal meaning of the metaphor is not clear. Moreover, there are a limited number of metaphors in English, and a supervised learning model (e.g., neural network, support vector machine) can be developed to recognize their presence within the text. Furthermore, metaphor rarely follows the grammatical rules of the English. Hence even dependency parsing does not come out to be a fruitful exercise.

A word in English can be used metaphorically. For instance, "This food tastes like Punjab." As a case, it is difficult to distinguish a literal, and it's metaphorical use. There are approaches to defining rules or mappings for metaphor identification. Their failure is attributed to the atypical semantic formation of metaphor. For instance, "You will have to taste the failure."
It is true that metaphor requires semantic association for making sense of it [2]. Careful tokenization of sentences with the discovery of semantic association and ranking can be a possible way to identify and interpret a metaphor. But, this is an unexplored area in NLP.

A clever trick is to generate a large labeled dataset of a word and discover syntactic features, relationships and compositional patterns that can presumably identify metaphors. One can even augment semantic features (like a presence of an entity, modifier and entity pairs) to discriminate patterns for metaphor identification.

A link [3] at the end shows a demo on metaphor identification using an approach described above.

So far, we talked about "how we can teach a machine to learn metaphor," and Yes, the answer is through the process of "identification." But, it requires a bit of thinking as to how can we create a Mathematical equation to describe metaphors? A machine is differentiable from human as it can understand those concepts and procedures that can be programmed. Within the confluence community of linguists and computer scientist, learning models are mathematical equations that can generate a representation of the input by recognizing patterns. Hence, utilizing pattern for performing classification or prediction task. It won't be like our high-school mathematics, wherein we pass the value of x in f(x) to generate the value of "y." Or we solve systems of equations of the form [Ax=B] to calculate B. However, it will be simple math looped over a function that assesses our direction of progress. Such a routine is called loss function. This function compares the output of the learning model with actual production (labels of a labeled dataset) to see if our model is learning correctly. Once we define the loss function, we describe a function that relates the input to the output. Such a mapping is called activation function (e.g., a sigmoid function, linear function). It takes the numerical value of the input and generates an output label for loss function. Now the question is how to produce numerical values of words in an English sentence? This can be performed in multiple ways:

1. Generate frequency of words. Create a list of frequency values of the sentences (may or may not contain metaphor). Pass to the activation function of the learning model.
2. Following the purport of word embedding within the NLP community, one can train a Word2Vec model over the large corpus. After that, generate real number representation of words. Aggregation of numerical representation of words can produce numerical values of sentences which are passed as input to the model.
But the approaches enumerated does not capture the semantic aspect of sentences.
3. We improve upon them by contextual similarity. It means that those sentences which are semantically proximal must have same numerical representation.
4. We further refine by adding syntactic features. For instance,
a. presence and count of verbs, nouns, and pronouns in a sentence.
b. Noun follows verb or Verb follow noun in a sentence
c. Triplets of POS-tag ( Verb-Determiner-Pronoun or Adjective-Verb-Noun) for each sentence.
d. Numerical (common term is a vector) representation of the dependency parsed version of the original sentence. I suspect that metaphors within a sentence affects the syntactic structure and differs from sentences lacking metaphors.

I genuinely consider metaphors in a sentence as a frame-based knowledge container and calls for consistent pattern conjectures. I support this statement with a note of caution: Metaphors are a short figure of speech comprising of an implied comparison. An analogy is complicated and provides a logical argument. They form an essential aspect of a societal sphere as they show how human gestate and act towards social issues. The opinion mining task realized in the domain of drug-abuse, gun-violence, etc. can make headway with the unmasking of metaphor in the sentence.

[2] The ρ operator: discovering and ranking associations on the semantic web
[https://dl.acm.org/citation.cfm?id=637418]
[3] http://www.edvisees.cs.cmu.edu/Edvisees.html

And last, if you were to explain NLP to a complete newbie, where would you start from?

Aahhh, explaining Natural Language Processing to a newbie is dependent on his/her background.

Considering the worst-case scenario in which the person/computer does not have any knowledge of linguistics and computer science. It is essential to have information about this domain as NLP is an interdisciplinary field arising from the intersection of computer science and linguistics. Instead, I would categorize people having knowledge of this domain as computational linguists.

I would start my explanation by breaking the term; Natural language processing. "Natural" is a term defining an activity or act that is not voluntary. In other terms, it is an innate gift of humankind to solve problems in a specific domain. For instance; Apple is red in color; a ladder is for climbing, water is for drinking. "Language" is medium of conversation between two human beings (rather between brains!!). It is perfectly possible that two human beings talk to each other in different language and not understanding a single dialect. Such is a scenario when human interacts with the computer. Human is better in understanding "Natural language" which he/she has learned through his/her childhood. But, how to make the computer understand whose natural language is bit and bytes. The natural languages of a computer are Bash, Shell Script, C, C++, Java, Python, etc. Now, how can you write your knowledge into a program to make a computer understand? It requires Natural language Processing (NLP) in a pivotal role.
A note of caution, NLP does not make a computer understand American English and not UK English. It only teaches a computer to comprehend English text.
Latching ourselves to the basic on the English paradigm would be an appropriate start for teaching. It is essential to understand "words" for semantic preservation. Words are formed from Phonemes and Syllables. This is how a toddler learns to babble "papa," "mamma" and "water." It is true that I can teach a novice human being NLP using the notion of Phonemes and Syllables but how can we teach a computer? So, we teach a computer using the notions of Parts of Speech, Lemmatization, Stemming, Parsing and Grammar rules. Parts of Speech is how we learn how different words are tagged along to form a sentence in English. There are nine main parts of speech (POS) in English: Nouns, pronouns, verbs, adjectives, articles, adverbs, prepositions, conjunctions, interjections. In English, people use a different morphological form of words like: "This food is tasting good" is a morphological form of "This food is taste good." For a computer, it is sufficient to understand that "This food is tasting good" is same as "This food is taste good." The goal of both stemming and lemmatization is to reduce inflectional forms and sometimes derivationally related forms of a word to a common base form. Now we move to "Parsing of a sentence." It is important for making computer/newbie learn about NLP.
Let's understand the chain; Sentence is composed of Phrases which are intern composed of Words. We have already realized about POS tags which can be used to name each word. Combination of two or more POS tags forms Phrases and tags them in a sentence. For example Noun Phrase; "The only evidence that Abraham Lincoln abolished slavery."
Apart from this basics, there are some advanced tags in English like Determiners, Proper Noun, Plural Noun, superlative verbs.

To summarize, A beginner needs to tokenize (breaks) a sentence into words. Do the POS tagging of the words. Define grammatical rules for combining words to form phrases. Tag these Phrases. Combination of these phrase tags represents a sentence.
There are multiple combinations of POS tags and Phrase tags in English. Learning an example does not make you an expert. We provide you 1000 sentences for exercise to master these principles. Once you do these exercises, you learned how sentences are formed and can teach anyone. This is how NLP works.
You provide multiple examples to the computer, and it learns how to form POS tags, Phrase tags after filtering the text using lemmatization and Stemming. Now you give 1000 sentences; a computer will provide will tag the sentence and provide you a parse tree. It can be extended with learning "articles," "verb conjugation," "paraphrasing," "mood," "modality" and "sentiment" which are advanced task developed over Parse Trees and POS tagging.
Automating these tasks required Machine learning or Deep learning algorithms.

I hope, I have sufficiently motivated you about NLP.

Thank you Manas for a wonderful chat!

If you liked the interview and would like to stay tuned for more #semtechtalks , follow our collection Semantic Technology Talks at http://bit.ly/2EmuPHq
Photo
Add a comment...

Post has attachment
Wait while more posts are being loaded