We haven’t gotten into any Natural Language Processing (and wouldn’t until the end of this class - if we do), but I wanted to talk about it anyways.
For my project, I was thinking of doing something along the lines of Natural Language Generation with a functional language, namely Clojure. I feel like the expressive tendencies of functional languages fit well with human languages, and this could be something interesting to explore.
What is Natural Language Generation? Well, to start it is a sub-field of Natural Language Processing, but where NLP (as a whole) is concerned with understand human speech NLG would be concerned with sounding human. An example of this we have already seen in class would be Siri. It needed to parse out the information from the question which would be the understanding, but then the response would be the generation aspect of the problem. So, some NLG projects might be getting a robot to narrate what it is doing to a human, or having a kiosk at a mall interact with a customer via generated voice.
More interesting than just having a one to one mapping from actions to phrases, would be having a program that summarizes data. One of the big problems with the large amounts data companies have nowadays, is that it is hard to draw conclusions from raw data, but there is no story that frames the data for the end user. Problems like creating summaries for videos and sound clips are common topics for Computer Scientists to tackle.
So my idea is this: Make a program that takes all the news from an aggregator on one very specific topic, for a period of time, and condense the most important ideas to summarize over one page. I would limit the domain to start with to one very specific topic, with six months for my time period.
This is mildly ambitious, a problem that most of my ideas suffer from. But, I think this would probably touch upon many areas of this class. To go beyond simple template driven text generation will require lots of filtering along with the intelligence to pick out the important parts. I think though if I can get away with using libraries for some of the heavy lifting in text processing (breaking everything into n-grams) and sentiment analysis, that would let me focus on the text generation part. The problem would be to choose the right algorithms at this step.
I know people have had success using convolutional neural networks to pick out sentiment from text, and that would give the program enough information to pick out what is important and what isn’t. The only problem is a neural network might become to heavy for this. I was talking to a developer who works for a company that does sentiment analysis APIs and he said that sometimes a simple node making snap decisions can work better. It's tough to tell what to use because I personally don't know enough about this part of the system yet. Frequency of something being mentioned and the sentiment attached to it would probably be the most important details for judging if that piece of information makes it into the summary.
Another challenge for this would be how to avoid keeping the summary from becoming something of a fill in the blanks, instead having a more organic feel. Most of what I have read so far on this topic seems be template based, so there is more research to be done on this topic.
In any case, I think making something from nothing is always interesting. One of my goals in taking this class was to be able to take on projects like this on my own!