Jump to content

Natural Language Processing


rhlsthrm
 Share

Recommended Posts

Is there a way to make Alfred use more natural language processing for queries so that there doesn't have to be a keyword and the intent can be determined through parsing the query for intent? I think this could be a great feature for Alfred, which could then turn into a voice engine. Having this controlled by the Alfred remote on the phone then becomes quite powerful.

Link to comment
Share on other sites

No, it is not possible.

 

The vast majority of workflows are user built and you can’t really automatically derive context from them automatically. They’re built in so many different ways and with so many languages, that’s not really feasible. Also, considering the power of what Alfred workflows can do (anything a script can), I’m glad that’s the case. You really don’t want to be bitten by a misinterpretation.

 

Just look at the companies with the most money and the best engineers. They can barely do it acceptably! Siri itself just got her capabilities opened to third party developers, and they have to plug into their API, it’s not like she does it automatically.


Natural language processing is something you can add to a specific workflow, but it’s not something you can make Alfred do automatically. Unless of course you’re only referring to Alfred’s default capabilities, but that doesn’t seem to be the case.

Link to comment
Share on other sites

6 hours ago, vitor said:

You really don’t want to be bitten by a misinterpretation

 

I hadn't thought about this before, but now you mention it, having something as broken as Siri connected to Alfred or Terminal is a scary scenario.

 

Link to comment
Share on other sites

I am thinking more in a "futuristic" sense of a computing world where layman-users can essentially tell the computer what to do and the computer will be able to know what they want to do and do it. This can translate to voice as well, mimicking an Iron Man-like situation (with his "Jarvis" computer). I believe an app like Alfred is a great jumping off point for this type of vision. If there is a way for Alfred to get smarter and smarter and learn about what the user wants to do, this would start to develop the vision. Would love to hear thoughts and discussion about this topic.

Link to comment
Share on other sites

12 hours ago, deanishe said:

 

I hadn't thought about this before, but now you mention it, having something as broken as Siri connected to Alfred or Terminal is a scary scenario.

 

Sure, as "broken" as it is now, this is the future, and there is no doubt that these technologies will become smarter and better as time goes on.

Link to comment
Share on other sites

@rhlsthrm Your vision is still far off from reality, and as stated above Alfred is not ideal for this because it is too versatile. Your vision requires what Apple is doing with Siri: limited capabilities that will expand over time. Also, Alfred has a team of two. Apple, Google, and other companies have immense teams working on just this, not to mention the abundance of independent researchers.

 

To get a sense for the current state of the technology, you’re likely to find something on Two Minute Papers. As the technology stands now, there’s no way it would make sense for Alfred to go down this path.

 

If you want to shout commands at your computer for Alfred to execute, you can do it now and have been able to for years. You can add dictation commands to run things on your machine. But to have it correctly interpret what you want in freeform speech, as it stands it is not a feature request, it is a mere concept.

Link to comment
Share on other sites

Is there an API for Data Detectors (the feature that highlights dates and phone numbers in Mail.app)?

 

Might be an interesting feature if Alfred pre-processed the query with Data Detectors and exposed any parsed results as workflow variables.

 

For example, if the query contains "noon tomorrow", Alfred might set the variable DATETIME to 2017-01-20T00:53+0100 (or similar).

Link to comment
Share on other sites

@vitor, completely understand. Alfred is powerful and has the potential to at least demonstrate this functionality. Voice aside, I think it would be great to be able to do some small bit of NLP on the queries.

 

From your first response, it seems like you can add some bit of NLP into the workflows, which might be a good starting point. I'll explore further.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...