-
Notifications
You must be signed in to change notification settings - Fork 51
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow more complex conversations (context) #76
Comments
The solution proposed for context is simple and well thinked, but I think it is too much responsability into one single function (a view). Also, the view for a while will take all control of the flow, and all error control, security, and wathever core stuff implemented in bottery could by bypassed in the view, possibly leading to bad implementations. The "head_set" implementation could avoid this problems, needs to be studied. I've been thinking about a solution where the view returns a second parameter (True or False) that would mean: I want a "Hook" or "I want a conversation". The Pattern associated with the view would then process that parameter, and flag HimSelf to a special Pattern that manages the "hook". Looks more complicated, but the implementation of the view would be more simple. In this case, the context could be easily passed to another view, like in webapps, without having to bypass/disallow the central routing controller. I have made a suggestion code. But since it is not the Pattern that calls the view, it just stores it, in the actual architeture this aproach would not work. It needs some change, that is simple: when bottery core retrieves the view from the Pattern, it is the Pattern object responsability to "run" the view. class HookableFuncPattern(Pattern):
'''Receives a function to preprocess the incoming message
text before comparing it to the pattern.
Allows use of regular expressions, selecting partial words for
routing, etc
pre_process: a function to process message on check action before comparing
with pattern
context: string with history of messages
conversation: HookPattern Object that will hook any next messages to this pattern
(see ConversationPattern)'''
def __init__(self, pattern, view, pre_process, \
hook_pattern=None, save_context=True):
self.pre_process = pre_process
self.context = ""
self.conversation = hook_pattern
self.save_context = save_context
Pattern.__init__(self, pattern, view)
def call_view(self, message):
'''Local function to check return of view.
Just to treat errors if view returns only response'''
tuple_return = self.view(message)
if type(tuple_return) is tuple:
response = tuple_return[0]
hook = tuple_return[1]
else:
response = tuple_return
hook = False
return response, hook
def check(self, message):
''' If a view wants to begin a conversation, it needs to return True
Default is False.
First we see if the context has to be set, then we run the view.
While view returns True, the hook will remain'''
# If hooked, go directly to view
if (not self.conversation is None) and self.conversation.has_hook:
if self.save_context:
message.text = self.context + message.text
response, hook = self.call_view(message)
if not hook:
self.conversation.end_hook()
return response
# Else, begin normal check
text, _ = self.pre_process(message.text)
if text == self.pattern:
response, hook = self.call_view(message)
if hook:
self.context += text
if (not self.conversation is None) and (not self.conversation.has_hook):
self.conversation.begin_hook(self)
return response
return False
class HookPattern(Pattern):
'''FirstPattern to be checked. Allows a Pattern to "capture" and release
the flow if it receives an incomplete messsage
_pattern = a Pattern Object
Usage:
Put as first pattern
On a view, call set_conversation(Pattern) to ensure the next message will go to this Pattern
Also on a view, call end_conversation to release the hook'''
def __init__(self):
self._pattern = None
Pattern.__init__(self, "", None)
def check(self, message):
if self._pattern is None:
return False
return self._pattern.check(message)
def begin_hook(self, apattern):
'''Pass the pattern that will begin a conversation'''
self._pattern = apattern
def end_hook(self):
'''Releases pointer to Pattern ending a conversation'''
self._pattern = None
def has_hook(self):
'''Return if hook is active'''
return self._pattern is None
conversation = HookPattern()
patterns = [
conversation, |
This is the part of the core code that needs to be changed: ` async def message_handler(self, data):
|
Altough the 'hook' may seen complicated at first point, it allows more complex interactions. Let's say we build a bot that has a general mode, that can enter a command mode, two or more NLP modes, a query mode, and so on. The "command" mode should also act like a menu, having levels and/or asking for completion of parameters passed. Soon it would be impossible to decide what patterns go for each side. All the logic would be in the active view, and all control on it. This view could use NLP and other functions from bottery and other libs of the app, but the code could easily become a chain of crossing calls. With 'hooks' maybe this would be simplier and we could even switch from one mode to other if the user wants, saving context for every "mode". And we would have a central point to see if the user wants to stay on this view/mode or not. Seems more organized at first glance. Sorry that part of the example code became bad formatted on this forum. All the codes are on my github if it helps. |
What do you think about a class PurcharseConversation(ConversationView):
def show_product_categories(self):
self.categories = get_product_categories()
return 'Thanks for shopping! Here are our product categories: {self.categories}'
def validate_category(self, resp):
return False if resp not in self.categories else True
def bail_out(self, resp):
return True if resp is 'I want to bail out' else False
def reject_category(self):
return 'Sorry, I do not recognize this category. Please select from: {self.categories}'
def end(self):
self.super().end('OK, sorry to see you go!')
async def start(self):
final_resp = await self.super().chain()
.ask(self.show_product_categories)
.while(self.validate_category)
.if(bail_out, self.end_conversation)
.do(self.reject_category)
return final_resp async def purchase(message):
p = PurcharseConversation()
finished = await p.start()
return finished
}
patterns = [
Pattern('I want to buy stuff', purchase),
Pattern('Give me goods!', purchase),
] |
A guy from work recommended RasaHQ https://github.com/RasaHQ/rasa_core. |
I wrote a code that can handle a cli conversation following a simple dict configuration and make a request to an JSON API. I made tests on an application on my site and on a CEP WebService. The patterns.py code would be as simpler as follows. Running example on https://github.com/IvanBrasilico/bottery/tree/rules_tests. Just need a bot of yours in setting.py to test. `rules = {'tec': {'rank': 'http://brasilico.pythonanywhere.com/_rank?words=', rules_cep = {'cep': {'busca': 'http://api.postmon.com.br/v1/cep/', conversation = HookPattern(END_HOOK_LIST) |
Using a class (or set of classes) is a perfectly valid approach. My example was meant just to illustrate the concept.
Wow that is awesome. But I now realize that perhaps my example came across to implement full blown conversation to the bot, which was not my intent. My bad! The Here is a more concrete example of what I meant by this idea: async def trigger_jenkins_job(message, conversation):
mask = await conversation.send('Got it. Which branch you want? (you can use wildcards)')
jobs = await fetch_jenkins_jobs(mask)
question = ['I found these:']
question += [f'{index} - {name}' for index, name in jobs]
question.append('Which one do you want to trigger?')
selected_index = await conversation.send('\n'.join(question))
selected_name = jobs[selected_index]
eta = trigger_jenkins_job(selected_name)
return f'Job {selected_name} has started! ETA: f{eta}, I will let you know when it finishes.'
patterns = [
Pattern('trigger job', trigger_jenkins_job),
] This conversation would go like this:
Without having the ability of sending new messages in the middle of the conversation, I have to remember some context myself somewhere because I will need to implement separate views. This can be done as: async def trigger_job(message):
index = parse_job_mask(message)
if index is None:
return 'Missing branch index. Use the "search <mask>" command.'
jobs = await fetch_last_search()
selected_name = jobs[index]
eta = trigger_jenkins_job()
return f'Job {selected_name} has started! ETA: f{eta}, I will let you know when it finishes.'
async def search_mask(message):
mask = parse_job_mask(message)
if mask is None:
return 'Missing mask'
jobs = await fetch_jenkins_jobs(mask)
save_last_search(jobs)
found = ['I found these:']
found += [f'{index} - {name}' for index, name in jobs]
return '\n'.join(found)
patterns = [
Pattern('trigger job', trigger_job),
Pattern('search', search_mask),
] The conversation:
It of course works, but is less natural. But it would be awkward to try to get the user to confirm the job before triggering in this case (we don't know how long it has been since the last search). So my point is that having the |
Hi. As a suggestion, I implemented 3 different bottery "extensions". All of them use a "Hang" to capture the messages and a ContextHandler to maintain context information and user inputs. All of them have operational examples inside. https://github.com/IvanBrasilico/bcontext - Just the ContextHandler, the view does all flow control. And also an app: https://github.com/IvanBrasilico/alfbot2 - just to map some JSON of pet apps tests of my site. My option to use a "Hang" is to use the main loop to view communication, and not start another request to telegram (or other) outside of main async event loop. Since we dont have control of the loop, I think starting another request may led to conflicts, and, even it not, there are two possible situations: Another pattern on the main loop consuming the waited message of the view. Although operational, the code needs some improvement. There's need for refactoring, improving usage, making async requests, etc. But I think it can be a starting point, especially the "extension" approach, that does not bloat the core. |
As we talked last time, I think it would be nice for
bottery
to support more complex conversations by allowing multiple replies from within a message handler.(
head_set
is a little joke of course 🎧)The main advantage of this approach is that we keep the context of the conversation inside the same function. Without this the user will have to maintain that context themselves, which is hard to do and doesn't scale very well. Pairing that with some natural language processing (#4) would make bottery well suited for complex interactions.
The text was updated successfully, but these errors were encountered: