tf: Scope model under a model identifier (WIP)#54
tf: Scope model under a model identifier (WIP)#54julien-c wants to merge 1 commit intoConchylicultor:masterfrom
Conversation
|
Actually, there is a scope argument to the method embedding_rnn_seq2seq, so this might be what you need to use to scope your two models properly. See here in the TF code: A specific model scope could thus be added here in DeepQA: Haven't tried, but seems like the right place to look into. Also note that the current implementation of loading pre-trained embeddings relies on fixed scope name. This should be solved if you change scoping: |
|
Thanks @eschnou ! I will look into it tomorrow and hopefully have a mergeable branch by then. |
|
As both chatbots would use different sessions, I think they should be on different graphs. If you simply change the scope, as the scope is global, the second session would also allocate memory for the first chatbot. I didn't try but that would be more something like: g1 = tf.Graph()
with g1.as_default():
botFoo = chatbot.Chatbot()
botFoo.main(['--modelTag', "foo", '--test', 'daemon'])
g2 = tf.Graph()
with g2.as_default():
botBar = chatbot.Chatbot()
botBar.main(['--modelTag', "bar", '--test', 'daemon']) |
So what I want to be able to do is:
This does not work right now as there are collisions between variable names in the TF symbolic graph. This first commit is – I think – a right step towards fixing this, but is not enough: I think in the
model.pyfile, where we rely on thetf.nn.seq2seq.embedding_rnn_seq2seqimplementation, something must not be properly scoped.What do you guys think?