ValueError: Shape must be rank 0 but is rank 1 for ‘Adam/update_weight/ApplyAdam‘ (op: ‘ApplyAdam‘)

原代码:

self.lr = tf.placeholder(shape=[1],dtype=tf.float32,name="learning_rate")
。。。
。。。
        
optimizer = tf.train.AdamOptimizer(learning_rate=self.lr)
self.trainops = optimizer.minimize(self.cost)                                                                       
    

报错信息:

InvalidArgumentError: Shape must be rank 0 but is rank 1 for 'Adam/update_weight/ApplyAdam' (op: 'ApplyAdam') with input shapes: [39,2], [39,2], [39,2], [], [], [1], [], [], [], [39,2].
During handling of the above exception, another exception occurred:
ValueError                                Traceback (most recent call last)
 in __init__(self)
     29 
     30         optimizer = tf.train.AdamOptimizer(learning_rate=self.lr)
---> 31         self.trainops = optimizer.minimize(self.cost)

出错原因:自定义的变量self.lr的shape为[1],即vector(rank 1 tesnsor),而tf.train.AdamOptimizer中learning_rate只接受scalar(rank 0 tensor)。

修改为:

self.lr = tf.placeholder(dtype=tf.float32,name="learning_rate")
。。。
。。。
        
optimizer = tf.train.AdamOptimizer(learning_rate=self.lr)
self.trainops = optimizer.minimize(self.cost)                                                                       
    

参考:https://stackoverflow.com/questions/45733581/tensorflow-optimizer-minimize-valueerror-shape-must-be-rank-0-but-is-rank

你可能感兴趣的:(tensorflow)