TensorFlow pip installation issue: cannot import name ‘descriptor’

Loading...

TensorFlow pip installation issue: cannot import name ‘descriptor’

I'm seeing the following error when installing TensorFlow:

ImportError: Traceback (most recent call last):
File ".../graph_pb2.py", line 6, in 
from google.protobuf import descriptor as _descriptor
ImportError: cannot import name 'descriptor'

Solutions/Answers:

Answer 1:

I faced the similar issue , after trial and error, i used the below logic to run the program.
$pip install –upgrade –no-deps –force-reinstall tensorflow

this will make sure to uninstall and reinstall the program from fresh.it works!!!

Answer 2:

This error signals a mismatch between protobuf and TensorFlow versions.

Take the following steps to fix this error:

  1. Uninstall TensorFlow.
  2. Uninstall protobuf (if protobuf is installed).
  3. Reinstall TensorFlow, which will also install the correct protobuf dependency.

Answer 3:

Try this:

  1. pip uninstall protobuf

  2. brew install protobuf

  3. mkdir -p
    /Users/alexeibendebury/Library/Python/2.7/lib/python/site-packages

  4. echo ‘import site;
    site.addsitedir(“/usr/local/lib/python2.7/site-packages”)’ >>
    /Users/alexeibendebury/Library/Python/2.7/lib/python/site-packages/homebrew.pth

Answer 4:

I would be extra careful before uninstalling/reinstalling other packages such as protobuf. What I think would most likely be the issue is difference in versions. As of writing this, the most recent release of python is 3.7 while tensorflow is only compatible up to 3.6.

If you’re using a 3rd party distribution like Anaconda, this can get hidden from you. In this case I would recommend creating a new environment in Anaconda, with python 3.6 and then installing tensorflow: https://conda.io/projects/conda/en/latest/user-guide/getting-started.html#managing-python

References

Loading...

Sharing weights in Tensorflow between two subgraphs

Loading...

Sharing weights in Tensorflow between two subgraphs

I have the following setup, where each input consists of two trajectories. I want that the left graph has the same weight as the right graph

I tried to follow the approach described here for sharing variables, https://www.tensorflow.org/versions/r1.0/how_tos/variable_scope/, however it is not working. Two different graphs are created. What am I doing wrong?
def build_t_model(trajectories):
    """
    Function to build a subgraph
    """
    with tf.name_scope('h1_t'):
        weights = tf.Variable(tf.truncated_normal([150, h1_t_units], stddev=1.0/math.sqrt(float(150))), name='weights')
        biases = tf.Variable(tf.zeros([h1_t_units]), name='biases')
        h1_t = tf.nn.relu(tf.matmul(trajectories, weights) + biases)

    with tf.name_scope('h2_t'):
        weights = tf.Variable(tf.truncated_normal([h1_t_units, h2_t_units], stddev=1.0/math.sqrt(float(h1_t_units))), name='weights')
        biases = tf.Variable(tf.zeros([h2_t_units]), name='biases')
        h2_t = tf.nn.relu(tf.matmul(h1_t, weights) + biases)

    with tf.name_scope('h3_t'):
        weights = tf.Variable(tf.truncated_normal([h2_t_units, M], stddev=1.0/math.sqrt(float(h2_t_units))), name='weights')
        biases = tf.Variable(tf.zeros([M]), name='biases')
        h3_t = tf.nn.relu(tf.matmul(h2_t, weights) + biases)

    return h3_t


# We build two trajectory networks. The weights should be shared
with tf.variable_scope('traj_embedding') as scope:        
    self.embeddings_left = build_t_model(self.input_traj)
    scope.reuse_variables()
    self.embeddings_right = build_t_model(self.input_traj_mv)

Solutions/Answers:

Answer 1:

Okay, use tf.get_variable instead of tf.Variable for this. This works

def build_t_model(trajectories):
            """
            Build the trajectory network
            """
            with tf.name_scope('h1_t'):
                weights = tf.get_variable(
                    'weights1', 
                    shape=[150, h1_t_units],
                    initializer=tf.truncated_normal_initializer(
                        stddev=1.0/math.sqrt(float(150))))
                biases = tf.get_variable(
                    'biases1', 
                    initializer=tf.zeros_initializer(shape=[h1_t_units]))
                h1_t = tf.nn.relu(tf.matmul(trajectories, weights) + biases)

            with tf.name_scope('h2_t'):
                weights = tf.get_variable(
                    'weights2', 
                    shape=[h1_t_units, h2_t_units],
                    initializer=tf.truncated_normal_initializer(
                        stddev=1.0/math.sqrt(float(h1_t_units))))
                biases = tf.get_variable(
                    'biases2', 
                    initializer=tf.zeros_initializer(shape=[h2_t_units]))
                h2_t = tf.nn.relu(tf.matmul(h1_t, weights) + biases)

            with tf.name_scope('h3_t'):
                weights = tf.get_variable(
                    'weights3', 
                    shape=[h2_t_units, M],
                    initializer=tf.truncated_normal_initializer(
                        stddev=1.0/math.sqrt(float(h2_t_units))))
                biases = tf.get_variable(
                    'biases3', 
                    initializer=tf.zeros_initializer(shape=[M]))
                h3_t = tf.nn.relu(tf.matmul(h2_t, weights) + biases)
            return h3_t

References

Loading...