Two models in file MXNET and Tensorflow

Hi, I have a problem. In my project, I have a file which uses TensorFlow and MXNet models. Everything is fine when I use GPU only by TensorFlow when I switch MXNet to GPU I’m getting an error:
E tensorflow/stream_executor/cuda/] Could not create cudnn handle: CUDNN_STATUS_INTERNAL_ERROR

This is memory error.
If I understand right, you use TF and MXNet within the same script.
I had kind of that problem: when used together, different frameworks fight for the GPU memory (and in my experience, tensorflow is much more greedy on it).
Depending on model (how much GPU memory it consumes), I was able use MXNet and TF models together within different scripts/processes.

My method to block TF from taking all the GPU memory:

    def _prevent_tf_gpu_crash(fraction=.5):
        config = tf.compat.v1.ConfigProto()
        config.gpu_options.per_process_gpu_memory_fraction = fraction

Maybe I can tell you more if you provide more details.