As the title mentioned,any network suit for this kind of task? Check yolo 9000, looks like it need more than 3GB of ram to perform the inference task, do you have any recommendation for this kind of task?Thanks
ps : The last layer of the yolo 9000 shown on this page only cost 110.43Mbytes(28947456/1024/1024*4), not sure why it could consume more than 3GB when doing inference.