in , , ,

(video) DeepRoute’s Inference Engine

DeepRoute-Engine is an inference engine that speeds up the neural network computation, allowing algorithms to run on an energy-efficient computing platform. This results in a 6x faster inference process than that of widely used open source deep learning frameworks (e.g. TensorFlow, PyTorch, Caffe) and is compatible with GPU of various brands, including NVIDIA, Intel, and AMD.

What do you think?

486 points
Upvote Downvote

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

(video) The future of autonomous and all-electric transportation is here.

(video) Perceptive Automata Live Model Outputs