MLPerf Inference v0.5 Results

November 6th, 2019

Any use of the MLPerf results and site must comply with the MLPerf Terms of Use.

You may wish to read the Inference Overview to better understand the results.

MLPerf Inference v0.5 Results Table Explanation
The MLPerf results table is organized first by Division and then by Category. MLPerf has two divisions. The Closed division is intended to compare hardware platforms or software frameworks “apples-to-apples” and requires using the same model and optimizer as the reference implementation. The Open division is intended to foster faster models and optimizers and allows any ML approach that can reach the target quality. MLPerf divides benchmark results into four Categories based on availability.
  • Available In Cloud systems are available for rent in the cloud.
  • Available On Premise systems contain only components that are available for purchase.
  • Preview systems must be submittable as Available In Cloud or Available on Premise in the next submission round.
  • Research systems either contain experimental hardware or software or available components at experimentally large scale.
Each row in the results table is a set of results produced by a single submitter using the same software stack and hardware platform. Each Closed division row contains the following information:
  • Submitter: The organization that submitted the results.
  • System: General system description.
  • Benchmark Results: Results for each benchmark and scenario: latency for single stream scenario, number of supported streams for multiple stream scenario, QPS for server scenario, and throughput for offline scenario.
  • Processor and count: The type and number of CPUs used, if CPUs perform the majority of ML compute.
  • Accelerator and count: The type and number of accelerators used, if accelerators perform the majority of ML compute.
  • Software: The ML framework and primary ML hardware library used.
  • Form factor: Submitter-declared form factor(s) of system or intended use, including: mobile, desktop, server, and embedded.
  • Details: link to metadata for submission.
  • Code: link to code for submission.
  • Notes: arbitrary notes from submitter.
Each Open division row adds the following information:
  • Model used: The model used to produce the results, which may or may not match the Closed Division requirement.
  • Accuracy: The accuracy achieved, which may be lower or higher than the Closed Division requirement.