Skip to content

Benchmarking LLM's #3028

@compellingbytes

Description

@compellingbytes

Is there a way to benchmark runs? I mean, a readout for something like tokens/second?

I looked at the OpenVINO documentation and saw that there was a Benchmark app, but when I tried to run it, it required a shape, I guess a tensor.

Then there's the openvino_genai.PerfMetrics object that I couldn't get to work, if it works. I'm not the best at understanding object oriented programming, but my understanding is that this is a separate object from openvino_genai.LLMPipeline, so I don't know if it's actually recording metrics when something is run through the LLM Pipeline.

I'm on a different machine than the one I was messing with LLM's on, but if you want to look at the code I attempted to write to get the metrics to display, say so and I'll log in on that machine and copy and paste it.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions