Thank you for creating and maintaining this awesome project!
OpenAI recently introduced the seed parameter to make their models' text generation and chat completion behavior (more) reproducible (see https://cookbook.openai.com/examples/reproducible_outputs_with_the_seed_parameter).
I think it would be great if you could enable users of your package to control this parameter when using OpenAI models as a backend (i.e., in the files here: https://github.com/iryna-kondr/scikit-llm/tree/main/skllm/models/gpt)
The seed parameter could be hard-coded https://github.com/iryna-kondr/scikit-llm/blob/0bdea940fd369cdd5c5a0e625d3eea8f2b512208/skllm/llm/gpt/clients/openai/completion.py#L50 similar to setting temperature=0.0.
Alternatively, users could pass seed=<SEED> via **kwargs.