diff --git a/docs/source/_notebooks/explain_llm_logprobs.rst b/docs/source/_notebooks/explain_llm_logprobs.rst index 812fbc5..9cdde71 100644 --- a/docs/source/_notebooks/explain_llm_logprobs.rst +++ b/docs/source/_notebooks/explain_llm_logprobs.rst @@ -669,3 +669,11 @@ Then you can explain predictions with a custom client: + +Consider checking other libraries which support explaining predictions +of open source LLMs: + +- `optillm `__, e.g. see + `codelion/optillm#168 `__ +- you can also visualize the ouputs using the `logprobs + visualizer `__ diff --git a/docs/source/libraries/openai.rst b/docs/source/libraries/openai.rst index 7c87e1d..13771a1 100644 --- a/docs/source/libraries/openai.rst +++ b/docs/source/libraries/openai.rst @@ -57,11 +57,20 @@ you may call :func:`eli5.explain_prediction` with See the :ref:`tutorial ` for a more detailed usage example. +Consider also checking other libraries which support explaining predictions of open source LLMs: + +- `optillm `_, e.g. see + `codelion/optillm#168 `_ +- you can also visualize the ouputs using the + `logprobs visualizer `_ + .. note:: While token probabilities reflect model uncertainty in many cases, they are not always indicative, e.g. in case of `Chain of Thought `_ preceding the final response. + See the :ref:`tutorial's limitations section ` + for an example of that. .. note:: Top-level :func:`eli5.explain_prediction` calls are dispatched diff --git a/notebooks/explain_llm_logprobs.ipynb b/notebooks/explain_llm_logprobs.ipynb index 5deb1ff..6a5a8ae 100644 --- a/notebooks/explain_llm_logprobs.ipynb +++ b/notebooks/explain_llm_logprobs.ipynb @@ -751,6 +751,17 @@ " model=\"mlx-community/Mistral-7B-Instruct-v0.3-4bit\",\n", ")" ] + }, + { + "cell_type": "markdown", + "id": "cb6fce33-edb1-42ab-ac93-7bd4472075ca", + "metadata": {}, + "source": [ + "Consider checking other libraries which support explaining predictions of open source LLMs:\n", + "\n", + "- [optillm](https://github.com/codelion/optillm), e.g. see [codelion/optillm#168](https://github.com/codelion/optillm/discussions/168#discussioncomment-12399569)\n", + "- you can also visualize the ouputs using the [logprobs visualizer](https://huggingface.co/spaces/codelion/LogProbsVisualizer)" + ] } ], "metadata": {