File tree Expand file tree Collapse file tree
Expand file tree Collapse file tree Original file line number Diff line number Diff line change 11{
2- "." : " 0.6.1 -alpha.1"
2+ "." : " 0.7.0 -alpha.1"
33}
Original file line number Diff line number Diff line change 11# Changelog
22
3+ ## 0.7.0-alpha.1 (2026-03-28)
4+
5+ Full Changelog: [ v0.6.1-alpha.1...v0.7.0-alpha.1] ( https://github.com/llamastack/llama-stack-client-python/compare/v0.6.1-alpha.1...v0.7.0-alpha.1 )
6+
7+ ### ⚠ BREAKING CHANGES
8+
9+ * eliminate /files/{file_id} GET differences
10+
11+ ### Features
12+
13+ * Add stream_options parameter support ([ b4c2f15] ( https://github.com/llamastack/llama-stack-client-python/commit/b4c2f15b16872730a9c254b1b2dfc02aba223a71 ) )
14+ * eliminate /files/{file_id} GET differences ([ 1f28d73] ( https://github.com/llamastack/llama-stack-client-python/commit/1f28d730824b6cb721415985194c5f4567e42ea7 ) )
15+
16+
17+ ### Bug Fixes
18+
19+ * ** deps:** bump minimum typing-extensions version ([ 50ea4d7] ( https://github.com/llamastack/llama-stack-client-python/commit/50ea4d7fd98a86726f6825d911507b7fc96e2e60 ) )
20+ * ** inference:** improve chat completions OpenAI conformance ([ 147b88b] ( https://github.com/llamastack/llama-stack-client-python/commit/147b88b44eb83bceb7cd6204cd79d8dafe8f8e7a ) )
21+ * ** pydantic:** do not pass ` by_alias ` unless set ([ f6836f9] ( https://github.com/llamastack/llama-stack-client-python/commit/f6836f9dacef1b9b26774fcfaf82689ae00f374a ) )
22+ * remove duplicate dataset_id parameter in append-rows endpoint ([ d6a79d0] ( https://github.com/llamastack/llama-stack-client-python/commit/d6a79d0a830bad4e82b70d7ab9e007ebc16e0f05 ) )
23+ * sanitize endpoint path params ([ 9b288d5] ( https://github.com/llamastack/llama-stack-client-python/commit/9b288d553ae83860fbe1d8ee9352532ed04ddd9b ) )
24+
25+
26+ ### Chores
27+
28+ * ** internal:** tweak CI branches ([ 1df7e26] ( https://github.com/llamastack/llama-stack-client-python/commit/1df7e2605e78572eccc53aa8db1e44d987106a9b ) )
29+ * ** internal:** version bump ([ f468096] ( https://github.com/llamastack/llama-stack-client-python/commit/f46809696ddf1f179cc26984facfcbb7f9264730 ) )
30+
31+
32+ ### Refactors
33+
34+ * remove fine_tuning API ([ 021bd5e] ( https://github.com/llamastack/llama-stack-client-python/commit/021bd5e6138574884befe6f20ba86ceeefee1767 ) )
35+ * rename rag-runtime provider to file-search ([ 94a14da] ( https://github.com/llamastack/llama-stack-client-python/commit/94a14dad88ed55d3f2baf1de8eb30ba529fb9818 ) )
36+
337## 0.6.1-alpha.1 (2026-03-13)
438
539Full Changelog: [ v0.5.0-alpha.2...v0.6.1-alpha.1] ( https://github.com/llamastack/llama-stack-client-python/compare/v0.5.0-alpha.2...v0.6.1-alpha.1 )
Original file line number Diff line number Diff line change 11[project ]
22name = " llama_stack_client"
3- version = " 0.6.1 -alpha.1"
3+ version = " 0.7.0 -alpha.1"
44description = " The official Python library for the llama-stack-client API"
55dynamic = [" readme" ]
66license = " MIT"
Original file line number Diff line number Diff line change 77# File generated from our OpenAPI spec by Stainless. See CONTRIBUTING.md for details.
88
99__title__ = "llama_stack_client"
10- __version__ = "0.6.1 -alpha.1" # x-release-please-version
10+ __version__ = "0.7.0 -alpha.1" # x-release-please-version
You can’t perform that action at this time.
0 commit comments