|
1 | 1 | # Changelog |
2 | 2 |
|
| 3 | +## 0.7.0-alpha.1 (2026-03-28) |
| 4 | + |
| 5 | +Full Changelog: [v0.6.1-alpha.1...v0.7.0-alpha.1](https://github.com/llamastack/llama-stack-client-python/compare/v0.6.1-alpha.1...v0.7.0-alpha.1) |
| 6 | + |
| 7 | +### ⚠ BREAKING CHANGES |
| 8 | + |
| 9 | +* eliminate /files/{file_id} GET differences |
| 10 | + |
| 11 | +### Features |
| 12 | + |
| 13 | +* Add stream_options parameter support ([b4c2f15](https://github.com/llamastack/llama-stack-client-python/commit/b4c2f15b16872730a9c254b1b2dfc02aba223a71)) |
| 14 | +* eliminate /files/{file_id} GET differences ([1f28d73](https://github.com/llamastack/llama-stack-client-python/commit/1f28d730824b6cb721415985194c5f4567e42ea7)) |
| 15 | + |
| 16 | + |
| 17 | +### Bug Fixes |
| 18 | + |
| 19 | +* **deps:** bump minimum typing-extensions version ([50ea4d7](https://github.com/llamastack/llama-stack-client-python/commit/50ea4d7fd98a86726f6825d911507b7fc96e2e60)) |
| 20 | +* **inference:** improve chat completions OpenAI conformance ([147b88b](https://github.com/llamastack/llama-stack-client-python/commit/147b88b44eb83bceb7cd6204cd79d8dafe8f8e7a)) |
| 21 | +* **pydantic:** do not pass `by_alias` unless set ([f6836f9](https://github.com/llamastack/llama-stack-client-python/commit/f6836f9dacef1b9b26774fcfaf82689ae00f374a)) |
| 22 | +* remove duplicate dataset_id parameter in append-rows endpoint ([d6a79d0](https://github.com/llamastack/llama-stack-client-python/commit/d6a79d0a830bad4e82b70d7ab9e007ebc16e0f05)) |
| 23 | +* sanitize endpoint path params ([9b288d5](https://github.com/llamastack/llama-stack-client-python/commit/9b288d553ae83860fbe1d8ee9352532ed04ddd9b)) |
| 24 | + |
| 25 | + |
| 26 | +### Chores |
| 27 | + |
| 28 | +* **internal:** tweak CI branches ([1df7e26](https://github.com/llamastack/llama-stack-client-python/commit/1df7e2605e78572eccc53aa8db1e44d987106a9b)) |
| 29 | +* **internal:** update gitignore ([0e98cfd](https://github.com/llamastack/llama-stack-client-python/commit/0e98cfdcf7779ca24ef4dbd7e9e8d9c75fa2a751)) |
| 30 | +* **internal:** version bump ([f468096](https://github.com/llamastack/llama-stack-client-python/commit/f46809696ddf1f179cc26984facfcbb7f9264730)) |
| 31 | +* **tests:** bump steady to v0.19.4 ([f5ad8f8](https://github.com/llamastack/llama-stack-client-python/commit/f5ad8f801078d79c03ec7723cd64b1c9895def2d)) |
| 32 | +* **tests:** bump steady to v0.19.5 ([55689e1](https://github.com/llamastack/llama-stack-client-python/commit/55689e1ddee55d81efff681dbb3523b0ed09d658)) |
| 33 | + |
| 34 | + |
| 35 | +### Refactors |
| 36 | + |
| 37 | +* remove fine_tuning API ([021bd5e](https://github.com/llamastack/llama-stack-client-python/commit/021bd5e6138574884befe6f20ba86ceeefee1767)) |
| 38 | +* remove tool_groups from public API and auto-register from provider specs ([c0df2dc](https://github.com/llamastack/llama-stack-client-python/commit/c0df2dcf9bb38600f73db746dc38d3277e74e7b9)) |
| 39 | +* rename rag-runtime provider to file-search ([94a14da](https://github.com/llamastack/llama-stack-client-python/commit/94a14dad88ed55d3f2baf1de8eb30ba529fb9818)) |
| 40 | +* **tests:** switch from prism to steady ([23d591c](https://github.com/llamastack/llama-stack-client-python/commit/23d591c70549c7f00b7be136a19893dbdd65f43c)) |
| 41 | + |
3 | 42 | ## 0.6.1-alpha.1 (2026-03-13) |
4 | 43 |
|
5 | 44 | Full Changelog: [v0.5.0-alpha.2...v0.6.1-alpha.1](https://github.com/llamastack/llama-stack-client-python/compare/v0.5.0-alpha.2...v0.6.1-alpha.1) |
|
0 commit comments