Problem statement
Purpose-specific triggers and conditions have been in Home Assistant Labs long enough to collect meaningful feedback and validate the direction. The feature is mature enough to ship to all users, but we have no established process for graduating a feature from Labs. This is the first time we've done this, which means two things need to happen in parallel: we need to decide what "out of labs" actually means and how we handle it operationally, and we need to execute that for this specific feature.
There's also a communication challenge. Moving something out of Labs could be read as us forcing a change on users who have had the feature available but may have deliberately not enabled it, or who tried it and didn't like it. Backlash is a real risk, even though Labs is explicitly a feedback and preview mechanism. We need to be clear and confident in our messaging.
Community signals
Not appliccable.
Scope & Boundaries
In scope
- Defining what it means for a Labs feature to graduate, including any criteria or checklist we want to apply going forward.
- Deciding how the UI transition happens (e.g. removing the Labs toggle, auto-enabling for users who hadn't opted in, handling users who had opted out).
- Communicating the graduation to users, including release notes and any blog or community post.
Not in scope
- Further UX or functional changes to purpose-specific triggers and conditions as part of this effort. Those belong in their own issues.
- Rethinking the Labs concept or what features go into Labs in the future.
Foreseen solution
We define a lightweight Labs graduation checklist (likely a short internal doc or issue template) and apply it to purpose-specific triggers and conditions as the first real-world test. The checklist would cover things like: known blocking issues resolved, migration path for existing automations confirmed, release note drafted, and community communication prepared.
The feature is then enabled for all users in a named release, the Labs toggle is removed, and we monitor feedback in the first week post-release.
Risks & open questions
- No precedent. Whatever we do here sets the bar for all future Labs graduations.
- What does "out of labs" mean to users? Some users may interpret it as the feature being locked in and unchangeable. We should be clear that graduating from Labs doesn't mean we stop iterating.
- Backlash from users who dislike the feature. Labs gave users the ability to opt in. Graduating means opting everyone in. Some users who deliberately avoided the feature, or who tried it and didn't like parts of it, may push back. We should be prepared with a clear rationale and hold the line.
- Is there any fallback needed, or do we graduate without a "disable" option from day one?
Appetite
Small, 1-2 weeks. The feature itself is done. The work here is process definition and communication, not engineering (aside from doing the actual rollout).
Execution issues
No response
Decision log
Problem statement
Purpose-specific triggers and conditions have been in Home Assistant Labs long enough to collect meaningful feedback and validate the direction. The feature is mature enough to ship to all users, but we have no established process for graduating a feature from Labs. This is the first time we've done this, which means two things need to happen in parallel: we need to decide what "out of labs" actually means and how we handle it operationally, and we need to execute that for this specific feature.
There's also a communication challenge. Moving something out of Labs could be read as us forcing a change on users who have had the feature available but may have deliberately not enabled it, or who tried it and didn't like it. Backlash is a real risk, even though Labs is explicitly a feedback and preview mechanism. We need to be clear and confident in our messaging.
Community signals
Not appliccable.
Scope & Boundaries
In scope
Not in scope
Foreseen solution
We define a lightweight Labs graduation checklist (likely a short internal doc or issue template) and apply it to purpose-specific triggers and conditions as the first real-world test. The checklist would cover things like: known blocking issues resolved, migration path for existing automations confirmed, release note drafted, and community communication prepared.
The feature is then enabled for all users in a named release, the Labs toggle is removed, and we monitor feedback in the first week post-release.
Risks & open questions
Appetite
Small, 1-2 weeks. The feature itself is done. The work here is process definition and communication, not engineering (aside from doing the actual rollout).
Execution issues
No response
Decision log