-
Notifications
You must be signed in to change notification settings - Fork 9
Implement generateText built-in
#2010
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
1 issue found across 9 files
Prompt for AI agents (all 1 issues)
Understand the root cause of the following 1 issues and fix them.
<file name="packages/runner/src/builtins/llm.ts">
<violation number="1" location="packages/runner/src/builtins/llm.ts:331">
If generateText hits this catch path, previousCallHash still equals the request hash. The next run sees the guard at the top and bails, so the same prompt can never retry after a failure until the input changes. Please reset previousCallHash (or otherwise clear the guard) when handling errors.</violation>
</file>
React with 👍 or 👎 to teach cubic. Mention @cubic-dev-ai to give feedback, ask questions, or re-run the review.
| .catch(async (error) => { | ||
| if (thisRun !== currentRun) return; | ||
|
|
||
| console.error("Error generating text", error); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If generateText hits this catch path, previousCallHash still equals the request hash. The next run sees the guard at the top and bails, so the same prompt can never retry after a failure until the input changes. Please reset previousCallHash (or otherwise clear the guard) when handling errors.
Prompt for AI agents
Address the following comment on packages/runner/src/builtins/llm.ts at line 331:
<comment>If generateText hits this catch path, previousCallHash still equals the request hash. The next run sees the guard at the top and bails, so the same prompt can never retry after a failure until the input changes. Please reset previousCallHash (or otherwise clear the guard) when handling errors.</comment>
<file context>
@@ -182,6 +208,143 @@ export function llm(
+ .catch(async (error) => {
+ if (thisRun !== currentRun) return;
+
+ console.error("Error generating text", error);
+
+ await runtime.idle();
</file context>
Probably only useful for me... for now
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
1 issue found across 9 files
Prompt for AI agents (all 1 issues)
Understand the root cause of the following 1 issues and fix them.
<file name="packages/runner/src/builtins/llm.ts">
<violation number="1" location="packages/runner/src/builtins/llm.ts:273">
If `prompt` is empty we return before clearing `result`/`partial`, so the previous LLM output sticks around after the prompt is cleared. Please reset those cells before this early return to avoid stale data.</violation>
</file>
React with 👍 or 👎 to teach cubic. Mention @cubic-dev-ai to give feedback, ask questions, or re-run the review.
| inputsCell.getAsQueryResult([], tx) ?? {}; | ||
|
|
||
| // If no prompt is provided, don't make a request | ||
| if (!prompt) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If prompt is empty we return before clearing result/partial, so the previous LLM output sticks around after the prompt is cleared. Please reset those cells before this early return to avoid stale data.
Prompt for AI agents
Address the following comment on packages/runner/src/builtins/llm.ts at line 273:
<comment>If `prompt` is empty we return before clearing `result`/`partial`, so the previous LLM output sticks around after the prompt is cleared. Please reset those cells before this early return to avoid stale data.</comment>
<file context>
@@ -182,6 +208,143 @@ export function llm(
+ inputsCell.getAsQueryResult([], tx) ?? {};
+
+ // If no prompt is provided, don't make a request
+ if (!prompt) {
+ pendingWithLog.set(false);
+ return;
</file context>
e40a4b6 to
e8d3671
Compare
generateTextbuilt-innote.tsxmise.tomlto specifydenoversionSummary by cubic
Adds a generateText built-in for simple prompt-to-text LLM calls with streaming partial results. Wires it through the API, runner, and updates the Note pattern; addresses Linear CT-947.
New Features
Refactors and Fixes
Written for commit 5a6694a. Summary will update automatically on new commits.