Add AI-powered documentation generation (core)#3302
Add AI-powered documentation generation (core)#3302mjwolf wants to merge 16 commits intoelastic:mainfrom
Conversation
Co-authored-by: Jonathan Molinatto <jonathan.molinatto@gmail.com> Co-authored-by: Quan Nguyen <quan.nguyen@elastic.co> Co-authored-by: Cursor <cursoragent@cursor.com>
Add broader inclusive-language checks (disability-defining terms, gendered job titles, execute/abort variants), Git conflict marker detection, and new static style checks (banned words, latinisms, exclamation points, ellipses, version terms, article misuse). Extend style-rule prompts with wordiness, negation, and device-agnostic language guidance. Remove unused parallel generation helpers and trim the readme template. Made-with: Cursor
Move stylerules/ into prompts/ (now a dependency-free leaf package), relocate sectioninstructions.go to docagent/ (its only consumer), drop the _static/ subdirectory, compose AgentInstructions from FullFormattingRules to eliminate duplication, and extract shared validator JSON output suffix into prompts.ValidatorOutputSuffix. Made-with: Cursor
# Conflicts: # internal/docs/readme_test.go
⏳ Build in-progress, with failures
Failed CI StepsHistory
|
jsoriano
left a comment
There was a problem hiding this comment.
Thanks for the work here!
Reviewing by now only the code out of internal/llmagent. Regarding the parts under cmd, I would try to keep them minimal, moving LLM specific parts to internal packages. Also, try to reuse existing helpers for documentation, in consistency with other commands.
| internal/stack/versions.go @elastic/ecosystem | ||
|
|
||
| # LLM based functionality owned by integration-experience | ||
| internal/llmagent/ @elastic/integration-experience |
| - LLM_PROVIDER / llm.provider: Provider name (only Gemini provider currently supported). | ||
| - Gemini: GOOGLE_API_KEY / llm.gemini.api_key, GEMINI_MODEL / llm.gemini.model, GEMINI_THINKING_BUDGET / llm.gemini.thinking_budget` |
There was a problem hiding this comment.
We use to prefix environment variables supported by elastic-package with ELASTIC_PACKAGE_, using environment.WithElasticPackagePrefix. Maybe we don't need to do this for the provider specific variables, but we should probably use this for LLM_PROVIDER.
| } | ||
|
|
||
| // getLLMConfig returns provider, api key, model ID, and optional thinking budget. | ||
| func getLLMConfig(profile *profile.Profile) (provider, apiKey, modelID string, thinkingBudget *int32) { |
There was a problem hiding this comment.
Nit. Consider returning an struct instead of so many values.
| } | ||
|
|
||
| // getLLMConfig returns provider, api key, model ID, and optional thinking budget. | ||
| func getLLMConfig(profile *profile.Profile) (provider, apiKey, modelID string, thinkingBudget *int32) { |
There was a problem hiding this comment.
Nit. Consider moving LLM specific helpers to some internal package.
| provider = defaultProvider | ||
| } | ||
| if provider != defaultProvider { | ||
| apiKey = getConfigValue(profile, "", "llm."+provider+".api_key", "") |
There was a problem hiding this comment.
profile.Config could be directly used here, right?
| apiKey = getConfigValue(profile, "", "llm."+provider+".api_key", "") | |
| apiKey = profile.Config("llm."+provider+".api_key", "") |
| FieldsCacheName = "fields" | ||
| KibanaConfigCacheName = "kibana_config" | ||
|
|
||
| llm = "llm_config" |
There was a problem hiding this comment.
Use the Dir suffix as other constants here. And the _config suffix may be redundant in this context.
| llm = "llm_config" | |
| llmDir = "llm" |
| KibanaConfigCacheName = "kibana_config" | ||
|
|
||
| llm = "llm_config" | ||
| mcpJson = "mcp.json" |
There was a problem hiding this comment.
| mcpJson = "mcp.json" | |
| mcpJsonName = "mcp.json" |
| // MCPJson returns the file location for the MCP server configuration | ||
| func (loc LocationManager) MCPJson() string { | ||
| return filepath.Join(loc.LLMDir(), mcpJson) | ||
| } |
There was a problem hiding this comment.
Nit. Maybe we don't need to be so specific to have a method for mcp.json file. Other internal packages manage their files inside directories given by locations.
| * `stack.fleet_auto_install_task_interval` sets the interval for the Fleet auto-install content packages task. | ||
| Supported in Kibana 9.2 and later. Defaults to "10m". | ||
|
|
||
| ### AI-powered Documentation Configuration |
There was a problem hiding this comment.
Consider moving this section to a file under docs/howto.
|
|
||
| .cursor/ | ||
|
|
||
| batch_results/ No newline at end of file |
|
CI failures are not related, we have skipped them by now, you can update your branch to get green build. |
Add the
elastic-package update documentationcommand that uses an LLM (Gemini) to generate package documentation via a section-based generate-validate workflow.Core components: