Compiled AI: Deterministic Code Generation for LLM-Based Workflow Automation explores Compiled AI generates deterministic code from LLMs for reliable and cost-efficient enterprise workflow automation, especially in healthcare.. Commercial viability score: 8/10 in LLM Workflow Automation.
Use This Via API or MCP
This route is the stable paper-level surface for citations, viability, references, and downstream handoffs. Use it as the proof layer behind Signal Canvas, workspace creation, and launch-pack generation.
Use an AI coding agent to implement this research.
Lightweight coding agent in your terminal.
Agentic coding tool for terminal workflows.
AI agent mindset installer and workflow scaffolder.
AI-first code editor built on VS Code.
Free, open-source editor by Microsoft.
6mo ROI
1-2x
3yr ROI
10-25x
Automation tools have long sales cycles but high retention. Expect $5K MRR by 6mo, accelerating to $500K+ ARR at 3yr as enterprises adopt.
Geert Trooskens
XY.AI Labs, Palo Alto, CA
Aaron Karlsberg
XY.AI Labs, Palo Alto, CA
Anmol Sharma
XY.AI Labs, Palo Alto, CA
Lamara De Brouwer
XY.AI Labs, Palo Alto, CA
Find Similar Experts
LLM experts on LinkedIn & GitHub
References are not available from the internal index yet.
High Potential
1/4 signals
Quick Build
4/4 signals
Series A Potential
2/4 signals
Sources used for this analysis
arXiv Paper
Full-text PDF analysis of the research paper
GitHub Repository
Code availability, stars, and contributor activity
Citation Network
Semantic Scholar citations and co-citation patterns
Community Predictions
Crowd-sourced unicorn probability assessments
Analysis model: GPT-4o · Last scored: 4/8/2026
Generating constellation...
~3-8 seconds
This research introduces a cost-effective, efficient, and reliable method to automate workflows using LLMs by compiling them into deterministic code, addressing common issues of runtime variability and high inference costs.
The product could be a software tool that allows enterprises to specify workflows in YAML, automatically compiling them into deterministic code for scalable execution.
This approach replaces continual AI-driven workflow execution with a more stable and predictable system, potentially reducing reliance on cloud-based inference services.
There is significant demand in sectors like healthcare and finance for reliable, regulatory-compliant solutions that streamline administrative tasks without incurring high costs from continuous LLM usage.
This can be commercialized to automate compliance-heavy workflows in industries like healthcare, where auditability and cost efficiency are crucial.
The paper presents an architecture that uses LLMs in a one-time compilation phase to generate executable code for workflows, enabling deterministic execution without repeated LLM calls. The approach focuses on compiling business logic into validated templates, ensuring auditability and reducing operational costs.
The method was evaluated on function-calling tasks and document processing, showing substantial token savings and execution efficiency compared to traditional LLM architectures. Benchmarks demonstrated successful task completion and high accuracy on security evaluations.
The system's effectiveness depends on the quality of the initial workflow specification and may not apply well to workflows requiring dynamic decision-making or creative input. Initial setup and specification may require considerable effort.