Basic Intent
intent('hello world', p => { ... });
Matches simple phrases like ‘hello world’.
Your quick reference for building multimodal conversational experiences with Alan AI. Covers script concepts, client API, handlers, and best practices.
Basic Intent
Matches simple phrases like ‘hello world’. |
Intent with Wildcards
Matches phrases like ‘I want coffee’, ‘I want pizza’, etc. |
Intent with Entities
Matches ‘I want coffee’ and captures ‘coffee’ as an entity ‘item’. |
Multiple Phrases
Matches either phrase. |
Contextual Intent
Only matches ‘yes’ when in the ‘confirm_order’ context. |
Command (No NLU)
Triggered directly from the client via |
Capturing Entity Value Inside the intent handler:
Accesses the recognized value of the ‘item’ entity. |
Responding with Text
Alan says the text. |
Responding with Sound/SSML
Plays a sound file. Alan supports SSML. |
Chaining Plays
Plays sequentially. |
Playing from List
Randomly picks one response. |
Setting Visual State
Updates the visual state on the client application. |
Calling Client Function
Triggers a handler ( |
Adding Follow-up Question
Sets up a follow-up intent block. |
Ending Conversation
Alan finishes speaking and resolves the current intent processing. |
Handling No Match Use the |
Best Practice: Be Specific Define specific intents before using broad ones or fallback to avoid misinterpretations. |
Best Practice: User Testing Test with real users speaking naturally to refine your intents. |
Basic Follow-up
Sets up a temporary context ( |
Named Contexts
Use |
Entering Context on Match
The |
Context Lifecycle
|
Tips for Contexts
|
Avoiding Context Loops Ensure there are paths to exit contexts, either by matching a specific intent within the context that calls |
Accessing Context Name
|
Debug Tool: Context View Use the Alan AI Studio Debugger’s ‘Contexts’ tab to see which contexts are active and how they change during a conversation. |
Makes Alan say |
Sets up a follow-up intent handler for the next turn. |
Sets the active context. |
Indicates the intent handling is complete for this turn. |
Dynamically adds new script code. Use with caution. |
Calls a method on the client side via the |
Updates the visual state object on the client side. |
The raw text recognized by the ASR/NLU engine. |
An array of recognized tokens. |
The name of the matched intent. |
An object containing recognized entities and their values. |
An object containing recognized slots and their values. |
An object to store and retrieve user-specific data across turns and sessions. Persists across sessions if enabled. |
Similar to |
Selects a random element from the array |
Logs a message to the Alan AI Studio Debugger console. |
Initializing Alan Button
Replace |
Key Handlers
|
Sending Commands to Script Use the
This is useful for triggering script logic from UI actions. |
Setting Visual State from Client While primarily set from the script using
This updates the |
Sending Text to Alan Allows sending text to Alan as if the user spoke it:
Useful for initial prompts or integrating with chat interfaces. |
Activating/Deactivating Alan Control the button’s listening state:
|
Working with Visual State The
Best Practice: Design your visual state carefully to represent relevant UI information the script might need. |
Troubleshooting Client Integration
|
Global Variables Declare variables outside of
Global variables persist throughout the script’s lifecycle on the server. |
User Data ( Use
|
Session State ( Use
|
Choosing Data Storage
|
Working with External APIs Use
Remember to configure API keys securely in project settings. |
Asynchronous Operations Use The |
Debugging Tips
|
Error Handling Wrap API calls and other potentially failing operations in |
Best Practice: Modularize Script Break down large scripts into smaller, manageable functions or files if possible (using |
Best Practice: Keep Responses Concise Alan AI is best for short, direct interactions. Avoid long monologues. Use the visual interface for displaying detailed information. |
Best Practice: Progressive Disclosure Don’t ask for too much information at once. Use follow-ups and contexts to guide the user through gathering necessary details step-by-step. |
Best Practice: Provide Help/Examples Include intents for ‘help’ or ‘what can I say/do’ that provide examples of valid commands, especially within specific contexts. |
Best Practice: Multimodal Design Always consider the visual feedback alongside the voice response. Use |
Best Practice: Testing Edge Cases Test how your voice assistant handles unexpected inputs, misrecognitions, and errors from external services. |