Understanding the 3-Layer Structure
Keywords AI parameters are organized into three distinct layers, each serving a specific purpose in your LLM observability stack:- Layer 1: Required fields - Essential data for basic logging
- Layer 2: Telemetry - Performance and cost metrics
- Layer 3: Metadata - Custom tracking and identification
Layer 1: Required fields
These are the essential parameters needed for basic LLM request logging.Core required fields
Parameter | Type | Description | Required |
---|---|---|---|
model | string | The LLM model used | ✅ |
prompt_messages | array | Input messages to the model | ✅ |
completion_message | object | Model’s response message | ✅ |
Basic implementation
Message structure
Prompt Messages Format:Layer 2: Telemetry
Performance metrics and cost tracking for monitoring LLM efficiency.Telemetry parameters
Parameter | Type | Description | Unit |
---|---|---|---|
prompt_tokens | integer | Number of tokens in prompt | tokens |
completion_tokens | integer | Number of tokens in completion | tokens |
cost | float | Cost of the request | USD |
latency | float | Total request latency | seconds |
ttft | float | Time to first token | seconds |
generation_time | float | Time to generate response | seconds |
Implementation with telemetry
Layer 3: Metadata
Custom tracking and identification parameters for advanced analytics and filtering.Metadata parameters
Parameter | Type | Description | Purpose |
---|---|---|---|
metadata | object | General metadata | Custom properties |
customer_params | object | Customer information | User tracking |
group_identifier | string | Group/organization ID | Group analytics |
thread_identifier | string | Conversation thread ID | Thread tracking |
custom_identifier | string | Custom tracking ID | Custom analytics |