Ready-to-use models, queries, and semantic layers for Aampe's shared data tables. Works with BigQuery, Snowflake, and Databricks.
| What | Where | What it does |
|---|---|---|
| dbt package | dbt/ |
Staging models flatten nested structs (copy assignments, propensities, timing). Mart models aggregate reward lift, daily summaries, contact-level views, and label convergence. Cross-dialect macros handle BigQuery/Snowflake/Databricks differences. |
| SQL cookbook | cookbook/ |
12 copy-pasteable recipes: propensity ranking, reward trends, channel comparison, agent vs. controlled analysis, diversity, timing, label convergence, label trends, co-occurrence, messaging cadence, learning velocity, and event attribution. |
| Semantic layer | airlayer/ |
Airlayer view definitions with dimensions and measures for message events, labels, contact profiles, and propensities. Includes a reward lift motif and a weekly channel review query. |
| Schema & domain docs | docs/ |
Field-level schema reference, domain concepts (agents, labels, Beta posteriors, reward functions), and FAQ. |
| Agent context | AGENTS.md, llms.txt |
Schema and domain knowledge formatted for AI coding agents (Claude Code, Cursor, Copilot, etc.). |
| Hex guides | hex/guides/ |
Schema guide, query patterns, and domain glossary for Hex Context Studio. |
| Snowflake Cortex | snowflake/ |
Cortex Analyst semantic model and sample natural-language questions. |
| Sample data | sample_data/ |
Synthetic parquet files from a fictional food delivery app (FeastFleet). Load into DuckDB or pandas to explore without warehouse credentials. |
- Clone this repo
- Edit
dbt/models/staging/_stg_aampe__sources.yml— replaceYOUR_DATABASEandYOUR_SCHEMAwith where the shared tables live in your warehouse cd dbt && dbt run
This gives you flattened staging tables and pre-built aggregations. See the dbt model docs for what each model produces.
Open any recipe in cookbook/ and run the SQL directly against your shared tables. Set your default schema first:
-- BigQuery: select project + dataset in the console, or qualify table names
-- Snowflake: USE DATABASE your_db; USE SCHEMA your_schema;
-- Databricks: USE CATALOG your_catalog; USE SCHEMA your_schema;See the cookbook README for the full recipe list.
Add the files in hex/guides/ as workspace context in Hex's Context Studio. You can also add AGENTS.md or any of the docs/ files — Context Studio accepts any markdown. The Hex AI agent will then understand the Aampe schema, common query patterns, and domain terminology when helping you build notebooks.
Copy AGENTS.md or llms.txt into your agent's context to give it full schema and domain knowledge for writing queries against the shared tables.
| Table | Grain | Description |
|---|---|---|
AAMPE_CONTACT_PROFILES |
One row per contact | Propensity scores across 5 label dimensions (offering, value proposition, tone, timing, channel). Updated daily. |
AAMPE_MESSAGE_EVENTS |
One row per delivered message | Copy label assignments (with Beta posterior parameters), reward metrics, timing/channel decisions, message content. |
AAMPE_MESSAGE_ATTEMPTS |
One row per queued message | All messages including delivery failures. Use for delivery rate analysis. |
AAMPE_CONTACT_EVENTS_PARTIAL |
One row per contact event | Aampe-specific events (aampe_* prefix) with timestamps and JSON metadata. |
For full column-level documentation, see docs/schema_reference.md. For domain concepts (how propensities work, what the Beta parameters mean, how reward is calculated), see docs/domain_concepts.md.
This repo is organized by integration target — each top-level directory is a self-contained module for a specific tool or platform. To add support for a new tool (e.g., Cube, Looker, Metabase):
- Create a new top-level directory named after the tool (e.g.,
cube/) - Use
docs/schema_reference.mdas the source of truth for table and column definitions - Add a row to the "What's in here" table above
- Follow the authoring guidelines: direct, precise, no marketing language, copy-pasteable examples
For new cookbook recipes, follow the numbering convention (09_descriptive_name.md) and include Snowflake/Databricks variants where syntax differs. See the cookbook README for the pattern.
- Data share overview — How data shares work and how to set one up
- How agents work — Per-user agents and what they optimize
- Labels — What labels are and how agents learn from them
- Reward functions — How Aampe measures message success