phidea
Plain series · page 6 / 7

# 06 — Safe patterns

Part 6 of 7 · ← AI risks · Index · Next → Your first nano

Five rules. Apply from day one → ~80% of the 20-risk list is handled.

1. AI narrates; your nano owns the facts The LLM never sources numbers, prices, or clauses. Those come from your database. The LLM only presents what your tools returned.

2. Every answer cites a source IPID reference, product code, article from a curated library. No source → no answer.

3. Confirm before anything irreversible Quote generated ≠ quote sent. Every sensitive action is a separate button the user clicks.

4. Human in the loop on anything binding No AI-signed documents. No automatic cancellations. Advice-shaped questions → routed to a human advisor.

5. Log everything, retain per regulation Every interaction reconstructable years later: user ID, input, output, sources cited, app version.