Rule 3.3 — Confidentiality
“A lawyer at all times shall hold in strict confidence all information concerning the business and affairs of a client acquired in the course of the professional relationship and shall not divulge any such information unless expressly or impliedly authorized by the client or required by law to do so.”
— Rule 3.3-1, LSO Rules of Professional Conduct
Rule 3.3 is the most relevant provision when using AI tools in legal practice. When you upload a client document to an AI tool, you are sharing confidential client information with a third-party system. The key questions to ask:
- →Where is client data stored? Is it in Canada?
- →Who can access the data? Can the AI provider view it?
- →Is the data used to train AI models?
- →Is there a data processing agreement with the AI provider?
- →Have you disclosed AI use to the client where appropriate?
Consumer AI tools (ChatGPT free tier, Claude.ai free tier) are generally not appropriate for processing confidential client information. Enterprise or legal-specific tools with appropriate data processing agreements and Canadian data residency are much better positioned.
Rule 3.1 — Competence
“A lawyer shall maintain, and strive to improve, competence in the areas in which the lawyer practises. Competence includes... being current with developments in the law, including technology relevant to the lawyer's practice.”
— Rule 3.1 Commentary, LSO Rules of Professional Conduct
The LSO has stated that technological competence includes understanding AI tools relevant to legal practice. This does not require lawyers to be AI experts — but it does require:
- →Understanding the limitations and risks of AI-generated output
- →Reviewing and verifying any AI-generated work before submitting to clients or courts
- →Staying current with how AI is being used in your area of practice
- →Understanding what AI tools you use actually do with your data
AI-generated legal drafts, summaries, and analyses must be reviewed by the lawyer before use. AI output is a first draft and a research aid — not a final product.
Rule 5.1 — Supervision
Rule 5.1 governs the supervision of non-lawyers. When an AI tool performs work that would otherwise be done by a legal assistant or law clerk, the same supervision obligations may apply. This means:
- →The lawyer is responsible for all work product, regardless of how it was generated
- →AI-generated documents must be reviewed and approved by the supervising lawyer
- →Errors in AI output are the lawyer's professional responsibility to catch
- →The lawyer cannot delegate final judgment or legal advice to an AI tool
Practical Compliance Checklist
Before using any AI tool with client data in your Ontario practice:
How Atticus Addresses LSO Requirements
Rule 3.3 — Confidentiality
- ✓All data stored in Canada (Railway Canadian infrastructure)
- ✓Data processing agreements with Anthropic, OpenAI, Voyage AI, Clerk, Resend
- ✓Client data is never used to train AI models
- ✓PIPEDA-compliant data handling
- ✓Built-in AI consent disclosure on first login (consent timestamped and stored)
Rule 3.1 — Competence
- ✓All AI-generated output is presented as a draft for lawyer review
- ✓Clear labeling of AI-generated content
- ✓AI summaries supplement but do not replace lawyer review of source documents
- ✓Lawyer can always access the original document
Rule 5.1 — Supervision
- ✓No AI output is sent to clients without lawyer action
- ✓Draft letters and documents must be manually sent by the lawyer
- ✓All AI actions (add deadline, log time) are logged in the activity timeline
Common Questions
Can Ontario lawyers use AI tools like Claude or ChatGPT?
Yes. The LSO has not prohibited AI use. Lawyers may use AI tools subject to their existing professional obligations. For consumer tools like free ChatGPT, the key concern is confidentiality — uploading client documents to a tool that may train on that data is risky under Rule 3.3. Enterprise tools with appropriate data handling (stored in Canada, DPAs with providers, no training on client data) are generally safer.
Do I need to tell my clients I'm using AI?
The LSO has not issued a blanket disclosure requirement, but disclosure is strongly recommended. Many lawyers include an AI disclosure clause in their retainer agreement. The more client-specific information you feed to an AI tool, the stronger the case for disclosure. Atticus includes a built-in consent disclosure flow on first login to document your practice's AI use.
Can I use AI-generated court documents in Ontario?
Yes, but with mandatory lawyer review. Courts are increasingly adopting AI disclosure requirements. All AI-generated documents must be reviewed, verified, and certified as accurate by the lawyer of record before submission. An AI-generated document that contains errors is the lawyer's professional responsibility, not the AI's.
Should I contact LAWPRO before using AI tools?
It's advisable. LAWPRO (Lawyers' Professional Indemnity Company) has issued some guidance on AI tools and malpractice coverage. Contact LAWPRO to understand how AI tool use affects your coverage, particularly for tasks like legal research, drafting, and advice.
This guide is informational only and does not constitute legal advice about your professional obligations to the Law Society of Ontario. Always refer to the current Rules of Professional Conduct and consult the LSO directly for authoritative guidance.