Sentiint Data Promise

Your data trains your educator. Not our algorithms.

Sentiint is built privacy-first for education. We don’t fine-tune AI on student work. We don’t aggregate behavior across customers. We don’t build a global model of any learner. Educators see their classroom. Sentiint sees nothing.

FERPA-aligned · Zero-training · Educator-controlled
1

We never train AI models on your data.

Not on student responses. Not on educator content. Not on dashboard interactions. No fine-tuning. No RAG corpus built from user activity. No prompt examples drawn from real classrooms.

When an LLM responds in Sentiint, it works from your course content and the immediate context — not from a learned profile of any student or class. Our models stay our models. Your data stays yours.

2

Activity logs serve educators — not us.

Quiz scores, time on task, discussion posts, and chat interactions are logged so the educator who owns the course can see how their students are doing.

That’s it. We don’t analyze them, we don’t aggregate them across customers, and we don’t surface them to anyone outside the course’s owner. The dashboard is a delivery mechanism — not a data mine.

3

Educators can clear student activity at any time.

In any course’s settings, educators can delete all student activity — quiz answers, chat messages, slide views, session timing — with one confirmation click. Per-course, on demand, fully audit-logged.

What stays: the course itself, the enrollment roster, the student account records (those belong to the student). What gets cleared: every behavioral data point Sentiint captured on your behalf.

This isn’t a deletion request that takes 30 days. It’s a button.

4

Course boundaries are real boundaries.

Data scoped to Course A doesn’t flow to Course B. Data inside your institution doesn’t flow to another institution’s tenant. Cross-course intelligence, when it ships, will be an explicit institutional opt-in feature — never a default behavior we enable behind your back.

If a feature would require crossing a boundary, we ask first. Always.

5

Sentiint, the company, never owns a model of your students.

Some platforms accumulate a global behavioral model of every learner who touches them. They get smarter as more students use them — using student data we never get to see being used.

We don’t do that. We won’t do that. Even at Levels 4 and 5 of our framework, the longitudinal intelligence lives with educators and institutions — not with us. The platform is the surface. Your data is yours.

What this means in practice

For students

Your work in Sentiint is for your educator’s eyes — not a corporate AI’s training set. When you take a quiz, write a discussion post, or chat with an AI tutor in a course, that data flows to your professor’s dashboard and stops there. It doesn’t follow you to other courses. It doesn’t get used to train the next version of Sentiint. It’s yours and your educator’s — by design.

For educators

You’re the data controller. Sentiint is the data processor. You decide what to keep, what to clear, what to share. Activity dashboards are tools we deliver back to you. The “clear activity” button is real and works on demand. We’re the platform. You’re the owner.

For parents and guardians

Sentiint is built for educator-mediated access. K-8 deployments don’t ask children to create direct accounts — students enter classroom content via educator-shared links, scoped to a single course. We don’t aggregate behavior across courses, we don’t build profiles of children, and educators retain full control of activity records.

For institutions

Your tenant boundary is enforced architecturally — not just contractually. We build for FERPA alignment, GDPR rights (export, deletion), and a clear separation between processor and controller responsibilities. Cross-course intelligence is an opt-in capability when it ships, not a default. Audit trails for administrative actions are persistent and queryable.

What we do use data for

  • Operating the service: API calls to LLM providers (Anthropic, OpenAI, Google). Per their standard commercial API terms, API traffic is not used to train their models. Sentiint uses paid API tiers exclusively to ensure this protection applies across all three providers.

  • Educator dashboards: Activity logs power the views the course owner sees.

  • Billing and usage tracking: Token usage is metered per-user for billing accuracy. Metadata only — no message content stored in usage logs.

  • System reliability: Standard infrastructure metrics (request latency, error rates) — no user content, no behavioral data.

  • Security and compliance: Audit logs of administrative actions. Required for FERPA-aligned operation.

That’s the full list. If we add a use case, we update this page first.

What we don’t do

  • Train AI models on student work, educator content, or platform interactions

  • Aggregate user behavior across customers, courses, or tenants

  • Build behavioral profiles of any learner

  • Sell, share, or license user data to third parties

  • Use student data for product improvement insights, marketing, or analytics

  • Retain data after an educator clears it

  • Cross course or institutional boundaries without explicit opt-in

Compliance foundations

  • FERPA-aligned by design — educators are data controllers, Sentiint is the data processor.
  • GDPR-ready — data export and deletion supported.
  • COPPA-aware — K-8 deployments use educator-mediated access, no direct student account collection for under-13 users.

Questions

If you have specific questions about how Sentiint handles your institution’s data, contact us at support@sentiint.ai.

Privacy-first AI for education isn’t a feature. It’s the foundation.