The client sentiment analysis tool provides a unified, AI-powered view of client health by aggregating data across all major communication and project channels. It surfaces issues before they escalate and gives account managers a high-level picture of every account without manually reviewing each channel.
As Mark described in the [1]:
"I think you guys are going to find that sentiment analysis — you could run the whole freaking business from that. It basically tells you, here's what they're saying and here's what's happening."
The tool ingests and cross-references data from four channels:
| Source | What It Watches |
|---|---|
| All emails sent or received by anyone to/from a client — including tone and sentiment, not just content | |
| Slack | Client-related channel activity |
| ClickUp | Client project folders — task status, due dates, overdue items |
| Google Drive | New documents added; reads content to understand what's being discussed |
Each client receives a sentiment rating (positive, neutral, declining) based on aggregated signals. The tool flags patterns like frustrated email tone, repeated rescheduling, or stalled deliverables — even if no single signal would raise an alarm on its own.
From the client page, a summary can be generated for the last 7 or 30 days. This produces an executive summary covering:
- Recent email activity and tone
- ClickUp task status (overdue, backlogged, completed)
- Slack activity
- Any new documents or deliverables in Google Drive
A search function allows querying across all ingested data. Filters can be applied by client, channel, or date range. Useful for quickly locating a specific email, decision, or commitment without digging through individual tools.
An AI interface allows natural-language questions against the ingested data only — it does not query the open internet or hallucinate from general LLM knowledge. Filters can be scoped to a single client or channel.
Every Sunday morning, the tool automatically runs the sentiment analysis across all clients and posts the report to Slack. The team can review it Monday morning to identify any accounts that need attention before the week begins.
"What I'm doing now is I'm writing this thing up where every Sunday it's going to take this report and run it and put it in Slack. So you can come in Monday morning and see what it says about the last seven days."
The tool is intentionally direct and can read as alarmist. The team noted it tends to heavily weight negative signals and may flag issues that have already been resolved (especially on a 30-day lookback). Calibration notes:
"I don't really want to be stroked here. I want it to tell us how it is."
The first team-wide review of the tool during the [1] surfaced several actionable issues:
| Client | Issue Flagged | Status |
|---|---|---|
| [3] | Website errors found by client; trust deficit | Resolved; white-glove approach adopted |
| [4] | Google Ads account suspended; $2,500 balance | Under investigation |
| [5] | Blog content approval bottleneck; repeated rejections | AI training workspace proposed |
| [6] | Overdue ClickUp tasks; scope creep on website guidance | Time tracking initiated |
| [7] | Blog tasks backlogged since December | Needs attention during AM transition |
| [8] | Blog tasks backlogged; empty meta descriptions | Meta descriptions queued for fix |
| [9] | Neutral/declining; Google Ads blocked; multiple reschedulings | New proposals being prepared |
| [10] | Flagged "unreliable commitment" — likely false positive | Flyer finalized; situation stable |