Articles on: Crisp Inbox

How do Crisp Analytics work?

Crisp Analytics lets you get a big picture of how your customer support is doing. In a snap, you can get to know what you're doing right, and which things you can improve; over a selected period of time. This article explains the details of each Analytics metric that you can find in Crisp Analytics.

Analytics are processed every hour, for the last revolved hour. It means that Analytics will show up in your Crisp dashboard with a delay of 1 hour. Do not expect Analytics to show up in real-time, as the Analytics consolidation process only kicks in once per hour.

Please remember to save you Analytics data at least once per year. Crisp analytics will store your data for this 1 year period, but after this it will be removed from the Analytics plugin.

Our analytics are part of our team inbox and will help you to better understand how your team is doing in terms of efficiency and satisfaction.

Video Tutorial


Activity Last Week Compared to The Week Before

This metric shows you quick numbers on how your customer support performed in general last week as compared to the week before.

The following numbers are shown:

Conversations: number of new and re-activated / existing conversation threads that occurred;
Mean response time: average time it takes an operator from your team to reply to a user message (calculated from when the user sends their message until you have responded / resolved the conversation);
Website visitors: number of visitors that browsed your website or app;

As the current week is not 100% done, we cannot show reliable numbers using the current week. Thus, Crisp uses last week's numbers instead and compares them to the numbers of 2 weeks before.

Conversation Activity

A conversation equals one or multiple interactions in one day. No matter the number of chats you have, it'll be equal to one conversation.

This bar chart tells you how many conversation threads occurred, either:

New conversation: the conversation did not exist before and has been created;
Existing conversation: the conversation existed before and has been re-activated as it was inactive for more than 1 day (thus, we consider it as a new conversation thread);

The conversation activity statistics are not calculated on the basis of a created conversation, but rather on threads that occurred over time. Thus, if you count by hand the conversations that occurred eg. yesterday, and compare them to the metrics that are given by Crisp Analytics, it may not match. You need to account for our method of calculation for threaded conversations, which is either new conversation or re-activated after 1 day.


The leaderboard view shows how many conversations your operators have been assigned to last day, compared to 2 days before. It also lets you compare them to each other and see who's handling the most conversations. This lets you spot the most active of your team's operators.

Support Responsiveness

This calendar view helps you see when you are falling short on customer support availability, in other words when you do receive user messages but your support is not being as responsive as it usually is. You can use this calendar to decide if you should hire someone (eg. on evening hours), or move someone's shift.

For instance, at Crisp we noticed that our evening hours response time was not as good as our day hours response time (30 minutes mean vs 1-2 minutes mean). We decided to hire someone to handle those evening hours, which lowered our evening response time to the normal 1-2 minutes mean. You can see this evening red spots pattern on the screenshot above.

How is the support responsiveness calculated?

We count the time delay between a chat message and the reply from an operator. Then we calculate a mean of all those delays. Also, note that if you don't answer to the last message and directly resolve the conversation, it will count just as if an operator had answered, and stop the timer.
The automated responses are considered as a response from the support. If your Bot scenario answers and resolves the conversation, it will be counted as an operator response as well.

The spots are calculated on a relative basis, not an absolute basis. This means that red spots may not indicate a super-slow response time, but it rather indicates that it's not as good as it usually is (eg. if your mean response time is 1 second; a 10 seconds response time would show as red although it is still a good one). This helps spot slower-than-usual support hours.

Segments Saved Over Time

This bar chart view shows you how many, and which, segments were saved over time. The segments can come from Crisp Chatbox (ie. JS SDK), Crisp Apps, Crisp Plugins or the REST API.

Note that if a given segment gets removed, it will still be visible on this chart, as it does not change the fact that this segment you just removed was added earlier on. Crisp Analytics goes looking forward, already-consolidated past Analytics points are not removed.

Shortcut Usage Statistics

This bar chart helps you see which shortcuts your team use the most, and thus which ones are never used. This helps you remove useless shortcuts from your shortcuts database, and thus keep it uncluttered.

Assigned Conversations Per Operator

This graph shows how many conversations were assigned to all operators, or a selected operator, for each timespan over time.

As a conversation can be re-assigned dynamically to another operator, there is a possibility that a given conversation gets counted as assigned multiple times to the same operator, different operators, or both. This metric accounts for an assigned conversation each time that any conversation gets assigned or re-assigned.

Fired Triggers

This bar chart shows how many triggers have fired over time from your Crisp Chatbox. It also provides an indication of which triggers did fire.


Visitor Location Heatmap

The visitor location heatmap shows a view of where your visitors come from on a world map. The green color is flashier if a country has relatively more visitors than other countries. If a country has no green color at all, then we've not seen any visitor from this country on your website.

In some countries, eg. Iran, Internet users rely heavily on VPNs as to circumvent government-backed Internet censorship (nah!). Those users are usually seen as accessing your website from Western IP addresses (eg. in the US, Canada or Europe). Those VPN-users will be shown as originating from the country their VPN IP address is located in. There is — unfortunately — no other way around this to detect the real user's country.

Note that: storing heatmap data is super-heavy and generates a lot of points. Thus, we've had to limit point storage to 1 month for all Crisp websites. Thus, you can see map data up to 1 month ago (this lets you view data on the 4 last weeks, on a per-week basis). Also, we don't pre-collect heatmap data for Basic and Pro plans before they upgrade to Unlimited. You need to be using Crisp Unlimited for the data to start showing up. All this for technical reasons at our scale (100,000s of Crisp websites).

Website Visit Calendar (Per Day And Hour)

The website visit calendar shows you when your website receives the most visits per spots of 2 hours. If a given spot is red (eg. from 10am to 12pm on Wednesday as it can be seen on the screenshot above), it means that this time spot received more visits than average.

This calendar works in a similar way than the support responsiveness calendar described above.

Contacts Created Over Time

This graph shows how many contacts have been created in your Crisp CRM over time. Contacts created from Crisp Chatbox using the emails of users who chat with your support are accounted for in this graph.


Ratings Compared to Last Week

This metric gives you quick numbers on how your support is doing as ranked / rated by your users.

The following numbers are shown:

Mean score: average score from all scores;
Comments: number of comments you've received with your scores;
Satisfaction counters: number of satisfied, okay and unsatisfied users (as estimated from the scores you got);

General Rating Change Over Time

This graph shows how your mean score evolved over time, per time frame. The graph portion of satisfied scores will be seen as green, then yellow if okay and red for unsatisfied.

If there are no ratings for a given period of time, the rating will default to 5 (ie. maximum rating).

Latest Ratings & Comments

This data-table lists each individual score you've received, with the comment that was left (if any), the author user identity, and the agent's name whom handled the conversation. You can contact the user in a snap to ask more questions with the "preview" button, then "open conversation".


Sent Campaign Emails

This graph shows how many individual campaign emails were sent over time, as well as which type of email was sent (either one-shot or automated).

Campaign Email Activity

This bar chart shows user email activity on the emails you've sent, split over the following activity types:

Link Clicked: a link has been clicked by an user who received an email from your website (can be counted multiple times);
Email Opened: an email you've sent has been opened;
Unsubscribed From Emails: an user who's received an email you've sent has unsubscribed from it;
Email Bounced: an email you've sent was bounced by the receiving SMTP server (ie. the email could not be delivered);
Email Delivered: an email you've sent was delivered to the target user;


You need to be using Crisp Helpdesk and run an active knowledge base for data to show up in this section.

Helpdesk Feedback For Articles

This data-table lists each individual feedback you've received on your knowledge base articles, with the comment that was left (if any), and the author user identity. You can contact the user in a snap to ask more questions with the "Message User" button.

Helpdesk Articles Read Statistics

This graph shows the number of visits on all Helpdesk articles, or a selected article. These numbers can be counted for as page-views.


You need to be using Crisp Status and run an active status page for data to show up in this section.

Node Downtime Statistics

This bar chart shows how many times all, or a given Status Page node was reported as DEAD (ie. down, offline).

Pull Nodes Latency

This graph shows the average latency that was seen from our probes for all, or a given Status Page node.

Updated on: 02/11/2023

Was this article helpful?

Share your feedback


Thank you!