“Aggregate reports tell you what is happening across your user base. Activity reports tell you what is happening for a specific person - and that is where the real answers live.”
Activity reports show you the complete timeline of an individual user’s interactions with your product - every page viewed, every feature used, every email opened, every payment made - in chronological order. It is the difference between reading a census report and reading a biography.
This individual-level view is essential for several use cases that aggregate data cannot serve. Customer support teams need to understand what a specific user experienced before they submitted a ticket. Product managers need to see the exact sequence of actions that led a user to encounter a bug. Growth teams need to study the behavior patterns of their best customers to understand what makes them different. Sales teams need to know what a prospect has done on the site before getting on a call.
This guide covers how to use KISSmetrics activity reports effectively: navigating the user timeline, searching for specific users, inspecting event details, applying activity data to support and sales, identifying power user patterns, and debugging conversion issues by examining individual journeys.
The User Timeline View
The user timeline is the core of the activity report. It presents every tracked event for a specific user in chronological order, creating a complete narrative of their relationship with your product from the first visit to the most recent action.
What the Timeline Shows
Each entry in the timeline includes the event name, the timestamp, and any properties associated with the event. A sign-up event might show the user’s email, the acquisition source, and the referral URL. A purchase event might show the amount, the plan type, and the payment method. A feature usage event might show which feature was used, how long the session lasted, and what the user accomplished.
The timeline also shows marketing interactions: email opens, email clicks, campaign assignments, and ad impressions. This creates a complete picture of the user’s journey across all touchpoints, not just the moments when they were inside your product. The KISSmetrics reports platform connects all of these events to a single user record, even when the user switches between devices or channels.
Cross-Device Identity
One of the most valuable aspects of KISSmetrics activity reports is cross-device identity resolution. When a user browses your site on their phone, signs up on their laptop, and uses the product on their tablet, all three sessions appear on the same timeline. This is critical for understanding the actual user experience, which almost never happens on a single device in a single session. Without identity resolution, you would see three separate, disconnected stories instead of one coherent journey.
Navigating Long Timelines
Active users can generate thousands of events over weeks or months. Navigating a long timeline effectively requires filtering by event type (show only purchase events, only support interactions, only feature usage), filtering by date range (show only events from this week), and searching for specific events within the timeline. KISSmetrics provides these filtering capabilities so you can quickly find the events relevant to your current question without scrolling through hundreds of unrelated entries.
Searching for Specific Users
Before you can view a user’s timeline, you need to find them. KISSmetrics provides several ways to search for specific users depending on what information you have.
Search by Identity
The most direct search is by email address or user ID. If a customer writes to support and you know their email, you can pull up their timeline instantly. This is the most common search method for support and customer success use cases, where you start with a known user and want to understand their experience.
Search by Property
You can also search for users by any tracked property: plan type, company name, acquisition source, geographic region, or any custom property. This is useful when you want to find examples of a specific user type. “Show me enterprise users who signed up from the webinar campaign last month” returns a list of users matching those criteria, and you can view the timeline for any of them.
Search by Behavior
The most powerful search method is by behavior: find users who performed (or did not perform) specific events. “Show me users who reached the checkout page but did not complete a purchase in the last seven days” returns the exact users who abandoned checkout, and their timelines show exactly what happened. This behavioral search bridges the gap between aggregate analytics (you know your checkout abandonment rate is 35%) and individual understanding (you can see exactly what each of those users experienced).
Navigating from Aggregate to Individual
One of the most effective workflows is to start with an aggregate report that identifies a pattern, then drill down to individual activity reports to understand why the pattern exists. Your funnel report shows a 40% drop-off at step three. You click into the users who dropped off and view several of their timelines. You notice that many of them triggered an error event immediately before abandoning. The aggregate report identified the problem. The activity reports explained it.
Event Detail Inspection
Each event in a user’s timeline carries detailed information beyond the event name and timestamp. Inspecting these details reveals the context of each action, which is often essential for understanding why the user behaved a certain way.
Event Properties
Every tracked event can include custom properties that provide additional context. A “Search Performed” event might include the search query, the number of results returned, and whether the user clicked a result. A “Feature Used” event might include the feature name, the time spent, and the outcome (success or failure). These properties transform a simple timeline entry into a rich description of what actually happened.
User Properties at Event Time
In addition to event properties, you can see the user’s properties at the time of each event. What plan were they on when they contacted support? What was their engagement level when they viewed the cancellation page? These time-stamped properties provide context that is lost in current-state views. A user might now be on a premium plan, but at the time they submitted the angry support ticket, they were on a free trial.
Session Context
Events also carry session-level context: the device type, browser, operating system, geographic location, and referral source. This technical context is invaluable for debugging. If a user reports a problem, seeing that they were on an older version of Safari on an iPhone might immediately explain the issue. If a user’s behavior seems erratic, noticing that they switched devices mid-session might clarify what happened.
Using Activity Data for Customer Support
Activity reports transform customer support from reactive guesswork into informed problem-solving. When a support agent can see exactly what the customer experienced before reaching out, the interaction is faster, more accurate, and more satisfying for both parties.
Pre-Conversation Preparation
Before responding to a ticket or getting on a call, pull up the user’s activity timeline. Look at their recent events: what were they doing before they contacted you? What errors did they encounter? What features have they been using? This preparation lets the agent start the conversation with context instead of asking the customer to explain everything from scratch. “I can see you were setting up a funnel report when you ran into an issue” is a much better opening than “can you describe what you were trying to do?”
Replicating Issues
When a customer reports a bug, the activity timeline shows the exact sequence of actions that led to it. Which page were they on? What did they click? What happened next? This information lets the support team replicate the issue reliably, which dramatically speeds up the debugging process. Instead of going back and forth with the customer asking for screenshots and reproduction steps, the agent already has the complete picture.
Proactive Support
Activity data also enables proactive support. If you see a user repeatedly trying and failing to perform a specific action, you can reach out before they file a ticket. “I noticed you’ve been working on setting up your first campaign. Would you like some help?” This proactive approach prevents frustration and demonstrates a level of care that customers remember. Building automated campaigns that trigger on struggle patterns can scale this approach without requiring manual monitoring.
Understanding Customer Context
Beyond specific issues, activity data helps support teams understand the customer’s overall relationship with the product. Is this a power user who knows the product deeply and is encountering an edge case? Or a new user who is struggling with basic functionality? The timeline makes this immediately clear, allowing the agent to adjust their communication style and the level of detail in their guidance.
Identifying Power User Patterns
Your power users are a goldmine of behavioral intelligence. They have figured out how to get the most value from your product, and their activity patterns reveal what that looks like in practice. Studying individual power user timelines uncovers patterns that can be replicated across your broader user base.
The Power User Journey
Look at the timelines of your most engaged, longest-tenured, highest-paying customers. What did they do in their first week? What features did they adopt first? How quickly did they reach the activation event? What is the sequence of features they use in a typical session? These patterns define the ideal user journey - the path that, if followed by more users, would produce more power users.
Feature Usage Sequences
Power users often use features in specific combinations and sequences that are not obvious from aggregate data. Maybe they always check the dashboard before building a report. Maybe they use the export function after every analysis. Maybe they have developed a weekly workflow that involves four features in a specific order. Understanding these sequences informs both product design (make the transitions between these features smoother) and onboarding (guide new users toward these sequences earlier).
From Individual to Pattern
The process of identifying power user patterns starts with individual timelines but ultimately becomes a generalized model. Review five to ten power user timelines. Note the commonalities: which actions do most of them take in the first week? Which features do most of them use regularly? Which behaviors distinguish them from average users? These commonalities form the basis of your activation criteria, your onboarding design, and your engagement scoring model. Use populations to define these behavioral segments and track them over time.
Debugging Conversion Issues
When your conversion funnel shows a problem but the aggregate data does not explain why, individual activity reports are the debugging tool of choice. They let you examine exactly what happened for specific users who did not convert, revealing the root causes that aggregate data hides.
The Debugging Workflow
Start with the funnel report to identify where the drop-off occurs. Then search for users who dropped off at that step within a recent time period. Pull up five to ten of their activity timelines and look for patterns. Did they encounter an error? Did they navigate to an unexpected page? Did they spend an unusually long time on a specific step (suggesting confusion)? Did they return later and complete the step (suggesting it was not a dead end but a delay)?
Common Findings
The most common findings from conversion debugging include: technical errors that prevent form submission or page loading, confusing copy or design that causes users to hesitate, unexpected navigation patterns where users leave the funnel to find information and do not return, price shock where users view pricing and immediately leave, and mobile usability issues where specific elements do not work correctly on certain devices. Each finding points to a specific, addressable problem.
Quantifying the Problem
After identifying a potential cause from individual timelines, return to aggregate data to quantify it. If you noticed that several non-converters encountered an error at step three, check how many total users experienced that error. If 15% of all step-three visitors hit the error, and your step-three conversion rate is 60%, fixing the error could potentially increase conversion to over 69%. This quantification helps you prioritize fixes based on their expected impact.
Validating Fixes
After implementing a fix, use activity reports to verify that it works. Find users who recently went through the repaired step and review their timelines. Did they encounter the error again? Did they navigate differently? Did they convert? This individual-level validation complements aggregate metrics, which might take days or weeks to reflect the improvement due to sample size requirements.
Privacy and Ethical Considerations
Activity reports provide a detailed view of individual user behavior, which carries responsibility. Using this data ethically and in compliance with privacy regulations is essential.
Access Control
Not everyone in your organization needs access to individual user activity. Limit access to roles that have a legitimate business need: customer support, product management, and customer success. Marketing and sales teams might need access to aggregated or anonymized data but not to individual timelines. Configure role-based permissions to enforce these boundaries.
Purpose Limitation
Use activity data for the purposes for which it was collected: improving the product, providing customer support, and understanding user behavior patterns. Do not use individual user data for purposes that the user would not reasonably expect, such as targeted advertising based on in-product behavior or sharing individual user activity with third parties.
Data Minimization
Track only the events and properties you actually need. Collecting everything “just in case” creates unnecessary privacy risk and data management burden. Review your tracking plan periodically and remove events that are not being used for any active analysis or operational purpose.
Transparency
Your privacy policy should clearly describe what data you collect and how you use it. Users should understand that their interactions with your product are tracked for the purposes of product improvement, customer support, and analytics. Transparency builds trust and ensures compliance with regulations like GDPR and CCPA.
Key Takeaways
Activity reports provide the individual-level perspective that aggregate analytics cannot. They are essential for customer support, product debugging, power user analysis, and conversion optimization.
Aggregate analytics tells you what is happening. Individual activity reports tell you why. The most effective analytics teams use both in combination: aggregate data to identify patterns and priorities, individual data to understand causes and validate solutions. Build the practice of moving fluidly between these two levels, and your analytics will produce insights that no amount of dashboard-watching alone could deliver.
Continue Reading
Metrics Dashboard Setup: Design KISSmetrics Dashboards That Drive Daily Decisions
A well-designed dashboard is glanced at daily and drives weekly decisions. This guide shows you how to configure KISSmetrics dashboards that surface the right numbers for your role.
Read articleFunnel Reports: The Complete Guide to Building and Analyzing Conversion Funnels
Funnel reports show you exactly where customers drop off in your conversion process. This guide covers how to build effective funnels, interpret the data, and take action on what you find.
Read articleA/B Test Reports: Measure Experiment Impact Beyond the Landing Page
Most A/B test reports stop at conversion rate. KISSmetrics A/B test reports track the downstream impact on revenue and retention, showing whether your winner actually wins where it counts.
Read article