What data points would you would like to track to know if zoom meeting was successful or not?

  Google
Add Your Answer
Answers (2)

I want to clarify what exactly is meant by a “successful” meeting before answering this Google metrics question. In my opinion a meeting is successful when it’s attended by a huge number of participants invited (some can be left as they might be on PTO, in another meetings etc), knowledge is shared across the meeting, they are interactive where users get to talk to each other and there are minimal nuisances like background noise etc. Apart from user related issues like mentioned above a meeting can be successful when the tech doesn’t fail like crashing of Zoom application, users not being able to join a call, users not being able to share screen etc, and some external reasons like poor internet connection. Am I right in understanding what a successful meeting is? – Yeah

I’ll start with defining what Zoom is. So Zoom is an application for real time audio and video communication. In Zoom’s Basic (free) plan upto 100 users can join for a call upto 40 minutes. These limits can be removed by upgrading the plan to Pro. Zoom’s major competitors are Cisco WebEx, Microsoft Teams, and Google Meet.

Now there can be two broad categories in which different metrics can be defined –

  1. User metrics, and.
  2. Technical metrics

These 2 categories of metrics will define how well the meetings happened for the users.

 

User metrics – These are the metrics to identify whether the meetings are successful or not by measuring users’ actions. These metrics are –

 

  1. Duration metric: The weekly/monthly average percentage of durations the meeting lasted vs the duration these meetings were actually scheduled for.
  2. Attendance metric: The average weekly/monthly attendance percentage – Number of users attended/Number of users invited
  3. Efficiency metric: The average weekly/monthly percentage of participants who join the meeting and sit through at least 75% of the total duration for which a meeting was held.
  4. Knowledge metric – Some metrics to measure the amount of knowledge imparted in meetings –
    1. Duration of screen sharing per Zoom call
    2. Number of different participants who shared the screen per Zoom call
    3. Average number of links (docs, sheets, confluence, Metabase, Github ) shared
    4. Percentage of meetings recorded as compared to total number of meetings held
    5. Percentage of users (except the one sharing screen) switching screen while on the call for more than 10% of the entire call duration
  5. Interaction metrics – This includes the numbers to determine how interactive the participants were during these meetings –
    1. Average number of messages send on the Zoom chat per week/month
    2. Average number of emojis sent on the Zoom call
    3. Average number of participants who turn on their videos at least for 50% of the entire call duration
    4. Average weekly/monthly ratio of number of participants who speak in the call – Number of participants speaking once in the call/Number of total participants
    5. Average number of breakout rooms created per call per month
  6. Nuisance metrics – Multiple nuisances like unnecessary speaking on users when not asked to etc can cause a bad experience for  participants, and meeting will not be called as successful as it should be –
    1. Average number of times the participants were removed from the meeting by the host.
    2. Average number of participants who were removed from the meeting by the host.
    3. Average number of times the host had to mute the participants due to some background noise or other such reasons.
    4. Average number of participants the host had to mute due to some background noise or other such reasons.
    5. The number of times the host had to turn off the chat for the participants.

 

Technical metrics – These are the metrics to identify whether the meetings are successful or not by measuring Zoom’s tech. These metrics are –

  1. Average number of times the Zoom app crashed during a call per week/month
  2. Average number of times the users had to rejoin the call (very inaccurate as this can be due to poor internet connection)
  3. Average number of times the user wanted to do an action (share screen, unmute/mute themselves, start/stop video) but weren’t able to
  4. Average number of times users were not able to join a Zoom call

These are all the metrics that I think can be tracked by Zoom to understand users’ sentiments about different features and also understanding the quality of Zoom calls taking place on the platform. These can obviously be prioritised according to the feature PMs are focu

Understanding the Problem

First off, let’s establish a common definition of what a Zoom meeting is and how users use it. Zoom meetings are virtual meetings that can have multiple participants. They can be accessed via web browser, desktop app, or mobile app. Users can speak with each other, use text chat, share their screens, and turn on their cameras. I’d say the majority of Zoom calls occur in professional settings, but it is used by people in their personal lives as well.

In order to set the stage for this problem let’s examine it through the lense of Zoom the company and it’s mission statement. Zoom’s mission statement is to make video communication frictionless. While that is succinct and straightforward, I think it’s worth us teasing out what makes a Zoom meeting frictionless and successful:

  • Logistics (scheduling, inviting, joining)
  • Audio and Video quality is clear
  • No connectivity issues
  • Users participate in the meeting
Metrics

In order to track whether or not a Zoom meeting was successful, let’s brain storm some metrics that correspond to our above aspects of what makes a Zoom meeting successful:

  • Logistics
    • # of times a meeting was rescheduled
    • % of invitees who accepted the invite
    • % of participants who joined within 2 minutes of the scheduled start time
    • # of additional participants who weren’t invited
  • Call Quality
    • Instances of background noise (dog barking, phone ringing, etc)
    • # of times a speaker was asked to repeat something
    • Average volume level of speakers
  • Connectivity
    • # of times a participant was fully dropped from the meeting
    • # of instances where a participants’ screen froze or their audio cut out
    • Average # of attempts before a user could reconnect
  • Participation
    • Average % of time Zoom is the main window
    • Average time spent speaking per participant
    • Average # of chat messages sent per participant
Prioritization
While I think each of the above metrics has their own place, there are a lot of them and it might be worth focusing on just a few:
  1. % of participants who joined within 2 minutes of the meeting starting
    • This is kind of downstream and envelopes a lot of the logistical issues around scheduling, joining, and starting a Zoom meeting
  2. Average # of times a participants video or audio froze
    • Want to measure the average instead of absolute as that would be skewed upwards by larger meetings
  3. Average % of the time Zoom is the main window
    • I imagine the majority of participants not paying attention are doing something else in another tab
Downsides
I am confident that our above chosen metrics will do a good job of tracking whether or not a Zoom meeting was successful. With that said, no metric is perfect and I think it’s worth calling out their potential downsides:
  1. % of participants who joined within 2 minutes of the meeting starting
    • As this is downstream from some of the other logistics actions, it doesn’t give us insight into what the process was like for preparing the meeting. For example, it could be a really painful process to create and invite users to a scheduled Zoom meeting.
  2. Average # of times a participants’ video or audio froze
    • This let’s us know a problem is happening, but doesn’t give us much insight into why it’s happening. For example, their Zoom version could not be playing nice with their device’s hardware or they could be on a poor mobile connection? We wouldn’t know.
  3. Average % of the time Zoom is the main window
    • If the meeting is purely audio without video or screensharing I could see this metric not being particularly accurate. Additionally, if external links are being shared in the chat or users are actively Googling the topics being discussed we could be miscounting their active participation as inactive.
Summary
In order to asses whether or not a Zoom meeting is successful, which means proper handling of scheduling logistics, call quality, and participation, we’ll monitor the following metrics:
  1. % of participants who joined within 2 minutes of the meeting starting
  2. Average # of times a participants’ video or audio froze
  3. Average % of the time Zoom is the main window