Product Management interview question [Metrics]: What data points would you like to track to know if Zoom meeting was successful or not?
I want to clarify what exactly is meant by a “successful” meeting. In my opinion a meeting is successful when it’s attended by a huge number of participants invited (some can be left as they might be on PTO, in another meetings etc), knowledge is shared across the meeting, they are interactive where users get to talk to each other and there are minimal nuisances like background noise etc. Apart from user related issues like mentioned above a meeting can be successful when the tech doesn’t fail like crashing of Zoom application, users not being able to join a call, users not being able to share screen etc, and some external reasons like poor internet connection. Am I right in understanding what a successful meeting is? — Yeah
I’ll start with defining what Zoom is. So Zoom is an application for real time audio and video communication. In Zoom’s Basic (free) plan upto 100 users can join for a call upto 40 minutes. These limits can be removed by upgrading the plan to Pro. Zoom’s major competitors are Cisco WebEx, Microsoft Teams, and Google Meet.
Now there can be two broad categories in which different metrics can be defined -
- User metrics, and
- Technical metrics
These 2 categories of metrics will define how well the meetings happened for the users.
User metrics — These are the metrics to identify whether the meetings are successful or not by measuring users’ actions. These metrics are -
- Duration metric: The weekly/monthly average percentage of durations the meeting lasted vs the duration these meetings were actually scheduled for.
- Attendance metric: The average weekly/monthly attendance percentage — Number of users attended/Number of users invited
- Efficiency metric: The average weekly/monthly percentage of participants who join the meeting and sit through at least 75% of the total duration for which a meeting was held.
- Knowledge metric — Some metrics to measure the amount of knowledge imparted in meetings -
a. Duration of screen sharing per Zoom call
b. Number of different participants who shared the screen per Zoom call
c. Average number of links (docs, sheets, confluence, Metabase, Github) shared
d. Percentage of meetings recorded as compared to total number of meetings held
e. Percentage of users (except the one sharing screen) switching screen while on the call for more than 10% of the entire call duration
5. Interaction metrics — This includes the numbers to determine how interactive the participants were during these meetings -
a. Average number of messages send on the Zoom chat per week/month
b. Average number of emojis sent on the Zoom call
c. Average number of participants who turn on their videos at least for 50% of the entire call duration
d. Average weekly/monthly ratio of number of participants who speak in the call — Number of participants speaking once in the call/Number of total participants
e. Average number of breakout rooms created per call per month
6. Nuisance metrics — Multiple nuisances like unnecessary speaking on users when not asked to, etc can cause a bad experience for participants, and meeting will not be called as successful as it should be -
a. Average number of times the participants were removed from the meeting by the host.
b. Average number of participants who were removed from the meeting by the host.
c. Average number of times the host had to mute the participants due to some background noise or other such reasons.
d. Average number of participants the host had to mute due to some background noise or other such reasons.
e The number of times the host had to turn off the chat for the participants.
Technical metrics — These are the metrics to identify whether the meetings are successful or not by measuring Zoom’s tech. These metrics are -
- Average number of times the Zoom app crashed during a call per week/month
- Average number of times the users had to rejoin the call (very inaccurate as this can be due to poor internet connection)
- Average number of times the user wanted to do an action (share screen, unmute/mute themselves, start/stop video) but weren’t able to
- Average number of times users were not able to join a Zoom call
These are all the metrics that I think can be tracked by Zoom to understand users’ sentiments about different features and also understanding the quality of Zoom calls taking place on the platform. These can obviously be prioritised according to the feature PMs are focusing on to understand its impact on the call quality.