CapCut Web app

Usability Testing

Why CapCut?

CapCut is the most popular free online video software available right now, with over 500 million worldwide users. With a wide array of templates and powerful editing tools, creators can quickly craft unique videos to share with their audience.

Video editing has become very popular due to the rise of the short video format made popular by TikTok, Instagram Reels, and YouTube Shorts.

CapCut could see even more growth if these tools are proven to be easy and accessible by users.

I created and conducted a Usability Test with 5 participants, covering 6 tasks that I identified as potential issues while exploring the web application of CapCut. After analyzing the results, I highlighted identified issues and provided 7 recommendations.

Overview

I created a 10-minute walk through of my findings and recommendations for stakeholders.
View it here, or read on to see my full process.

For closed captions, timestamps, and speed controls, you can watch the video on Loom: Watch on Loom
The hand off document mentioned in the conclusion can be viewed here: View Handoff Page

Setup

PHASE ONE

CapCut provides users with hundreds of customizable templates covering a wide array of topics. that drastically lower the skill floor required to create videos.

They also provide powerful video editing tools, such as adding text-to-speech, and manipulating keyframes which may be too advanced for novice users.

I created 6 scenarios in total. The first three were designed to test how intuitive the manipulation of these templates were for inexperienced users. The final three explored if users could locate and implement popular features.

Research Goals

Implementation

Are users able to locate and implement popular features?

Personalize

Are users able to combine templates and their own content?

Customize

Are users able to manipulate templates to suit their needs?

PHASE TWO

Testing

Five participants with varying skill levels in video editing software were selected.

The tests took place over Zoom, the participant would be given control of my mouse in order to take the test.

I introduced scenarios, encouraged the user to voice their thoughts and actions aloud, and gathered data through interviews after each task.

PHASE THREE

Analysis

With testing completed, I identified pain points and strengths of CapCut by listening through and watching the actions taken on each test.

  • Task success rates and difficulty ratings were compared

  • Similar actions and comments were noted using a rainbow spreadsheet.

  • Impactful quotes and suggestions were pulled.


Recommendations

Using the data gathered from my user tests, as well as knowledge of typical UX conventions and patterns, I provided 7 recommendations.

Improve User Efficiency

Communicate Consistently

Tasks should be either completely impossible or easily accessible. Workarounds are never a solution.

Encourage Intuitive Interaction

Clearly communicate the ability to replace images by dragging and dropping

Accessibility + Power User Options

Add additional shortcuts to the contextual menu for quick access to frequently used tools

Enhance Interface Clarity

Improve Visual Cues

Remove the “uneditable” overlay from parts of the template that are editable

Reduce Guesswork

Conduct competitive analysis to identify popular features and opportunities for improvement.

Streamline User Support

Actionable Help

Replace generic options with problem-solving focused content like tutorials and contextual tooltips

In-Context Learning

Implement on-demand tooltips for specific functionalities within the editor, eliminating the need to exit the editing flow.


CONCLUSION

Key takeaways and learnings

Preparation before conducting a study is key.

I actually conducted six interviews, and the first one ended up serving as a “usability test” of my usability test! With the adjustments I made with what I learned after the first test, it was too different to accurately compare it to the rest of the interviews.

It was tempting to make more changes as I discovered more potential confusion, but I knew this would cause too many problems during analysis. I resisted the temptation and kept the level of instruction and verbiage consistent.

Providing Recommendations

My recommendations I made were based on the qualitative data I received. The presentation of these could have been much more convincing if I provided examples and mock-ups of what the implementation of my suggestions would look like.

Creating an asynchronous deliverable in the form of a video was an extremely long process, but was rewarding in the end. Taking the time to present my work in a 10-minute video was a good move as it will be more likely that my colleagues will watch a video over read written recommendations in a report format.

Previous
Previous

Derma

Next
Next

Amoeba Music