Redesigning the Remote Critique Process

By Rochelle Dai, Sienna Gonzalez, Morgan Creek, and Gennifer Hom

Design Interactive

--

Problem Statement

For the 2020–2021 school year, UC Davis Design is only offering remote courses. We wanted to create a comfortable and beneficial critique environment for design students to receive feedback on their projects in a remote setting.

Our problem statement was, “How might we create a comfortable and beneficial critique environment for design students to receive feedback on their projects in a remote setting?”

Our solution is a platform that increases transparency between participants and centralizes feedback for future reference.

Final Product

Video by Rochelle Dai

Our Team

Timeline

Here’s what our design process looked like. We spent the majority of the 5 week sprint conducting research and prototyping.

Our user research continued during our ideation process to allow for adequate time researching our main user groups. A key component to our process was conducting user-testing between our lo-fi iterations.

User Research

We knew the design critique process involved two groups of users: professors and students. To get rapid student feedback, we created a survey to gather data about their critique experience. For professors, we conducted a total of 4 one-on-one interviews with design professors to get more information about their experience hosting critiques. Our goal was to find the strengths and weaknesses for both in-person and online environments so we could cross analyze the two.

We surveyed and interviewed 20 students and professors about their experience with remote critique and found 3 common pain points:

A lack of personal connections between students and between professors and students

“I don’t feel very connected to my professor and peers…We can hide behind our screens.” — Student

Frequent awkward silences where students were waiting for one another to speak

“It feels awkward sitting there in silence, especially when my camera is on while waiting for at least one person to give me a critique.” —Student

Disorganized documentation as a result of decentralized feedback found in different documents or locations, making the feedback difficult to implement in future work

“Students never read the written critiques and there is no change in work. It is a wasted opportunity.” —Design Professor

Brainstorming

From our research, we used affinity mapping to identify common themes within our data and organized them into distinct pain points.

Affinity mapping exercise done through Figma

User Research: Synthesis

We recognized that the remote critique experience is a multi-step process, which we called the critique journey: what happens Before critique, During critique, and After critique. We wanted our solution to account for each step of the experience

Dissecting the Data

From our surveys and interviews, we received many contrasting responses about the experience during critique.

Example 1: Students have varying comfort levels regarding microphone use.

One student felt that using mic is nerve wracking, whereas another student dislikes when the audience mutes themselves.

“More people can contribute by typing in the chat. It’s not as nerve wracking as speaking in front of everyone.” — Student

“I dislike that people just mute themselves and turn their cameras off.” — Student

Example 2: Each professor has a different class structure, which affects the efficiency of critique. Students had varying experiences with whether or not critique sessions were done efficiently.

“It’s a lot quicker because we do it in breakout sessions and get 1-on-1 critiques with the professor.” — Student

“It would be great if professors had a time limit because these design critiques take way too long.” — Student

Example 3: We also found that students felt the text-based format of critique had both pros and cons.

Some students found this type of format beneficial when receiving feedback, whereas when providing feedback, students found this format is more time-consuming.

“It’s in written form so you can easily refer back to them and keep them in your notes.” — Student

“It can take more […]. I have to type things out and circle them.”—Student

Ideation: Sketches

After categorizing our user research, we moved onto the ideation phase.

Examples of a few sketches we did to solve some of the problems from our data and brainstorming

After we brainstormed problem statements for each step of the critique journey, and we created some rough sketches of solutions to the various problems.

We began to ideate possible designs to help alleviate critical pain points found during our research. Each team member explored solutions to a couple different problem statements and depicted these with digital sketches.

Before Critique

Low Fidelity Prototype

We first wanted to increase personal connections and ease anxiety for the before section critique to address the feedback from our survey and interviews.

As a result, we implemented an ice breaker question to create a friendly atmosphere and foster communication before critique, similar to how we used to chat with our peers before in-person critiques.

We also let users choose their presentation time to erase some ambiguity or anxiety.

User Testing

We then conducted user testing and learned that our lobby screens lacked hierarchy—users did not know where to look or what actions to take.

“I’m not sure where to look or what to do when I see this screen?”—User

High Fidelity Prototype

We simplified the Before critique process by separating it into 2 distinct steps:

  1. Selecting a presentation time
  2. Answering the ice breaker question
High fidelity prototype of the user flow for entering a critique session

During Critique

Low Fidelity Prototype

We wanted to minimize the feeling of awkward silence during critique. A memorable quote from a user stated:

“It feels awkward sitting there in silence, especially when my camera is on while waiting for at least one person to give me a critique.”

First iteration of the emotes interaction during our low fidelity prototype phase

To combat this, we decided to implement emotes—this would allow the audience to quickly show the presenter they are actively engaged during the critique process. The emotes would be visible to all attendees, similar to instagram live. We hoped this would help offset longer periods of silence in a fun and encouraging way.

User Testing Round 1

From user testing we learned that public emotes would actually make students more subconscious about their work.

“I would feel bad if emotes were visible to everyone. I would compare myself to other students it would and make me subconscious of my work.”—User

Mid-Fidelity Prototype

Second iteration of our emotes feature after attempting to address user feedback

From here we wanted to shift the definition of emotes from a performative action more into a static tool like a sticker that students can place onto the artwork to indicate favorite areas.

Only the presenter would be able to view these stickers—we felt this would be a better way for students to receive more specific feedback without having to compare emotes between peers.

User Testing Round 2

After another round of user testing, we discovered our emotes were still not functioning quite the way we wanted them to. Users mentioned that the sticker function was redundant and not intuitive because it was basically another form of annotation

“I saw the emotes as a reaction tool and it wasn’t transparent to me that I can click and drag them as stickers.”—User

High Fidelity Prototype

Final iteration of the emotes feature

We simplified the emotes where students react to the overall piece rather than specific areas.

To address user feedback, we made the emotes more subtle and provided the option to hide the chat and emotes to make them less distracting. We limited emotes to positive options because we wanted to encourage users to provide any constructive feedback in the form of written critique, and because the emotes are intended to show positive engagement with the students’ work.

After Critique

Low Fidelity Prototype

We wanted to make referencing the feedback after critique easier for students.

We centralized all the feedback the student received, such as the annotations, chat, and professor’s critique into one document that students could easily reference after class.

Low fidelity prototype of what the final documentation of feedback could look like

User Testing

User testing brought up an edge case where there might be multiple annotations in the same area.

“What if there are multiple annotations in the same area?”—User

High Fidelity Prototype

To combat this, we combined annotations from the same area, and added an option for users to filter between text or hand-drawn annotations. This is a feature that could benefit from more user testing, especially for larger class sizes.

Final Product

After seeing a breakdown of each step of the process, this is how it all comes together to form a user-centered remote critique session.

Reflection

Future Changes

We would like to explore and accommodate for various learning styles, improve the professors’ experience with conducting in-class critique, and further refine the emotes and test with a large audience

What We Learned

  • How to conduct a face-to-face interview in a remote setting for research
  • Designing with the user flow in mind
  • How to incorporate diverse user feedback into a design solution

Conclusion

Through user surveys and interviews we learned that users feel various complex emotions surrounding the user experience before, during, and after a remote design critique session. These emotions may not necessarily agree with each other, which is part of the challenge associated with creating products to fit a diverse audience.

By integrating user testing in between lo-fi, mid-fi, and hi-fi prototypes, we were able to further refine our solution and gain a better understanding of how the user feels during the critique process.

Finally, if we had more time, we would want to further test our product and explore the professors’ experience with conducting in-class critique.

Overall, our team learned how to design with the user flow and emotions in mind and incorporate diverse feedback. Our team won the Most Innovative UX award and the Audience Choice award on our final presentation day.

--

--