Design Process
The second round of BVW is the “naive user round” and is the most notoriously difficult project in the course. Built around the concept of indirect control, the prompt for this round was to design a world where the guest is able to complete a task or move through a narrative without any explicit direction from the team. The guest must also have agency and have the ability to make choices. The main challenge for the team is designing a world that guides them to make correct choices in order to complete the experience. Part of the grading for the assignment was about correctly predicting the choices that the guest would make.
Brainstorming
For this round, we used Notion to brainstorm ideas, with every team member providing at least two concept ideas. We focused on interactions that are intuitive and can be easily conveyed through visual clues. We ultimately settled on making a photography game, as the idea would be easy to convey – when the guest enters the world and sees they have a camera in their hand (the controller), their instinct will be to look for something to take pictures with. In our initial conception, the guest was a researcher in a magical forest, filled with different types of plants and animals they could photograph, each with their own behavior patterns and interactions. The experience would end with an encounter with a dangerous monster that resides in the forest. However, as we began breaking down the concept into tasks and reflected on our experiences from round 1, we realized that the scope was too large for a two-week project. We decided to start with just one creature – a paper mâché style butterfly creature we decided to call “Kiki-flies” – and expand if time permitted.
Playtesting & Iteration
After one week, we had a solid version of the game complete, with nearly all the art and sound assets finished as well as most gameplay mechanics. This resulted in much more constructive feedback from the playtesters as the guests had a very good sense of the experience. The feedback centered around things testers wished they could do, rather than on things that weren’t working or were confusing because they weren’t finished. For this round, in addition to the playtest session conducted with classmates and instructors after the first week, we were also required to playtest with at least 3 people outside of the course. I was responsible for taking notes on the behavior and feedback that guests gave during the experience. Getting guests to talk out loud and take note of what they were thinking and feeling as they went through the game was especially important because we needed to know any aspects that caused confusion or were difficult for the guests to navigate through. Our feedback on the initial prototype was overwhelmingly positive, guests enjoyed the environment and music and were interested in exploring. They looked at their hands and saw that they were given a notebook filled with drawings of butterfly creatures and a camera and were easily able to put together their goal. Reviewing their feedback gave us a clear set of tasks that needed to be completed in order to refine and improve the experience for the final.
Feedback | Resolution |
---|---|
The first 3-4 butterflies were easy to find. The remaining 4 were more difficult and after about a minute of searching and not finding anything new, guests would randomly snap pictures. | We designed the experience so that guests can choose to end at will once they have collected at least 4 butterfly photos. We didn’t want it to be too easy for guests to get all the photos as the purpose of the game is to explore. For the remaining 4 butterflies, we tweaked and tested location, color and size until we found that it was possible for guests to find all 8, without it being to easy and making the experience to short or flat. |
Guests were not interested in taking good photos, they just wanted to win. | Programmers implemented a photo-grading system, with photos receiving a “perfect”, “great”, “good” or “ok score. This incentivized guests to retake bad pictures |
Guests wanted more of a challenge/more interesting behavior for the butterflies. | During the initial playtests, all the butterflies that moved followed the same basic path After finalizing the locations for each of the butterflies, the artists add variation to their paths. We had initially planned to add more complex behavior, but this ended up being out of scope. |
Challenges
VR - For myself and most of my teammates, this was our first time using VR as a platform. There was a bit of a learning curve with that, and for myself there was an extra challenge in that I had only used Unity in very limited capacities prior to this. We also ran into a last- minute challenge in that most of our testing was done with the headset connected directly to a PC. But for the final, the game needed to run solely in the headset. Our environment ended up being too large to run without lag, and we needed to scale down.
Prioritizing Feedback - In our eagerness to improve based on the feedback from playtests, we made a list of updates and tried to tackle everything at once. This left us very little time to playtest again before the final.
Designing For Biases and Experience– We found that people who were familiar with similar games like Pokemon Snap or puzzle/exploration type games picked up on the goal very quickly and were invested. Other players seemed to struggle with searching and figuring out how to end the experience. One of the most consistent pieces of feedback we got was that players loved the environment and how peaceful it was. However some players who were less familiar with VR were hesitant to move a lot and explore. Because we had no idea who would be playing the game for the final and what their personal preferences and experience would be, we had to be aware of these potential behaviors and design safeguards.