In the past, solar eclipse viewing has been inaccessible to those who have vision impairments. By focusing on the sounds of eclipses, or eclipse soundscapes, we’ll create a new way for users to contribute to a Citizen Science research project. We aim to create a platform where people can easily upload, search for, and analyze sound data regardless of visual ability.
Goals
Our goal with the Eclipse Soundscapes site as a whole is to make participating in citizen science research (uploading, analyzing, and sorting through audio clips in this case) an easy to learn and easy to participate in activity. This would look like:
Success would look like a high number of users completing tasks at a high success rate (clicking through the pages all the way), and a high percentage of users returning to the site after initially visiting and completing a few tasks.
Due to time constraints of the project, we had to prioritize which pages were going to be prototyped first and thus have a chance to be user tested and iterated on. We decided that of the various subpages after the Home page, the Analysis page, where the user would be actually analyzing sound clips and contributing to research, was the place to start.
We started with a low fidelity wireframe of the home page, we wanted to make sure first and foremost that the buttons were large and clearly labeled, thus making the site easily navigable.
Next we made low and high fidelity wireframes of the analysis page. We wanted to have a high fidelity prototype for user testing since color contrast is incredibly important for users experiencing low vision.
For our final prototype, we included wireframes for the Updates and Badges pages.
The following clip shows a brief demo of the file navigation in the Analyze page
The next video is a demo of how a user would analyze an individual sound clip using our interface.
This final clip is a demo of how a user would browse the existing soundscapes database for clips to analyze.
We also included a few mobile mockups:
We were under severe time constraints for this project, so if we were to do this again we'd definitely do more rounds of user testing and iterating for not just the Analysis page but for the other secondary pages as well. The user testing phase of this project was crucial to helping us make adjustments to our first round of wireframes, and I feel that we probably missed out on similar opportunities for the other pages simply because we didn't have the time to prototype and test an entire site in the time frame of the class.
If given more time, I'd also love to begin developing some of these pages and interactions into a more high fidelity prototype using something like React.js in order to start linking some back end features (like the database search feature). Our current prototype is done in Figma with the primary goal of having something lower fidelity to test the design in its initial stages.
Overall, I feel like this project gave me great insight and practice in the research and design process. I really enjoyed speaking with our user testers and our user interviewees in this process - it almost felt like a collaborative experience with them. Hearing about their experiences with navigating the web was incredibly insightful and helped inform our design decisions - like I mentioned above, I wish we had more time to consult with them further and really develop all aspects of the site.
This project was also special to me because I am personally passionate about accessibility within education, and this was my first opportunity to participate in a project like this as a designer. It taught me how to work in a group to ideate and split up research and design tasks, and it challenged me to frame the task in a way that puts the user first. I'm really excited to see what ARISA Lab continues to do with the project, and I only wish that our time on the project was longer so we could further develop it.