A Return to Best Practices
Client Logo
Client Overview
Development Associate International is a Non-Profit Fundraising Organization that endeavors to empower local leaders through education, utilizing Christian values, to create a strong and moral authority that builds their individual communities up to become self-sustaining in a world structured around fostering foreign dependency.
DAI team members focus on finding donors who share their views. The donations received go towards communities around the world that have often experienced some form of disaster; be it economic, natural, or human violence. They believe that educated, passionate leaders sculpt their own societies to mirror these unassailably noble values.
The Details
Duration
2 - Month Research Audit
The Gang
Myself
Abby Amstutz
Mikel Detmer
The Tools
Zoom
Google Docs
Gmail
The Almighty Internet
The Methods
Business Analysis
Heuristic Evaluation
Competitive and Comparative Analyses
Usability Tests
The Problem
“By designing for everyone, you have designed for no one.”
DAI requested an extensive analysis of their website to find flaws, inconsistencies, and failures in adherence to best practices. The C-Suite had expressed consternation with the website as it had existed and requested that it be updated in such a way that inspired confidence in what the business achieves.
The website was, quite frankly, an elaborately constructed maze. Often links would open new tabs to different websites that they were affiliated with, but were created for a location that was not the United States, where the vast majority of their donors reside. There were numerous inconsistencies between the same buttons on different pages that led to the same pages. The list was extensive, but I will focus on some of the more egregious or numerous design flaws.
The Solution
Discovering the point and leveraging their purpose.
Abby, the Communications Director, and Mikel, the IT Systems Administrator, explained that the primary purpose of the website was to educate any user with their methodology and efficacy. Often their users are potential donors and local leaders they hope to, or have previously, worked with.
This would all manifest in several design decisions that we believed would address the problem. The first was to apply visuals for the target users that would more clearly inform them of the application process in terms of their progress, the general timeline attached to both completing the form and next steps, and the creation of a guide section that would provide relevant information. Furthermore, we would be adjusting the information architecture of the form itself. Consolidating and reorganizing sections, restructuring the layout, and generally adhering to the best practices in form construction.
Research and Testing
Our plan of attack.
User Interviews
Finding those moments that were hurdles for our users.
As the metrics did not reveal the point of drop-off, nor any other hint as to where frustrations might arise, we had to approach our client’s concerns from a different angle first. Thus we began our research by interviewing past, present, and potential students. In total we interviewed 10 students to gain the necessary insights that the metrics could not shine a light on and where the points of frustration arose to assail them in the application.
Affinity Mapping
Uniting the users.
After conducting our interviews we sought to decipher the main points and recognize the outliers by affinity mapping. We found that many students were confused by some of the larger open field text questions asked of them, not really knowing how to answer them. There also existed among our users a common desire to know what happens next and when. There were several points where these details could be found, but they were buried in large chunks of text, which, as we discovered through our interviews, few of the students took the time to read.
User Persona
Our champion.
The synthesis of the affinity mapping insights resulted in the persona, our target user. This served as our guiding light as we moved into the early stages of design ideation. We knew our user was motivated to create mentor guided passion project, but wanted to know how best to fill out the form so that their chances were maximized. We also knew that they were impatient and so there was little chance of them reading large blocks of text that would otherwise answer their questions.
Competitive Analysis
Scouting the field for the opposition’s strengths.
We next engaged competitive and comparative analyses to further bolster our research. First was to see how those competitors of Polygence, like Pioneer Academics and Horizon Inspires, approached the process of applying for their services. Specifically what we were looking at was how they laid out their form fields, addressed error and help messages, and utilized visuals to explain processes.
Comparative Analysis
Learning tactics from the best.
While the information gleaned from these competitors was useful, it was the comparative analyses that were truly bountiful. It occurred to us that TurboTax was perhaps unrivaled in the world of form design, as well as utilizing images and encouraging text to help the user complete something that is not a particularly enjoyable task. Equally inspiring was Airbnb, who offers their users a dedicated help/explanation section to assist their own users.
Sketches
Practicing our new formations.
The research complete, we began to ideate the design. Here I must give the credit to my compatriot, Leila. She created these rough sketches of possible design routes, which we as a group agreed to hold true to as we moved further ahead with our design. These sketches addressed the major design features that we wanted to implement that would resolve the issues we found our users to have with the application.
Prior to moving ahead with our plans, we wanted to make sure that our client was comfortable with these changes, which were considerable. We had a meeting in which we presented all of our research findings to the Polygence CEO, briefly explaining the processes, results, and showing these early stage plans for the design. The message well received and understood, we moved into the digital realm.
Usability Testing
Putting the tactics to the test.
Guided by our research and personas, we created a prototype of the application process, beginning with the initial signup from the home page. Usability testing followed, which validated many of our design decisions. The feedback was largely positive from student testers, all of them saying that the sections felt more fluid and that they really appreciated the presence of the help section we created. There were a few changes we had to make, but they were minimal.
Delivering the users a path of least resistance.
Visualization of processes.
By presenting clear and easily digestible images to explain processes, our users will know where they are and what they will have to do at any point in the application. An update to the progress bar of the application serves the users in the same capacity, constantly keeping them abreast of their status.
Encouragement and guidance.
The student application utilizes space on the page to explain how the information will be used as well as offering help via examples. This will eliminate, or at the very least reduce, the need for our target user to leave the application in search of answers. We also updated the form’s help message system, making sure that students will know what doesn’t fit the fields requirements, and why.
Best practices in form design.
Through our competitive analysis, as well as a round of card sorting, we were able to determine where we could consolidate some sections to reduce the total number of sections our user has to move through. Their positioning in the application was informed by industry best practices, ensuring that the fields that are the easiest to answer are first and more complex fields later.
Results and reflections
Student considerations.
My colleagues and I had several long discussions about two different aspects of the application: 1. The financial aid and how best to present that information and the cost of the program and, 2. The purpose behind asking for the gender of the students. These two things resulted in hours of conversation, and thus design iterations to address them. Ultimately we felt that it was best that the financial considerations were presented in some fashion early on from both an ethical and business standpoint. Students needed to know what the rough cost would be prior to beginning the application so that they wouldn’t feel tricked and abandon the application early on, contributing further to the completion rate, and they should know that there are options for financial aid, which opens up the opportunity for those students who are fiscally constrained.
The question of gender next gave us pause. In the end, we decided it was best to simply omit this section from the form. This decision was influenced by the Lyft research into it, which showed that it failed in its attempt to be inclusive, but rather potentially putting those people who are targets of discrimination at risk.
Mobile...primacy?
The user interviews informed us that there were some major things to take into account for the mobile. None of the interviewees filled out the application on their phone. 80% of the student said that they wouldn’t use their phones even if the application was present on the dedicated mobile app because they simply wouldn’t download something to then fill out an application. Too many obstacles on this path. Thus we spoke with our client about developing a fully responsive design for mobile users as there was still a possibility that some students may not have access to a computer to complete the application. Creating this design was outside the scope of our 3-week contract, so it would be a next step in any future collaboration.
More research always.
I believe there is more research to be done. It would be beneficial to interview those students who failed to complete the application and discover why that was. Polygence provided many of thier students for us to interview, however 70% where students who had already completed the program, which meant that regardless of any frustration they might have had with the application, they still completed it.