Designed an Infographics Toolkit WEb Application
Overview
During the spring 2015 semester, I completed a User Experience course. In that course, the semester group project involved designing a prototype for a local industry client. The goal was to design a tool that designers can use to make infographics while generating SAS reports. My team designed and created a low fidelity wireframe prototype and a high fidelity prototype. We conducted formative and summative evaluations using the prototypes to collect data on hedonic and usability measures. We presented our prototypes and testing results to the client,a UX team at SAS Institute, and we created a design specification for the toolkit. Visit http://uxclass.csc.ncsu.edu/2015/05/project-sas-infographics.html to view reports, design specifications, screencasts, etc.
My Contributions
I had the following roles during the project: participated in team design ideation, developed the protocol for the formative and summative evaluations of the tool, moderated the user tests, analyzed the evaluations’ data, and helped create the low fidelity prototypes.
Tools and Methods
paper sketching, low fidelity prototyping, Balsamiq, moderated lab testing, think aloud protocol, semi-structured interviewing, task/scenario-based testing, survey, Qualtrics, descriptive statistics
Team Members and Collaborators
Ankita Pise (written documentation), Jesseca Taylor (co-team leader, UX researcher, low fidelity prototype), Nitish Pandey (co-team leader, low fidelity prototype), Sharan Gopalan (high fidelity prototype), Yashwanth Nallabothu (high fidelity prototype); Rajiv Ramarajan and Riley Benson at SAS (clients)
Process
Design Iteration 1
We sketched design ideas, and my teammate and I used Balsamiq to generate a low fidelity prototype of the paper sketches. The prototype was used in the formative evaluation in order to identify the concepts and features that needed improvement.
Screenshots of paper sketches and first low fidelity prototype
Testing Iteration 1
I conducted a walkthrough of the low fidelity prototype to identify potential problems to investigate with user testing. The user test consisted of semi-structured interviewing and scenario-based testing. I administered a survey to obtain participants’ feedback on what they found visually pleasing and visually unpleasing about the prototype. Additionally, I included questions with ease of use scales. I provided data to substantiate my prediction that the language used on our controls and screens was unclear to users.
Design Iteration 2
We modified the Balsamiq prototype based on the first testing iteration and client feedback. Some changes involved terminology used on labels, color scheme, layout, and additional functionality to support use cases requested by the client. We also developed a high fidelity prototype that allowed for more direct manipulation and provided support for one of the more complex use cases that we proposed to support.
Testing Iteration 2
I developed the task scenarios, test script, and survey for the user test.
I created task scenarios based on the following use cases: replacing slices in pie chart with images; connecting image or visualization to element in a visualization; customized visualization, images, and text in tooltip; and overlaid image or masking images to provide decoration or context. Visual examples of each use case are shown below.
I included the following tasks in this iteration of testing:
onboarding (select dataset and data, change background, add report title)
add visualizations and image
add infographics (formatted text tooltip, query tooltip, row chart tooltip, connect images to line chart markers)
change infographics’ properties (replace pie slices with images, change tooltip property)
For post-task questions, I asked participants to give a Single Ease Question (SEQ) rating and an explanation of the rating given.
I also presented post-test rating questions consisting of the System Usability Scale (SUS) to measure overall usability of the prototype and hedonic measures (pleasantness, attractiveness, originality, innovative).


Outcome
Participants thought onboarding and adding visualizations were relatively easy. However, adding and modifying infographics generated lower SEQ scores from participants. Additionally, the average SUS score was 24 on a scale 0-100 (100 indicating best usability). These data suggested that some tasks were not easy to perform. Based on observations and participants’ comments, some difficulty arose from limited functionality of the prototype. Participants appeared frustrated that they could not press certain controls (e.g., toggle between options) and some transitions were not visible.


Reflections
I enjoyed working on this project because it resembled a project that would be assigned in a professional setting as opposed to a typical class project. Our industry client provided the team with real deliverables. Also, our team reflected a cross-functional team. My teammates were computer science majors with little user experience knowledge while I was a psychology major with experience in user research. I encouraged the team to perform an evaluation early, and as I recall, we were the only team that conducted an evaluation early in the process. We received positive feedback from our client and professor because our team included user feedback early and multiple times throughout the process.