Shortly after defending my PhD in psychology, I accepted a contracted role as a Design User Researcher within the Interactive Entertainment Business (IEB) division at Microsoft. Following this role, I joined Quidsi, an Amazon subsidiary, as a User Experience Analyst.
At Quidsi, I query and analyze web and customer data to better understand the user experience across Quidsi's ten eCommerce websites (e.g., Diapers.com, Wag.com, and Soap.com). I also conduct A/B and time series studies to help make informed improvements to these sites.
At Microsoft, I designed studies to understand and improve entertainment experiences for video and music consumption on the computer (PC and web) and TV (Xbox and Xbox-Kinect). I also participated in mobile and gaming research.
In my role at Quidsi, I support data-driven design decisions. Below is list of typical tasks I undertake in my job.
A/B tests. I conduct A/B tests to assess how changes to our sites' user interface might facilitate, or hinder, usability. Some tests are design-driven (e.g., I run A/B tests to understand the effect of new features). Other tests are data-driven (e.g., I have designed A/B tests based on potential pain points uncovered in web analytics or user studies).
Time series tests. When changes to a site are released without A/B testing, I conduct time series analyses to assess whether users' behaviors covary in time with the design change.
Tagging user interactions. For every design spec, I author a tagging document outlining the coding requirements to collect users' behavioral data. I also test all tags once they are released in testing and production environments. This ensures that critical user interactions are recorded within our web analytics and backend databases.
Improving data completeness. Prior to my hire, tagging user behavior was not completed systematically, thus there are gaps in the data. I provide developers with coding requirements for important on-site interactions that are missing tags. I also file tagging bugs.
Funnel analyses. For user goals that require completion of a known series of steps (e.g,. purchase flow, setting up a registry, etc), I conduct funnel analyses to identify and address the steps at which users experience friction.
Ad hoc data pulls. Whenever there is consideration for a change on our sites, I pull and analyze data to inform design decisions. When possible or appropriate, I also provide data support outside the UX team (e.g., retail, product, and operations).
Solving data mysteries. Because I work so closely with Quidsi's web analytics data, I am often the first to detect data aberrations. This means I am also usually the one responsible for identifying and addressing the cause of these aberrations. Most often the causes are bots or bugs.
Report and dashboard building. I build reports and dashboards and publish them to an internal UX analytics and research reporting site. The internal site was built by a intern under my supervision and it makes my work accessible to everyone in the company.
Survey design and analysis. I work with my team's qualitative user researcher to conduct surveys to collect customers' opinions and attitudes about our sites' user experience.
Subsidiary integration. I assist in preparing our sites for integration onto Amazon's various internal platforms. In particular, I play a large role in make our sites compatible with new analytics tools.
UX Research Projects
Although my contract at Microsoft was only for one year, it was a very productive year in which I had the opportunity to use many core usability research methodologies. Below is a representative sample of the types of UX projects I was responsible for designing, executing, and presenting.
Benchmarking. I measured users’ performance and satisfaction for completing key tasks on the Zune desktop client. I wrote an internal website to track these metrics across versions.
Competitive analyses. I compared the user experience of (1) Zune on the PC to comparable entertainment applications for the PC and (2) Zune on Xbox to comparable entertainment consoles and applications for televisions.
Natural user interface research. I designed several iterative studies assessing the usability of Zune on Xbox with Kinect shortly before it shipped. Users gestured or spoke to control this application.
Prototype testing. Using a speak-aloud protocol, I ran many studies evaluating users’ understanding of static and interactive prototypes for future entertainment products.
Card sort. I used this method to identify how users conceptually organize their digital entertainment activities. Results informed the organization of content for future products.
Movie attributes study. I developed a novel research approach to identify (1) which tv/movie attributes users report as most useful for deciding what to watch and (2) which attributes users actually use.
Heuristic evaluation. I helped develop a list of product agnostic design principles for the purpose of auditing application usability. I used this list to evaluate and recommend changes for current and future products.
Purchase flow evaluation. I deconstructed the purchase flow for content on Zune Marketplace and made recommendations to remove barriers to users’ ability or desire to complete a transaction.
Information architecture study. I ran an experiment to test the role of the “flatness” of the organization of information (i.e., amount of vertically nested hierarchical levels) on usability.
Internal research document. I authored a widely circulated internal document outlining how to optimize user experience with movie recommendation systems.
Information transmission study. I developed a novel research paradigm to measure how well on-screen UI elements “transmitted” information about their functions. I used textual analyses and signal detection analyses to quantify the quality of information transmission.
Site visits. I assisted in the moderation of site visitation studies examining how users consumed entertainment in the home.
Survey research. I analyzed a dataset that comprised of a few hundred Xbox users’ responses to questions about their attitudes and behaviors regarding their Xbox Avatars.
Academic Research Background
Modeling individual differences in response to complex situations
Advisor: Yuichi Shoda
I conducted many studies in which I (1) collected people's responses across many situations, (2) measured the presence of psychological properties embedded in these situations, and (3) used statistical computing to fit these data to unique predictive models for each person. My studies contributed to a growing body of research demonstrating the limitations of psychology's traditional one-size-fits-most approach to describing people's interactions with their environment.
Modeling individual differences in structures of emotions
Advisor: Yuichi Shoda
One of psychology's oldest unresolved debates asks whether positive and negative feelings fall along opposite ends of a bipolar dimension, or whether they are two separable, unipolar dimensions. For my thesis, I revealed a blind spot in this debate: the assumption that one structure describes all people. In many studies, I used mixed modeling to demonstrate that the relation between positive and negative feelings significantly varies from person to person.
Human Computer Interaction behavioral science consultant
I acted as an informal consultant for a few ubiquitous computing research projects at the University of Washington’s Computer Science and Engineering program. These projects involved development of technologies that passively track people’s health or environmental sustainability behaviors (e.g., mode of transportation). I provided feedback about how to use such data to index unique user profiles and to strategically change behavior.
Subliminal mere exposure and priming
Advisors: Anthonly Greenwald and Vivian Zayas
As side projects to my main line of research, I conducted several experiments to identify the conditions necessary to produce subliminal mere exposure effects and subliminal face priming effects. Although demonstration of both types of effects were published decades ago and are frequently cited in psychology textbooks, these effects are in fact extremely difficult to replicate. There are very few incentives for psychologists to conduct replication studies, but identifying necessary conditions to reliably reproduce unusual psychological phenomena is an important step to understanding the mechanism underlying them.
Artificial Grammar Learning
Advisors: John Kilhlstrom and Lillian Park
For my first independently executed study, I conducted an experiment in the area of artificial grammar learning (AGL). The AGL research paradigm typically yields results that support a theory that people can learn complex, rule-based knowledge outside of consciousness. My experiment provided support for an alternative explanation that did not require the unconscious learning of complex rules.