BFK: UX Strategy, Research, and Design
The challenge
How might an organization better support users of a mature product?
About the Project
Battelle for Kids (BFK) is a national not-for-profit organization that develops innovative educational products and services. Edward Stull Consulting led the efforts to research, design, and test a new iteration of their flagship learning management system.
The learning management system (LMS) was an already mature product, but it needed to better serve the needs of its audiences. Partnering with CNTXT, who managed the project and led the subsequent visual design, the team improved the LMS by thoroughly investigating the problem space, revealing and prioritizing the true jobs to be done within the system.
Schools & nonprofits
Research reconciliation
Workshops & facilitation
Prototyping & testing
Existing research
Within mature product lines, you can often find rich repositories of existing research data. The data comes in the form of previous studies as well as customer support tickets, call logs, and analytics. Battelle for Kids had a wealth of existing research. Yet, the true effort involved making such data usable to its decision makers and product teams.
We collected and reconciled a vast trove of existing research data, including recent qualitative user interviews. These reconciled findings were then presented in a “research roundup.” Here the team could review findings and do gap analysis of the questions we still needed to answer through additional research.
Workshopping
Once the analysis was completed, the team workshopped the pains and gains for numerous stakeholders and users. For example, not only did we consider students and teachers, but also the needs of backstage curriculum authors and administrators. After all, the LMS would not be successful without the high-quality content that would be created and maintained within the system. Needs were then translated into user stories and jobs-to-be-done (JTBD) statements.
Prototyping and testing
The user stories and JTBD statements underpinned our prototyping efforts, producing several rounds of building and testing. Successful approaches could then be introduced into the live system and be measured quantitatively through the LMS’s analytics.
Students were able to quickly access their courses upon signing into the system. We emphasized the “Last Viewed” course, as this course was most frequently accessed.
Each primary screen was paired with a video describing how to use the on-screen information. Such an approach remedies many of the most common customer support requests.
With over 100 available courses, finding a course could prove to be a challenge. To alleviate this issue, we first segmented the corpus with terminology that was already familiar to users – in-progress, recommended, and completed courses. Filtering provided further delineation.
"Administrators wished to see the 'big picture' of how an implementation performed across their school or district."
Administrators wished to see the “big picture” of how an implementation performed across their school or district. Here they could view the activity of their organization, filtered by individual courses and date segmentations.
For example, in the screenshot above, administrators could quickly see that 114 of 200 users had interacted with the system on December 28th. Each date displayed a corresponding list of links to individual user detail pages, thereby giving administrators both "big picture" and granular data about their programs.
Curriculum creators managed courses through a WYSIWYG (what-you-see-is-what-you-get) interface, mirroring how courses would be viewed by students and teachers. The distillation of multiplied modals into a single screen helped reduce the rinse-and-repeat input processes of the prior system.