DBIDS
A retrospective
As an aspiring experience & Design specialist I know, understand, and do my best to usher usability testing. In my current position working for the Department of Defense, our work is not publicly available, the systems I design and implement are FOUO - which makes usability testing damn near impossible.
The requirements for even being able to talk about the software I design i.e. Security clearances, meetings with officials and law enforcement officers that are way above my pay grade are few and far between.
When I first started with the D.O.D. I was tasked with auditing the current version of the software. I needed a deep understanding of what I was about to get into. I spent nearly a year reading a 500 page manual, stumbling though high level security measures, interviews with all the engineers.
Getting feedback for the work I do is extremely rare. I rely on my ever evolving knowledge and experience to make judgement calls based on what I know about my end users. I've studied their age brackets, levels of knowledge and competence. Their location, daily tasks, roles and duties, and on some rare occasions, interviews with the LEOs (Law Enforcement Officers) & BSO's (Base Security Officers) who had nothing but bad things to say about the legacy version.
Methods
- Assess the current version of the software. (evaluation agenda)
- Learn it inside and out utilizing Analytical and Empirical Evaluation methods . Note taking along the way. Identifying immediate issues with the UI were crucial. Right out of the gate, data was just dumped onto the screen in black and white with no thought or care given to task and flow. It took me months to figure out how to perform the most remedial of tasks.
- Get to know the customer, establish trust
- With my limited resources, I needed to reach out to those who used this software. In the public sector, communication is limited to clearance levels. I wasn't able to just strike up a conversation with a general. I had to jump through hoops for years before I was able to finally gain an audience.
- Organize & Educate: deliver a clear set of intended directives and tasks (Requirements Agenda)
- After my audit of the current software pitfalls, issues and vague needs of the customer I was able to put together a battle plan to bring this legacy software to the modern age. Months of writing (sans technical jargon) was a daunting task, however, it helped to illustrate the importance of having user friendly software. Mainly, increasing efficiency and reducing human error.
- Dive right in, hop around each of the small goals adding bits and pieces of real-world data into a visual model – this allows a design to evolve both naturally and logically while mapping a potential path of least resistance.
- We were provided a very limited set of vague requirements for our tasks, which in turn, required us to fill in the blanks and in fact, write the requirements as we pressed forward. 90% of the time, the client didn't know what they wanted so it was up to us to generate and deliver items we thought would be beneficial to the product.
- Requirements gathering is a nightmare I wouldn't wish on my worst enemy...
- Create visual wires and prototypes presenting before and after models highlighting changes in the data model and task flows.(Design Agenda)
- Having the ability to show a new data model and structure before coding served as a road map for production. It also allowed my superiors to educate their superiors for approval.
- Limited usability testing based on internal sources
- Our internal team served as a limited testing ground for discovering errors in flows as well as identify problematic areas of concern, drop off rates, task confusion and completion, data display readability and hierarchy of data importance.
- Implementing the core: Visibility of system status, Error prevention, Recognition rather than recall, Flexibility and efficiency of use.
- Shockingly, these core elements were entirely non-present in the legacy version of our software.
- ISO 9241-11
- Effectiveness
- Efficiency
- Satisfaction
- Freedom from Risk
- Context coverage
- In Field Testing
- Without the ability to perform usability testing before rolling out the new version of our software, I was forced to deploy a completely revamped, remodeled, redesigned, un-tested experience. It was a bit like closing your eyes and squeezing the trigger of a fire arm.
- I created a comprehensive help and video tutorial section to help alleviate shock, educate and set up a feedback portal and waited for the complaints to roll in. It was a horrible yet rewarding experience. I found that for the most part, just changing the high contrast black and white data dump to a more colorful and usable flow model provided less user error and a more pleasing experience over all. Win!
- Iterative Removal of Usability Problems
- Over the next few years, our team would make changes based on limited client feedback. Every version would improve the product. It was a long hard road to be sure. And even after hemming and hawing about the importance of Usability testing was, how much time and money could have been saved over the last 6 years, I still don't have the ability to properly user test.
- Utilizing Adaptation and Continuous Evaluation Practices
- While our software continues to evolve, we have found ways to adapt and work through the limitations set in place. We do not have a broad user spectrum so we have studied and verbally communicated with our clients on a weekly basis doing our best to fill in the blanks.
- Next Generation Versions, Thinking Ahead
- Pushing the envelope with design and function is always on our minds. like a good game of chess, anticipating what the client needs before they need it serves as a vessel for consistently delivering modern tech. We explore and generate new features that were perhaps not asked for, rather predicted and defined by our team in advance. Always think ahead.
These result from multi-factorial interactions between users, goals, contexts and a software product.
Out of the gate, I had my work cut out for me. I kept these core considerations at the forefront of my task. The legacy version of the software was designed and implemented by engineers with no consideration for HCI.
*I had to educate and sell my approach without offending, blaming and belittling. *(See #3)