Three distinct yet related user types:
Freshly funded, the founding team needed an initial MVP delivered with aggressive speed-to-market as the driving factor. Prior to me joining the team, the founders had validated their concept with both Auditors and potential clients.
We discussed the risks involved with this research-light project, the many assumptions we’d need to make, and that subsequent efforts may be major vs. iterations from a solid initial release. My concerns were reduced, however, when my own research revealed current tools being used were extremely basic and inefficient.
Existing solutions were typically provided by the Audit firm and consisted of crude spreadsheets tied to file shares like Google Drive or DropBox and endless emails and Slack messaging. Poorly organized, minimal instruction/assistance and an incredibly inefficient completion mechanism of uploading and linking files often multiple times for the same item.
Overall, existing solutions were confusing, inefficient and laborious. Creating an organized SaaS tool that logically organized the effort, provided better requirements guidance and removed duplicated effort seemed a worthy and valuable MVP goal.
Having to create a complex solution from scratch, at speed is always a sobering exercise as there’s little time for research, prototyping or feedback loops. Having explained the caveats related to this kind of MVP effort, I proceeded to interview the CEO, his co-founder, the CTO, and the head of customer success, a seasoned Deloitte auditor.
The interviews yielded a very extensive list of ideas which I organized into a Google Sheet. I then lead the team through a MoSCoW exercise to group features into Must, Should, Could, and Would categories. I then had the leadership team assign a value (1-5) and an effort (1-5) on opposite scales. We re-reviewed Could, and Would items through a lens of non-inclusion in the MVP, found a few strays and placed those in either Must or Should categories.
The next exercise was to write each Must and Should idea onto a post it note and have the team place each on in the appropriate quadrant and then discuss/adjust. Once complete I created a facsimile using Miro (formerly RealtimeBoard) then shared for final validation ahead of product and design efforts.
Prior to me joining, the CEO had modelled his vision using Jira, a little unusual for sure but it was an effective way of showing hierarchy, grouping, and actionable items requiring evidence to be uploaded. Several ‘friendly’ client prospects had been shown the Jira board and liked the way it organized and supported the audit, an interesting signal.
It was a useful way to ramp up my industry knowledge and help me understand the IA of the audit data and requirements in a practical and more importantly, usable way. It was a useful reference model that accelerated my ideation process.
I spent time with our in-house audit specialist, and thankfully, a couple of client prospects who were close to the founders. I needed to understand what an audit involved procedurally, legally, and how each side (auditor and client) approached the effort. I then tried to understand and absorb the human factors side of the audit process. What is an audit actually like day to day over months? What did Auditors complain about often and likewise, the clients. Where were the friction points, the confusion, the frustration, the needlessly repetitive work?
Clearly, a lot of immediate value could be added to an MVP product release.
There were no publicly accessible competitors in the space so nothing valuable could be gleaned from other assessments.
Given the extreme time-constraints of “a couple of weeks’ and the light concept validation with prospective clients and audit firms alike it was time to start thinking through the MVP.
High-level MVP scope:
Stretch goals:
Notes:
Getting Started Form
Current Audit Dashboard
The core user experience consisted of three main functional sections:
Overall audit progress, total tasks broken down by status, audit requirements grouped into common criteria and the ability to view current and prior audits.
Working closely with stakeholders, product management, engineering and our in-house audit expert we launched our MVP on schedule.
Feedback from our early-adopter beta testers was almost entirely positive. In summary, in comparison with existing tools, our 1.0 product was better organized, easier to use and far more efficient in terms of evidence upload and repurposing. We actually made a sale on the day of our launch!
The non-positive feedback centered around wanting the platform to do more e.g. task assignment to other team members with status checking, clearer explanations of tasks “in plain English” as well as Auditor access and automated evidence gathering. All good requests and great input into future releases.
An interesting learning was the need for our own engineers to spend time with our Audit expert to provide context for their work as it related to the overall project. A good lesson.
Overall, a very successful initial product launch.
Post MVP launch, with early clients, prospects, and investors satisfied that we now had an in-market research vehicle and a fledgling source of revenue. With that startup stress handled, it was time to explore and define a platform that we believed would better address current client needs, would support Audit firms, additional compliance standards and designed in a way to accommodate known and predicted future features.
I spent time talking to our existing single client, as well as attending as many sales calls as I could to understand points of resonance, desired features and capabilities as well as the perceived level of pain/value associated with each. I also worked with customer success to understand bugs, unclear functionality and feature requests from that channel. Lastly, we scoped and initiated an active campaign to add additional design partners which was incredibly revealing and useful.
With more time and a mandate to create a scalable design approach, I started researching design patterns typically used in administration systems.
Given what I knew would be future roadmap features and capabilities plus a general desire to create a system that could expand to fit the unknown, I ended up with a modular solution.
To support a variety of personas, functionality and use-cases I wanted to create flexible content zones to accommodate requirements.
Highlights:
A selection of screens for user research with clients, prospects and design partners.
Example Layouts
Client Audit Manager Role: Showing ‘no content’ layout
Client Audit Manager Role: Empty State (No Audits)
Evidence Gatherer Role: Task List but no access to other tabs
After walking through this approach with our group that had previously been so vocal about the wireframe version, feedback was even more positive.
Engineering
I setup a series of meetings with Engineering leads and specialists to deep-dive into proposed functionality, ask and answer questions and, over time, some ballpark estimates of effort and very approximate production-ready dates.
This is almost always the most sobering phase of the project. This phase often feels like a huge win, the end of a major effort, which it is, but it’s just the end of one phase ahead of the biggest piece, engineering begins.
From discussions with leadership, engineering, customer support, product, and design it was decided that this approach was absolutely the right solution for the scope of the product design requirements.
It was decided, however, that due to recent competitive pressure, we needed to innovate at a faster pace than this solution would allow.
What to do? After some consideration, I devised a hybrid approach, blending current and new designs that could be implemented in phases for an iterative solution vs. a complete overhaul.
Instead of a complete overhaul, I blended the new designs with the current MVP inverted-L navigational layout.
By using the existing framework but implementing the new modular content panels, engineering time was drastically reduced along with production-ready timelines.
The overall responsive framework would then be scheduled as separate, substantial effort at the appropriate time on the product roadmap.
To support a variety of personas, functionality and use-cases I wanted to create flexible content zones to accommodate requirements.
The reporting system created as part of the Automated Evidence Collection feature had some interesting challenges as report content is created in separate processes, queued, and then rendered. Also, the data that needed to be represented was often very long.
Each report consisted of a custom cover page, like this:
Followed by one or more pages of unique sectional layouts:
I created a Material Design based style guide for the new portal system.
The Material Design based modular design system was very positively received by our internal team, Design Partners and Audit firms.
While elements of the new design have been implemented into the production system, the complete implementation is scheduled as a future release effort.