Codebase Search Engine

Launching a new search engine to supercharge Ginkgo’s tech sales pipeline

Discovery interviews

User testing

Design thinking workshop

Survey

Wireframes

High fidelity prototypes

Screenshots of the final product. (The information displayed is all dummy data.)

Design Challenge

A quick intro to Ginkgo Bioworks

  • Ginkgo runs R&D projects for customers (our customers outsource R&D projects to us)

  • Unlike most contract research organizations, we retain IP rights to the scientific assets we create on behalf of customers

  • We brand this biological IP as “codebase”

  • Codebase lets us execute future projects cheaper and faster, building off learnings from past experiments

  • The technical sales team leverages codebase to scope and sell new projects - they figure out what Ginkgo is good at and sell more of that

Access to biological codebase was limiting our technical sales team

From past user research, I knew the technical sales team was struggling to leverage codebase and sell deals:

  • Scientists share most insights via slide decks, spreadsheets, and in meetings (institutional knowledge)

  • Scientists weren’t capturing and recording insights at the summary level required by the technical sales team

  • Lack of access to codebase was slowing down the tech sales pipeline, hurting Ginkgo’s deal conversion rate

I joined a team committed to solving this problem

  • I joined up with 4 engineers, a Product Manager, and Engineering Manager

  • They had a proof of concept on how to make novel connections across Ginkgo data objects

  • They needed help understanding how to turn their proof of concept into a user-centered product

  • My starting point was this proof of concept Python script:

My starting point was this proof of concept Python script. Users could enter a search query and the script would return an HTML file (sent to their downloads folder.) They could open the HTML file in their browser to see a list of helpful links and Ginkgo data IDs.


Actions taken

I documented my initial learning goals and got buy in from the team

I wrote up a design brief proposing a UX project with the following learning goals:

  • What is the most valuable codebase to make searchable for tech scoping?

  • At what level of detail do they need to see information?

  • What are the organizing principles for this new software product?

  • When do they need to compare across information vs dive deep into a single thing?

  • What will users do with this information once they find it? (need specifics)

I took the team’s ideas and converted them into diagrams and wire-flows

  • I was joining a team who already had a proof of concept

  • I wanted to set the stage for a collaborative working relationship & demonstrate my approach to UX

  • I collected ideas from my teammates and converted them into a whiteboard full of low-fidelity diagrams

  • I facilitated a mini-workshop and encouraged the team to react to the whiteboard

  • This was a productive session that resulted in a tangible artifact we could all stand behind

Images of the artifacts coming out of our initial brainstorm. I channeled the team’s creativity and turned it into something concrete we could all react to

I made a clickable prototype featuring low-fidelity wireframes

  • I needed something I could put in front of users. I wanted an interactive prototype they could click through and react to

  • I opted for wireframes because I wanted participants to understand these were still rough ideas

Wireframes for user testing, made using Figma

I tested the wireframes with target users

  • I recruited 5 participants, all from the tech sales team, with a mix of tenure at the company

  • I gave participants an open-ended task and encouraged them to free-explore the wireframes

  • I encouraged them to think out loud as they worked through the wireframes

  • I prepared a series of questions to cover topics they may have missed during the free-explore

  • I encouraged engineers to join - at least 1 engineer sat in on every session (the PM joined each session or watched the recording)

A tech sales scientist giving feedback on the wireframes

I converted the research findings into actionable recommendations

  • I rewatched each recording, highlighted, and tagged quotes (using Dovetail)

  • I clustered highlights and tags into themes

  • I labelled & wrote a brief synopsis for each theme

  • I typed this all up into a report to share with the team

  • I used the report to advocate for delivering the highest value features first

Table of contents from the research report

I delivered high-fidelity mocks over the course of several sprints

  • I kept my mockups ~2 weeks ahead of the developers implementation pace, designing for their next sprint’s epic

  • The mockups helped us uncover and fix edge cases as early as possible

  • They also helped the PM and devs clarify uncertainties around scope of epics & tickets

  • I annotated the mockups to highlight important decisions

  • I leveraged our new Ginkgo design system, built on top of MUI and Material Design

Screenshots from the developer handoff mock ups. Our design system + Figma Dev mode meant the engineers could automatically see which components I used and how I configured each component.

I used the designs to build consensus amongst the team

  • I held a bi-weekly design review meeting

  • The whole team engaged in async discussions via the Figma comments

  • These discussions helped us reach consensus around what we intended to release next

  • Good communication kept the team moving fast!

The Figma file became a place the team could come together and have important discussions asynchronously

I collaborated with the PM to create user adoption dashboards

  • The developers helped us integrate the application with Fullstory

  • We created metrics to track common task flows, app usage, and web vitals (e.g. first content paint, content layout shift)

  • We had a Fullstory dashboard ready to go prior to product launch

The team agreed upon a rubric for prioritizing design and tech debt

  • We wanted to leave sufficient room in each sprint to fix known technical and usability issues

  • We decided to spend 20% of each sprint fixing “product flaws”

  • We agreed upon the RICE model for scoring product flaws

  • The whole team was encouraged to write tickets and assign scores, top issues were pulled into the next sprint

  • This ended up being a great process for rapid iteration and staying on top of product debt

Our rubric for scoring product flaws. A developer proposed the idea and provided a draft, we all came together to refine it.

We ran user testing on our initial set of features

  • I worked with Ginkgo’s dedicated user researcher on a quick round of user testing

  • I helped her identify tasks and participants

  • She ran the testing and hosted a collaborative synthesis session with the engineers & PM

  • We tested the app with 5 participants from the tech sales team

  • I got to observe the testing and take notes without the cognitive load of facilitating sessions (so nice!)

Output of the collaborative research synthesis session lead by Ginkgo’s dedicated user researcher

We used the holiday quiet period to fix ALL the usability issues

  • We used the slow period leading up to the winter holidays to fix every pain point uncovered by the user testing

  • I worked 1:1 with the developers for small tweaks and made mockups for big fixes

I ran a survey to benchmark user satisfaction

  • I waited a few weeks to give people the chance to use the new app

  • I looked at our user adoption dashboard to target and recruit participants

  • The goal of the survey was to establish a baseline user satisfaction metric

  • 2 survey questions were from UX-Lite

  • 4 questions were from our UX team’s “product pulse” survey

  • ~1/5th of the tech sales team responded to the survey (n=12)


Outcomes

We received positive initial satisfaction scores 🎉

  • Those who used the application said it improved their efficiency

  • We have a small user population, but they scope and sell enormous deals. Ginkgo needs this team to run as efficiently as possible

We got confirmation on the highest value thing to work on next

  • The most cited issues from the survey confirmed feedback we received from other sources

  • Our MVP supported search for 1 type of codebase (small molecules.) Users were loud and clear, they needed to search for proteins and enzymes as well

I was surprised to learn not many people had used the software “as part of their work” 😬

  • Only 3 of 12 respondents said they used the app “as part of their work”

  • All of these people had visited the app (I reviewed their sessions using Fullstory)

  • Most non-users said they “know about the app, but haven’t tried it yet”

  • The PM and I used this information to increase our focus on driving user retention, finding ways to get users into the application and keep them coming back

I used the survey findings to advocate for getting rid of adoption blockers

  • We kew the sales team needed to search for more types of codebase than our initial MVP allowed

  • I advocated for the research findings and the PM shifted our strategy for the next handful of sprints

  • We spent the next quarter focusing on delivering protein and enzyme search (our biggest adoption blocker)

  • Early signs show an uptick in page views and retention, but it’s too soon to tell (at the time of this writing)

There’s no such thing as a “finished” product

  • I’m proud of our team’s accomplishments to date!

  • I plan to continue with incremental deliveries, fixing usability issues, and measuring user satisfaction over time

  • By the end of 2024, we want 80% of the sales team using the app at least 2 times per month

  • We also have an ambitious KPI around improving end user satisfaction (measured by survey scores)

That’s it, thanks for reading 👋

More case studies: