Sharkzor is a web-based application for visual sort and summary of images with a deep learning back end.

ROLES: UX Research, UX Design, UX Management & Visual Design
↘BRANDING
The Sharkzor branding is playful and powerful. Designs here are used for stickers, t-shirts, and app branding.
↘USER TASK HIERARCHY
The Sharkzor user experience is designed around three primary user tasks; Triage, Organize and Automate. Rather than as a replacement, the Sharkzor system acts to augment the human analyst.
Triage:  The user's initial task is to triage the images, which is done by selecting any number of images on the canvas and viewing them in grid mode. Grid mode shows the images in an unobstructed way.
Organize: The user then begins to select images that they deem similar and create groups. Very few (1-2) examples are needed to define a group within the Sharkzor system. Once the user has created groups, the machine learning underpinnings can take over. 
Automate: Using the auto-group functionality, the user is able to quickly label many images with an arbitrary label defined within their own work-flow. This helps the user triage a large number of images without the typical task of labeling hundreds of example images.
↘INTERFACE DESIGN
Sharkzor was a research project developed for a government client. The original concept brought deep learning to a user interface for a true human-in-the-loop system. Selected mockups are shown below.
To meet client needs, Sharkzor's previous proof of concept interface needed to be updated to support use with large datasets.

PROBLEM: Update Sharkzor UX and UI to support use with datasets on the order of 1 million images.

PROCESS: High level requirements were provided by the client. I organized team brainstorming meetings and ideation phases, ensuring that the whole team was invested in the redesign and understood the future direction. Until this point, Sharkzor was designed (with browser limitations in mind) for use with only 500 images. In order to support the client need of 1 million images, there would be heavy lifts for all of our teams; design, development, engineering and deep learning.​​​​​​​

SOLUTION: To support real-time interaction with large datasets, some existing features will be removed, and new features will be introduced. Key new features include: (1) Interface reorganization, (2) Visual sort, (3) Nested grouping, and (4) Project overview bar. Additionally the entire existing app and features need to be updated to reflect the new interface design.
UPDATE: Interface reorganization
To accommodate new features and support interacting with large numbers of images, the Sharkzor interface needed a makeover. The old toolbar housed tools that were no longer part of the new interface, so it was removed and leftover tools were folded into logical places.
The new visual sort bar takes the place of the current metadata sort bar as both features can not be used at the same time.
To accommodate nested grouping, the groups bar which was once horizontal is now a vertical pane where nested groups can easily be displayed in a standard hierarchical list format.
The old status bar has transitioned into the project overview bar and moved to the top of the screen. This helps maximize vertical space for scrolling through large numbers of images.
NEW FEATURE: Visual sort
To add user understandable context to large datasets during the triage phase, we allow the user to sort the dataset between one or two user-chosen example images. The images used to sort are not intended to be images the user is interested in grouping, but rather a way to superficially organize the large number of images in a way that makes sense to the user during the triage task.
NEW FEATURE: Nested Grouping
Nested grouping is a logical extension of the current Sharkzor architecture which only allows for disjoint groupings. This feature supports and extends the organize task by allowing the user to define more specific groups. With small numbers of images, disjoint groups are sufficient. To support organization of large numbers of images, nested groups are necessary.
NEW FEATURE: Project Overview Bar
The new Project Overview Bar takes the place of the old status bar which only showed the number of images visible in the current view and the number of images present in the entire dataset. The project overview bar instead provides a visual representation of the makeup of the current project. This feature supports all phases of Sharkzor use from triage to organize to automate. The project overview bar provides a simple and visually intuitive overview of the user's entire project.
Top level groups are shown as colored bars, the color corresponding to the group label color. After images are auto-grouped, groups that have had no images added are decorated with a hash pattern. This helps the user quickly determine on a superficial level whether or not their actions are having an effect, and that the deep learning backend is working as they expect.
We also visualize ungrouped images as two categories; seen and unseen. This helps provide context to the user about how many images they have already triaged, and how many they have left. The user can also view images they have added to their scratchpad via the project overview bar.
UPDATES: Existing feature updates (metadata pane as example)
Existing features needed to be updated in line with the redesign while keeping familiar features consistent so as to not confuse existing users.
Updated metadata screens
Sharkzor is an ongoing project with evolving needs based on client requirements and scientific advancements. Client response to the redesign and proposed features was overwhelmingly positive, and the team is currently moving forward with development. 
Future directions include user studies, usability assessments and experiments using the interface to test Interactive Machine Learning Heuristics. IML Heuristics were developed on this project and presented at the IEEE VIS 2018 Workshop on Machine Learning from User Interaction for Visualization and Analytics as well as at the Black in AI Workshop at NeurIPS 2018.
This work was funded by the US Government.
Back to Top