Hi, I'm Case
I design approachable mobile experiences for sophisticated digital problems.
Product Designer | Entrepreneur
Scroll down for Works
UX Case Study 2016-2019
Loki MVP: Creating the worlds first multi-angle live video experience
A journey of how Loki reduced the cost of multi-camera live streaming coverage from ~$100k down to ~$1,000 by bridging difficult physical barriers with approachable user experiences.
myCigna App: UX Process Case Study 2019
Transforming the lives of 2M+ DAU’s through extreme process improvement
Meet & Greet
Hi, I'm Case. An experienced product designer (5+ years @ corporate & startups) looking to make a difference in this world.
Who's with me?
Get in Touch
Multi-camera live video coverage takes a great deal of time to set up, expensive and heavy equipment, complicated software, and technical expertise to ensure all of the components are correctly working together. After all of that, the result is a curated viewing experience in which what the audience sees is determined by what angle is shown to them.
Loki aimed to simplify the experience across the board, from production to broadcasting to viewing.
Chief Design Officer (2016-2019): Responsible for overall art direction and end to end ideation / creation of Loki from a design perspective. Collaborated with all departments of the company to achieve success in the following areas:
• Mobile UX
• Web UX
• Branding & Marketing
• Product Vision
Reduced the cost of multi-camera coverage from upwards of $100k down to ~$1,000Reduced the amount of trained camera operators from 5-10+ (depending on the event size) down to 0 trained operatorsRemoved almost every barrier to entry for multi-camera live streaming
Draft Loki Marketing Video 2019
Process | Collaboration is a Must
Just as Loki aimed to show perspectives from around the world, I believe a diversity of perspectives is extremely valuable in creating any product. That is why from the beginning, I partnered with all facets of our team to produce the best experience for our users. This philosophy is what drove our success in launching our Loki MVP app.
The fastest, cheapest, and simplest multi-camera live streaming platform in the world
Early on, our problem statement focused on tackling “fake news,” and our vision boiled down to “Perspectives define reality”. The more perspectives of something you see, the more possible it is to understand the truth of what you are seeing.
As our vision matured through user testing, our business problem became utilitarian in nature. “We are the fastest, cheapest, and simplest multi-camera live streaming platform in the world”.
Digging Into Research
I started with secondary research focused on trends inherent to the live streaming industry, beginning with these questions:
Which industries are most viable for live streaming?What kinds of people live stream?What are the issues with current live streaming apps?
The results of this online research supported our initial hypothesis that perspectives were important for achieving authenticity and truth.
As we conducted field trials of our early alpha builds, we observed that there were immense opportunities to reduce the cost, time and resources of multi-camera live streaming, while enhancing the authenticity of the content broadcast through our platform.
I made a number of improvements over existing live streaming apps, such as, speeding up the process of starting a live stream.
A number of different demographics were surveyed to learn deeper insights about our broadcaster and watcher personas.
Defining the Foundation
We developed two main personas from our initial research which evolved over time through field testing.
Our “Watchers” were millennial aged, digital natives. They valued creativity, choice and immediacy. As our target market evolved towards festivals, this persona continued to align with our company's direction.
The following features were designed for the Watchers persona:
Ability to view all perspectives live, in real time or on-demandAbility to search for events based on a number of factors (Location, people, etc)Ability to explore events via a users home feed
Our “Broadcasters”, were organizations looking for less expensive and/or less complicated multi camera live streaming coverage. This persona aligned towards music festivals as our company matured.
The following features were designed for the Broadcasters persona:
Ability to easily join an event nearby in one tapAbility to view the broadcasting status of all other broadcasters within a live event (live or not live)Ability to see their own live streaming stats (views, stream health, etc)
Our MVP consisted of five main features, all constructed in support of our company’s vision:
A user must be able to Broadcast.A user must be able to Watch.A user must be able to Discover.A user must be able to Search.A user must be able to have an Identity.
After establishing our MVP features, I created a sitemap based on user journey exercises. This helped us understand the architecture of our MVP and the true scope of work to achieve it.
Whiteboard Concepts & Technical Team Ideation
In collaboration with my CEO and developers, I white-boarded user flows of our MVP features, while also discussing the technical implementations needed to support our designs. All of our discussions focused around, “how can we build this feasibly, in a time and resource conscious way.” This collaboration ensured we could overcome any difficult design considerations early on while being able to change course when needed due to technical limitations.
Wireframes were reserved for extremely complex features where we knew many iterations were needed upfront. This process allowed us to rapidly align on a direction as a team, and then refine the UI using high fidelity mockups. This also allowed development to get a head start on the bulk of the coding while giving myself time to refine the look and feel of the smaller interactions.
High Fidelity Mockups
A combination of quick whiteboard sketches, wireframes and our in house design system allowed me to ideate new experiences, iterate on them as a team and build them efficiently for delivery to our developers.
Once final designs were established, I created multiple resolution variants of each asset for use on our iOS, Android and Web platforms.
Before passing off final designs, I would conduct team demos to review the feature, final visual design, interactions and animations. This helped to level set smaller details, and gave the devs an opportunity to ask questions before writing any code. Additionally, I utilized both Invision and Zeplin (each have their pros and cons) for collaboration between myself and our developers. I used these tools to increase efficiency for the delivery of features, and reduce QA time by allowing developers to “inspect” precise design details from the tools.
Once our developers had a rough cut of the feature coded, I would perform a series of QA rounds. I would start by raising issues via our QA spreadsheet and prioritize them based on customer experience, business needs, and whether it was a design discrepancy or a technical bug. This gave visibility to our entire team, and allowed the appropriate member to quickly find and squash their bug! As we worked on these items, we would roll the fixes along with new features into our beta app.
Testing our Hypothesis
Once we launched a new feature, we would observe our users in the real world using our app. We would attend their events, and even help run them, so that we could see first hand how our experience performed and what the immediate pain points to focus on were.
Loki being tested at an Award Show
User Testing w/o an App
Before we had an alpha version of our MVP app, we created mock experiences to detect real world pain points early on…
Stream Feed Creation
One problem I observed was that broadcasters did not understand who else was actively broadcasting with them at an event...
Learn & Adjust
Lastly, due to the agile framework we worked within, it was extremely easy for us to learn from our testing and adapt based on our observations. Our custom agile-like practices allowed us to re-prioritize quickly based on our user’s needs.
My process for developing our app was very cyclical (in the sense that some of the steps within the process above were iterated through before moving to the next) but flexible (enough to allow for bumps in the road inherent with startups). This overarchingly increased the speed in which we worked together! All of this was achieved through having incredible trust and collaboration with our team.
A massive thank you to Andrew, Brian & Jeffrey, my incredible team at Loki!
"Going-Live": Who knew it'd be so hard!
Most live streaming apps at the time had the same narrative for starting a broadcast (or at Loki, “Going live”). The narrative goes like this:
You see something you want to broadcast (a concert, sports game, etc..), so you take out your phone and open your preferred live streaming app.You click broadcastYou name your event Then you might have to adjust a bunch of event settingsOn top of all of this, users trying to watch broadcasts from the same event can’t because none of the streams are aggregated together.
This process, as you can tell, is cumbersome and slow! And the last thing you want to be doing is fumbling around on your phone instead of getting the perfect shot!
I created a design that removed the clutter. It focused on one thing:
“Going Live” as quickly as possible.
Within Loki all a user has to do is:
Open the Loki app and click broadcastIf the user is at an event with active broadcasts, a user will see the event in the “Broadcast Center” (step 2 in the image above), and in a single tap join in live!The added benefit of this was that broadcasters self-aggregated their content into the correct event, allowing us to easily display events with all the appropriate broadcasts within them for easy discovery.
Our 1-click to "Go-Live" design performed very well in real world user testing and attributed largely to Loki’s success in becoming the easiest to use multi-camera platform in the world.
I surveyed college aged students from a viewer perspective to better understand a few of the following factors:
What type of content was important to them?How did they prefer to ingest live content? (Mobile, desktop, etc)Would they prefer to have more control over the content they were watching?
When surveying organizations (clubs, bands, festivals) from a broadcaster perspective we focused on questions like:
What were the key business drivers for starting to live streaming?What were the organizations current blockers for entering into live streaming?How could they see themselves expanding the use of live streaming in the future?
From our research, it was obvious that to make this work, we had to make it work well for the entire market we were entering into. This meant we had to improve not just the broadcaster’s experience, but the watcher’s experience as well. By doing this we could effectively face the "chicken or the egg" scenario inherent to the two-sided market we were creating.
Additionally, as our product evolved, we continued to survey our changing demographics to remain aligned with the new direction we had pivoted to.
Enhancing Perspective Discovery
(This section is still a WIP)
Wireframes were used to re-design the perspectives carousel and control dock located on the Watch View. The overarching goal of this enhancement was to improve the discoverability of perspectives.
To accomplish our main goal three big changes were implemented:
The perspective cards were augmented to show more of the thumbnails within the carousel.The information within the control dock was replaced with more relevant information demonstrated to us by our users, such as: Seeing the current broadcaster’s information and being able to subscribe to them.Lastly, some experimental features were removed that didn’t test well, such as our informational marquee.
Once these changes were published, we saw an increase in the ratio of perspectives watched within an event. This indicated to us that our changes improved viewers’ understanding of the perspective switching feature.
User Testing w/o an App
Before we had an alpha version of our MVP app, we would create mock experiences to detect real world pain points. For example, we started by building our broadcasting experience with an app that could only record. We gave that to our early users, gathered feedback from them and observed their recordings through a web player (featured above). This allowed us to iteratively build features, test and repeat before having a fully completed MVP experience. We were also able to test the technical viability of our platform alongside the user experience.
Stream Feed Creation
One problem I observed was that broadcasters did not understand who else was actively broadcasting with them at an event. This was important especially for broadcasters working alone with multiple cameras, because streams sometimes failed due to network coverage. They wanted to know which of their cameras were still streaming without constantly running around.
After noticing this problem, I designed a feature called the "Stream Feed", which allowed broadcasters of an event to know when others joined in and whether they were currently broadcasting. I accomplished this by showing a list of broadcasters within the event and signified whether they were actively broadcasting ("live") with a red dot next to their name.
With the "Steam Feed" implemented, we observed qualitative improvements to the broadcasting experience, mainly around a broadcaster feeling much more confident in what they were streaming. This was due to the transparency that the stream feed offered not only to a user’s own cameras, but to other broadcasters who may not know each other personally.
Designing Around Technical Limitations
Live streaming inherently has some technical limitations that a company can either choose to ignore or redefine. At Loki, we decided to tackle one of these technical limitations head on. We consistently saw one of the biggest problems being network penetration, or simply put, how strong your cell service is. Live streaming is already a data intensive activity to perform on your phone, so without a strong data connection, broadcasts are likely to fail. At Loki, we observed a lot of the industries we were looking to move into such as music festivals, concerts and sports, all had one thing in common, low network coverage due to extremely high traffic. We knew failing broadcasts would be frustrating for our users, so we looked to improve this part of the experience.
I decided to design a feature called "Post Live Uploading". This feature essentially allowed a broadcaster to lose connectivity to the network and continue to record locally without interruption. Once the user regained coverage, Loki would automatically upload the missing footage and merge the clip into the correct real world time within the event.
This vastly improved the frustration for broadcasters in a low networked area by keeping the broadcaster focused on the content rather than the tool. Additionally, this ensured content that would normally be lost entirely, would still be captured and uploaded to the platform.