Envrmnt is a team that was built out to focus on AR/VR at Verizon. We were increasingly being tasked with an expanding list of projects of a similar nature for our custom immersive engine. Rather than delivering one-off projects over and over again, we identified the opportunity and established a business goal to productize our player alongside the development of a web-based self-service content creator.
We had a mix of digital media companies (like Hearst and Time) and internal business units (like Verizon FiOS and RYOT) that wanted to introduce their existing mobile user base to immersive augmented (or virtual) reality content.
In the case of most experiences early on, this user base would be relying not only on their mobile devices, but would require an additional piece of material (like the cover of a magazine or the page of a user manual).
By offering a customizable SDK, our customers wouldn’t need to convince their audience to download a new mobile application.
As the principal designer on the team, and the only member devoted to mobile design, I worked directly with the Android and iOS developers, project managers, and QA testers to see the Player SDK become a reality. I additionally hired on a designer for our desktop web-application, and he provided design feedback and assisted in translating the mobile UI into the web’s workspace preview.
My individual contributions, as alluded to above, included interaction and interface design (at all levels of fidelity), visual design, prototyping, user flows and requirement documentation.
We already had already launched a number of projects that formed our fundamental set of requirements. A core project was the work we did with Cosmopolitan magazine and Maybelline. It was here that we introduced a link carousel in addition to our standard video player. The Cosmo interface was very pink, to align with the visual styling found in the associated brands.
Coming from this foundational work, my guiding principle was to provide a design that was usable, scalable, modern and neutral enough, so that even if our customers didn’t want to (or have the resources to) modify the player, they would be satisfied.
- - - - -
The first version of the resulting interface was essentially black and white with 2pt stroke icons.
I also relied on scrims and a collapsable carousel to provide more visibility into the camera view (as some of our early feedback confirmed that our UI was crowding the actual experience).
- - - - -
The next major addition to our product was evolving from marker-based (trigger) only experiences to markerless experiences.
Our customer driving the need for this feature enhancement was Sports Illustrated (Time, Inc), and they wanted to feature an experience for their swimsuit issue. We needed to make our player capable of placing 3D models in the user’s physical surroundings in the camera view. This included both swimsuit model—models, and 360 video portals.
Given the tight turnaround for the feature addition, I relied on best practices found in some other markerless apps, established an MVP, and relied on user flow diagrams to align with our stakeholders on the design.
- - - - -
While in the visual design phase, I worked on motion design in tandem same time, as I find it helps our devs to know where I’m going. We also introduced a side-by-side UI to support photo and video snapshots.
I used furniture catalogs for the extended design documentation, as it would be more applicable to future customers.
- - - - -
Immediately after launching our first pass at the markerless feature set, I got to work on improving the onboarding and user guidance.
The second version included animated iconography (using Lottie), timed messaging, gesture interaction tips, and enhanced plane detection.
We also added multi-model support, so I created detailed views for models to serve the customers who were interested in leveraging 3D models in AR as sales marketing tool.
- - - - -
After the revisions were made to the markerless components of the player, we then extended some of the enhancements to the marker-based interface.
The changes were more minor in this case, but they improved on elements like the massive white space in some of the carousel cards. I also provided an animated tip for users unfamiliar with trigger images, updated some of the directional copy, and created specifications for minimized video playback.
- - - - -
Our team was distributed across the country, with a major presence in New Jersey, and some in France. With me being in San Francisco, I worked to provide a wide breadth of preemptive design documentation and issue feedback.
We leveraged Confluence as a team, and I used the platform for documenting user requirements, attaching iconography, and providing key links to required resources.
I used Zeplin to provide interface design specifications.
For issue tracking, we used Jira, and whenever necessary I provided screen recordings like this for extra context to QA and devs.
From the time we got started, until I left, our AR Player SDK was used in significant media properties like Cosmopolitan, Time Inc, Entertainment Weekly, AccuWeather, and Sports Illustrated. It was also used in a range of internal applications for Verizon, like offering immersive training experiences to otherwise static content for FiOS technicians.
Reception for Envrmnt’s broader self-service creator platform was positive, receiving a finalist nomination for the Best Creator and Authoring Tool (the platform was comprised of a web-app, a mobile preview app, and this player SDK).
The Player SDK was constantly evolving, as was our immersive streaming platform, so I had a ton of opportunity to learn how to iterate rapidly with limited input.