At Immersiv.io, we’re truly design-focused, and we have at heart to create appealing and user-centric products. Discover here the work and discussions made by our Design Lab to create the most exciting AR cross-platforms experience for sports fans. You can also access directly the second part of this article: translation of our experience from mobile to AR glasses.
Designing ARISE: The Future of Watching Sports with Augmented Reality
For almost a year now, Immersiv.io has been developing a solution that will change the way people watch sports. Whether at home or in stadiums, ARISE helps sports fans to better understand the game and see how players are performing by providing real-time statistics and analysis using augmented reality. ARISE is available on the latest smartphones’ generation (embedding ARCore or ARkit) which are already democratized among the young and tech enthusiasts sports fans.
As Immersiv.io is at the forefront of AR technology, it is also working on the ultimate live fan experience by creating a smartglasses version of ARISE. Nowadays, smartglasses remain expensive and under-used products with few suppliers and actors. Still, they provide the most immersive experiences possible and their democratization is just a matter of time. In the meantime, few users means fewer users feedback and guidelines to design AR experiences on smartglasses, everything remains to be defined!
Problematics when designing in augmented reality
As designers, building a cross-platform AR product is quite a challenge. Now is the time to document the decisions and learning we’ve made throughout the process and to share it with curious readers, designers or not, who want to know more about the way we design AR experiences.
How to design the same product on devices with an extremely different display and interaction systems?
First, it is important to remind, especially for AR rookies, what are the differences between Smartphones’ and Glasses’ AR experiences.
What is augmented reality on smartphones?
Augmented Reality means “augment your reality”, a.k.a the real world with extra layers of digital information projected into it. These 2D or 3D virtual content are projected in reality through the smartphone camera feed (see our complete definition of AR).
The smartphone’s screen works as a small window to the augmented world that the user can move around. In the end, it is a 2D surface to interact with a 3D world, that’s why immersion on smartphones is quite limited and the user has to be fully focused to enjoy it.
©Ikea AR App
It also raises readability challenges since the screen is relatively small and virtual objects can sometimes be far away from the user, or worse, not in front of him/her. For this reason, AR experiences on mobile must be fully responsive: the size and rotation of the elements must be automatically adjusted in real-time and the user must be able to adjust them at any time.
ARISE combines 3D spatialized objects (like a mini football field), 2D spatialized information (like players’ numbers or names), and 2D overlays as in traditional mobile apps (interactive 2D content, buttons). Users interact with these virtual objects via a combination of touch gestures similar to standard apps (tap, hold, swipe…). But as these virtual elements are anchored in the real world, the user can also move around to get different viewpoints or move virtual objects around the area.
What is AR on mixed-reality glasses?
Although the technology and principle of adding virtual content to the real world work much like AR on smartphones, AR glasses offer a very different user experience. The 5” window turned into glasses close to your eyes so that you can (almost) no longer see the edges. Your whole body is inside the augmented world, you enjoy the ultimate immersion in a place where digital content and physical surfaces interact seamlessly.
While the visualization system is almost the same for all recent smartglasses, manufacturers offer different interaction systems. Magic Leap provides a controller with a touchpad and triggers, Nreal uses the smartphone screen as a large touchpad and HoloLens offers a hand-gesture solution, allowing users to interact without any controller. This wide variety of quickly evolving technologies is very challenging for designing AR experiences for glasses. In addition, there are still few users and no suitable interaction models or standards, except for manufacturers’ recommendations.
However, the AR glasses market is evolving rapidly and it seems that the global standards for mass-market AR devices for the next few years will be the ones that call for lightweight glasses and phones as controllers (powered by tech giants like Qualcomm and telecom companies). That’s how the Nreal Light works, using the power of smartphones to compute AR graphics: you plug the glasses into your phone and use it as a controller. Your phone is tracked by the glasses so you can use it as a laser pointer to interact with holograms. The mobile screen becomes a large touchpad where the user can perform touch gestures for specific interactions. The main interaction is point-and-click, but you can also design much larger interactions that involve 6 DOF tracking such as grabbing and moving objects.
There are still a lot of constraints related to experiences with glasses. First of all, almost all users who try our app today are AR novice, so they need more time and attention to understand and handle the device. Next, one must take into account the infinite variety of user environments. If designing a responsive app for mobile screens is already a challenge, for “physical worlds” it’s a totally different challenge because of the huge number of external parameters to be taken into account. How much space does the user have? Is there a flat surface around him? Is she standing or sitting? Can he move around? Are their hands-free? What about the brightness of the environment?… There are also many technical challenges that we, as designers, must keep in mind: tracking glitches, eye fatigue, limited battery life, loss of connection… And many others that you meet during iterations.
Our design process: mobile-first, meeting sports consumers expectations
In 2020 there are more than 2.8 billion smartphone users. While not all smartphones are AR-enabled, it’s still easier to launch a digital product on smartphones, and to target our audience with the feedback from a decade of use and habits that has made it easier for us to implement new features.
The first step of our design process was to determine the added value of such a product. We’ve been working with different sports leagues, and one of them, The Bundesliga (German Football League), has helped us throughout the process with its experience, knowledge and above all to prioritize the main needs of the sport’s customers. Today fans no longer watch sports, they live it, using sports apps to find out the latest news about their team, stalking their favorite player on social media and sharing their feelings with others. We’re targeting this new generation also called The Fluid Fans.
These analyses helped us ideating our first user journey (carve out your journey with human, not digital, action to easily target user needs and meet their expectations): Open the app for a sporting event, be informed in real-time of the match event with notifications, select a player on the field to see his performance live with AR tracking, replay your favorite moment, see the main post-match news, and share your experience on social media. What a journey!
At the same time, we’ve worked on the user flow and the prototype, to validate the product’s usability. The main challenge was to design the product as a white-label multi-sports app compatible with many sports, rules, durations… and to imagine how to process the live data we receive during a match, using augmented reality to display them with added value.
Once you have sequences, screens, placed elements, hierarchical information, it’s time to translate it into AR glasses (don’t fall into the trap of UI your smartphone interfaces and then think of your glass product).
Now, discover the design translation from mobile to glasses by reading the second part of this review, with some interesting examples.
This article was written by Immersiv.io’s Design Lab, composed of UX/UI/3D designers experimenting and reinventing digital interactions with the advent of immersive technologies and AR devices.
Special thanks to Thomas and Erwan!