Rocksmith+ Mobile
My Role : UX/UI Designer
Team Size : 2 UX Designers | 2 UI Artists | 4 UI Engineers
Context
I joined the team in 2020. Rocksmith+ was in pre-production and the Ubisoft wanted a global release on mobile, PC, and next gen game consoles (at that time, it was the Playstation 5 and Xbox Series X). Very few people on the UX team had prior experience building a mobile user experiences.

This was the traditional Rocksmith experience - in front of a large screen. The MVP for Rocksmith+ was also built only for PC. My goal was to deliver it on Mobile and Tablets in addition to PC and Console to invite a whole new audience of guitar learners from around the globe.
Actions
The Rocksmith+ mobile experience was a collection of many, many small steps. The first piece of work started with lots of user testing where we requested internal and external volunteers to play the a crude, ported version of the game on their mobile and tablet. My first few initial tests were to discover how people set up a portable device compared to playing on a PC or a TV (plugged in to a console) to understand context and environment in which this will be experienced.
The tests gleaned insights using three primary methods
- Observational studies and contextual interviews with internal and external volunteers
- Surveys after each playtest to track progress of key success signals via satisfaction of learning experience, satisfaction of interaction, and NPS surveys
- Diary Studies with strictly internal, non development staff who do not have insights into development (Ubisoft at that time had 20,000 employees, recruiting internal employees was not a challenge) to see how needs evolved as users went from beginners to power users
Throughout the virtual and in person testing sessions we made notes of various environments user learn guitars in, such as
- Placing their phone on various sized surfaces, from kitchen countertops to dedicated desks
- Plugging in their guitars into a myriad of equipment, from pedals for advanced guitar players to headphones so mothers can practice without waking up the children
- Placing the devices at different distances away from themselves, so as to have room to play their instruments in different positions and with different techniques


What I quickly realised was that the UI cannot be one size fits all for this experience. Users need tons of customisability to change the experience to their needs
- The ability to zoom in and zoom out, and change field of vision, based on how far they have placed the device from themselves
- The ability to scale the UI so that button are bigger if the device is farther away and they have to operate it while holding a guitar
Not only did we need a responsive UI that adjusted to various form factors and aspect ratio, users needed customisability.
So we first began with working on a core responsive UI that would offer the base experience on all devices. It involved not only designing and validating layouts in Figma, but also working closely with UI engineers to build appropriate tools such as
- UI component anchoring, so that UI elements know exactly where to stay on the hundreds of devices this game will be played
- Anchor based scaling, so that the UI components know in what direction to scale in to remain on screen and maintain the layout
- Aspect ratio based scaling rules, so that UI elements know exactly how to scale in proportion to the screen. We opted to use percentage based scaling
- Aspect ratio break point table, so we can create bespoke rules for brackets of devices (some UI rules on phone were broken on tablet, some rules on tablets were broken on monitors and TVs)

The app on an iPad pro vs a Samsung galaxy phone
The tricky part was making a responsive UI was the 3 dimensional objects in a virtual space. For that, I worked very closely with the UI engineers to develop a system where an anchor point was projected from the 3D dimension space into a 2D space on the screen, and the anchor point was then subjected to laws of bounds. Once the appropriate 2D anchor point was calculated for any screen, the camera adjusted in such a way that the 3D projection of it aligned as closely as possible. The result was a reactive camera system that could be configured like a 2D UI and the camera would adapt automatically. It would automatically know how much to zoom in or out and how to move left and right to make sure all the 3D objects appeared at the correct places on the screen no matter the aspect ratio and screen resolution of the device, whether it was a nearly square Ipad or a really tall phone. The MVP of this solution achieved usable display results on over 8 types of aspect ratios and different devices which broadly encompassed 90% of the devices or screens we expected the app to be used on



For example, on a PC you would scroll through a carousel by clicking left and right arrows. You would do the same on a game console by using the bumper buttons. But on a touch device, do you remember the last time you clicked a button to scroll? Instead you will see your feed extends beyond the screen, and you simply scroll to access those items outside the screen. The UI is built to lead to and teach these natural behaviours that Rocksmith's prototype version was missing as it was built PC first. A large part of my role was making all these small changes to the global design system and its component libraries to improve intuitiveness on mobile, PC, and consoles.

Accessibility was a core tenet of the experience we set out to design, this included the ability to users to customise the experience to their needs. One of the first accessibility feature we developed was allowing users to modify the scale of their UI to fine tune their mobile and tablet experience based on how far they kept their device. The distance between player and device heavily changed based on whether they were advanced guitar players or beginners, whether they played easy or difficult songs, whether they played bass or guitar, whether they played a full 34" bass or some other size, or if they played different guitars, whether they had equipment such as pedals, amps, or headphones attached.

This customisation system was easy to build as we had already worked on a dynamic system to make our 3D UI responsive. The original system, being built on solid technical foundations, meant that users could change their configuration at run time meaning they could change whenever they wanted and however many times they wanted.
One of the final pieces was how players interact with the UI. Some players could be on a chair using a keyboard, some could be on a couch with a PS5 controller in one hand and a guitar in another, and some could be leaning over a stool to touch a tablet screen on their kitchen table. The environments and context ranged on a wide spectrum.
This is why we needed a consistent interaction system so outcomes could result from the same user actions over and over again. Since their minds and memories were occupied with guitar chords and other complex techniques, it made no sense to burden them needing to learn how to operate this UI.
The first version of a unified interaction system was an interaction matrix for achieving the most important outcomes
| Outcome -> | Input (Confirmation, Cancellation, Selection) | See More in a Scroll Group | Get more information | Return to previous state (Not confirmation) | more actions..... |
| Component ↓ | |||||
| Button | Left Mouse Click or Enter Key (PC) Single Tap (Mobile) Press (A) or (X) (Game Consoles) | ||||
| Information Tooltip | Mouse Wheel Scroll (PC) Scroll With Finger (Mobile) Highlight Out of Screen Component (Game Console) | Cursor Hover (PC) Tap and hold (Mobile) Highlight (Game Consoles) | Escape hover region (PC) Leave hold (Mobile) Press | ||
| Content Card | Left Mouse Click or Enter Key (PC) Single Tap (Mobile) Press (A) or (X) (Game Consoles) | Mouse Wheel Scroll (PC) Scroll With Finger (Mobile) Highlight Out of Screen Component (Game Console) | Cursor Hover (PC) Tap and hold (Mobile) Highlight (Game Consoles) | Escape Key or Click Close (PC) Tap Close (Mobile) Press (B) or (O) (Game Console) | |
| more components..... |
Once we had a consistent mapping between common outcomes that users wanted and consistent mapping the same actions to them, we then made sure to map the mechanics of a device with each other to build a cohesive system that is technically sound. For example, on PC, there is a UI event called hover which is activated when a cursor enters a region and does not exit the area or make an action like a click for 600 micro seconds. This is mapped to a player hovering on an item to want more information which is common in most digital applications. For touch devices, we had an event a finger was placed on an item, but did not lift it up for 600 micro seconds. This registered as a tap and hold, which was also mapped to wanting "more information" about the item the player tapped and held. In both cases, the help tooltip was created as a hovering element next to the original element so it wasn't obscured by the cursor or the finger. This ensured that under the hood, the technical interaction system was just as consistent as the outcomes of these interactions for the users which made it easy to maintain and scale the system rapidly as the number of screens and number of instances of a component started increasing exponentially later in development.
Result
Rocksmith+ launched in open beta and swiftly followed a global launch in 2022. The mobile version was able to unlock a completely new market of on-the-go learner who did not have the money, time, or convenience of learning to play the guitar on a computer or a game console. The new and improved accessibility feature set also garnered critical acclaim when Rocksmith+ won the Game's Accessibility Conference's award for Best physical / mobility accessibility.
But don't take my word about success on mobile, listen to real Rocksmith+ users talk about the mobile version
Learning
Working in a global team based on out San Francisco, Osaka, and Pune, I quickly learnt that project wide impact on accessibility and a multi-device philosophy won't automatically result in a great product simply with my personal contributions. A large part of team alignment came from rooting discussion in user needs and coaching the team what a truly multi device experience means which included numerous coaching and workshop sessions on mobile interactions and games. This was further challenged by language and cultural barriers during COVID 19, and meant we had to take an approach of overcommunication and active building of interpersonal relationships. While I am extremely proud of the product we put out, which helps millions of humans learn guitar at their own pace at an affordable price point, I am even more proud of the team culture we built.
What my colleagues say about me
Hiroshi Ogawa, Lead UI Engineer, Ubisoft
"Bramha made the foundation of Multi-platform UI, which was one of the biggest challenges in the project. His knowledge and insight always pushed our discussion forward. A clear design-based dialogue was productive and helpful in the cross-studio project in Japan and India.
I respect his courage to accept the change, which is UX/UI design's most challenging part of improving the game while managing our resources"
Kaiwen Young, Director of User Experience, Ubisoft
"Bramha’s passion, knowledge and communication style contributed greatly to the quality of our project and team UX culture"
Rohit Suvarna, Senior Game Designer, Ubisoft
"Bramha has a deep understanding of how to design user interfaces that are intuitive, effective and visually appealing. He is also an expert in user experience research and knows how to use data to drive his designs."
Utkarsh Bagade, Senior Engineer, Ubisoft
"Bramha's approach to designing complex game systems always put UX at the highest priority. I’m very impressed by his ability to understand the tech (tools and engine) and designing features that take advantage of the current frameworks and help improve them."












