SBB Design System Mobile and ML frameworks are now OpenSource
This has been in our pipeline for quite some time now: We’re very happy to announce that starting from today, we will start sharing our work as OpenSource, so that everyone can benefit from it.
Today we start by sharing two of our most-used libraries for iOS:
- SBB Design System Mobile: A SwiftUI framework allowing for an easy integration of SBB theming to your app. All elements are optimized for dynamic TextSizes, VoiceOver, light & dark mode as well as for different SizeClasses.
- SBB ML: A Swift framework simplifying the integration of CoreML (ObjectDetection) models into iOS Apps using Combine and SwiftUI.
In this article, you’ll learn why we chose to follow the OpenSource path. We’ll also dig a bit deeper into both frameworks and our quality standards for libraries.
As the in-house agency at Swiss Federal Railways for everything Mobile-related, we started creating libraries a long time ago for our internal and external mobile apps. This allowed us to avoid code-duplication, ensure quality standards across all apps and most importantly to develop apps faster.
We also have a strong history with Hackathons. Events like LauzHack, HackaHealth or DreiländerHack allowed us to broaden our minds for possible future solutions. In fact, apps like SBB Inclusive were born this way. On the data front, providing the needed ressources for a hackathon became easy thanks to SBB Open Data. However we always struggled to provide in-house app-code to participants. By sharing our libraries, we hope to provide hackathon members a solid basis to build their proof-of-concepts and to allow them to focus on their challenge. Similarly, we also hope to attract talented and motivated techies to come and join our team.
Right now, our focus is to publish our mobile libraries as OpenSource code (under the MIT license). App-specific code will remain closed-source for now. There will be more iOS, Android and Flutter libraries coming up in the next few months, so stay tuned.
SBB Design System Mobile (SwiftUI)
Since the introduction of SwiftUI (and Combine) we were huge fans of its declarative syntax, its improvements for accessibility and that it embraced a clearer separation of concerns on the architecture side (see SwiftUI App Architecture). At the same time our existing internal Design System Mobile was due for an update. So we started working on a newer version together with our friends from the UX team.
Our SBB Design System Mobile for SwiftUI (GitHub) is the result of this process. The Flutter version will follow shortly. It is already in use for customer apps like SBB Inclusive and few internal apps. It contains all SBB-specific UI elements, a specific font as well as our color theming. However, optionally you can also define your own color theming. All elements are optimized for dynamic TextSizes, VoiceOver, light & dark mode as well as for different SizeClasses.
SBB ML (SwiftUI & Combine)
Our journey into Machine Learning started quite a few years ago with our first Image Classification models. Since then, we’ve come a long way. Currently, our dataset contains over 70'000 labelled images with bounding boxes for over 150 SBB-specific objects.
The application field for our models is quite large and spans from navigating visually impaired persons to automatically recognising and classifying defects. And since more and more apps want to benefit from our models, it seemed natural to develop reusable framework.
The SBB ML framework allows for an easy integration of a livestream from the camera which is automatically fed into any CoreML model of your choice. You can then subscribe to a Combine publisher emitting all detected objects. SBB ML automatically deals with device orientation and can optionally also track objects or measure the distance to them.
Quality standards for OpenSource frameworks
The day we started working on our mobile libraries, we defined some quality standards, that we wanted to meet. In fact we think that all libraries should embrace them:
- Testing: Reasonable code coverage.
- Demo App showcasing the framework’s functionality.
- Clean documentation of all public interfaces.
- Automated CI pipeline which executes all tests on every branch and pushes to the stores automatically.
So let’s dive in quickly and have a look on how we implemented those standards for the two frameworks at hand.
This one seems quite straightforward for the case of the SBB ML framework: We wrote unit tests for all classes and additionally test the object detection part by feeding in reference images (instead of using the camera livestream as a source).
However unit tests are not applicable to UI Views (and thus to the SBB Design System Mobile framework). So we chose a different route here: Snapshot Testing. We test every single View in all of its different possible states and compare it to a reference image (captured automatically). This includes View-specific states (e.g. selected/unselected state for a checkbox) as well as system-specific states (dynamic TextSizes, light & dark mode and different SizeClasses).
Providing a framework together with a demo app has two main advantages. First, it allows us to showcase all the framework’s functionality to our developers as well as to our colleagues on the business side (and trigger their imagination what could be achieved with it). Second and even more importantly, it allows us to test how user-friendly our framework interfaces are. In our experience, we often modified and simplified the interfaces after having to integrate them for the first time, so a demo app is a perfect way to collect first impressions.
A few years ago we typically had a huge ReadMe file explaining how to use a framework. With the introduction of the DocC compiler in XCode 13 we luckily were able to simplify our ReadMe file and to move the documentation of the framework’s interfaces to the place were it belongs. Directly into the code and then auto-generate the documentation. Right now, user’s can generate the documentation directly in XCode (under Product → Build Documentation). However we are planning to host all of our documentations online in the future.
If you look closely at the SBBTextField documentation above, you’ll notice that it also contains a visual representation of the View. This representation is auto-generated during the snapshot testing process.
Automated CI Pipeline
Our internal projects are pushed to an internal BitBucket server setup with hooks to a Jenkins instance which then builds, tests, archives and pushes new versions to the stores using Fastlane. This ensures, that all tests need to be green before merging a pull request and that our projects are permanently buildable. However this internal BitBucket server is only reachable by a secure VPN connection.
To be able to provide our libraries as OpenSource we had to switch to a public GitHub repository which cannot trigger Jenkins builds on our build servers. So we had to look for another solution. Luckily, Apple offers its own XCode Cloud beta solution which is directly built into XCode. Our first impressions are mostly positive: It’s very easy to setup workflows and builds automatically connect and display in GitHub. The only drawback so far has been with our snapshot tests in SBB Design System Mobile (we use the SnapshotTesting framework). XCode Cloud does not copy the reference images to it’s servers (since they are not part of the target) and thus cannot find them when running the tests.
- Will they be used and forked by a large user-base?
- Will people start contributing to them?
Next up we have other libraries up our sleeves. We have a strong background in Flutter development and first Flutter libraries will start to appear on our GitHub account in the coming months.
Let us know what you think about our OpenSource libraries!