I want to provide an update on Cycling Guide, the mobile app that Zeitspace is working with CycleWR to create. One of the fun aspects of working on Cycling Guide has been something that software product teams don’t always enjoy: testing.
More specifically, as the team has worked on the app we’ve needed to assess the quality of the cycling routes that it provides. As a bit of background, we’re building Cycling Guide on a foundation of geospatial data from OpenStreetMap. It’s a fantastically detailed and accurate resource. We’ve been able to process the underlying OSM data that we use to better understand and visualize our routes, which has been a great starting point for our routing approach. But we’re also doing exploratory testing of our cycling routes, something familiar to software product teams.
In our case, it started by looking at the routes on screen. As Cycling Guide is focussed on Waterloo Region, we’re assessing routes on streets that we know at least a little. As work progressed we could easily identify anomalies in the routes we looked at and address them in development. We also defined some specific trips that we could review again and again, looking to see how updates to data or our routing engine affected the routes. For example, there’s a two-way protected cycling lane along the otherwise one-way Erb St between Peppler and Caroline in Waterloo that we’d expect to see used for some routes in either direction. And we have a few test trips where we would reasonably expect that routes would use the Iron Horse Trail, Spur Line Trail, or Laurel Trail. My own favourite test trips are to local breweries.
In addition to software updates and data updates, being able to tweak routing settings while testing and seeing the results on a familiar street and path network has made the work a lot of fun. We’ve also compared our routes to what some other mobile mapping apps provide, and the results are encouraging. In many cases we prefer the routes that Cycling Guide provides.
Exploratory testing on screen was a place to start and helped the team make great progress. Things got more fun when we started doing exploratory testing while on bikes in the real world. The earliest field testing that our team did was pretty simple, as the app was extremely limited in functionality. For example, riding along Iron Horse Trail and periodically checking the current location marker on a map wasn’t really delivering much value, but the functionality worked. As Cycling Guide grew in capability I started trying more ambitious rides myself. Two recent cycling trips provided unexpectedly strong validation that the app is already delivering value.
The first was a ride with my son from Uptown Waterloo to St. Jacobs and back. The route took us on paths that we didn’t know about, or hadn’t been able to find previously. And the app was great to use while riding even in a still-limited form. The second was a ride I did from Uptown Waterloo to Cambridge and back. Again, Cycling Guide took me on paths that were new to me. Both rides were mostly on protected lanes or shared paths that were clear of cars. And both rides were quite scenic in places. The rides were unexpectedly exhilarating experiences. My typical rides are either running errands or longer recreational rides. Having a way to expand my cycling repertoire will provide great value for me.
As I like to cycle anyway, this exploratory testing phase has been great fun. In addition to testing our app, it has me exploring Waterloo Region by bicycle and making new discoveries. Even at this early stage we’ve been able to make some small improvements in OpenStreetMaps data as well. So it’s been a win on multiple levels. Sometime soon we’ll be expanding our testing to include some outside testers who aren’t a part of the project team. We’re excited to see the results.