What if your mobile app could detect bugs, fix UI inconsistencies, and spot user frustration before a user ever reports it? In today's episode, I sit down with Kenny Johnston, Chief Product Officer at Instabug, to explore how AI is reshaping the way developers build, test, and maintain mobile apps.
Instabug is taking mobile observability to an entirely new level by developing what Kenny describes as "zero maintenance apps." Powered by on-device AI models, their platform can now detect subtle UX breakdowns, visual design flaws, and even frustration signals that wouldn't normally trigger crash reports. Whether it's an unresponsive button, a layout shift, or a broken navigation path, Instabug flags the issue, often before a user ever notices.
Kenny shares how Instabug's approach to AI is helping development teams move faster and smarter, particularly in high-stakes environments like retail and e-commerce where performance peaks during events like Black Friday or Valentine's Day. Through real-time crash reporting, automated UI analysis, and deep session insights, developers can spot and solve problems that would otherwise get lost in a backlog or surface in app store reviews.
We also explore the unique pressures of mobile development. With no quick rollbacks and high user expectations, developers need tools tailored to the realities of app store approvals, device fragmentation, and version-specific bugs. Instabug's platform brings together observability, feedback, and issue reproduction in a way that simplifies the mobile stack and accelerates release cycles.
Kenny draws on his experience at GitLab to reflect on the need to consolidate tools and workflows in mobile development. He offers valuable insights for product leaders and mobile engineers on how to navigate change, evolve their approach, and stay curious in the face of constant technical demands.
So how can your team shift from reactive debugging to proactive experience design? And are you really seeing all the issues your users encounter or just the ones they report? It's time to find out.
[00:00:04] Have you ever experienced an app crash on your mobile phone at the worst possible moment? Maybe while checking out an e-commerce site during a sale or trying to book a last minute ride, what if those issues could be fixed before you even notice them? Well today we're going to be talking about AI powered observability but for mobile apps. A space where Instabug is leading the charge.
[00:00:30] With their AI driven crash reporting, frustration signal detection and self-healing software, Instabug is helping developers identify and fix issues before they impact their users, ensuring a seamless mobile experience. Well joining me today is Kenny Johnston, Chief Product Officer at Instabug, who also brings with him years of experience from GitLab
[00:00:55] and we'll talk about how to redefine how developers approach mobile app quality. Because today we're going to cover how AI is transforming mobile development by detecting hidden bugs and UI issues in real time. And the challenges unique to mobile apps, where there is no rollback button like there is in web development. And why Instabug's on-device AI models are enabling a new level of proactive issue detection.
[00:01:24] And of course, the ROI, the role of automated insights and improving app stability, user retention and performance under pressure. So with businesses relying more than ever on mobile first experiences, how can AI help ensure apps are always running at peak performance? Let's find out by getting Kenny onto the podcast now. So thank you for joining me on the show today, Kenny.
[00:01:50] Kenny, tell everyone listening a little about who you are and what you do. Sure. Yeah. My name is Kenny Johnston. I'm a product leader at a company called Instabug. Instabug is a mobile observability platform powered by AI. We have a long history of focusing really specifically on mobile developers. It's a passion of mine. I'm a former developer myself and dabbled with mobile development. And my whole career has been in developer tools. So before this, I worked at a company called GitLab that many of your listeners might be aware of.
[00:02:17] I've always been really intrigued as a former developer and also as a product person in the kind of really complex product problems that present themselves when you're building for a developer audience. You know, developers have very hard jobs that involve processing a lot of data and performing very complex tasks. So building products for them is challenging and I enjoy that challenge. Well, thank you so much for taking the time to sit down and join me today.
[00:02:44] One of the things that intrigued me about Instabug is this new AI driven features that you've got there that include the ability to automatically detect and fix issues within mobile issues. And you're an ex-developer, so that must excite you and it'll excite people listening. But can you tell me a little bit more about this self-healing software, how it works and the potential impact it has on minimizing downtime for developers? It's a pretty big problem you're going to have to say. Yeah, for sure.
[00:03:12] And let me start with the potential impact and the kind of vision for what we call zero maintenance apps. So in our view, you know, AI is already starting to intertrude in all sorts of places in a developer's workflow. Many developer who's listening to this podcast knows they're probably using it in their IDE. There might be some scaffolding work that they do using AI to kind of build out common code, or they might use it to understand complex code that they're not familiar with.
[00:03:39] We believe that AI is going to continue to make life easier for developers, take away some of the toilsome work. And as a mobile company, we also understand that even more so than most other types of software, mobile developers spend a lot of their time in what they would call keep the lights on activities. Like if you hear the acronym KTLO, it's both the maintenance of the app, reprogramming to new UI frameworks or operating systems,
[00:04:04] or just generally keeping up with what is a really, really fast bar for quality and experience that we as users of mobile phones and mobile apps expect. So Instabug has been investing really heavily in making it so that developers spend less time on those keep the lights on activities and a lot more time on building amazing experiences for their users, which is where we think the real battleground for who's going to win in the mobile app world in any given space is going to be.
[00:04:32] So yeah, our belief is that of all the places where AI is going to rapidly produce this kind of agentic workforce that's helping you as a developer or progress things that you wouldn't, you would have normally had to have done manually. Mobile and specifically mobile quality is going to be the first ones. And we have a lot of rationale for why. And as I mentioned, that's our belief. That's where we're investing heavily in. That's what we've been building. And we have a couple of really exciting capabilities that we've already shipped.
[00:05:01] But as a product leader, I'm also really focused on what we're planning on shipping. That's really exciting to me. And I must admit, as an ex or an ex-IT and long recovering change manager, one of the standout capabilities of Instabook that caught my eye is how you're offering advanced AI driven insights, but to evaluate the impact of new features before they release and go into a live environment. That's a big problem that needs solving as well.
[00:05:28] So how did those insights help developers maintain things like app stability and ultimately make more informed decisions during updates? As I said, as an ex-IT change manager, this is the stuff that excites me, sadly. Yeah. And of course, part of this is predicated on how mobile development is different than web development. So let's just remind ourselves, mobile development, there is a kind of, like you said, a change process, but there's a third party agent in the middle of that process.
[00:05:57] That's the app stores. They review your app, they review your app, not just for maliciousness, but also for quality. So they are checking to make sure that your app is sufficient quality for them to want to publish it in their store. That affects where it's ranked. And what that means is that there's this variability to your deployment process. So what we find from mobile teams is that they're trying to circumvent that in a way. And they do that by shipping code that gets reviewed, but is not on for their users, and
[00:06:25] then slowly rolling that out using feature flags. So you hear a lot about feature flags in web development as well for more testing purposes, but like experimentation purposes. But in mobile, it's really a very standard deployment mechanism. It's for controlled deployments to make sure you have full knowledge about what might happen to that user's experience or the overall user experience and prevent anything bad from happening. Because there's no rollback in mobile.
[00:06:53] There's no like, oh, yikes, we need to ship the newest version of that. And we're going to do it in minutes. There's no CD. So it makes it very difficult for mobile teams to manage. What Instabug does is we connect with your feature flag tools to give you insights into how individual feature flags are performing, not just on their quality metrics, but their business impact, and then allow you to progress that feature rollout or App Store rollout based on some AI insights that we
[00:07:23] might have about what is kind of a good bar for your, for what we know about your app and that feature's performance. So think of this as like a safety guardrails for deployment process specifically built for mobile development teams. You have very unique challenges that are different from web. And if we do have anyone from retailers listening, they will testify firsthand how mobile apps can face immense pressure during peak traffic times.
[00:07:49] And what immediately springs to mind is Black Friday and Cyber Monday, that kind of time of the year. So can you tell me a little bit more about how Instabug's real-time bug and crashing report, that also helps ensure optimal app performance during these high stress periods? Because again, pretty big thing. Yeah. And I want to point out that like, I think as outsiders, we think that generally there are these big moments that are maybe universal across apps.
[00:08:15] But for example, we have customers who might be hosting an ad in the Super Bowl, and they know that they're going to be getting a lot of traffic on their app at very specific moments. Or maybe they're a sports betting app and there are specific games where there's lots of people involved, and every second of downtime costs hundreds of thousands of dollars. And it's not just downtime. Sometimes it's every extra second on an app launch time costs hundreds of thousands of dollars.
[00:08:40] So these moments are unique to individual customers, are often very critical to the business impact. And so what Instabug does is we are a real-time analytics processing platform. So you get real insight into how our users experience is happening right now. And you can do things like flip a feature flag if a certain feature stops performing well. You can do things like slow a rollout or pause or rollback or rollout.
[00:09:08] These are all the steps that a mobile team would normally take, but what we can do is capture those much quicker and go from three hours or four hours of downtime or impact to mere minutes of impact. And then you can compound the dollar amounts there. If minutes are hundreds of thousands of dollars, this can be saving millions of dollars for mobile app teams, especially those who are really focused on the business impact.
[00:09:35] And one of the interesting insights for me is that every app is created for a reason. Not all of them were created for commercial reasons. Some of them are created for just having a better connection with their customer base. Typically, a mobile app is it's like in your users' pockets. It's that close and that intimate of a relationship. So most brands consider the mobile app the most important relationship, digital relationship with their customers.
[00:10:00] And so they measure the impact of poor performance or a poor quality app in uninstall rates over time. If users are uninstalling your app, you're spending all this money to capture them and bring them into your app. But that's going to waste if you're leaking out 20 percent of the users because you're not delivering a quality app experience.
[00:10:21] And what Instabug really does is give you a quality management platform to help you organize your teams and hold your teams to a bar so that the whole of the app is delivering on the experience. Then, you know, your business leaders decided was why we wanted to have the app in the first place. And I think we often forget that for many e-commerce apps, retaining users during those critical shopping days or those huge scale big events is absolutely vital.
[00:10:49] So can you tell me a bit more about how the platform can enhance user experiences and ultimately enable businesses to address issues swiftly, ensuring greater customer satisfaction and that seamless customer experience that they almost expect a standard from those apps? Yeah, exactly. And I think that last point is a really important one. No matter what kind of app you build, the expectation for your app's performance is not set by the other competitive apps. I always use the banking example.
[00:11:19] The expectation for the performance of the banking app on my phone is not set by all the other banking apps that I don't have on my phone. It's set by all the apps that I use frequently. It's set by the Spotify's and the YouTube's and those types of apps are the ones that are going to set every user's bar for what a high quality app is. So the bar is getting higher and it's not just getting higher within your industry. It's getting higher universally and you've got to keep up with that really high bar.
[00:11:44] So Instabug is really, as I mentioned, focused on the ability to spot quality issues, both once they're in production and respond to them rapidly, like we were talking about, and prevents downtime and user impact. It's also focused before release. So one of the great AI capabilities that we've been working on recently, we have a tool for reporting bugs and typically in a mobile environment or mobile development environment.
[00:12:13] There's a QA team who's testing the app and reporting bugs. And we have a tool that makes it really easy for those testers to report a bug that is informative enough to the developer to actually fix it before you ship it to the store. Our AI feature in this space is actually capturing this type of bug that we don't normally capture and that QA testers typically ignore. It's UI issues.
[00:12:39] So inconsistencies in font styles or a slightly miscropped image or an alignment issue in your app. These are all things that cause paper cuts to your users that they think, oh, man, this isn't such a quality app after all, because I notice all these little slight UI issues. But a QA tester, it's hard for them to think this is worth me taking even a couple of seconds to file a bug about it.
[00:13:03] With AI, we can spot all of those and deliver them as part of your regular QA process while your testers are testing the app. We'll also report these UI inconsistencies to your team to make sure that you're not just covering, oh, does it functionally work? But is it a really polished quality apps that we want to see? And this is a really important behavior for the top tier apps. If you talk to anybody who builds mobile apps and, you know, the top 10 apps in the world, they'll tell you they have a really heavy focus on these UI inconsistencies.
[00:13:32] Because after, you know, does the app crash and is the app performant? This kind of polish of the app is one of the next best indicators of user engagement on an app. When I was reading up on Instabug's AI-powered detection of these hidden visuals and UI issues, it really feels like a game changer.
[00:13:49] So to bring to life some of what we're talking about here, can you just walk me through how advanced or how these advanced AI models can identify and address UI inconsistencies across different devices and different operating systems? Because I think this would really help people both inside and outside just understand what we're talking about here. Yeah, and I think it's important to remember, like, this is actually something that AI is really good at.
[00:14:16] AI is a really good pattern matcher, and it is good at matching things that don't fit or finding things that don't fit the pattern. So in this case, what our tool is doing is through the course of any app, particularly in a QA or dev environment, we're capturing regular screenshots of the user's interactions. And this is putting those through a model. And the context is kind of, hey, here's all the screens in the app. And then here's each individual one asks, is there something strange about this screen?
[00:14:45] And we can spot and then we have some sophistication in how we augment that pipeline so that we can spot specific instances like this is a text problem. And we can label that as like this is a text problem or this is an alignment issue or this is an image crop issue. But what that does is, as I mentioned, like your QA team will typically even the most sophisticated QA teams have a hard time spotting these and taking the time to file them.
[00:15:13] They're usually very focused on, I got to make sure functionally all these things check because we're on a really quick road to do it or to get this out. So this is really about superpowering your QA process so that you're also capturing visual issues. And as I mentioned, the business impact is really about pushing your app to that highest bar of user expectation so that people aren't saying, oh, the apps sure doesn't crash on me and it's stable.
[00:15:39] But there's all these like strange parts of the app that seem like they're not fit together or incongruous or have little minor issues that impact the user's perception of the quality of an app. Yeah. And on that, for the non-developers listening, I think we should highlight the silent issues that don't trigger traditional error reports often go unnoticed but can frustrate users long term.
[00:16:05] So how does your on-device AI model, how does it detect those subtle yet frustrating user experience problem? And what role does it play in improving the overall app quality as well? Yeah. And I think the word frustrating is a really great one. I'm glad you used it. We think of the progression of what we call mobile app quality as levels of sophistication of the user, of us collecting a frustration signal.
[00:16:33] So the obvious one is if the app crashes, my user is going to be frustrated. So that's why when we first had this Cambrian explosion of mobile apps, everyone was grading themselves on their crash-free rates. Next, we've started saying, well, that we know the app launch time is also frustrating if I want something from an app and I have to wait three seconds. That's a really frustrating experience. We know if screen loads take a long time, that can be a really frustrating experience. Instabug pioneered this additional frustration signal that we call force restarts.
[00:17:02] You probably do this on your phone every day where an app is kind of slow. You're not really sure. Maybe something's buggy. You force quit it and then you immediately open it back up. It's a clear frustration signal that you're expressing, but most tools would not even present that to developers. We think the next bar is actually something we call broken functionality, which is where you can model what the user's regular behavior is. And then if something all of a sudden changes, there's probably something broken in that experience that's a user-frustrating event.
[00:17:32] So the example I give is we all use chat, like chat or comment-related apps. But a typical behavior is like, oh, I'm expecting that there's a new comment and I'm scrolling down to see if there's something new or swiping down to refresh the app. That experience should yield some sort of feedback to the user, but maybe a user is doing it repeatedly and the app is not giving them any feedback, like either there's no comment or something like that.
[00:17:58] And that's an example of a frustrating user experience that might be kind of unique to that user's behavior. And so that's why we are deploying on-device models, because they're capturing that user's specific behavior around your app. Building a small, very small model that gives a kind of feedback to when we next ask the question, is what this user doing normal or not? We can get a good answer to it.
[00:18:23] So our deployment of on-device models is really not focused on better performance, which is very common, like lots of the other companies that are deploying on-device models are trying to push AI to the edge, because they want to not have a transit back and forth to a large language model running in a cloud somewhere. Our focus is really on being able to deploy customized models for individual users so we can spot frustrating experiences that are a result of that user's behavior.
[00:18:52] So, yeah, from our perspective, we think AI actually unlocks a new set of frustration signals that aren't deterministic. So today, most of the frustration signals we look at are, did the app crash? Was the screen load more than two seconds? Did the user force restart? These kind of non-deterministic frustration signals are going to be the next ones that, again, every app is competing for the highest quality, lowest frustration app in the market.
[00:19:19] And we think it's going to be really critical for those leading edge mobile teams to be able to get this new wave of frustration signals and make sure that their app is an even better experience for their users. And I also read before you came on the podcast today that Instabook provides developers with actionable analytics and feedback all in real time during those high stress periods.
[00:19:42] So are you able to share any examples of how these insights can enable those quick iterations and improvements for app developers? Yeah. So a common one is crashes. So if you encounter a crash, if you're a mobile developer, sometimes it's, oh, my gosh, there's this crash. We need to get on it quickly. Sometimes there's this tricky crash that's been plaguing the app for a number of months. Can someone go spend some cycles, dig in and fix it?
[00:20:08] But that experience for all mobile developers know that the first step is, can I reproduce this? And so what you're trying to do when you're trying to answer, can I reproduce this, is what's unique about the demographics of the devices and the users that seemingly have a higher frequency of experiencing this crash? So you think of like, oh, it only happens on certain versions of iOS, or it only happens when the device is in portrait mode, not landscape mode.
[00:20:38] It only happens when the device is on Wi-Fi. Those kinds of patterns are what bread and butter debug tools give you. They show you, hey, here's all the occurrences. Here's some information about the different states and demographics of the user that might be helping you lead to something that can get you to reproducing. Instabug produces all those patterns, but also produces a patterns inside section that says, hey, very clearly I'm looking at all of this information about patterns.
[00:21:06] Seemingly this is isolated to iPhones running version 15 and in portrait mode with Wi-Fi disabled. Okay, that really helps the developer go from there's a whole bunch of information I have to sift through to try to find that pattern to I know exactly where my starting point is. And then even better, we give you reproduction steps. So we say, hey, of the 10,000 occurrences, the most common steps that a user took before this crash occurred was X.
[00:21:34] So a developer can now, in their local environment, set up an emulator with all the specifications we just said and walk through the exact steps to try to reproduce that crash. Then they can really get into debugging. Okay, what was happening in the stack and on the in the code that led to this crash occurring? We have another capability that also says just based on what we're looking at from the stack trace.
[00:21:57] So we know all information about the kind of like logs that are spit out when a crash occurs of what was happening process wise on the app when that happened. We have another capability that says given this information, we generally think that the problem is this like think of it like as simple as a division by zero error or unhandled exception error. Those are common, but those are kind of trivial crashes.
[00:22:21] What we typically find is instabug is really powerful for very complex, hard to figure out crashes in getting a developer to be able to reproduce it in their local environment. And at the very beginning of our conversation today, you mentioned your experience at GitLab where you focus on consolidating developer tools.
[00:22:40] And I'm curious, did that time inform your approach at instabug in creating a more seamless platform for mobile observability, accelerating mobile app growth? One of the reasons I asked that is the old Steve Jobs quote, you can't join up the dots looking forward, but it's not until you look back and see how those dots form that you can do that. But has it impacted your approach at instabug? Yeah, for sure. And I'll tell you, it impacted my decision to join instabug in the first place.
[00:23:10] It was clear to me that there are a lot of drive to consolidating developer tools. Developers spend a lot of their time jumping between various tools. And it was also clear to me that these platforms are built for a specific workflow. I'm kind of like GitLab, a very Git-focused, continuous deployment workflow. Mobile doesn't work like that. And there's entire books written about how the mobile development workflow is very different than the web one.
[00:23:35] And so I definitely foresaw there being an opening for, you know, people say the most important window to their users is mobile. We're competing in a digital world, primarily every business is if they like it or not. That digital world is a mobile world, to be honest. But we're not building developer tools for mobile developers. I get in rooms with mobile developers every day. That's why I'm very passionate about it. And you hear them say, you know, people keep telling us that the mobile app is super important. The CEO is complaining about an app quality problem.
[00:24:04] It's something we talk about in all of our investor reports. But we're not giving, as mobile developers, the tools we need to actually respond and drive a better quality app. And it's actually one of the few places where developers will tell you, I believe that the quality of our app and the overall performance of our app is more important than any feature. But I can't convince my business that that's the case.
[00:24:32] They keep on wanting to push us on features and our quality keeps going down. Instabug really answers that question. And so definitely the comprehensive platform for a developer workflow that I experienced at GitLab informed me. And it really was that there is something different about mobile. And mobile deserves its own set of tools to enable what we all know is a really important business outcome. I love that. And to many people listening, you're someone that's leading the way here.
[00:25:00] And there is also a big pressure on us all to be in a state of continuous learning. So as someone leading the way, I've got to ask, any tips you could share on where or how you self-educate, how you keep up to pace with this, keep up with the pace of technological change? Yeah, I mean, I think like many people, I read news, I read tech talks and things like that.
[00:25:22] But I think the most important thing that I would leave people with is all of your peers that you work with at your companies are wealth of information that you can and should be learning from. So it's not always so you've got to find some external source. I had a mentor once tell me, you should be asking every single one of your peers that you work with daily, what can I do to be a better partner for you? And that will really help you understand what motivates them, what their expertise is, what is involved in their day to day.
[00:25:51] Just being a curious person, particularly for those people who you interact with daily, will make you a better colleague, but will also help you grow. So, you know, I think there's obviously answers that are very read this, read that. But I think interacting with the people who you spend your working days with and being curious and trying to help them is going to be the best way to grow. Fantastic advice. Love that.
[00:26:16] And for anyone listening that just wants to find out more details on Instabook, maybe connect with you or your team, discuss some of the things we talked about today a little bit deeper. Where would you like to point everyone listening? Yeah, instabug.com is our website. We have a full-fledged sandbox. So if you're a mobile developer or you know any mobile developers, you can point them to our sandbox to get a sense of exactly how our product works.
[00:26:38] Yeah, and we have all sorts of contact methods on our website to get a hold of either me on the product team or anyone else at the company. Well, so much I loved about our conversation today. Hearing more about these new AI features and this self-healing software and this new ability to automatically detect and fix issues with mobile apps within mobile apps, minimizing downtime, reducing the need for manual intervention by developers.
[00:27:06] There is the ROI right there of anybody listening. So I would urge anyone that is interested to check you out and contact you directly. But more than anything, just thank you for starting the conversation today. Yeah, absolutely. My pleasure. Thank you, Neil. From self-healing apps to AI-powered frustration detection, Instabug is redefining what mobile observability means. No more waiting for customers to report issues and then reacting on that.
[00:27:35] Now apps can detect and fix them before users even notice. And some of the big takeaways from my conversation with Kenny is that AI-driven crash reporting and frustration analysis allow developers to identify hidden UX issues. And mobile-specific challenges require specialized observability tools, especially since App Store deployment can make quick fixes incredibly difficult.
[00:28:02] So the future of app development is automation, reducing developer toil and enhancing user experience, all through zero-maintenance apps. And as mobile usage continues to grow, the big question is how can developers stay ahead of performance issues before they impact users and revenue? If you've got any experience in this field, whether you're a developer or just managing a mobile app, I'd love to hear your thoughts on what you heard today.
[00:28:31] Please email me techblogwriteroutlook.com or social channels just at Neil C. Hughes. Let me know your thoughts. But that's it for today. I'll be back again tomorrow. You're all cordially invited to join me, so hopefully I will speak with you then. Bye for now.

