Ep. #1, A Closer Look at Product Testing with Ian Brillembourg of Plunk
In this debut episode of How It’s Tested, Eden Full Goh of Mobot speaks with Ian Brillembourg of Plunk. Together they explore product testing processes, insights on utilizing AI-driven data as part of quality assurance, and lessons on striking a balance between web teams and mobile teams within a modern organization.
Ian Brillembourg is Head Of Mobile Product Management at Plunk. He has been in product management for over a decade across games, mobile, and social products at places like Xbox, Zynga, and AssuranceIQ, focusing on features to delight and retain users.
In this debut episode of How It’s Tested, Eden Full Goh of Mobot speaks with Ian Brillembourg of Plunk. Together they explore product testing processes, insights on utilizing AI-driven data as part of quality assurance, and lessons on striking a balance between web teams and mobile teams within a modern organization.
transcript
Eden Full-Goh: Hey, Ian. Thank you so much for joining me on the How It's Tested podcast.
Ian Brillembourg: Thanks for having me. It's so good to see you.
Eden: Yeah. It's been really exciting to have this chance to connect with you more deeply about all sorts of topics that we're going to talk about today relating to product management, what it's like working with Mobot. I would love to learn more about what you're building at Plunk, and then also just I think there's a broader conversation we can have about all of the interesting products that you've built throughout your career.
Context for anyone who's listening, is Ian and I have been working closely together since our companies Mobot and Plunk have gotten to know each other over the last few weeks and it's been really exciting to just learn more about the product that you guys are building, learning more about your users and the vision that you have for your roadmap. So maybe if you could spend a few seconds introducing us to Plunk and then also your role at Plunk.
An Introduction to Plunk
Ian: Yeah, sure. So Plunk is the world's first realtime, AI driven data platform for the real estate industry. Basically the problem that we are solving is how antiquated and unreliable the processes that the real estate industry has for making decisions about how much houses are worth, what kind of projects should be done on them. So it's been a real fun adventure, seeing our data scientists build all these amazing models and the things that they are able to do to tell you exactly how much a home is worth right now and maybe five minutes from now, and then tomorrow. Because it's something that, turns out, particularly in these turbulent times of the real estate market, having the latest price point is quite important. What I do there, our product is data so there's a big chunk of it that is just data coming out of API pipelines, et cetera. But there is a side that we are using, that we want to expose to the consumer, so there is a web app that is accessible through the desktop web and mobile web, and there's going to be a companion native mobile app for Android and iOS. That's the part that I've managed.
Eden: Yeah. What's been really interesting in working with you is just seeing how core the mobile interface, the tablet interface, the phone interface is to the user experience of the folks that are going to be using the Plunk product. Could you tell us a little bit more about why this isn't just your average, run of the mill web app with some data in it? This is a mobile experience, it sounds like users need to be able to use this on the go.
Ian: Yes, absolutely. Think about a real estate agent who is walking somebody through a house. They're often needing to carry a bunch of papers around and having to read off of things or memorize things. If they are asked randomly about a house in the middle of, "Hey, let's walk through this neighborhood," it's much easier for them to be able to look down at their palm and say, "Yeah, yeah. Blah, blah, blah," turn it around and show it in a way that looks beautiful, that is responsive, and therefore come out convincing their clients and coming off to their clients as the authority and the expert in real estate that they are.
Eden: Yeah. I'm curious how in your product team, how do you guys approach doing that kind of user research to understand the ways that the Plunk product might potentially be used? How do you use that to then drive features on the roadmap?
Ian: Well--
We spend a lot of time actually doing product discovery, that implies talking to a bunch of potential clients and getting a sense of what their day to day work is like, what their hardest problems to solve are, and what kind of impact they have either on their day to day life in terms of how much time they spend doing their work. Which means how much work they can do and how productive they can be.
Or, how much value they get out of each piece of work that they do and therefore how they can better affect the bottom line. Once we have all that, basically it's a question of sitting down with design, engineering, coming up with the right solutions to the right product requirements which are now sorted by the things that we've learned in terms of what's more important. We have an amazing design team and amazing engineering team, and I'm blown away by the solutions the team comes up with. They take a very humble list of product requirements and ideas and outcomes, amazing products with amazing documentation. It's a real pleasure to work with a team like that.
Eden: Yeah. One of the things that's really impressed me about the Plunk product is there's the high level of data accuracy that you guys have, and there's a very smooth interface in the native mobile app on iOS and Android. Then as well the tablet experience on mobile web, which is really cool. I can see how that is going to be really useful for realtors, investors, home owners, to be able to access that on the go.
Yeah, what's also interesting is I know there's a home screen widget as well and there's complex dependencies, whether it's around location or the right zip code, your current location of the device, your search queries for the map API. Could you tell me a little bit about some of the features that are coming out that you're excited about?
Ian: Yeah. You're hitting it right on the widget. I guess I should've brought that up earlier. One of the things that we came up with as we were thinking about improving the quality of life of an agent, was what if you just could be able to look at your phone and wherever you are, this is how much real estate is worth right now in this location, how many houses there are listed available, what kind of inventories.
All these kinds of stats so that in the use case of a home owner with an eye out for new properties or real estate agents and brokers, wherever they go, they just have to take a quick look as they're walking, as they're driving just to get a quick sense of the space that they're in. We're talking about features like image detection, take a picture with your phone and update the value of your home by detecting what quality of furnishings you have or maybe even the brand, sometime down the road.
All these little details that up until now, A, are kind of hard to encapsulate but that are really impactful in how well homes are going to sell and how much they're going to get valued. But also being able to register them in an accurate and reliable way that is ingested in a standardized way. With all these different states, counties and even sometimes down to the even smaller spaces, each one of them with different regulations and terminology, a huge part of the challenge for the data team was just translating, getting all that into a single, cohesive data schema.
Eden: Got it. So I guess how does the current testing process at Plunk work? I know you have a product team, you have an engineering team, how does it all come together when you make a decision to release a new version of Plunk or get ready to release a new version of Plunk to the App Store and Google Play?
The Testing Process at Plunk
Ian: Sure.
Let's talk about how it is now and not how it started, because when you're a small startup you have to wear basically a bunch of hats. There's just not enough people and resources, so we've gone from a very workshop, hand built QA which had a lot of holes to something that is closer and closer, getting to a more sophisticated process that surprisingly is still run by mostly a one man band.
Now except it's a one man band with a bunch of robots behind him. So there's a few things that we have to do and actually some of them I think are unique, but at least things that I never had to deal with when I was in games or when I was in everything else. That is we work, A, with a data model, machine learning model that is getting updated and it's learning and works on its own timetable. We also have a backend system that has to read off of that model and it's also being updated and worked on its own, in its own timeline. Then there's the frontend team on both web and mobile that are hopefully consistently putting out builds in a more constant cadence, but that have to react to whatever happens on the other side. Because we've worked so hard to share the codebase as much as possible between mobile and web, we have to be very careful about the order of operations for us to get to a testable build that we can be confident in because what is being tested right now is what's going to go to product. Now I'll explain. As you know, you have the web team, you have the mobile team and every time the backend and the data team have to do an update, they're changing how the API payload is structured or how the model responds to something. That means that the clients have to adapt, have to make changes. But they have to wait until that is done. Now, further plot twist, the mobile team is offshore, several, several hours away so basically they're releasing builds while we are sleeping. So we have to release back and before we can test it on web we have to also wait one night to release it on mobile, so then the next day we can test on both because it doesn't make sense for us to test on web if, when mobile comes out, it has a bug. So to get all that done and to save time because very often when you ship out a new build, the bugs are very obvious and you don't want to wait until the mobile team releases a new staging environment client, just to find out, "Oh look, this chart doesn't have the right series." So we build this app that is kind of like a predevelopment environment app that you install it and it then asks you, "Send me towards a branch, a PR." And then that connects dynamically to the local machine, wherever it is in the world, that is running a PR that is about to be merged. So before we are merging it to stage, we at least have an initial pass very quickly internally that can be done at any time of the day, whenever things come out, before we tell the team, "Okay, push to stage."That's why whenever releasing on Monday I'm like, "There's going to be a staging push coming out." And I'm like, "Okay." Because I'm going to have to test this test app and something is going to come out that I'm going to have to push back, it's sometimes the staging in the production environment doesn't get pushed as on schedule as we would like. But I've only just started telling you maybe the answer to the question so I'm sorry if I'm rambling.
Eden: It's very complicated and very important.
Ian: We've certainly learned that together, you guys and us. So now, after all that is done, we basically have... because it's still a one man band we've worked with you guys to basically come up with all the different use cases and test scenarios that I think can be automated. And with you guys, we run it for both the app and then also for mobile web, so you guys are doing both the actual client and then going to the browser and doing a bunch of little actions.
I'm happy that I don't have to do it right now because you know how many houses you have to test it on, you have to test how many rooms, how many bathrooms, do you have square footage, what kind of furniture. All these little buttons that I used to have to press and make sure that they were getting sent and the model is reacting correctly. Now I don't have to do all that so you guys tell me now, "Okay. All of these standard things are good."
Then that just leads to us trying to break the model because we've tested this and then it's just, okay, the user walks into the bar, the user jumps into the bar, the user pushes a duck into the bar and trying to see what crazy things we can input into the model or try to stretch whatever settings in the UI. It's done. Right now we've been to the point where we haven't released publicly, so the impact of a bug slipping is not ginormous.
Still, we are getting really close to launch and we are working together to come up with ever more sophisticated measures, until such a time as we can increase more QA resources.
But one of the most satisfying things for us is not just how much bugs we've been burning, but how many bugs we've been finding. During the first weeks it was a barrage of bugs that were obvious, but we just didn't have either the time to do all the permutations of actions or all the devices to do all the permutations of devices and actions.
At first it was finding it a little overwhelming, our developers were like, "Oh my god. It's dozens and dozens of bugs." As time went on, we keep finding more bugs, we keep finding more bugs. Then one day we didn't find as many bugs and we could break the app after we got your processes and there were these tiny, little bugs that we kept working on and then one day there were no bugs.
Now I'm just looking at the Slack report from you guys, I'm like, "Pass, pass, pass." And I'm just telling the entire team and the entire team is looking at it and we're all so happy because now we feel a lot better about shipping out a beta, which might hopefully be as early as next week. So what a strange, twisted trip it's been.
Eden: It's interesting, because what you just described is pretty similar, I feel like I've personally had that experience before as a PM and working with a lot of the teams that Mobot has a chance to work with, where there's sort of an initial clean up period where you're auditing the whole app and you're just figuring out and getting a lay of the land of all of the possible bugs in the universe and you have to whittle it down to what are the priorities, what are the features that you actually care about testing.
Then there's a little bit of that tech debt you do have to pay down. But then you get to hopefully a point that, "Okay, this is the new foundation. This is what the engineering team has to build on." I'm curious, now that you're at this point and you've arrived at a point that you're happy with, how do you think about, as a team, in terms of your beyond testing? Just engineering best practices, product development best practices? How are you guys going to approach the way that you build and ship every week to keep this really healthy state that the product is in?
Best Practices to Maintain a Healthy State
Ian: Part of the things that we have worked on really hard over the last few months is getting to a place where we're doing continuous deployment, being able to find mechanisms to do continuous deployment with machine learning models that don't have a reliable timeline. So part of those things that we're finding best practices for is the process that I just mentioned, and part of what we're learning is how to do those best practices and it's something that you guys have been helping us a lot with because, like I said, we don't have anybody in house that knows all those best practices. It's been an interesting journey, just learning the basic things that probably a basic QA team knows and now really starting on that even further learning journey of, "Okay, now we're going to go live. We're really going to have to be agile about responding to tickets in real time." We've worked really hard to make sure that we have a pipeline that goes straight from users reporting a bug to our Jura instance, to making, what is it? A flow that tells us, "Hey, we need to test this specific thing." Then we'll add it as a use case when that happens, which is something that we've yet to really, really flesh out. But actually we were talking about yesterday with our customer service manager. So yeah, I'm afraid I can't tell you your best practices in terms of how to do QA because that's not what I do. I look to you guys to advise us and to lead us down the straight and narrow path.
Eden: Yeah. It's been a really exciting partnership so far, and I think it's really interesting just getting a chance to work with you and your unique view as a product leader. We work with a lot of engineering leaders, we work with QA leaders and we also work with product leaders. But what I've found really special about our partnership is I have a similar experience as a product manager as you, where I've gotten tasked with having to just rope in QA as a part of my roles and responsibilities because no one else was doing it. But it's actually really important, and so the way that product people think about QA and testing and the role that it plays, the responsibility that that carries is sometimes different than the perception from other team members on the team.
Ian: Yeah. It's funny, when I talk with other PMs, particularly PMs that are a little younger, I often hear comments about, "Oh, the higher ups don't understand this or they don't know that." First, I guess we should clarify that all of this is with the proviso that everybody knows more or less about all kinds of different things.
But I guess my point is it's not so much about what people know or don't know. It's about how often history repeats itself, and how startups usually go through very similar convergent paths.
For example, how many startups don't find themselves looking on one side at their ever diminishing runway and looking on the other side of their ever increasing scope to make MVP? How many people find themselves in that situation and the choices that leadership has to make usually fall in the same similar choices?
What I've found about what people know or don't know or what they ignore or what they don't ignore is, A, on the one side how much foresight you have to really assess what's coming up in the future and the resources that you have? Having the humility to accept that the scope that you can bite off is not as big as you think it is. Lastly, really it's all about the specific variables in the universe, in your own specific region of space time. Luck is luck, and eventually without it there's only so much you can do. Everybody needs just a little speck of it.
Eden: Yeah. I can relate a lot to that, given my own journey building Mobot and us being a startup as well. There is definitely a tension there, of there's so much I want to build still but you do have to balance that with what is realistic. We have to listen to our customers, we have to listen to the North Star and the product vision that we have and balance all of that and be somewhere in the middle of that.
Ian's Path to Product Leadership
Ian: Yeah. For example, oft repeated decisions that are made as you are nearing your deadlines and you're looking at your roadmap is, for example, telemetry. It's rare in my several... because I've spent most of my time in really early stage startups. Several times we are forced to release with very scant analytic telemetry or none at all, and it's not a question of whether we knew that we had to do it, or we knew we didn't have to do it. It's a question of, "Hey, we have only so many developers and only so much time, and trade offs have to be made." It can be anything, sometimes it's QA, which then the great thing about Mobot actually in this case, and I'm sorry I'm sounding like a shill, is you can do more with a lot less.
Eden: Yeah. Thanks for the plug. So you were touching on you've worked on a lot of different products, you've worked in gaming, you're working on a project related to blockchain, and you've managed a lot of products, you've founded companies before in your very long career. I'm curious if you could tell us a bit more about your personal background and what even led you down this path of being a product leader?
Ian: That's a good question. I love what I do and I think I found my true calling in terms of my career. However, I had no expectation or idea that that's where I would land. I graduated off of college with a degree in biology.
Eden: Wow.
Ian: Back in Venezuela, where I'm from. I quickly found out that biologists don't make a whole lot of money and that I like making at least a little bit of money. But anyway, so I started working at PNG and consumer market knowledge, so I spent a lot of time listening to consumers and just trying to figure out what makes them tick, how it can solve their problems.
Eventually I decided that I really liked figuring out what consumers like and trying to come up with ideas for how to address these needs. But I also knew that I did not want to work on consumer products, so I decided to go to business school, I went to UCLA. Being in LA, that's when I realized that I really like the games industry. That was my first love. That's what started it all.
I went into the games industry with the idea of, okay, we're going to figure out what games are going to be sold and how they're going to be sold, and eventually you get more and more involved into the how the things are built, not how they're sold and I fell in love with it. As I kept doing it, more and more in the games industry, eventually I wanted to see all these things that I'd learned and what I love doing about product is choices.
It's how do we get users to go down the golden path and what kind of incentives do we give them, how are they going to react depending on the different use cases, and how can we delight them along the way. It's been an interesting move to now try to do that in more enterprise, more business, less entertaining products. But yeah, I'm still doing it after, well, I think it's for a decade now and, yeah, I love doing it.
Eden: Yeah. It's interesting how you've unconsciously or maybe consciously specialized your career in mobile. Maybe it's because of that first immersion into gaming, and so there's a very close relationship between gaming and mobile. But is there an intentional reason behind why it looks like your expertise or your preference is now developing from the mobile tech stack?
Ian: It was being not in the right place at the right time, but being present for a revolution. I was working at Zinga right around the time when mobile games started to go from being just this thing that gets packaged to sell a mobile phone to, okay, now mobile phones are a platform and you can build games on them and new games were starting to become more and more prevalent.
So eventually we started noticing that we were losing our business to mobile phones, we started looking into how can we replicate these experiences, how can we also get into this space. I was there for that, and I happened to find it really empassioning because there's all these new different ways of interacting and there's all this empty space. Back then, what I thought was super interesting was location.
I worked in this location based games studio and back then it was the bee's knees because all of a sudden this concept of, "Wow, you can play and have the game react to wherever you are," was novel. Ever since then, it's been, "Okay, what different experiences can be enabled with this new technology?" And I guess that's part of why, as you say, I'm a little bit on blockchain.
It's because now it's like, "Ooh, what kind of new and cool things can we build?" And finding cool tech that makes you feel a lot of passion, at least in my case, has been what has driven where my career went or at least the early and middle parts of it. Now I like to think that I am slightly more driven by, okay, let's understand the opportunity and how we can impact things in a real way. But yeah, for the longest time my biggest driver for where I was going to work was just passion.
Eden: That's fantastic, and it's really clear, just the thought and intention that you've put into the kinds of products that you choose to get involved in and the kinds of initiatives you choose to get involved in. It's amazing.
Ian: I've been incredibly fortunate, honestly. I'm incredibly grateful to have been exposed to so many interesting things and interesting people. Yeah, it's been a good experience so far.
Eden: We could honestly do a whole other episode around all the other products that you've built before Plunk, and even some of your blockchain work as well. We'll have to save that for a different episode.
Ian: Yeah, for sure.
Eden: Last question for you is what advice or insights do you want to share with other product leaders who might be listening to this podcast, or CEOs or founders that might be thinking about either mobile product development or mobile testing? Any insights that you'd like to share?
Advice to Product Leaders in the Mobile Space
Ian: Well, two things, I guess. The first thing, and it's probably something that might be obvious to others but for me particularly in having worked in mobile for the longest time, but not having been physically present doing QA, really consider... and this is coming from somebody that thinks about how you use things, really consider where and how you're holding objects. I don't know if you remember, it took us forever to realize there was a bug because we test things, when you're testing things you test things on a flat surface or you're testing it really briefly. There was this huge bug that took us, what? Two weeks to figure out and it was because the thing that made it happen was tilting the phone, and until we lifted it from being just on the table and actually picking it up, that's when we were reliably able to reproduce it. So that's one thing, if you're thinking about getting into mobile and you're developing a mobile application for the first time, don't make all your testing static. Really take it out for a walk. That's not just for QA, it's in general when you're thinking about usability. I call it the walking test, just walk around, trying to do the app as you're supposed to be doing it, and if you walk into a telephone pole, you're not doing it right. So yeah, that's the first one. The second one is be aware if you're a small startup or if you're building something in a particular category for the first time, listen, rewind back to, what is it? The 20 minute mark, the 15 minute mark, whatever it is and listen to the bit about history repeats itself. Really, really, whatever you think you're going to build, and then you scope down to the MVP, you're not going to get there. So plan almost for failure and you might be okay. And it's okay, really, that frequently said thing about being embarrassed about your product is true. If you're still embarrassed by your product... if you're not embarrassed, you're launching it too late.
Eden: Yeah. There was a lot of good quotes in there that I feel like I could print on a T-shirt for anyone who's starting a startup to remember because, yeah, I remember in the early founder days of your hopes are so high and you haven't gotten enough feedback from the market and customers and users yet. It gets exciting, it's more exciting now and more rewarding to be on this journey now and have real users or real engineers that you're working with, and putting together a real roadmap. But I can totally relate to that contrast and the resetting of expectations that needs to happen.
Ian: If you ever do put a quote of mine on a T-shirt, just make it my number one rule of product, and I've learned this from very many years ago, rule number one, "Users don't read." They will never read. Don't expect them to read, and you'll be better off. It's not necessarily related to QA, but honestly you will save yourself a lot of time and grief if you remember that your users will not read.
Eden: Yeah. It's a very interesting and very thoughtful way of approaching product and design, and thinking about a UX. I can also relate to that from the QA side as well, sometimes I think we send out great, thoughtful test reports, and the details are in the test reports.
Ian: Of course. Users don't read, and everybody is a user. It's a little cynical, but that's one of the ways I've found you actually love your users a little more.
Eden: Yeah. Designing for them, thinking about them, empathizing with their perspective is important, and it's okay to just say that. But users don't read.
Ian: Users don't read. For 99 cents you will get the other five.
Eden: Thank you, Ian, for taking the time. This was a fantastic conversation. I really loved getting to hear more about Plunk, your vision for the product and the team at Plunk, and what you guys are building. There were some really good bits in there about the progression and the journey that you guys have been through, of pivoting the company, building this new product, I really loved hearing about that and your personal background as well. So thanks for taking the time, I really enjoyed this conversation.
Ian: I'm glad you enjoyed it. I hope you find it useful. I've had a bunch of fun, so whenever you want to sit down and talk about development horror stories, happy to be here.
Subscribe to Heavybit Updates
Subscribe for regular updates about our developer-first content and events, job openings, and advisory opportunities.
Content from the Library
How It's Tested Ep. #3, Balancing the Ownership of Testing with Alan Page
In episode 3 of How It’s Tested, Eden speaks with Alan Page. The conversation begins by exploring why developers should own the...
Jamstack Radio Ep. #119, Customer Retention with James Hawkins of PostHog
In episode 119 of Jamstack Radio, Brian speaks with James Hawkins of PostHog. In this talk, James shares insights on utilizing...
You’re Targeting Developers? So Is Everyone Else. Here’s How to Do Segmentation Better.
Caroline Lewko is an accomplished visionary and entrepreneur who has spent over two decades helping develop groundbreaking...