#76. Join me with Bert Wijns, a Global Solution Strategy Architect a Microsoft and co-founder of Power Accelerate as we talk about the tool he created which allows you to take a screenshot of a legacy application or database schema and have a Power App built in a few minutes with the legacy data migrated. Sound crazy? Listen on to hear more about Bert Wijns' Power Accelerate.
Make sure you stick around until the end of this episode to find out how you can get access to the bonus episode with Neil Benson and Bert Wijns' extended interview.
Our discussion covers:
If you'd like to hear that bonus extended interview, please make sure you subscribe to the Amazing Apps show in your podcast player and set it to download new episodes automatically.
Support the show (https://buymeacoffee.com/amazingapps)
Welcome to the Amazing Applications podcast for Microsoft business applications creators who want to build amazing applications that everyone will love.
Hi, I'm your host, Neil Benson. I've been practising Scrum since 2008, and I've been a Microsoft Business Applications MVP since 2010. My goal in this show is to help you slash your project budgets, reduce your delivery timelines, mitigate technical risks, and create amazing, agile Microsoft Dynamics 365 and Power platform applications. Amazing applications is a little different to most of the other podcast shows in the Dynamics 365 and Power platform community on this show, we believe that the approach that we take is just as important as the technical skills of our teams. And so we focus on applying agility to our projects, our applications, our organisations and quite frankly, to ourselves as well. I also love celebrating the success of Microsoft customers and partners who've built amazing agile applications to find out how they did it, and I try to get them to open up and reveal what worked for them and what didn't. My guest on this episode is a little different. It's the story about how a passionate power platform solution architect imagined a future where you could take a screenshot of your legacy application or its database schema and have a Power Apps built for you in a few minutes with all your legacy data migrated into dataverse as well.
If that sounds as crazy to you, as it did to me before I met my guest, Bert Wijns, and saw Power Accelerate in action, then you're not going to want to miss this episode. Speaking of missing an episode, is there anyone else in your team who might be missing this episode or maybe never even have heard of the Amazing Applications podcast? I'd really appreciate it if you could take a moment recommend it to one person on your team and suggest that they listen to Bert's amazing story about his application that builds power platform applications. Tell them to go to Amazing Apps DOT show and listen online and subscribe on their favourite podcast player. You can find show notes for this episode, including contact details for Bert and Power Accelerate by visiting Customery dot com slash zero two six. And make sure you stick around until the end of this episode to find out how you can get access to the bonus episode with our extended interview, here's Bert Wijns.
Welcome to The Amazing Application Show, it's great to have you on. I really look forward to this interview. I think it's going to be something a little bit special. We'll get into that in a moment. But just for our audience, I wonder if you could just take a second and introduce yourself. And my first question is going to be, what did you have for breakfast this morning?
Right, OK. Yeah, good. Good. Good morning, Neil. First of all, thanks for the opportunity. For breakfast this morning, I think I just had a bit of cereal and orange juice, so nothing too fancy. But it's a weekday after all when we're recording this. A quick introduction. I've been working in the Microsoft technology space for about 13 years now. I think most of the time with the Dynamics tech in the last four to five years and deeply integrated into power platform, my current role is I'm the co-founder of this power accelerate solution, but I do have a main job as well. I work for Microsoft, inside of Microsoft Consultancy Services, and so yeah, I like to be busy.
So your current role sounds like you've got two you've got a consulting role within Microsoft Consulting Services and you're starting up a new venture as well. What was your first role out of school or out of university and what were you doing whenever you just got started in your career?
My first role was actually a dot net developer, I think about 2008. That's a long time ago. It was for a partner company, EDS, which then got rebranded to HP and DXC. And I think it got yet another name by now. And I started off as a dot net developer, did that for some time. And then after a year or so, I rolled into Dynamics CRM, CRM 4.0 back in the days and that got me hooked before I knew it I was doing CRM for many years.
It's strange to hear somebody from Microsoft criticise other people for changing the names. There's been a little bit of consternation in the community about the renaming of things, but I'll leave that alone for I don't. I'll give you the benefit of doubt. I'm going to assume you don't work in marketing and you aren't responsible for other name changes.
And so tell me a little bit about Power Accelerate. I understand it's a new venture for you. Tell me a little bit about the company that you've started and the application of what it does.
Yeah, so Power Accelerate. We started this journey, I think, about a year ago where we were seeing that it was taking customers quite a bit of time to get started on their power platform journey they got to the initial ideas and they built some small little apps really fast, but really to get fully into the platform and the true value of the platform it was taking companies a long time and we figured also that some of these thoughts in building this more enterprise applications are moving legacy applications to our platform is actually quite repetitive tasks like each time you are to create a data model, you have to do the data migration and create some screens. And yeah, we just sit on the idea like there must be a way of doing this. And especially if you look at doing this at scale, they look at maybe 50 or 100 applications. And from that idea, we just got to work, build a small MPP and pilot to test out if this was possible from a technology point of view and about, I think six months ago that MPP was ready and we started the private preview and started working with the customers and partners, mainly a lot of partners as well in a private preview data mode, so they get the benefit of not having to do the manual work and we get the benefit of improving the product. And I think we now have about five to six months in released many new features, many new channels. But it's nice to see it grow as well, where partners are really starting to see the benefits, but also giving us biggest challenges like it initially started with moving some smaller, simpler applications to our platform. But now we're talking about dataset, 150 tables, and thousands of records. And so they give us new challenges which we can use to improve the product as well. But the true objective is that we try to accelerate the customer's journey by automating a bunch of the discovery tasks of moving applications as well as the actual implementation work.
So I can take an existing data model. Is that the most common starting point? Maybe an access database or SQL Server database and Power Accelerate can consume that is that where most people start?
Yes, so we will support four channels at the moment, one is screenshots, so that can be a screenshot from a screen in an application, could also be something you draw on a piece of paper or in a ... For the Demos and what it does It uses an active services model to detect which fields are on this form, and then it translates into a data model. And then we automate all of the tasks we can to move into a Power Apps. Right. And squeeze it into this one channel. But then we also support Excel Access and Microsoft SQL Server and the benefits of Excel is that on top of detecting the form, we can also move the data before access and SQL we go even further than we have a lot more details about that the source. But we can also move relations are relational data to the platform. And I think what's a good story to tell is one of the first things we were showing this to a partner in Spain, I think it was I remember we did the demo and at the end of the session, the lead developer went like, wow, thank you for automating the most boring part of my job.
Because he had sitting there and been getting these data models field by field for the last year. And it's like, wow, I need this.
Yeah. I've got to admit, you know, creating tables and fields sorry what are they called now? Yeah, I get it right. Tables. Yes, yes. Tables and columns. Thank you. Yeah. Yeah. It's never the most glamorous part of the role and I've always used a little sheet method using the data import wizard from the old CRM days.
You can actually import a data source that doesn't exist. So the target doesn't exist and it will create the columns for your data type. And sometimes it's never quite right. But it was good for a demo, maybe not always for production. I saw a YouTube video a few months ago and it was a let's say it was a solution architect or a business analyst in a workshop with a bunch of users, drawing fields on a whiteboard, you know, UX design on a whiteboard. And there was a camera there hooked up to a computer, some kind of cognitive service translating that into a user interface design. Is Power Accelerate there yet or it sounds like you're pretty close? If you can take an MS paint user interface and consume that and build me something in Power Apps.
If they take a picture of their form, which they build, that we will take the data model and auto-generate everything in the path there from stack going from the data first tables, the fields, the data types. And we have a bunch of logic there as well to detect common data fields depending on the label, because otherwise everything would be a text field. And then we would also create a model that even that populate forms views with those fields. And then we can also generate a canvas app.
The thing we cannot do because you said it was a user experience design session, that they probably put fields in a certain position and to make it more accessible or usable. Yeah. So we don't have that advanced logic where we can create exact screen designs, where we what we do is we create the data model the field and we put those fields on the form. But someone still has to put them in the right order or the right format, it gets too confused. We're half way there I would say.
I would say the optimum layout for any given form is a very tricky balancing act for any UX designer. Do you group all the most commonly used fields together? Well, and at least in Australia, a middle name is often not populated. So you go first last name and put middle names somewhere else in the form. You know, those are very difficult design decisions for any team to make. How does your application decide whether to build a model driven application or canvas application? I got a sense you're doing both.
Yeah, so so by default, we create everything in data first, so to accommodate a third of a comment on the name change or the data flex and name all three of the first things to know.
So by default, we create everything in there to the FDs, and then we always create a model-driven app, and then we have the possibility to create a canvas app on top of that data structure, which we've created.
That's the way we do it for the canvas app. So initially we were using a robotic process automation to Power Automate desktop to basically have a certain template. And then we automate the manual actions to replace the data source in the template in the field. Using RPA with the release of this experimental tool, Microsoft released, I think two or three weeks ago with the YAML language where you can expect an MS app into YAML. We are looking at going that route because it's more reliable and it requires less prerequisites on our side because we've seen it at the Power Automate desktop here and there isn't always performing as we expected. And that's some difficult cases like entities with the same name, which make it hard. So we are definitely looking at a new part, the experimental feature, I think they called it as well.
And tell me that Power Accelerate is making some pretty intelligent design decisions. You're not using new underscore, for example, as your prefix for your custom tables. How do you decide, for example, and the data type for any given field? I could maybe infer from a label that it's a date field, for example. I guess if you had a screenshot of a legacy application, that might be easier if there's a date picker. But for choices field, how do you know that it's a choices field and can you ever populate the set of choices that would appear in the dropdown?
Having worked in consultancy for Dynamics for more than 10 years, we don't use the new underscore naming as well as all of the solution files. So we only populate the additional fields and use all the best practises that that is the benefit of knowing the world, because dataverse comes from Dynamics under the hood. So a lot of those best practises we, of course, leveraged.
Around the common field logic, the way it works is twofold. One is we allow we have a set of global common fields. So our Power Accelerate tool is a SaaS tool. So it's an online hosted tool in dot net and we have a set of global fields which apply for everyone. But then, as in your own Power Accelerate setup, as a partner, as a customer, you can add fields to that in a way it works for you uses labels and we use fuzzy matching based on those labels to detect, for example, the date of birth field that is date-time field. And based on the label we detect, we will flag it as a date-time.
And the second layer of intelligence we added after that is a common data model mapping as to the common data model from the collaboration between Microsoft, SAP and some other companies, they have a common data model like person records, contact organisation...and so there we go a little further. If we detect through the common fields logic that there is more than three fields which would map to accommodate the model and the like first name, last name, date of birth.
And we will automatically suggest that, hey, we're seeing this could potentially be a person's record and we will allow you to map to that common data model entity and it will then automatically map in the logic to accommodate a model forcing an actual best practise because another thing we're seeing is that when those citizen developers create circulating 100 apps, they also create 100 different data sources and 100 different person maker types, which creates a nightmare for someone to do something with that data.
Yeah, opens up a whole raft of very difficult design decisions. You know, do I reuse a common data model entity, which a lot of them would be a good idea, given a great example there of that looks like a person that's model that on the contact table. But sometimes those common data model entities come with a, how do I put this politely, a set of baggage, some history, some constraints that often a solution architect or developer will go, no, I can't live with those constraints. I'm better off creating my own custom entity even when it looks a little bit like one of the system entities. Can your application make those kinds of trade-offs as well whenever it's making those decisions?
Yes the way we do it is we put the decision in the user's hands. So we work in three steps. One is you upload the source screenshots in Excel and Access and then you get a review step. So we give them an overview of this is what we have detected is the suggested common data model and here's the common fields and they can decide whether or not they want to apply that that logic, because you're totally right. Been there, done that and I have the T-shirt of making those wrong decisions somewhere in my career. I have to refactor it entirely afterward. So that decision we actually leave it to the user we don't make a decision for them, but we do make the suggestion if we detect a person or an organisation like, hey, looks like we are seeing a person or an organisation that you can map it through the tool.
So I guess once the data model is in place, you said that you also build some forms and views, do all that boring things that developers don't enjoy doing. I guess there's still all the requirement for user interface and validations and some relationships maybe, but also all the business logic of this field is required or that whenever this field is equal to this value, then this other field becomes read-only and all those kind of user interface business rules that need to get applied. That's still the work of an implementation person, whether that's a citizen developer or a Microsoft partner with a professional developer. Is that right?
Correct, this is the third step in the ultimate generation of everything which is possible. We have some logic, like we populate the forms and the views with fields which have most data so there's some logic to help there but that through screen logic and those business rules is something we cannot do yet. You never know in a couple of years because one conversation we had with the customer was around, I have tens and tens of Excel files and some of them have complex formulas. Well, as you're reading the data model and the data out of Excel already, you can read the formulas and based on the formulas, you could make certain decisions. And while we looked into it, it's just too open-ended then and again. It's so dynamic that it's hard to define any rules for that at this stage. That is still yet the work of a citizen developer or a professional developer either in some cases.
Well, that's a bit of a relief Bert that we've still got jobs for the foreseeable future. Thank you for that. There's all this boring work, some of the statistics I've read about how much time and effort you can save on a typical implementation. And the stats are pretty compelling. I think you've taken some of the you know, the classic example files, North Wind database. I can't remember what's the name of the what's the name of. There's a wine database as well as pretty famous, the old bike shop. So some of these classic Microsoft sample databases and SQL Server and Access. And you've illustrated how quickly those could be built-in Power Accelerate on the power platform just by consuming some of those sources, some of. Share with us some of the statistics again.
Yeah, and actually we like literally yesterday we released the feature now inside the tool, which shows specific statistics for why you've just uploaded. So it's a discovery inside a feature where if you upload your database, it will list how many tables, how many records fields, how many common data model suggestions. And then based on that, we provide an estimation of if you would do that manually, it's almost like an estimation engine, how much effort that would be. Right. And how much effort we can actually then say if you would use our solution and available on our preview website now, but with common tools, which we at the common databases, which we use like North Wind, Adventure Works, bike store, all of those. And we could see that we could move apps up to three times faster and automate up to sixty-five percent of the effort that the categories where we can save a lot of effort and automate a lot of effort is around data modelling.
Data migration, of course, is a really big one, but also analysis and testing where we provide the least help in the areas around the build or implementation efforts, the screen logic, and all of that. While we populate the fields and on the forms and the views and the canvas app. There's still logic there which people need to do. But yeah, from feedback from customers as well, is that it is not yet that big of a deal because that's the kind of job citizen developers like to do. They like to create screens and make it look nice, that they don't like to create fields and data model and have to think about that and how they do that.
And so if I start with a database, you mentioned data migration there. If I start with a database, do I just provide your tool with the schema or am I actually providing the database and you're actually migrating the data as well?
If you can call Access a database as well, so there's two approaches for Access, you can just upload the access file, and then we have everything for SQL Server. We've learned the hard way that customers aren't very willing to run some kind of a tool from a company they've never heard of on their production database, reading their data. So the way we built that, we leveraged credit. So we asked them to execute one query which extracts the metadata. So they get us all of the data, which is tables, the fields, the relationships. And then based on that, they based that result in our tool. And then based on the results, we generate a second query which fetches the data, and then they paste that response in our tool as well. And the big benefit is they don't have to install anything. There's no connectivity or security issues and they see exactly which data they give to us and they can also run it first without the data. So it gives a lot of flexibility around that discussion as well. That's also a big learning we had from one of the first partners we started working with. Like there's no way this customer is going to allow us to run a tool on their production server.
Well, I'm wondering if I could be a little bit more creative than that. Could I use this tool to migrate an old instance of Dynamics CRM 2015 2016 and say, here's my schema, go and build that on Dynamics 365 customer engagement, then migrate the data as well. Has anybody tried to use Power Accelerate for a CRM on Prem to online migration?
We've not done that exact scenario? And I think the reason why we where we would face some struggles, as well as the way Microsoft puts out the data model because it looks at tables and field names. And I think it would be really hard to recognise those technical field names which Microsoft uses under the hood and map it to what is, but that in theory, it's possible that there would be some manual work mapping and some of the other common data model methods like accounting contact, we would have recovered quite easily, but some of the other tables would still be some manual effort. But it's not impossible if you have access to the database, if it's an on-premise or something model, it is certainly possible from a technology point of view.
And if a Microsoft customer or partner is interested in Power Accelerate and the advantages it could bring to an implementation, what sort of customer or partner do you think would benefit from this? For example, I do a lot of work with teams that are building and designing the applications in sprints, so we don't have a data model at the start of the project. It evolves as we learn more about the requirements. But if I'm moving a legacy application and it's not quite lift and shift, but I need all that data to come across into a new power platform application that sounds like a great scenario for Power Accelerate for that first one which an iterative design and development can I use Power Accelerate to achieve any advantages there?
Yes, we see two big scenarios which we are working with partners and customers. And one is indeed the modernisation piece, which is our initial use case and where we started from, really. And it's indeed a bit more obvious because you have something which you can take across and that after starting down a path, we did see Power Accelerate being useful as part of a centre of excellence as well, where you have citizen developers, but also other teams starting to create these apps inside of a customer environment. But they don't always take into account those best practises with those initial releases.
And what we've seen is once the people managing this..., the certain applications have a higher user base and value for the organisation, they take those applications up using Power Accelerate and move them to a more hosted instance or an environment which they have control in a common example that as well as SharePoint lists have been an extremely popular data source, mainly because of licencing constraints and licencing opportunities. And so a lot of the people we work with now use Power Accelerate to move apps or SharePoint lists to dataverse as well. So and they use case from the centre of excellence is growing. It can either be more experienced citizen developers who work with a set of citizen developers and take up an app or data model and move it into a different environment or the professional developers as well, that the most simple case where they draw something up on the phone during a design session. But also if a citizen developer has done something in Excel already, which they want to quickly pull over so that those are the two main peak use cases and we're trying to get prioritised the centre of excellence use case a bit more to create some new features in there.
What kind of limitations does the have? I know you've worked in a lot of enterprise deployments with big customers. Can I point this thing at my SQL Server backup of my finance and operations database and rebuild finance and operations on top of dataverse? What kind of limitations does it have at the upper end in terms of number of tables and the complexity of the application? We might be trying to modernise?
By default, we enforce some limits. And also because we've learned when we initially started the private preview, we had some people who were very original and tried out a lot of scenarios. Yeah, we make one Power Platform instance available to all of our users, and that instance got very polluted, very fast. So enforcing some limits like the number of records, mainly the number of records they can migrate on top of our instance. The idea is that if they bring their own instance, which is also supported, that we remove that limit and they can go unlimited then again, unlimited is something we are learning on as well. So if I give you some examples, that adventure works, I think at 81 tables and back 20, 30000 records, we were able, after fixing many books and timeout issues to get to support that one. But now also working with a partner in Spain, which is giving a database with 150 tables and almost 100000 records, which we're working to us as well. Yeah, we are facing some time out issues which we're tackling one by one, and we try to optimise the tool by making it use threading, multithreading but then we learned the hard way that the metadata API of CRM doesn't play nice with us and we were getting customisation lock exceptions.
So we're now looking at, yeah, splitting up the model and doing it in multiple runs in order to support that. So I've worked in a lot of complex scenarios, so I know the space quite well. So we're trying to support as much as we can. But yeah, at the moment I would say that the limit is somewhere around 100 tables, 20, 30000 records, anything below that should be able to go through the tool without our support. If someone does have a requirement larger than that, just feel free to reach out and we'll work with you to make it work. And one learning we did have sometimes already as well, is that when people do modernisation, they don't need all of the data, all the ways that they can be data and tables there which are there for backup. And so we do advise some kind of a cleaning phase before handing the database to our tool as well.
Building the application quickly is one part of it, one struggle that we all have, you know, migrating the data. There's already proven techniques and tools out there for that. So if I can use Power Accelerate to help me build the application quickly, get the basic forms, views designed and the data model in place, and I can always use a more traditional approach to migrate the data. So you've given me even more options there, which is fantastic. And where are you in terms of your journey towards general availability and having this, you know, globally used by anybody you mentioned private preview. Where can customers go or partners go if they're interested? Is it generally available yet is still in a private preview phase?
Yeah. So so we are planning to go GA somewhere in the next month or so. I would say the initial target was the end of January, but I had a personal situation a little bit in the way of that. I became a father for the first time three weeks ago and that took quite a bit of time that we had to reschedule a bit.
Thank you. So the idea is that, yeah, for private preview, we accept anyone who has an interest in using our product and is willing to commit to actively use it as well. So if someone really wants to try it out, has some use cases, and they reach out to us, we will get them set up in the private preview and work with them to tackle everything we currently have, I think 14 partners in the private preview, actually thirteen and one customer in the private preview which are trying out a product from those. A couple of them are actually doing production work for their customers, but we're not charging them anything for that yet. I'll still call it private preview that we are looking at coming up with a GA model sooner than later. I would say for those customers who have who are doing production work using our tool, we will see some kind of a trial mode after that, as well as customers and partners, can use the product in trial mode before they have to buy anything.
Yeah, well, that leads me on nicely to the next question about licencing, and there's a million different ways you could licence an application like this. Have you settled on a licencing model yet? And what do you what are your thoughts in that direction?
Yeah, we worked with partners on that model and we've learned from them and how they would like to use it. And we landed on the model where we provide two big features. One is discovery, and the second one is the automation of the work where we will have the discovery of the first layer and we're scoping can be done. You get an estimation of how much work would this be and will always remain free. And so you can just upload file in there and you can get the statistics and also provide an estimation to a customer if you are a partner. And then for the second layer, the automation of the work. We're looking at a sort of credit model where a certain application as a complexity, like if you upload to screenshots, which generates three tables, it isn't as complex as if you do adventure works. And it's based on the complexity and the use those statistics as well to define it. And we will have credit model where the customer or partner can buy a set of accelerate credits in a bundle, and then with that, they can pay for that work. Actually, that's the idea behind it. And we have an entire ...model linked to it working with those partners. So if anyone wants some more details, wants the real numbers, feel free to reach out to us at our website, Power Accelerate dot com, or the LinkedIn page, and just send us a message and we can do something specific.
Yeah, I love that. You know, the more I use it, the more value I get from it, the more I'll pay for it. That that suits my consumption mindset really well. So, yeah, I really like your idea of where you're thinking there. I would always advise any ISV and a building, an ISV product as well. Try and keep it as simple as possible. We all know how complicated the Dynamics 365 licencing has become over the years.
Everybody find it much easier when it was nice and simple and there was one almost one price for everything. But, uh, yeah. Sounds like you've got some great thinking.
What's coming up in your roadmap? Talk us through the next three months, six months. What are you can you talk about what you're working on next and what else you'd like to achieve? What kind of feedback have you heard from your early adopters?
Yeah, I think the big milestone is that we want to go GA and then be the first stable version because now I think we're in this private preview mode for six months now and features which a lot of people have asked for and get value. We really want to get to a phase where we have that stable basis, which we can put out there and people can use it because otherwise, you get into a mode where you keep releasing new features, and sometimes that comes with a couple of perks as well, the big things on the road map out of the top of my head is one, support for larger databases is something that actively working on it will be released in the next couple of weeks, will make something available where you can more easily select which tables you want part of the process. It comes really handy if you have more than 100 tables, it becomes really nightmarish. You have to treat them one by one. And so that's a big one. Also big in volumes, we're starting to learn that some of these databases, even if it's just one tables, can have a lot of records. And so we do want to support that.
Even SharePoint lists?
Even Sharepoint lists, even though we did some tests there with quite large lists and also the data migration at our site, we've optimised using execute multiple and all of that. So it is quite a bit performant and other things on the radar, support for dataverse for teams. So we can add that as a target channel, we are looking a little bit at the API, which is going to make available linked to dataverse for teams. That's a big prerequisite, of course. And then there's also a public sector customer we're working with around moving paper forms into our platform apps. So that can be worth documental paper forms and taking applying a bit of the same logic. So adding more channels. And I guess the final one we're looking at is. Yeah, Lotus Notes migration towards.
Oh yeah, I'm, I hadn't thought of that. I looked at the Queensland state governments. I live in Queensland and here in Australia on the State Government IT website, you can see all the projects that are ongoing at the moment. There's an eight and a half million dollar project to do modernisation of ancient Lotus Notes databases. So, yeah, this is a massive market for that. I always wonder, though, and in 10 years' time, is there going to be a 20 million dollar project for a Power Apps modernisation project? Are we building the legacy apps of the future? Maybe that's my kids will sort that out.
That's the thing is if you have Power Accelerate, it won't be 20 billion, because the idea is that we automate that.
I'll pay 19 million dollars to you and the Queensland government can save a million dollars.
That's the first part of my conversation with Burt, if you'd like to hear that bonus extended interview, please make sure you subscribe to the Amazing Apps show in your podcast player and set it to download new episodes automatically. A bonus episode is not going to be published on my website or promoted on social media. It's just my way of thanking Amazing Apps show subscribers with some extra content. In the bonus extended interview, we discussed the size of the Power Accelerate team, their approach to building their application, how Power Accelerate is being funded and what Microsoft thinks of its employees building commercial power platform applications in their spare time. We'll learn what Bert's hopes are for Power Accelerate and his advice for anyone else like you or me, hoping to build a community application or commercial application as a side hustle. Remember, you can find show notes for this episode at Customery dot com slash zero two six . If you find Amazing Applications insightful, please remember to subscribe to share it with someone in your team. See you next time. Until then, keep sprinting.