WEBVTT
00:00:00.160 --> 00:00:02.799
People are talking about this vibe coding thing.
00:00:02.959 --> 00:00:05.679
Why don't I just try to vibe code it?
00:00:05.759 --> 00:00:16.559
I had no clue where to begin, so I saw some YouTube videos and yeah, ended up vibe coding uh Welcome everyone to another episode of Dynamics Corner.
00:00:16.719 --> 00:00:21.519
I love this episode because we all feel it but the pressure of AI.
00:00:21.920 --> 00:00:25.199
Instead of lifting all the pressure off, I'm your co-host Chris.
00:00:25.359 --> 00:00:26.160
And this is Brad.
00:00:26.239 --> 00:00:28.960
This episode was recorded on March 12th, 2026.
00:00:29.120 --> 00:00:30.399
Chris, Chris, Chris.
00:00:30.960 --> 00:00:31.519
AI.
00:00:31.679 --> 00:00:33.280
I hear that word everywhere.
00:00:33.520 --> 00:00:36.079
And boy, do I feel like I need to use it.
00:00:36.320 --> 00:00:37.280
But what do I do?
00:00:37.439 --> 00:00:38.079
How do I use it?
00:00:38.159 --> 00:00:39.600
What do I do with all that pressure?
00:00:39.840 --> 00:00:42.640
We had the opportunity to speak about that.
00:00:42.960 --> 00:00:45.679
And the Payables Agent with the Business Central.
00:00:45.759 --> 00:00:48.399
With us today, we have the opportunity to speak with Sorin Alexanderson.
00:01:00.880 --> 00:01:01.439
Good afternoon.
00:01:01.600 --> 00:01:02.079
Good morning.
00:01:02.159 --> 00:01:03.439
How are you doing?
00:01:03.920 --> 00:01:05.120
Good afternoon.
00:01:05.359 --> 00:01:06.079
Good morning.
00:01:07.040 --> 00:01:08.079
It's been some time.
00:01:08.239 --> 00:01:09.120
Welcome back.
00:01:09.599 --> 00:01:10.640
Thank you for joining us again.
00:01:10.799 --> 00:01:11.280
But looking forward.
00:01:11.599 --> 00:01:20.560
For some reason, I don't know, as I was uh Chris and I were talking before you joined, I started thinking of uh Welcome Back Carter.
00:01:21.120 --> 00:01:27.599
You know, because uh this is a repeat, uh not a repeat, but you've been on speaking with us before, so this is uh not your first time.
00:01:27.680 --> 00:01:30.959
So I wanted to say, like, welcome back, you know, thank you for taking the time to speak with us.
00:01:31.200 --> 00:01:36.319
But then that tune just started bumping into my head, and then I said, Boy, what year was that?
00:01:38.480 --> 00:01:38.799
Oh.
00:01:40.239 --> 00:01:41.439
I can't remember.
00:01:41.760 --> 00:01:51.680
Back when John Travolta just started, I think it was back in the late 70s, maybe early 80s, but then I determined that I want to go back to 1985.
00:01:52.480 --> 00:01:55.120
Why do I want to go back to 1985?
00:01:55.920 --> 00:02:05.200
I think we all thought, well, Chris, I don't know where you were in 1985, but I was I was probably I think I was I was one year old.
00:02:05.599 --> 00:02:07.200
I was running around in the street.
00:02:07.519 --> 00:02:12.400
I was running around in the streets, uh you know, playing ball, throwing those lawn jots.
00:02:12.479 --> 00:02:15.280
Uh sorry, did you have did you guys have the lawn jots over there?
00:02:15.520 --> 00:02:25.439
Remember the lawn jots with the big plastic thing, big long plastic lawn jar, and it had that big metal arrow at the end of it that you threw it and it came down and it stuck into the ground.
00:02:25.759 --> 00:02:26.240
Yep.
00:02:26.479 --> 00:02:27.439
I can't remember those.
00:02:27.680 --> 00:02:29.120
I'm amazed no one is hurt.
00:02:29.199 --> 00:02:30.560
Like no children's ever hurt with that.
00:02:30.960 --> 00:02:32.000
I'm sure people got hurt.
00:02:32.719 --> 00:02:40.400
There was zero chance of them selling that today because it was like a big they sell lawn jars today, but they're all like lightweight plastic.
00:02:40.560 --> 00:02:40.800
Yeah.
00:02:40.960 --> 00:02:48.400
But we used to have when I was growing up, you had this big plastic circle, like not a hula hoop type thing, but a big it was like a hula hoop, but a small.
00:02:48.639 --> 00:02:59.120
And then you had this jar, they called it, that you threw, and you threw it up in the air, and you wanted to land it in, you want it had to land in the circle for you to get the points.
00:02:59.360 --> 00:03:02.000
And it had a big look it up afterwards.
00:03:02.080 --> 00:03:06.960
Uh Chris, maybe look it up and just show a picture of it uh when we do the the editing.
00:03:08.080 --> 00:03:11.919
But it had a big like lead point.
00:03:12.319 --> 00:03:17.759
It wasn't even there, but it was like a big point that if you and we would stand there and say, ah, you're not going to get us.
00:03:18.000 --> 00:03:21.439
And if that thing hit you in the head, today they wouldn't sell any of that stuff like that.
00:03:21.840 --> 00:03:26.560
But um times have sure changed.
00:03:26.719 --> 00:03:27.360
Oh, yeah.
00:03:27.599 --> 00:03:32.000
From from the times of being able to sit in the in the yard, uh, run out.
00:03:32.080 --> 00:03:33.599
I remember my mother would give me a dime.
00:03:33.680 --> 00:03:37.439
I always would have a dime to go into the payphone if I needed to make a phone call.
00:03:37.680 --> 00:03:44.400
And uh I remember walking up the street to go get stuff at the store and just being able to do stuff.
00:03:44.479 --> 00:03:49.520
And now and today we have AI, the craze, the pressure delivered to you, man.
00:03:50.960 --> 00:03:53.599
And well, absolutely, delivered to you.
00:03:53.680 --> 00:03:59.439
I can order something at four o'clock in the morning, and Amazon will deliver it by 10 or 11 o'clock in the morning uh sometimes.
00:03:59.520 --> 00:04:02.479
And it's uh we've really changed.
00:04:02.800 --> 00:04:12.879
Um, and I think uh a lot, even more so now with technology and uh you know, speaking with a lot of individuals, is uh there's also the appearance of a lot of pressure from AI.
00:04:12.960 --> 00:04:16.160
You know, we we we can do more, so we need to do more.
00:04:16.480 --> 00:04:17.680
And how do we do more?
00:04:17.759 --> 00:04:18.800
How do we manage we do more?
00:04:18.879 --> 00:04:20.160
We feel like we need to do more.
00:04:20.319 --> 00:04:21.680
Is that pressure real?
00:04:21.839 --> 00:04:25.360
Is that pressure self-induced or self-imposed?
00:04:25.519 --> 00:04:37.519
Or is it a combination of uh, you know, many forms of uh self um, I don't know, is that self-imposed, self-imposed pressure, pressure from peers, pressure from information?
00:04:37.680 --> 00:04:49.519
And what I mean by that is we see a lot of, oh, this is a new, you know, now the big craze is all these clodbot, I'll put that in quotes, type uh, you know, systems that are being set up to help manage and orchestrate and do this.
00:04:49.600 --> 00:04:50.800
Everyone's coming out with one.
00:04:50.879 --> 00:04:56.000
So is it because we're exposed to it that we also feel that pressure?
00:04:56.399 --> 00:04:57.519
So I don't know.
00:04:58.000 --> 00:05:02.800
Uh there's so much to unpack in what you just said already.
00:05:03.040 --> 00:05:06.800
But just I I'm I'm still stuck in 1985 by what you said there.
00:05:08.879 --> 00:05:13.600
One of my most vivid childhood memories is actually from no, and that's from 1986.
00:05:13.920 --> 00:05:21.120
Uh I remember like we listened to radio back then, as as as you did, like in the morning and so on.
00:05:21.279 --> 00:05:35.839
And I remember the Challenger shuttle explosion, like when they told us over the radio, and remember then seeing the footage in the news or something like that afterwards.
00:05:36.160 --> 00:05:43.199
That is just one of my I was just amazed by first of all this whole like space thing.
00:05:43.519 --> 00:05:46.160
Like I was nine years old at the time, right?
00:05:46.240 --> 00:05:49.839
So but then also just like and it and it blew up.
00:05:50.079 --> 00:05:52.800
It just like like tragedy.
00:05:52.879 --> 00:06:01.600
Of course I could I didn't understand the sort of ramifications of it, but it's just it was just an amazing like and from then on I just I wanted to know all about space.
00:06:01.759 --> 00:06:03.519
It was thing I could sort of get my hands on.
00:06:03.600 --> 00:06:07.279
It was just sort of the ignition, uh if I can use that word.
00:06:07.519 --> 00:06:11.839
Uh so yeah, no, um, yeah, the 80s, great time, man.
00:06:12.000 --> 00:06:12.240
Great time.
00:06:12.560 --> 00:06:14.000
It was a great time.
00:06:14.240 --> 00:06:15.839
Uh I do remember that as well, too.
00:06:15.920 --> 00:06:22.480
I remember I was in uh in class at school and they wheeled in the TV because we used to watch the at that point.
00:06:22.720 --> 00:06:27.519
Again, it was a big uh back in the 80s, it was big for the space exploration with the space shuttle.
00:06:27.759 --> 00:06:31.439
And I remember when I do I remember that vividly.
00:06:31.519 --> 00:06:33.600
I think there's a lot of points in life that we go through that we remember those.
00:06:34.240 --> 00:06:46.319
I remember when they they used to wheel the TV, and Chris believed that they would wheel the TV and the TV was plastered, uh uh bolted to that big dolly, and they have a VCR or something in there too.
00:06:46.399 --> 00:06:49.839
Or I didn't even think back then, I think when I was that young, I don't even think we had VCRs.
00:06:49.920 --> 00:06:54.959
I think we had that when we wanted to watch a movie in class, you know, the big real yeah.
00:06:55.040 --> 00:06:56.240
Do you remember the real projector?
00:06:56.959 --> 00:07:00.639
To be honest with you, that was one of my class jobs for a period of time.
00:07:00.800 --> 00:07:04.879
Is when we watched the movies, is I would load on the projector.
00:07:05.040 --> 00:07:05.360
Yes.
00:07:10.720 --> 00:07:11.600
It meant something different.
00:07:12.399 --> 00:07:16.959
Those tube TVs bit I think they went through all the all the way to the end of the 90s.
00:07:18.240 --> 00:07:20.480
They had those tube TVs with VHS's.
00:07:20.639 --> 00:07:23.439
Um yeah, you were lucky you had VHS.
00:07:23.759 --> 00:07:26.720
Uh yeah, I don't think that was until I was almost out of high school.
00:07:26.879 --> 00:07:37.360
But uh no, there's a lot that we wanted to talk to you about today uh in the world of AI, in the world of vibe coding, as well as uh another big uh addition to Business Central, which is the Payables agent.
00:07:37.439 --> 00:07:45.920
Uh and a payables agent has uh made some significant enhances over the past couple, I guess, releases, and I call them releases, major updates, major releases, and minor updates.
00:07:46.079 --> 00:07:46.879
So I'd like to talk about that.
00:07:46.959 --> 00:07:48.879
Before we jump out, you might tell us a little bit about yourself.
00:07:49.199 --> 00:07:49.759
Yeah, sure.
00:07:49.920 --> 00:07:55.759
Uh so my name is Soren, and I'm uh product manager in the business central engineering team at Microsoft.
00:07:56.000 --> 00:08:00.000
I've been at Microsoft for 10 years here, January 1st.
00:08:00.240 --> 00:08:18.000
Uh so yeah, and eight and a half years within the BCE team and uh moved around a bit, but always in sort of um in the application uh workload, as we call it, or sort of working on user-facing features and now lately with the with the payable agent.
00:08:18.560 --> 00:08:36.399
Uh yeah, so that's uh I've been in the industry for almost 30 years, uh always doing ERP in of sorts, uh all the way back to uh Accepta, so Dynamics AX, and uh the predecessor Concord XAL.
00:08:36.720 --> 00:08:40.879
Um and so that yeah, so very nice.
00:08:41.039 --> 00:08:56.080
And and back then, when I got my first sort of job in an internal IC department, I I I got my feet wet in terms of being a consultant and like an internal consultant asking the questions hey, why why do you want that new button there on the screen?
00:08:56.240 --> 00:08:57.600
What do you use it for?
00:08:57.759 --> 00:09:01.519
What if you turn the palette this way instead and scan the barcode?
00:09:01.600 --> 00:09:13.679
Like all those kinds of questions, learning that very early on to question the question the why, question the desire to do customizations, which uh has been healthy, I think, uh throughout the years.
00:09:13.919 --> 00:09:15.279
Yeah, yeah, but that's that.
00:09:15.440 --> 00:09:17.840
And then um personal life.
00:09:18.000 --> 00:09:31.039
Uh I live here on the countryside, an hour from Copenhagen, Denmark, uh, with my wife and dog, and um I have a big house, big garden, a lot of stuff to fill, uh you know, to spend time on.
00:09:31.279 --> 00:09:44.559
Um I play music uh as uh let's say semi-professional hobby of mine, playing a rock band, have a few kicks here and there, and uh yeah, just enjoy life.
00:09:44.799 --> 00:09:46.960
That's uh that's that's about that's good.
00:09:47.360 --> 00:09:48.720
In the countryside, I like that.
00:09:49.039 --> 00:09:50.000
The countryside's good.
00:09:50.159 --> 00:09:52.399
And uh it does, it gives I think the countryside's good.
00:09:52.480 --> 00:09:56.879
I think it gives you the opportunity to have a little break from in again, in my opinion.
00:09:56.960 --> 00:10:10.240
I think it's good to have a little break from some people like to have the city life and to be always exposed with stuff, but sometimes it's nice to be able to step back and remove yourself and be able to stare up at the sky, listen to the birds, smell the air, see the trees.
00:10:10.639 --> 00:10:11.519
Yeah, yeah.
00:10:12.000 --> 00:10:22.080
So um so a lot of things going on with AI, um, you know, a lot of pressure with a lot of organizations with AI to do more uh, you know, from a development point of view.
00:10:22.159 --> 00:10:28.879
And as I always say, a lot of people focus more on the coding portion of AI, but AI gets used in many different ways.
00:10:29.039 --> 00:10:32.480
We talked in previous episodes, even Tesla driving is artificial intelligence.
00:10:32.639 --> 00:10:44.480
Uh, from the functional consultant point of view, a lot of times there's creating documents, analysis documents, summary documents, you know, even down to a little bit of summarizing meetings and doing emails, for example.
00:10:44.720 --> 00:10:51.840
So it's it's funny that um you know, we talked about when we started the episode, we talked about um the nostalgia, right?
00:10:51.919 --> 00:11:00.000
Like when in class you had the real and then you had VHS, and then you know, how did you develop a nav, you know, when you started your career?
00:11:00.159 --> 00:11:08.320
And then I think we're in a kind of a pivotal pivotal point now where maybe five, ten years from now, we'll talk about the same thing.
00:11:08.399 --> 00:11:12.720
We're like, yeah, remember before when we were trying to figure out AI, now it's part of your life.
00:11:16.080 --> 00:11:16.399
Yeah, yeah.
00:11:16.559 --> 00:11:17.759
Remember when you typed code?
00:11:18.080 --> 00:11:19.120
I think you're right, Chris.
00:11:19.200 --> 00:11:27.519
I think, I mean, looking back like 10 years, we might have had a fairly good understanding of what the future would look five years from that point.
00:11:28.080 --> 00:11:41.519
But now, who can who even dare say what what the future looks like in five years, like technology-wise, just like the exponential change we've all been through the last two or three years has just been been crazy.
00:11:41.600 --> 00:12:00.720
And I think it's I can only speak for myself, of course, uh, but I can also hear from various, you know, both colleagues and people in the business, that it's easy to get frustrated about the pace of change, and it's easy to feel like you're sort of left on the station and everyone else has figured it out.
00:12:00.960 --> 00:12:04.480
That's for me personally, a year or so ago, I I I felt that way.
00:12:04.639 --> 00:12:13.919
I felt like we're had a mandate to build more AI into Business Central, as you know, but also the the the tools around us start changing.
00:12:14.000 --> 00:12:18.000
So it's not just what we build, it's also the tools we build with, they they change as well.
00:12:18.080 --> 00:12:22.960
So everything in our work life is changing, and we changed both platforms.
00:12:23.039 --> 00:12:26.960
You're used to you're used to your tools being sort of fairly stable.
00:12:27.279 --> 00:12:42.559
Uh, I know once in a while a new version of VS Code will come out and so on, but you know, but this change this is different, so everything is changing, and feels like or felt like at least a year ago personally, that I was the only one who hadn't figured it out.
00:12:42.720 --> 00:13:15.200
And I think that and many other things pushed me towards me thinking that I wanted to leave Microsoft, which uh I didn't, but I that's what I thought I did, and and this AI grappling with AI and having the feeling that I was behind and not using it enough and not you know doing good enough quality, probably because I wasn't using AI enough, uh blindsided me to think I wanted to leave, and then for all kinds of reasons I went on sick leave.
00:13:15.279 --> 00:13:39.120
But and during that time, something something remarkable happened while I was on sick leave, which was I I want I needed something to be creative with, and of course I play my guitars, but uh I had this I started listening to my vinyl collection some more while I was away, and I it struck me that I couldn't find an app for my vinyls to manage my vinyls that I was happy with.
00:13:39.279 --> 00:13:43.360
So I thought, hmm, people are talking about this vibe coding thing.
00:13:43.600 --> 00:13:46.240
Why don't I just try to vibe code it?
00:13:46.320 --> 00:13:53.919
I had no clue where to begin, so I saw some YouTube videos and yeah, ended up vibe coding uh a very cool app, I think.
00:13:54.000 --> 00:14:04.159
Uh actually um actually with just within just within one day I had something that was usable, but obviously then scope creep and you know wanted to do more.
00:14:04.559 --> 00:14:11.679
Uh and then after some weeks, I had it was published on App Store uh App Store, not App Store, sorry, but App Store, the Apple App Store.
00:14:12.000 --> 00:14:33.759
And uh it was a super fun experience, and and what I got from it, my key takeaway is and was that from the moment I had an idea until I saw it realized after some iterations, but just seeing that first attempt was seconds, maybe minutes, right?
00:14:34.480 --> 00:14:40.159
And that that brought a kind of teenage excitement.
00:14:40.320 --> 00:14:43.360
Uh hey sir, your dinner is getting cold, you know.
00:14:43.440 --> 00:14:46.320
Yeah, yeah, but I'm not ready, I'm not, I'm not done yet, you know.
00:14:46.399 --> 00:14:48.240
That kind of excitement.
00:14:48.720 --> 00:14:53.600
Uh my wife actually came in one day, hey dinner's dinner's getting cold.
00:14:53.679 --> 00:15:05.679
It's like but and that was so fun, and I think that kind of I won't I wasn't against AI before, I just hadn't find out found out the great use cases for me.
00:15:06.000 --> 00:15:10.799
One thing is building it, I was all for building it for customers, but for me, where could I use it?
00:15:10.879 --> 00:15:12.639
Where would I want it to have part of my life?
00:15:12.799 --> 00:15:19.759
And at the same time, and we we can come back to this, when you open your LinkedIn, you just see AI all over the place.
00:15:20.000 --> 00:15:28.080
People who figured it out, people who sound like experts, people who you know uh talk deeply about AI models.
00:15:28.240 --> 00:15:40.159
I had no clue if model 4.0 mini what that did like in depth compared to 4.0, like all those things I probably should know, but I didn't.
00:15:40.559 --> 00:16:02.240
And I think uh once I started using AI, it got demystified a little bit, and then I felt less behind because I thought, well, I I'm I'm scratching the surface over here and getting acquainted with something I can use it for.
00:16:02.480 --> 00:16:15.759
Like my my vinyl app uh detects images and can look up the artist and things of that nature, and you know, find similar albums and give it the DNA of my collection and things of that nature, that is super fun.
00:16:16.159 --> 00:16:37.519
But um, and then I think I think that was that was the moment when I was sort of converted, if you will, uh to someone who said, Yeah, you know what, there's probably many great use cases in daily life, and obviously technology has also moved a lot, not just these like this is six months, no, four months ago, so a lot has happened since then.
00:16:37.840 --> 00:16:50.559
Um, but the frustration was real, and I found out after I uh told my colleagues I wanted to leave Microsoft, people reached out and say, Hey, can we have dinner or lunch?
00:16:50.639 --> 00:17:16.960
And when when I spoke to people, and I won't give away any of any of my colleagues here, but I found out I was far from the only one who felt that way, who was frustrated, who felt behind, because it was a case of there's this term called uh uh pluralistic ignorance when you don't say you you think everyone else has figured it out, so you don't speak up, and everyone else also thinks that everyone else has figured it out, so they also don't speak up.
00:17:17.039 --> 00:17:21.920
So everyone else everyone goes around saying nothing, thinking the worst, right?
00:17:22.240 --> 00:17:33.440
And uh I think that's happening not just among me and my peers, but among many, many people, because you see all these experts, so-called experts online, right?
00:17:33.599 --> 00:17:47.359
They we're in a hype cycle still trying to distinguish value from like there's this sense of urgency without direction, you could call it like that's you could that's just my take on a hype.
00:17:47.440 --> 00:17:49.359
There's no there's no direction, there's just urgency.
00:17:49.519 --> 00:17:52.720
Oh go use AI nowhere.
00:17:53.279 --> 00:18:04.240
But but you need to have yourself you need to be find a way to be grounded in it in a way so it doesn't dictate your it doesn't dictate or take over your identity.
00:18:04.319 --> 00:18:07.359
Um so oh anyway, I I could rant for a long time about this thing.
00:18:07.680 --> 00:18:11.279
No, no, I you hit on many key points there.
00:18:11.359 --> 00:18:15.839
I think that pluralistic ignorance is true.
00:18:16.160 --> 00:18:32.960
And I can say with confidence, as you had found it with speaking with the colleagues that you had when you were at that point where you um uh uneasy with the use of AI and the direction that you were you're going or or how it's making you feel.
00:18:33.200 --> 00:18:42.240
I have conversations, if not daily, weekly, well if not weekly, daily, with many peers who feel the same exact way.
00:18:42.559 --> 00:18:51.359
And you summed it up best with we're all in a rush, and I say it a little bit differently, is we're all running, but we don't know where we're running to.
00:18:51.519 --> 00:18:53.759
And everyone's trying to run faster and faster.
00:18:53.839 --> 00:19:07.759
And as you said, is with the social media and I uh a point of it, and you see all these experts who've figured it out with the rapid rate of change that we have, can anyone even really be an expert in totality?
00:19:07.920 --> 00:19:10.640
Or can I be an expert in maybe one thing?
00:19:10.799 --> 00:19:28.240
But again, also with it only being so new, how can someone be an expert because their level of exposure is such a short period of time that they can't say with confidence that they really know how to tame the beast, as I'll say, or or or manage it as well.
00:19:28.480 --> 00:19:47.839
And I see a lot of people are starting to have this pressure because there's so much hype of what it can do, and also with it how much of that is also true because now AI can create content and people can create content with AI.
00:19:48.000 --> 00:19:52.960
That everything that I read now, I also have to say, is this true?
00:19:53.200 --> 00:19:54.240
Is this not true?
00:19:54.559 --> 00:20:00.799
Because we're being force-fed from a garden hose, as I say, with the the water.
00:20:00.880 --> 00:20:03.279
Uh you know, this one's laying off all these people.
00:20:03.440 --> 00:20:09.119
Oh, this industry's not going to have uh a need for for professionals any longer.
00:20:09.440 --> 00:20:17.839
It's like the end of the world is coming, and you better like you said, if you're not on the train, you and you you you've already missed it where it is.
00:20:18.400 --> 00:20:19.359
No, go ahead, Chris.
00:20:19.759 --> 00:20:26.799
Sorry, and you had mentioned about um you know trying to figure out what can AI do for me.
00:20:26.960 --> 00:20:29.759
Because you you're right, you do in our world, right?
00:20:29.839 --> 00:20:35.119
Uh where we are trying to build something for someone else.
00:20:35.519 --> 00:20:51.359
And I think we a lot of us are all trying to figure it out for on behalf of our you know, for for others, and trying to figure out where does it fit for me in a not only a personal level but also a professional level where it would improve my life.
00:20:51.519 --> 00:20:58.480
You're doing that for others, and you also mentioned uh so it hit me pretty hard because that you know, I think a lot of people are going through that.
00:20:58.559 --> 00:21:06.319
And you also mentioned that um people don't tend to speak up, like, hey, I'm struggling, where does it fit?
00:21:06.480 --> 00:21:17.680
You know, you are trying to figure out because sometimes people look at you as this knows it better than maybe others more than others, and so people look at you and say, Hey, where do we go, Soren?
00:21:18.000 --> 00:21:20.480
Where or or Brad, where are we going with this AI?
00:21:20.640 --> 00:21:28.799
And you at the same time, I'm trying to figure it out, but you don't want to say that at the same time because like I need I need to help others.
00:21:28.960 --> 00:21:31.680
Um I think that's where a lot of us are stuck on.
00:21:32.160 --> 00:21:34.079
I that's so true, Chris.
00:21:34.319 --> 00:21:45.839
I've never been as uncomfortable in my professionalism or what you call it in my capabilities as I was a year ago when I felt left behind.
00:21:46.079 --> 00:21:51.440
Uh I can I can come back to what I've done and where I'm at now because I'm a very different place now.
00:21:51.599 --> 00:22:09.200
But I I think when when when you're expected to do something and use tools and and all this thing, and have the questions or have the Answers to the questions, but you don't have them, you you feel I mean that was imposter syndrome every day and more than already, right?
00:22:09.279 --> 00:22:33.440
But just by being in Microsoft, but but uh and I think I think I think that the the hype also costs something else, which is it kind of tells us that if you're not all in on all these tools and models, you're not relevant, and that's a fallacy that we sort of fall for, in a sense, because it's not about knowing everything.
00:22:33.680 --> 00:22:35.599
As Brad said, no one can be an expert.
00:22:35.759 --> 00:22:39.680
This field has only existed for what three years now with the with with Gen AI.
00:22:40.079 --> 00:22:47.920
Uh like you can't expect to be an expert, and maybe there's a point in that, maybe we shouldn't strive to be experts.
00:22:48.240 --> 00:23:02.319
This is like it's all about now a whole new way of thinking because we have this, I'll say, companion or assistant or wherever we want to use it for that is AI alongside us in everything that we do.
00:23:02.480 --> 00:23:04.480
So, where is it that we want to use it?
00:23:04.559 --> 00:23:12.000
And I think uh there's so much, first of all, uh again, I'm just ranting on the hype thing.
00:23:12.079 --> 00:23:20.000
There's so much, and lack of better word, uh, tool fetishism around out there that you know, oh, it's this model, is this model?
00:23:20.079 --> 00:23:20.559
Yeah, great.
00:23:20.720 --> 00:23:24.880
And I know it's the improvement of the models that also is part of the progress.
00:23:25.359 --> 00:23:40.240
But we have when you look online, especially it's easy to glorify the technology and tools instead of talking about the problems we want to solve with it and what we want to realize, like the gains we want to you want to have from AI.
00:23:40.559 --> 00:24:07.279
So we're sort of told use AI for everything, and that's taxing when like then we then I think we sort of we we react to it in ways that we don't notice because we haven't started integrating it really into our lives, but we act we we do react to all of this that we're being told hey, use it every uh for everything, coming back to the urgency without the orientation.
00:24:07.359 --> 00:24:15.599
Like if someone had told me a year ago that here are three things in your personal life where AI will just be, you know, and this is how you're gonna do it.
00:24:15.839 --> 00:24:18.640
I just didn't see that, or maybe I wasn't looking for it.
00:24:18.720 --> 00:24:24.559
Maybe I I did want to also want to sort of hide a bit from it because maybe it will go away.
00:24:24.880 --> 00:24:31.519
I was hoping AI would go away, don't get me wrong, but I I just it's just one of those things when you have cognitive overload.
00:24:31.759 --> 00:24:38.640
Um, so I think part of it is because when it's unknown, it's scary by definition.
00:24:38.799 --> 00:24:40.960
That goes for anything, not just AI.
00:24:41.440 --> 00:24:48.720
And compounded by the fact that when everyone else seems that they figure it out because either they say nothing or they are part of the expert group.
00:24:49.039 --> 00:24:53.440
And uh yeah, I just think we don't we shouldn't strive to be the expert.
00:24:53.519 --> 00:24:56.640
I mean, we're still in the yeah, yeah, go ahead.
00:24:57.039 --> 00:24:59.359
I got a question for Brad Brad and Soren.
00:24:59.839 --> 00:25:23.440
Um, do you feel that the pressure of knowing AI because of your profession, because you are a consultant, you are a developer, that the pressure is more on you than others that you know maybe it doesn't really affect them every day, but does it feel more pressure on the on your because of your profession?
00:25:24.640 --> 00:25:26.559
Yes, definitely.
00:25:26.799 --> 00:25:37.839
I mean, I can only from my own perspective, uh in the job that I have in the company that I work for that we want to win the AI race.
00:25:38.079 --> 00:25:46.160
Uh I mean, you can it doesn't matter if I close my eyes and I turn my head and I open my eyes again, like that there will be AI.
00:25:46.319 --> 00:25:51.039
There will be a conversation about AI, there will be AI for breakfast and lunch and dinner.
00:25:51.119 --> 00:25:53.200
It's it's all AI.
00:25:53.759 --> 00:25:55.839
Uh and that's just the nature of it.
00:25:56.319 --> 00:25:57.519
I mean, that's that's what I mean.
00:25:57.599 --> 00:26:08.160
I mean, my wife who does uh leatherworks by hand, she has I mean, she has no no interest whatsoever in AI.
00:26:08.240 --> 00:26:15.200
I'm on we have conversations on the dinner table, she thinks it's it's it's exciting, and you know, but it it does nothing for her.
00:26:15.440 --> 00:26:18.559
Absolutely not, and she does feel left behind, by the way.
00:26:18.799 --> 00:26:26.880
I I agree with what you say, and I was going to bring up that point and to answer your question, Chris.
00:26:27.200 --> 00:26:28.640
100% absolutely.
00:26:28.799 --> 00:26:32.400
And I think there's a lot of points, Soren, to what you were talking about.
00:26:32.640 --> 00:26:37.920
There's a lot of pressure to use AI, but for what?
00:26:38.240 --> 00:26:40.079
You should use AI, and you like it.
00:26:40.400 --> 00:26:52.960
So the focus almost subconsciously becomes let's shift from what problems are we solving to we have to use AI, how are we going to use AI, and then we'll solve the problems after.
00:26:53.039 --> 00:26:57.519
Instead of doing what we traditionally did was let's take a look at the problem, determine how to solve it.
00:26:57.680 --> 00:27:02.799
And Chris, to your point of the pressure, Soren, your example was uh was perfect.
00:27:03.039 --> 00:27:12.720
I think within our field, we are so we're in a I don't want to say a microcosm, but all of our peers are in this tech space, this tech space that's saturated with using AI.
00:27:12.799 --> 00:27:21.440
We need AI in our ERP software, we need AI in our development environments, we need AI in our email, in our our office suite of products, we need AI.
00:27:21.599 --> 00:27:22.799
That's all we see.
00:27:23.119 --> 00:27:31.039
But if you take a step away and talk with someone, I've had the same conversations with others outside, and they say, Oh, what is this AI thing?
00:27:31.200 --> 00:27:31.359
Yeah.
00:27:31.599 --> 00:27:43.519
So they're there they hear about it and they hear about the stuff that it's doing, but they don't have that same pressure, I think, because their level of exposure is different than what ours is.
00:27:43.599 --> 00:27:50.640
So as you said, I I can't even go a day and I can try without hearing or seeing the word AI.
00:27:52.480 --> 00:27:55.440
But I can't even last maybe two minutes.
00:27:55.680 --> 00:27:56.000
Right?
00:27:56.240 --> 00:28:09.920
Within every two minutes, I'm going to be fed something with AI, whether I'm working, because we have AI in every single application that we use, or any news feed or any tech news or anything that I listen to.
00:28:10.079 --> 00:28:14.559
So I think 100% yes.
00:28:15.119 --> 00:28:27.519
Yeah, you both are that anybody that's in this microcosm that we have, and I saw some statistics points before as they talked about this, like how much exposure do people have to AI.
00:28:27.599 --> 00:28:35.279
And you know, it's like I don't even want to go in the numbers, but it's a very small percentage of people that have the exposure to AI that we do.
00:28:36.000 --> 00:28:36.960
We think that everybody's in.
00:28:37.119 --> 00:28:39.920
And so and it goes to your point, which I wanted to kind of jump back to.
00:28:40.000 --> 00:28:50.880
Um, your vibe coding story is amazing as well, but to jump back to the first portion of it when you felt left behind, because uh we all know factually that people all are in the same uh position.
00:28:51.039 --> 00:29:02.319
Or many people are not all people, but many people are in the same position of feeling left behind, feeling the imposter syndrome, feeling like they're missing out and they're not on the train.
00:29:03.359 --> 00:29:10.400
And you took some time off and you've done some things, and you're in a much better spot about all that now.
00:29:10.559 --> 00:29:23.440
What are some of the things that you realized along the way, or some of the things that you have done along the way to help you not feel so much pressure and feel so left behind?
00:29:23.920 --> 00:29:25.359
That's a great question.
00:29:25.599 --> 00:29:37.680
I think I think it started with me taking time off, first of all, like making my world much smaller uh and not you know rid myself of all the information overload of various kinds.
00:29:37.839 --> 00:29:38.640
That was one thing.
00:29:38.799 --> 00:29:41.599
So I I I needed that to get my head straight.
00:29:41.839 --> 00:29:49.039
Um, but also I mean, and then and then this spark ignited something like with the vibe coding.
00:29:49.200 --> 00:29:56.240
I think uh it also enlightened me a little bit in terms of what could be done, what's actually possible.
00:29:56.400 --> 00:29:58.880
I've been using copile, of course, in my daily life.
00:29:59.200 --> 00:30:04.799
Uh it wasn't it was not copile, I used to wipe code, I used um I used Claude.
00:30:05.279 --> 00:30:11.279
Um so there was a lot, there was there was a I mean I mean I mean no, let's not dive into into models.
00:30:11.519 --> 00:30:35.680
Um but I think what what I then found out when I also you know figure out I need to get back to work was I need to find a way where I could sort of shield myself from the hype and from the pressure and and find a way to be okay with missing out.
00:30:35.839 --> 00:30:41.920
So I I I was reframing the whole thing, oh, what's expected of me, what kind of professional should I be?
00:30:42.079 --> 00:30:44.720
Uh am I losing my job if I don't do this well?
00:30:44.960 --> 00:30:48.079
How do I stack up to my peers and all these things?
00:30:48.640 --> 00:31:16.799
I let that sort of flow through me, recognize that that was the feeling I had, but I had to reframe it another way and sort of say, uh, you know what, there's probably things that I do really well already where AI can make me shine or uh like compound the output or the quality of the output somehow, instead of trying to invent all new things that I hadn't done before in my work life.
00:31:17.039 --> 00:31:26.400
So I think that was where it started, trying to identify those those places where uh I could be a great sparring partner for AI and vice versa.
00:31:26.559 --> 00:31:44.160
So I could it it could be this uh uh compounding force or magnifier, if you will, uh, for my capabilities, where I already was sort of very good, instead of trying to find my weak spots and and have AI sort of close those gaps.
00:31:44.480 --> 00:31:55.680
And I think just by doing that alone, you don't have to think about all the other things where people say, Oh, you should use AI for this and this and this.
00:31:55.920 --> 00:31:56.319
Great.
00:31:56.559 --> 00:32:03.680
Do it at your own pace and and notice the notice the small gains and or maybe even large gains.
00:32:03.759 --> 00:32:10.720
I I mean I I can share my list now in a few minutes of what I use AI for in my work life, uh, because it has changed my work life.
00:32:10.799 --> 00:32:17.759
And now when I came back at work, I also promised myself I don't want to go back to work being the same.
00:32:18.000 --> 00:32:23.279
Well, of course, obviously I'm the same person, but I don't want to uh work the same way as before.
00:32:23.519 --> 00:32:29.920
If I find these these repeatable tasks or things so that I think I can use AI for, I will do it.
00:32:30.240 --> 00:32:45.759
So I think, and excuse the pun, but trying to keep the agency with yourself, uh take control over what you want AI to mean for you because it's different.
00:32:46.000 --> 00:32:50.240
We're we three will have different interpretations of what AI should do for you.
00:32:50.400 --> 00:33:05.839
I think there's a conversation where we can have about in an organization and in society, if if the promise of AI seems to hold true about the level of productivity and the productivity increases, what will we spend that time on?
00:33:06.000 --> 00:33:23.279
Will we and Chris, you you created an interesting blog post about this thing, about the cognitive load and the expectancy of you know, oh, if AI can make you work faster, then uh will will the will your leadership now suddenly or people around you now suddenly expect more output?
00:33:23.519 --> 00:33:32.880
So there's definitely an upcoming question that leaders should ask themselves first and foremost, because it's a leadership discussion.
00:33:33.119 --> 00:33:39.200
Uh, what is value in our organization?
00:33:39.519 --> 00:33:44.880
Is the value the quality of the output or is it the amount of output?
00:33:44.960 --> 00:33:56.880
And it could be tempting to say, oh, we just produce X times more, because there's probably a KPI you can measure it with, and that's uh intriguing, and you know, it's it's it's kind of tempting.
00:33:57.200 --> 00:34:13.360
But maybe we should let the freed-up time be spent on better decision making, uh, you know, lowering the employee anxiety and you know, build better and more coherent products and take more time to do strategy work and things of that nature.
00:34:13.679 --> 00:34:16.880
Uh and also because these changes impact people.
00:34:16.960 --> 00:34:22.320
So we can't just fool ourselves into thinking that oh, you you free up two days per week.
00:34:22.400 --> 00:34:24.719
Now we just produce two days more work.
00:34:24.800 --> 00:34:26.159
Like you that doesn't work.
00:34:26.320 --> 00:34:33.760
You'll create entropy over time if you do that, because people cannot, there's still a human factor, there's still a cognitive load of all the things we do.
00:34:33.920 --> 00:34:45.679
So I I don't have the answer, but I think we need to defend the right to some slowness in our lives and in our professional work.
00:34:45.840 --> 00:35:01.679
So personally, and I don't know if I should be saying this in this saying this in this podcast for for the world to hear, but I'm using AI and the productivity increases to buy myself some time for deep thought, to create myself some buffer in daily life.
00:35:01.920 --> 00:35:07.840
Uh things that used to take me a day, maybe sometimes it takes maybe takes two hours, maybe it takes 30 minutes.
00:35:08.000 --> 00:35:09.599
Oh, now I've gained the day.
00:35:09.760 --> 00:35:11.360
Uh, what will I use that for?
00:35:11.519 --> 00:35:16.880
It's easy to be consumed and just start producing more, but do I produce better things?
00:35:16.960 --> 00:35:26.960
So I think the quantity versus quality is something that every organization should should you know have a deep conversation about before they just imposed it on their employees.
00:35:27.280 --> 00:35:36.960
Anyway, that was a long sort of rant about you know what will we use the time for, but I have a lot of things I can tell you about what I use AI for if you're interested in that as well.
00:35:37.519 --> 00:35:38.559
I'm I'm interested.
00:35:38.719 --> 00:36:02.480
It's it's don't think that you're going on the stuff that you're saying is powerful because I'm even reflecting as I'm listening to it in my own, I'm reflecting upon my own daily interactions, how I use AI, how I have been progressing with AI, and my shift with AI.
00:36:02.559 --> 00:36:08.800
And I think you hit on some key points, is don't run just to run, basically, run with a purpose, right?
00:36:08.960 --> 00:36:13.679
So we're all running around with these nice, sharp, cool scissors, but we don't know where we're running to.
00:36:13.920 --> 00:36:19.920
Let's use this, these tools and determine what our output and our value will be.
00:36:20.000 --> 00:36:21.440
It's all value-based.
00:36:21.679 --> 00:36:32.159
It's not necessarily how much can we produce, because if you produce a lot and that's not quality or it's not of any value, then we uh have an overproduction, right?
00:36:32.239 --> 00:36:46.719
And then the the production will exceed the consumption, and then we're going to have another problem uh or another feeling that we're going to get when we overproduce, and we don't get the consumption that we expect because we're overproducing, uh, which will produce another issue.
00:36:46.960 --> 00:36:59.519
But it it's powerful, and uh and with it being so new, and there's a tr we're going through a transformation, and the transformation's far more accelerated than any other transformation that we've gone through.
00:36:59.760 --> 00:37:11.360
And to be honest, we're always in a state of transformation because ever since uh man uh was on the planet and we started inventing the axe, so we invented you know metalogy.
00:37:11.840 --> 00:37:12.800
Metallology?
00:37:12.880 --> 00:37:14.800
I can't even say these words sometimes.
00:37:15.039 --> 00:37:24.159
Um we've been transforming on how to how to use these tools to become more efficient, right?
00:37:24.320 --> 00:37:28.960
I guess you could say, is we want these tools to make it easier for us to do work.
00:37:29.119 --> 00:37:50.159
And here we have these tools that are transforming daily, and just by the time I think I can understand how to use Visual Studio Code with uh, you know, using a co-pilot instruction, then they come up with a co-pilot agent, then they come out with a co-pilot skill, then they come up with a co-pilot task.
00:37:50.239 --> 00:37:57.039
And it's like every day when I finally feel I kind of I understand and how to learn learned how to use this, now there's something else.
00:37:57.440 --> 00:37:57.840
No book.
00:37:58.079 --> 00:38:02.880
So we so we don't have we don't have the rate of adoption as well.
00:38:02.960 --> 00:38:06.320
Uh uh that's another topic we'll get into, but how are you using it in your daily life?
00:38:06.480 --> 00:38:06.960
Okay, go ahead.
00:38:07.360 --> 00:38:16.880
Before before you go on, so before you go on, Soren, you had mentioned that uh, you know, uh perhaps we should celebrate of slowing down.
00:38:17.039 --> 00:38:25.519
Because right now, as a lot of business leaders or owners, uh we, you know, this new tool, we could produce more, right?
00:38:25.679 --> 00:38:29.760
Try to get as much as we can out of someone's eight-hour day.
00:38:30.079 --> 00:38:32.960
And you hit it perfectly well.
00:38:33.119 --> 00:38:36.480
That we should encourage to slow down.
00:38:37.199 --> 00:38:42.000
We should be encouraging to slow down a little bit because not we shouldn't just go go go.
00:38:42.159 --> 00:38:49.519
Because you're right, it burns people out, especially when you're sprinting, you know, a thousand miles per hour, you're gonna wear out somebody.
00:38:49.679 --> 00:38:57.360
Even though you have more tools, you have more things to to to expedite some of your uh what you're trying to produce.
00:38:57.599 --> 00:39:03.840
But it goes back to you know, measure twice, cut once kind of thing, where like I gotta I gotta take my time here.
00:39:03.920 --> 00:39:08.079
I gotta make sure that my whatever I'm measuring, whatever I'm building, I understand.
00:39:08.320 --> 00:39:12.400
And then I need to measure it twice and then cut once.
00:39:12.559 --> 00:39:13.760
And I think we're forgetting that.
00:39:13.840 --> 00:39:18.559
I think we're just cut, cut, cut, here's pieces to the thing that I'm trying to build.
00:39:18.719 --> 00:39:21.519
And then we forgot it's like, ah, it doesn't quite fit there.
00:39:21.760 --> 00:39:25.440
Maybe it's not the right you know type of material that I'm using to build.
00:39:26.000 --> 00:39:36.480
Doing something because we can, not because we have to or need to, is is what we end up doing, is we do a lot of stuff, but we don't know why we're doing that stuff.
00:39:36.960 --> 00:39:39.920
And it it's crazy.
00:39:40.719 --> 00:39:41.280
It's true.
00:39:41.360 --> 00:39:42.719
And Chris, you have a good point.
00:39:42.880 --> 00:39:58.880
Like um, because it's also a risk when when we move that fast, when we move faster than we can actually sort of cognitive sort of uh do, and and and that there's such a high risk that we take the wrong decisions, and we should always take our own decisions, right?
00:39:58.960 --> 00:40:26.960
So AI is the tool that can help us uh with input to our decision making, but we should be the ones making the decisions, and we can't necessarily take faster decisions, we we get more input faster, but you still need to you still need to think about things, you still need to strategize, you still need to uh consume and digest if you had discussions with with the AI, for example, if that's what you do.
00:40:27.119 --> 00:40:40.719
So I I can I can say for my one of the things I just uh got here recently was uh I build a small agent that is now I consider my sort of PM colleague, if you will.
00:40:40.880 --> 00:40:46.079
It's kind of a buddy, it's kind of a sparring partner that knows all about the product or knows a lot about the product.
00:40:46.159 --> 00:40:56.639
I fed stuff as well, it knows about the ecosystem of partners, it knows about my kind of decision making, but it's also that neutral voice that grounds me somehow.
00:40:56.719 --> 00:41:12.159
So if I have a topic I want to discuss with it, like should I build this feature or not, for example, uh it's a really, really great sparring partner, and it it's been building up a great framework for decision making.
00:41:12.239 --> 00:41:22.320
Say, hey, why list me list me the pros and cons of building this feature, and I can have actually great conversations with it in lack of that person, you know.
00:41:22.480 --> 00:41:35.440
I could probably I could do it with my colleagues as well, but there's so much to enrich those conversations and though that that decision making with in that kind of agent that you could do it with co-pilot even, right?
00:41:35.519 --> 00:41:53.519
But that I find uh incredibly valuable, and also just the speed of things because it can you know chew on stuff, I can you know provide input to it and it will do its thing and I can do other things while you know, and then when it's ready we can have a conversation about it.
00:41:53.599 --> 00:42:01.840
So and I th I I I find it I find it great because I of course have biases as we all do.
00:42:02.159 --> 00:42:06.559
I think I know my customers and the personas that I serve through the product.
00:42:06.800 --> 00:42:11.920
I know partners and uh through relationships as we have.
00:42:12.559 --> 00:42:19.679
I have a view of the world that's been biased and you know pivoted by all kinds of things.
00:42:20.159 --> 00:42:32.079
But this PM buddy of mine uh helps keep me sort of sometimes a bit, you know, you know, or let's say challenges my assumptions.
00:42:32.239 --> 00:42:38.960
And that's when you use that purposefully, it can be really, really powerful, and it can make decision making.
00:42:39.039 --> 00:42:52.400
And this kind of work that I would like to think that I was good at already, with this in the hand, I I it's a bit like Gandalf looking at the ring and say, Don't tempt me a photo, it threw me, it will wield a power too terrible to imagine.
00:42:52.480 --> 00:42:55.119
Or it's a bit like I see myself a little like that.
00:42:55.199 --> 00:43:04.800
You know, AI for the right, it can be like a times 50 uh uh magnifier of whatever power you have, like a multiplier of what you have already.
00:43:05.039 --> 00:43:12.000
But obviously, there's a risk, there's a down, you know, downside or potential downside because you need to use it purposefully.
00:43:12.239 --> 00:43:27.280
And for me, for example, I and we I would like to say on behalf of my colleagues, we take our own decisions, we don't treat the output as truth, and uh there's no such thing as because AI said so.
00:43:27.440 --> 00:43:29.280
Like that that doesn't exist.
00:43:29.440 --> 00:43:35.440
It's an enrichment to all kinds of things like conversations, prioritization suggestions.
00:43:35.599 --> 00:43:37.920
Like, I have these three things.
00:43:38.159 --> 00:43:41.679
I know what I want to build first, prove me wrong.
00:43:41.760 --> 00:43:54.880
You know, just those kinds of just just the output you get from an an agent that you you get you get amazed by what it knows and the why and how it reasons.
00:43:55.039 --> 00:43:59.760
And it's actually like talking to a colleague, that colleague that brings some ideas that you haven't thought about, say, oh.
00:44:00.239 --> 00:44:01.760
I hadn't thought about it that way.
00:44:02.159 --> 00:44:04.320
And um yeah.
00:44:04.559 --> 00:44:15.679
So what what am I yeah, and so there are many of these kinds of like in like that helps with actual product thinking and just uh thinking about what can we do for our customers.
00:44:15.920 --> 00:44:20.159
Then there's things like analyzing telemetry.
00:44:20.480 --> 00:44:44.320
Like we sit on a huge bunch of data that until very recently it was incredibly hard, if not impossible, to find uh patterns and uh you know like find what is what is true for a certain uh demographic of customers or are the patterns into what they use.
00:44:44.639 --> 00:44:55.039
Oh, it seems that when customers create a uh purchase invoice, they always go through these eight clicks, you know, just simple things like that.
00:44:55.199 --> 00:45:04.000
It you would think is obvious, but when you get them visualized, which is the powerful thing now with AI, you can really start to see some patterns that was hard to see before.
00:45:04.320 --> 00:45:13.440
Um then, of course, there's all the the daily stuff that everyone with a copile license can can probably do, like have it manage your calendar.
00:45:13.519 --> 00:45:29.519
So for me, I automatically follow every meeting, uh also follow every meeting that gets invited to after lunch because right now I'm still sort of recuperating after my sick leave, so I don't want to just accept the meeting, but I'll follow it so I can catch up and so on.
00:45:29.760 --> 00:45:35.039
Uh I automatically uh accept meetings if they're if they're from specific people.
00:45:35.199 --> 00:45:38.719
So I tell my co-pilot agent that you know there's certain rules.
00:45:38.880 --> 00:45:44.559
Uh who, by the way, do not live, these rules do not live in your outlook rules, like the list of rules.
00:45:44.719 --> 00:45:50.079
It's the outlook agent where you tell co-pilot what you want, how you want it to handle your calendar.
00:45:50.159 --> 00:45:51.599
It's a very cool feature, by the way.
00:45:52.000 --> 00:45:52.639
Where do you do it?
00:45:56.639 --> 00:45:57.280
No, no, I'm sorry.
00:45:57.360 --> 00:45:57.679
Do you do that?
00:45:57.840 --> 00:45:58.559
This is within Outlook.
00:45:59.519 --> 00:46:01.039
Within Outlook, there's an Outlook agent.
00:46:01.199 --> 00:46:01.519
Okay.
00:46:01.760 --> 00:46:03.119
But just for clarification.
00:46:03.280 --> 00:46:04.800
So yeah, yeah.
00:46:05.280 --> 00:46:05.920
Yeah.
00:46:06.639 --> 00:46:12.880
And about meetings, one of my favorite features is the audio recap uh feature of Teams.
00:46:13.199 --> 00:46:24.559
Like you get this podcast vibe of what happened in a meeting uh when you walk the dog or you know, uh on the on the rowing machine in the morning just to catch up on stuff.
00:46:24.719 --> 00:46:26.079
That's I I love that.
00:46:26.239 --> 00:46:46.400
And yeah, I use yeah, I could go on, but so there are there are lots of improvements of tools of like almost like mini mini AIs of specific tasks that you want, which surprisingly a lot of people have not turned those on, or at least get them got the most out of that.
00:46:46.480 --> 00:47:04.480
But small little uh was that atomic habits, Brad, where small little improvements every day, and you're just trying to kind of understand how this works, and you just have to apply it every day and get better, what to say, one percent better than the prior day by the end of the year, you're you're you're way way above everyone else.
00:47:04.800 --> 00:47:06.559
Yeah, you're more than better.
00:47:06.880 --> 00:47:09.920
Changing habits it it's important.
00:47:10.000 --> 00:47:45.280
I think you uh there's a lot to it, but I I think to summarize some of the things you were saying, at least what I take from it, I like to just do that just because it's how I process what what we're discussing, is to embrace it and find practical ways for you to use it in your daily life, then it also helps you be creative on how you can use it to help solve problems for customers and other people instead of feeling the pressure to use it, use it and apply it, and then it helps you know how to apply it for others to solve their problems.
00:47:45.679 --> 00:47:57.599
And Chris, at some point, a lot of individuals don't turn this stuff on because I think that the rate of change is so fast that sometimes it's difficult to adopt it.
00:47:58.800 --> 00:48:04.880
And meaning that how do you educate individuals to use these features and then how to use these features?
00:48:05.039 --> 00:48:15.199
I know personally with me, when these whole new copilot instructions came out and skills and agents, I was a little overwhelmed with how do I create these?
00:48:15.519 --> 00:48:15.840
Right?
00:48:15.920 --> 00:48:25.440
So I felt more of like I felt more pressure of I should know how to create these, I should know how to use these, and then one day I started working with them like, oh, I'll just create something very simple.
00:48:25.679 --> 00:48:28.800
I had co I had copil that helped me do it.
00:48:29.039 --> 00:48:30.559
But now I create these things all the time.
00:48:30.719 --> 00:48:33.280
I'm creating prompts, I'm creating this with the help of copilot.
00:48:33.360 --> 00:48:39.119
So it helped me adopt and how to use it because I had a practical example of what I wanted to do.
00:48:39.360 --> 00:48:45.039
And the first thing I started with, to be honest with you, was a simple little command for committing code to a repo, right?
00:48:45.280 --> 00:48:58.159
And it was just so small and something that you could do so simply, as a you know, just with me uh uh doing development manually, but that small building block was the foundation for me to start to create.
00:48:58.559 --> 00:49:10.320
I have different agents like you sorry that do many different things for me on a day-to-day basis, and I have other instructions that I was able to grow into and feel less overwhelmed of.
00:49:10.559 --> 00:49:13.599
I need to have all these, I need to have all these if you follow what I'm saying.
00:49:13.760 --> 00:49:20.559
So I I think it's that rate of change and then trying to solve a simple problem.
00:49:24.960 --> 00:49:27.679
I think we just have to come to try I think you're absolutely right.
00:49:27.840 --> 00:49:34.320
Just start using it, but but choose where you use it and and and take all the low-hanging fruits that you can.
00:49:34.480 --> 00:49:38.559
Because as you said, Brad, you you you grow into it.
00:49:38.800 --> 00:49:48.719
Learn how to be with AI without letting it distort your priorities or identity and judgment, like uh, and you you'll grow from there.
00:49:48.880 --> 00:49:56.239
Um, I mean I can go to the hardware store and and get confused and get overwhelmed by all of the tons of tools.
00:49:56.400 --> 00:49:58.880
I have no idea what what they do.
00:49:59.199 --> 00:50:02.719
I just know that for some purpose they're great.
00:50:02.960 --> 00:50:04.079
I don't know how to use them.
00:50:04.159 --> 00:50:05.760
I probably don't even know they exist.
00:50:06.159 --> 00:50:15.280
But the the difference there is that no one is in my social media feed is is is uh you know uh you know telling me that I should go and look at all these tools right now.
00:50:15.519 --> 00:50:16.960
That's the only difference.
00:50:17.199 --> 00:50:21.760
I mean, so don't let the world fool you into thinking that you're falling behind.
00:50:21.920 --> 00:50:25.119
If you're using it right now, you're already on a on sort of a great track.
00:50:25.199 --> 00:50:27.679
Don't don't uh don't despair.
00:50:28.400 --> 00:50:30.079
Quick quick question to you, sorry.
00:50:30.239 --> 00:50:34.239
You had mentioned about co-pilot summary, but in audio mode.
00:50:34.559 --> 00:50:44.480
And um, you know, uh some of that features I I think it was available in Gemini, where you take some of the notes and things like that.
00:50:44.960 --> 00:50:46.000
That alone, right?
00:50:46.079 --> 00:51:09.440
When you're taking going for a walk, it not everyone wants to read the summary, maybe it's too long, but having that audio played back to you as if you are listening to a podcast, it's fantastic because then you can like, oh, you know, you can almost hear it from the voices, I guess, where like uh maybe hearing the different tones of how you typically would read something.
00:51:09.679 --> 00:51:14.880
That in itself, sometimes it it's it's your brain is maybe better for that than just trying to read it.
00:51:14.960 --> 00:51:18.559
And I'm like, oh, I gotta try to figure this out at the same time.
00:51:18.960 --> 00:51:19.519
Absolutely.
00:51:19.599 --> 00:51:29.039
And I will say about about voices, like AI voice, that's I mean, the improvements we've seen there just the last six months is just incredible.
00:51:29.280 --> 00:51:37.119
Uh I as part of me coming back to work, I wanted to set myself a challenge also to ramp myself up on some AI skills.
00:51:37.199 --> 00:51:43.599
So I I I've been studying for some of these exams, some of these certification exams, and I have still have one more to go.
00:51:43.760 --> 00:52:04.719
Um and I wanted to study those, not by reading all the material, but I heard about this guy who who studied an MBA just by running and listening to uh podcast, quote podcast, he did by using Google's notebook LM and feeding it all of the source material.
00:52:04.960 --> 00:52:11.679
Yeah, and I thought, well, if he can do that and you know take an MBA, I can I can do I could do these exams.
00:52:11.920 --> 00:52:13.039
And so far, so good.
00:52:13.119 --> 00:52:20.159
I mean it's and it's so fun because it's it's really this engaging conversation, it's not this synthetic AI voice that we've been hearing.
00:52:20.400 --> 00:52:24.320
It's really a real conversation, and it works really, really well.
00:52:24.800 --> 00:52:33.440
Uh and I want to encourage people also listening to this to go try co-pilot labs and go try the co-pilot audio expressions in there.
00:52:33.760 --> 00:52:47.519
It's so fun to fool around with like it's you get these neural voices that comes from from AI Foundry, but and it's much more natural sounding voices than um than what you're used to in other in other places.
00:52:47.760 --> 00:53:00.480
So I think we're going towards a direction where we I think uh we had this conversation maybe over a year ago, uh, where you you take AI as a tool that kind of summarizes into text and things like that.
00:53:00.719 --> 00:53:02.960
Eventually we want to make it more human.
00:53:03.119 --> 00:53:04.880
And so now we're doing the audio, right?
00:53:04.960 --> 00:53:06.559
Where it's just like having a conversation.
00:53:06.639 --> 00:53:16.320
You could I I I think with the notebook LM from uh Google Gemini, where you can have two people having a conversation about it, and you can even have them do pros and cons.
00:53:16.719 --> 00:53:29.840
And then, you know, those things are becoming so human-like when I'm listening to two people uh based on the thoughts or maybe the notes or sources that I'm referring it to, and it's like, can you battle it out so that I can have a better understanding?
00:53:29.920 --> 00:53:35.679
And I'm like, Oh, yeah, I can see both points now, and then I can make a better decision.
00:53:35.760 --> 00:53:36.400
It's crazy.
00:53:36.480 --> 00:53:42.480
Now I'm you know, maybe one day we'll see uh avatars, just having a conversation.
00:53:42.800 --> 00:53:43.599
Uh uh true.
00:53:43.679 --> 00:53:55.679
I mean, and and I think this point about making it seem more human, and we all know that it's it's not a human, it's still a machine, but they're still so so valuable to be able to have.
00:53:55.920 --> 00:54:05.840
I mean, when I drive to the office and back, it's an hour each way, and I very often just have a conversation with, for example, Jack GPT about some topic.
00:54:06.000 --> 00:54:14.000
And yesterday, for example, I wanted to um no two days ago, I wanted to sort of catch up on the whole financial crisis.
00:54:14.159 --> 00:54:15.199
Uh, don't ask me why.
00:54:15.280 --> 00:54:18.000
I was just I really wanted to understand why did it happen.
00:54:18.320 --> 00:54:21.519
Like now it's 18 years ago, right?
00:54:21.599 --> 00:54:30.480
It's uh but I wanted to understand, okay, who who were the actors, who were the stakeholders, there was the banks, there was the mortgages, there was the you know investors, and all these things.
00:54:30.639 --> 00:54:31.519
Why did it happen?
00:54:31.760 --> 00:54:36.719
And and the and the cool thing, I mean, for me, it's just a great learning experience.
00:54:36.960 --> 00:54:50.079
You know, after listening to all of it, asking questions back and forth, I tried to say, okay, now quit uh quiz me on this topic, ask me 10 questions to check that I've understood it correctly, as you can also do in Notebook LM, right?
00:54:50.239 --> 00:55:04.880
And it's just a fun way to learn, you know, pick your topic and you can just learn about it, of course, with all the caveats that it might be incorrect and all these things, but right, but but but but the way you interact and the way you can get your information is just fantastic.
00:55:05.119 --> 00:55:18.559
So um, yeah, I'm I'm I'm I'm very optimistic now, and since I sort of find my way and I I try to not feel pressured by having to learn it all or produce more.
00:55:18.719 --> 00:55:24.079
I don't know, maybe at some point the organization will say, Sorry, you should produce 10x more, but we're not there yet.
00:55:24.159 --> 00:55:33.119
I think we want to research I think we want to use it for for other quality work that we never got around to do because we didn't have time for it.
00:55:33.280 --> 00:55:35.280
That's also value in that, right?
00:55:36.079 --> 00:55:38.719
So there is value in that.
00:55:38.800 --> 00:55:42.639
It's it's all those projects I wanted to do that I couldn't do that I didn't have time to do.
00:55:42.719 --> 00:55:44.880
I feel like I should be able to do what I want to do.
00:55:44.960 --> 00:55:49.599
So it's almost adds an additional uh layer of pressure as well, too.
00:55:49.840 --> 00:56:01.920
Um but it is, it's it's the the technology and the examples that you're given are real-world examples of this is how you can use it, which again will help alleviate some of the pressure.
00:56:02.079 --> 00:56:09.440
Again, studying for an exam or giving it content, or even having a sounding board sometimes is good too.
00:56:09.599 --> 00:56:22.559
Uh, like you said, it's uh you feel like you're talking with a peer, and uh you don't have the inhibitions that you may have with another peer because you may feel a certain way to talk about um what we had talked about at the beginning of it.
00:56:22.719 --> 00:56:32.960
So uh there are some very good uses for it, in addition to just saying, hey, use it, you don't know what to use it for, you're you know, you're behind.
00:56:33.199 --> 00:56:38.800
But like you said, just find something simple that you want to solve, even if you can do it yourself manually.
00:56:38.960 --> 00:56:44.159
And when I say manual, like it seems like such a uh rudimentary task that why would I want to waste time with AI on it?
00:56:44.400 --> 00:56:49.039
But it's how we learn and progress and just give you the the foundation for it.
00:56:49.360 --> 00:56:59.920
Absolutely, and I think even though when like AI is everywhere and we're being told everywhere to use it, but still somehow there's an inspiration deficit.
00:57:00.159 --> 00:57:21.119
Like I we need we need to be inspired, we need to see how other people are using it, like in real life, and you would think there's a lot of that on on social media, but it's it seems to be more around the technology and not so much what you use it for, and uh so we need to show each other what can you actually use it for, and then just a that's telling, by the way.
00:57:21.440 --> 00:57:32.880
To me, that's telling because everyone's talking about the technology and how great it is and what it's doing in uh micro forms versus saying this is what I've done with it.
00:57:33.039 --> 00:57:36.239
Yes, this is how it's made my life better.
00:57:36.400 --> 00:57:49.519
And when I say my life better, it could be solve a customer's problem, made made things a little bit easier for a customer that I was working with, made things a little bit easier for me, studying for an exam, planning a fitness workout, planning this or doing that.
00:57:49.840 --> 00:57:50.880
I agree with you.
00:57:51.039 --> 00:57:59.679
I I see so far less of oh, this model versus this model, we go through this with these tokens and all this other stuff, but the reality is why does that matter?
00:58:00.000 --> 00:58:01.119
I'm not saying it doesn't matter.
00:58:01.199 --> 00:58:03.039
I'm just saying, tell me why it matters.
00:58:03.280 --> 00:58:04.800
Yeah, it's only women, yeah.
00:58:04.960 --> 00:58:08.400
Yes, yeah, tell me why that matters versus it's just better.
00:58:08.639 --> 00:58:09.119
Yeah.
00:58:09.440 --> 00:58:10.159
Exactly.
00:58:10.480 --> 00:58:22.000
And I also want to just uh on the whole thing about you using AI and and and the whole and I mean if there's a two words we all know by now that will become sort of a skill for the future is is the prompt engineering, right?
00:58:22.159 --> 00:58:23.679
Like how to write a good prompt.
00:58:23.840 --> 00:58:39.840
But even then, just I mean, if you don't know how to write a good prompt, ask Copilot, ask ChatGPC, hey, uh give me a good prompt for this and this and this, and then it will give you really good inspiration for how to write a good prompt and how to make it much more concise and grounded and so on.
00:58:39.920 --> 00:58:44.719
So yeah, just just uh I I learned that way too way too late myself.
00:58:44.880 --> 00:58:47.920
Uh just a quick note.
00:58:48.079 --> 00:58:58.480
Uh you had mentioned, you know, when you're working with Copilot or AI, you it's typically between you know, being able to have a conversation with it, it's typically just between the two of you.
00:58:58.559 --> 00:59:03.760
And I think that's still true with the tools right now, with its current um capabilities.
00:59:04.079 --> 00:59:11.599
But um you know, we talked about Gemini Notebook LM, and Copilot also has a notebook as well.
00:59:12.000 --> 00:59:26.800
And one of the things that maybe perhaps, I don't know, may uh sorry, and maybe it's something you guys do at Microsoft, but organizations, partners, ISVs, or whomever, like a consultant, you could do collaboration with copilot notebook as well.
00:59:26.960 --> 00:59:35.519
So when you're talking about brainstorming, it's not just um the limitation of brainstorming between you and the co-pilot or the AI.
00:59:35.679 --> 00:59:43.280
You can actually then pull in others to provide additional context, to provide their perspective into this notebook.
00:59:43.519 --> 00:59:45.920
Now the conversation is still between you and copilot.
00:59:46.000 --> 00:59:49.440
That's I think that's by design, right, for security privacy purposes.
00:59:49.679 --> 00:59:57.440
But to be able to bring in and share your kind of like a your notebook notes and sources and say, hey, can you add anything?
00:59:57.519 --> 00:59:58.880
This is what we're trying to solve.
00:59:59.119 --> 01:00:03.360
Add your opinion into this and add some instructions, help me out.
01:00:03.440 --> 01:00:08.719
And so you're all working in a collaboration in a single notebook.
01:00:08.960 --> 01:00:15.760
Uh is that something are you doing right now, or maybe it's something that Microsoft is doing internally?
01:00:16.320 --> 01:00:21.840
I I I think that's I mean, it's definitely the future as well.
01:00:21.920 --> 01:00:27.519
Uh, because I mean collaboration uh needs to be empowered as well.
01:00:27.760 --> 01:00:32.079
Uh like we've been we've been used to co-authoring an Excel spreadsheet, right?
01:00:32.159 --> 01:00:33.679
Or co-authoring a Word document.
01:00:33.840 --> 01:00:42.800
It's only natural if AI plays a part in that whole process as well, like between the stakeholders and um yeah, that that that's that's super interesting.
01:00:42.960 --> 01:00:50.159
Uh I haven't actually had the time to look into it yet, but you just saw this co-pilot co-work thing uh being announced here the other day.
01:00:50.239 --> 01:00:53.599
So it just I mean that sounds a bit like what you're what you're saying, Chris.
01:00:53.679 --> 01:01:11.519
But uh yeah, I mean it's it's hard, it's difficult to imagine all the places where AI will sort of uh play a part in our in our work life, uh and and uh hopefully make us all you know help us all to make some bit better better decisions.
01:01:11.760 --> 01:01:21.840
But until then, you can you can imagine all kinds of agents, of course, uh doing their thing and uh working on all the in-between stuff between people as well.
01:01:22.159 --> 01:01:42.639
Uh yeah, and and um obviously we do agents inside Business Central as well, and we but we also have agents, each of us do things and you know, and and um you know there could be as of I mean I don't think there's an engineering organization on the planet who's not using uh some agent to help them with code.
01:01:42.800 --> 01:01:55.039
Uh if it's if if if it's for writing code or reviewing code and be that extra layer of uh of sanity, um I think that that makes a lot of sense.
01:01:57.199 --> 01:01:57.760
Yeah.
01:01:58.639 --> 01:02:11.679
Speaking of agents within Business Central, I've been having a lot of fun with uh the agent preview, agent playground, formerly Agent Playground 27.4 released it as agent uh preview.
01:02:11.840 --> 01:02:14.639
Uh it's a lot of fun uh working with it.
01:02:14.719 --> 01:02:28.159
Uh, I know that there's still a lot with it being in preview, there's still a lot that uh can change with it, a lot that can be added to it, but uh I can see how it can enhance and help organizations within their business.
01:02:28.320 --> 01:02:32.159
Uh but one thing that you work very closely with is the payables agent.
01:02:32.320 --> 01:02:38.880
I know we first talked about the payables agents a bit ago, but what uh what's uh occurring with the payables agent today?
01:02:39.199 --> 01:02:40.079
Yeah, that's true.
01:02:40.159 --> 01:02:43.599
Uh we started developing on the payables agent quite some time ago.
01:02:43.679 --> 01:02:47.119
Uh it's not two years, it's one and a half years ago.
01:02:47.280 --> 01:02:56.960
And also because technology moves so fast, we sort of sometimes have to say, oh, now we switch up to this new model, or now we need to find another way of doing things, and and so on.
01:02:57.039 --> 01:03:04.639
So it actually, strangely enough, in this uh day and age, has taken quite a long time to build the agent.
01:03:04.880 --> 01:03:07.360
Um, it's uh yeah, it's out there.
01:03:07.440 --> 01:03:13.039
We released it uh in in in general availability, as we call it, uh back in the fall.
01:03:13.280 --> 01:03:18.800
And now with the upcoming major release here in a few weeks, it will be uh available worldwide.
01:03:18.960 --> 01:03:28.480
So in all countries where presential is available, the agent will be available and it will be supported by all the languages that Microsoft provides as part of our localization.
01:03:28.559 --> 01:03:31.039
So these 20-something languages.
01:03:31.280 --> 01:03:39.199
Um and it it might work in other languages as well, but but those are sort of the within what we sort of claim support of.
01:03:39.440 --> 01:03:48.079
Yeah, and then we're adding like so we knew it was an early version we released because that's part of this whole new paradigm with AI.
01:03:48.320 --> 01:03:58.960
Basically, uh you might remember Kevin uh Kim Scott, our CTO, a few years back saying the only way to know if there's a value in AI is to release AI.
01:03:59.440 --> 01:04:16.079
Uh that can sound a bit provoking, but it changes the mindset of when you want to release something, and it definitely challenges uh a product manager like me because I've been you we've been used to terms like uh minimum viable product and all that.
01:04:17.440 --> 01:04:19.679
Minimum viable product is dead.
01:04:20.079 --> 01:04:44.800
Like we want to go to a place where we I say we, I think going to water is a place where we we ship experiments, we ship something that outlines the value of what is to come, and then as we did with the payables agent in preview, in public preview, then we use that to gauge the interest from the market, and then the market says thumbs up, which they overwhelmingly, overwhelmingly did.
01:04:44.960 --> 01:04:50.800
Yes, we would like this, but then we say great, then we GA'd, and then we go build it, basically.
01:04:50.880 --> 01:04:52.960
So it's a bit reverse from what you typically would know.
01:04:53.199 --> 01:04:55.840
That's also why you're still missing quite a few things.
01:04:56.000 --> 01:05:10.000
But that's the new that's the new rhythm, because you would need to experiment with AI, because it's way too expensive to build something if and then find out that oh, it wasn't what they wanted anyway, and we'll see much more of that with AI.
01:05:10.320 --> 01:05:23.360
So um right now we're adding new capabilities to the agent, for example, like purchase order matching, being able to match three-way with you, like the invoice with the order, with the actual receipt of the products, for example.
01:05:23.599 --> 01:05:47.840
And then coming also soon is more kind of two-way matching, so where you don't require something that is not for inventory, but you still want to have a purchase order on it, match it, for example, like rent or utility bills or consultant contracts and quotes, like you have an order for you know 12 months of consultant services, and you want to still match the invoice with the lines of the order, things of that nature.
01:05:48.079 --> 01:05:52.000
Um, yeah, and and and um approvals.
01:05:52.159 --> 01:05:53.519
We're thinking a lot about approvals.
01:05:53.599 --> 01:05:57.519
How should approvals look with the agent in in this day and age?
01:05:57.760 --> 01:06:12.000
We have some power automated connectors do we We enrich them maybe with more intelligent routing of approvals based on who's the project owner, who is the budget owner, who is the purchaser for this thing, things of that nature is what we're thinking about.
01:06:12.239 --> 01:06:18.079
Um yeah, and then we're trying to make the agent easier to discover.
01:06:18.320 --> 01:06:26.320
So, right now, if you've been working with the agents in BC, you know that they present themselves as these avatars in the top right corner.
01:06:26.719 --> 01:06:33.119
But turns out only the ones with super access or administrators can see them or ADN administrators.
01:06:33.440 --> 01:06:49.280
But we want to have this essential be a bit more like there's this term called product led instead, where the product advertises things to the right audience.
01:06:49.599 --> 01:07:12.480
So if we can find a way to make the agent service itself, make itself available and advertise itself to the users who actually work with the purchase invoices, and to create sort of a demand, uh pull from the organization, if we can if we can convey uh what's in it for you, what's in it for you to use this agent.
01:07:12.639 --> 01:07:13.119
Try it out.
01:07:13.519 --> 01:07:21.360
Role-based, like a role-based kind of thing where like if you assign someone a role, you have a checkbox, it's like these are the people who should see that.
01:07:21.679 --> 01:07:29.840
Yeah, role is tricky because people use all kinds of role centers, and we only have the name to go by, and they can be called all kinds of stuff.
01:07:30.000 --> 01:07:33.440
So what but what we do have, we do know which permissions the user has.
01:07:33.599 --> 01:07:38.719
So if they have permissions to create person voices, maybe they're an okay audience to show it to.
01:07:38.880 --> 01:07:39.920
Things of that nature.
01:07:40.239 --> 01:07:43.280
Yeah, so um really trying to that is interesting.
01:07:43.599 --> 01:07:46.639
Really trying to improve uh the agent all over.
01:07:46.800 --> 01:08:00.159
So and now the point of making it available worldwide or in all countries where BC is available is also to get much more feedback because invoices look so different, as I'm sure you know, uh, from company to company.
01:08:00.400 --> 01:08:09.360
Sometimes they're very sleek and elegant, sometimes they're very messy, sometimes you have five textual lines per actual invoice lines or all kinds of information.
01:08:09.599 --> 01:08:15.280
We need to be sure that we can read invoices that interpret them correctly in all kinds of ways.
01:08:15.599 --> 01:08:23.520
And just for example, we got a piece of feedback saying, Oh, it doesn't it doesn't take the cost center from the invoice.
01:08:23.760 --> 01:08:29.520
Someone actually some vendor actually wrote cost center XYZ on the invoice.
01:08:30.560 --> 01:08:34.960
That's because we don't know what to do with it right now, but we want to be able to know what to do with it.
01:08:35.039 --> 01:08:42.479
Maybe your company policy says, Oh, if there's a cost center on the invoice, that goes to dimension uh department or whatever.
01:08:42.720 --> 01:08:46.800
Like, but so that's the thing that whole thing is also what we want to improve.
01:08:46.960 --> 01:08:53.439
So getting extraction even better, and then our interpretation of it even better as well.
01:08:53.680 --> 01:08:59.600
So yeah, please, if anyone is listening out there and using it or have customers, just keep the feedback coming.
01:08:59.760 --> 01:09:06.960
Make sure you use the provide feedback action inside the purchase document draft page where you can provide feedback.
01:09:07.199 --> 01:09:11.279
And when we reply to that, it goes to the feedback portal.
01:09:11.520 --> 01:09:16.960
Feedbackportal.microsoft.com is where you can see your feedback and our replies.
01:09:17.199 --> 01:09:20.640
Because when we send the reply, unfortunately, it goes into a void.
01:09:20.800 --> 01:09:23.039
You don't get an email, you don't get any notification.
01:09:23.119 --> 01:09:26.239
So please check it there, uh, at least for now.
01:09:26.479 --> 01:09:29.039
So uh yeah, lots of stuff.
01:09:29.119 --> 01:09:31.760
We had we're super excited, and we are very committed.
01:09:31.840 --> 01:09:39.359
Um, as a PM, I'm very happy to be able to work on something like this across multiple releases.
01:09:39.600 --> 01:09:46.560
Sometimes priorities have changed and so on, but everyone, everyone in the organization knows this is one of our big bets.
01:09:46.960 --> 01:09:54.399
This has to work, and we're committed to making this the best uh vendor invoice processing uh solution for PC.
01:09:54.560 --> 01:10:06.560
So that's uh that's that's exciting, exciting times to this new agent that uh a lot of improvements, and you're right, there's new tools or new uh model that you'd have to apply to these agents.
01:10:06.720 --> 01:10:09.199
And you I gotta go back a little bit further.
01:10:09.279 --> 01:10:17.600
Uh uh you had made a comment about uh the CTO of Microsoft that uh you had said like MVP is dead, we just have to put it out there.
01:10:17.680 --> 01:10:18.720
Yeah, this tool.
01:10:18.800 --> 01:10:20.720
I I thought that was pretty fascinating.
01:10:20.880 --> 01:10:28.800
Of like you're yeah, I I I get that, where you're like, we have this tool, let's put it out there and get feedback, see what's the use case that's available.
01:10:28.960 --> 01:10:34.000
I thought that was uh a very interesting perspective uh for a lot of these tools.
01:10:34.800 --> 01:10:40.399
But I think it also challenged it, it I think it challenges and has challenged everyone.
01:10:40.640 --> 01:10:51.119
And that's the whole thing with AI features because in in the before times, you as a user, you you clicked the button, you knew what it did, and if it didn't work, you filed a bug or something like that.
01:10:51.359 --> 01:11:04.319
But today, for many of these features, of the generative ones at least, we push the cognitive load of assessing whether something is correct or works correctly or not to the user, right?
01:11:04.479 --> 01:11:14.000
So, for example, we can only test with the invoice test invoices we have, but whether it works with your invoice, that's up for that's for you to decide, for you to tell us.
01:11:14.159 --> 01:11:16.800
So it's it's not black and white anymore.
01:11:16.960 --> 01:11:22.159
AI features pushes the cognitive load to users challenge in itself.
01:11:22.319 --> 01:11:24.000
But that's and I think so.
01:11:24.399 --> 01:11:40.159
While you might think it's provoking to say, like like Kevin did, that we're shipping experiments, and you know, only way to release, you know, figure out this value in AI is to release AI, is also it just shows that we're we all need to learn how to live with AI in this day and age.
01:11:40.239 --> 01:11:42.159
That this is the nature of software.
01:11:42.319 --> 01:11:48.720
We we've just not been used to it in sort of business software, like like things have worked or they didn't, right?
01:11:48.800 --> 01:11:50.800
But uh that's not necessarily the case anymore.
01:11:51.520 --> 01:11:56.800
It is, and I think it leads to better results if you understand that process.
01:11:56.960 --> 01:12:08.239
And a lot of times it is easy for individuals to not be so satisfied with it because some things in this uh era uh do evolve.
01:12:08.399 --> 01:12:11.359
I do understand shipping, you mentioned shipping and experiment.
01:12:11.439 --> 01:12:30.000
I don't know if I call it shipping and experiment, um, but shipping uh uh an application with a certain piece of functionality and then relying on the users of that application to help determine the features and functionality because I know that I've created some wonderful, cool apps that were great.
01:12:30.239 --> 01:12:34.720
I spent a lot of time doing it because I thought they were great features to spend my time on.
01:12:34.960 --> 01:12:36.800
Then you come to find out nobody cared.
01:12:37.119 --> 01:12:37.359
Right?
01:12:37.439 --> 01:12:39.680
So it's it it helps an organization.
01:12:39.840 --> 01:12:43.039
You you can think you have a great thing, right?
01:12:43.119 --> 01:13:14.800
Or you want to put a thing together, but it's the users of that thing that are going to determine what's valuable to have, and you don't want to sink a lot of time into trying to put bells and whistles and shiny uh objects on the areas that aren't important to those that are using it, when the ones that are using it can help you, me, or anyone that's creating something gauge the importance, and then you can focus on those areas and make those work really, really, really well instead of just eh, okay.
01:13:15.279 --> 01:13:36.159
It's so true, and it it just highlights the importance of you really need to establish a good feedback loop early on in the process so you can take the right decisions, because decisions for something like this are expensive, they're sticky, and it creates technical debt, and there's the opportunity cost with all of it.
01:13:36.239 --> 01:13:38.239
You could have just spent your time on something else.
01:13:38.479 --> 01:13:57.279
So, for example, right now we're assessing how far we should go with the approval scenario for the payables agent because you have approval workforce already, both the internal ones in BC that's hard to set up, let's put like that, and maybe it's not as powerful as the power-automated ones, but you also have power-automated ones.
01:13:57.439 --> 01:14:04.000
So, depending on how much flexibility you want, does it or does it not make sense to have the agent be involved with that?
01:14:04.079 --> 01:14:07.600
Should the agent just let go of the invoice when it's become a purchase invoice?
01:14:07.840 --> 01:14:24.960
Maybe it could trigger the approval flow, or you do that yourself and then you take it from there, or should the agent be part of it and help the approver with summarization of an invoice so they can just already in the email see summary of what's the invoice about and not have to log into BC.
01:14:25.680 --> 01:14:42.319
So we only know that by having customer conversations and conversations with people as yourself before we start developing it, because we could also risk developing something that was only for those last five percent of customers who would love it, and the other ones maybe they wouldn't care.
01:14:42.640 --> 01:14:48.640
Then that that's that's not wasted time, but we could probably spend our bucks on something else that had a larger impact, right?
01:14:48.720 --> 01:14:52.159
So that's yeah, feedback is so important.
01:14:53.840 --> 01:14:55.039
Yeah, that's great.
01:14:55.279 --> 01:14:55.760
That's great.
01:14:55.920 --> 01:15:01.840
A lot of great things coming to the application that pay both agents, sales order agent, agent preview.
01:15:02.079 --> 01:15:04.560
Uh there's a lot of agents coming into there.
01:15:04.720 --> 01:15:05.439
Expense agent.
01:15:06.720 --> 01:15:07.920
I can't even speak anymore.
01:15:08.079 --> 01:15:16.720
The expect expense agent, Chris, you have to help me out with the uh the expense agent uh and a lot of other AI features within Business Central.
01:15:16.800 --> 01:15:23.840
I just recently did a session on that, actually covering each of the features uh that are available in Business Central because many users may not be aware of them.
01:15:23.920 --> 01:15:32.560
And sometimes I found out that many users are using those features and they don't even know their AI, uh, which um is interesting.
01:15:32.720 --> 01:15:55.199
So we we all can do our part to uh uh uh uh educate or work with users to make sure they understand those features and the benefits of those features, uh, because it really does, if you go back with listen back to everything we were talking about, the use of these features can help them have a little more uh uh time, I guess you could say, or a little less pressure in completing the task they need to do.
01:15:55.279 --> 01:16:07.520
So I mean that's the intent behind a lot of these features and functionality is to help um the users and the business owners get a little more uh um accomplished within the application without as much pressure is the way I look at it.
01:16:08.239 --> 01:16:08.720
Well, Mr.
01:16:08.800 --> 01:16:14.399
Thornsor, I could talk with you for days about all of this stuff.
01:16:14.560 --> 01:16:21.760
Um, but unfortunately we can't do that all in one sitting because we have other things to do.
01:16:21.920 --> 01:16:24.880
Uh but we do appreciate you taking the time to speak with us today to share your story.
01:16:24.960 --> 01:16:33.840
Uh we do have to have you come back on and uh talk about the vibe coding experience that you had gone through with getting an app from zero to published uh and basically a weekend.
01:16:33.920 --> 01:16:42.560
I know there's some testing time there and it took a little uh a little longer than you had hoped, but it wasn't the coding challenges that you faced, it's some of the administrative challenges.
01:16:42.720 --> 01:16:45.279
So uh it'd be interesting here in that story.
01:16:45.439 --> 01:17:00.319
Um but if you would, if anyone would like to reach out to you to learn more about the payables agent uh within Business Central or how they could offer suggestions, provide feedback, uh and also maybe even if they themselves are feeling a little overwhelmed with AI, what's the best way to get in contact with you?
01:17:00.720 --> 01:17:02.479
Yeah, so anyone can reach out to me.
01:17:02.560 --> 01:17:05.760
Uh my email is soalex at Microsoft.com.
01:17:05.920 --> 01:17:07.600
And they can also reach out to me on LinkedIn.
01:17:07.680 --> 01:17:11.680
So at the LinkedIn slash in slash Alexanderson.
01:17:12.159 --> 01:17:17.680
Uh yeah, I got on LinkedIn on 2006, so I was one of those who get my vanity URL.
01:17:17.920 --> 01:17:21.279
Uh so um, yeah, so anyone feel free to reach out.
01:17:21.359 --> 01:17:27.520
I'd happy to yeah, uh always be a sounding board or uh love to get your feedback on the payables agent.
01:17:27.760 --> 01:17:31.039
And then just want to close by saying, hey, uh take it easy.
01:17:31.199 --> 01:17:41.439
Uh you'll probably still have a job tomorrow and uh find out, just start using AI and uh uh claim and defend your um your right to slowness.
01:17:42.399 --> 01:17:43.359
I like that.
01:17:43.520 --> 01:17:44.159
Thank you very much.
01:17:44.239 --> 01:17:45.680
I look forward to talking to you again very soon.
01:17:46.079 --> 01:17:48.640
I appreciate your time uh that you spent with us today.
01:17:48.720 --> 01:17:49.680
I really do appreciate it.
01:17:49.760 --> 01:17:55.039
Uh any moments you spend with us, uh it's time that you can't do something else and you don't get that time back.
01:17:55.199 --> 01:17:56.079
Uh talk with you soon.
01:17:56.159 --> 01:17:56.319
Thank you.
01:17:56.399 --> 01:17:56.880
Chao ciao.
01:17:56.960 --> 01:17:57.439
Thanks, sir.
01:17:57.760 --> 01:17:58.000
Thank you.
01:17:58.800 --> 01:17:59.199
Bye.
01:18:00.239 --> 01:18:05.359
Thank you, Chris, for your time for another episode of In the Dynamics Corner Chair.
01:18:05.520 --> 01:18:07.520
And thank you to our guests for participating.
01:18:07.760 --> 01:18:09.359
Thank you, Brad, for your time.
01:18:09.520 --> 01:18:13.039
It is a wonderful episode of Dynamics Corner Chair.
01:18:13.199 --> 01:18:16.560
I would also like to thank our guests for joining us.
01:18:16.800 --> 01:18:19.520
Thank you for all of our listeners tuning in as well.
01:18:19.760 --> 01:18:23.600
You can find Brad at developerlife.com.
01:18:23.760 --> 01:18:28.159
That is D V L P R L I F E dot com.
01:18:28.399 --> 01:18:33.920
And you can interact with them via Twitter, D V L P R L I F E.
01:18:34.560 --> 01:18:47.279
You can also find me at mattalino.io, m-a-t-a-l-in-o.io, and my Twitter handle is mattalino16.
01:18:48.079 --> 01:18:51.039
And see you can see those links down below in the show notes.
01:18:51.199 --> 01:18:52.399
Again, thank you everyone.
01:18:52.560 --> 01:18:54.319
Thank you, and take care.