Views:

Transcript:

0:00 foreign
0:04 I am Patrick mcinerney from North52.
0:07 today I'm here with my colleague Bruce
0:08 to discuss about how North52 can help
0:11 to solve complex business requirements
0:13 for example the national flood insurance
0:15 program in America the FEMA program
0:17 Bruce can you explain to us what is the
0:20 national flood insurance program sure
0:22 it's a program managed by the Federal
0:24 Emergency Management agency it provides
0:27 Insurance to help reduce the
0:28 socioeconomic impact of floods in the
0:30 United States homes and businesses in
0:32 high risk flood areas with mortgages
0:34 from government-backed lenders are
0:35 required to have flood insurance
0:37 okay I understand there have been some
0:39 rather complex changes to the rules
0:41 recently yes the risk rating 2.0
0:45 methodology was introduced on the 1st of
0:47 October 2021. it was a major overhaul of
0:49 the existing program and changed the way
0:51 premiums were calculated I see what
0:53 makes calculating National flood
0:55 insurance premiums such a challenge
0:58 I guess there are a significant number
1:00 of variables involved for example there
1:02 are over 35 property attributes things
1:04 like distance to a river or Coast
1:06 elevation measurements types of
1:08 foundation and then there are over 50
1:10 rating tables that are referenced this
1:11 all combines to about 1500 calculations
1:14 to arrive at the premium amount well
1:16 that does sound very complex how does
1:18 Nora 52 go about solving this problem
1:20 well fortunately the calculations are
1:22 very well documented and real world
1:24 examples are provided by FEMA that we
1:26 can validate against the premiums are
1:28 calculated from large number of
1:29 sub-calculations so we focused on
1:31 breaking down the problem and working
1:33 through it bit by bit I see and how does
1:35 North52 actually help with the process
1:37 well North52's multi-sheet decision
1:40 tables functionality makes it easy to
1:41 separate calculations out into logical
1:43 steps you can solve the problem for one
1:45 part test and validate and then move on
1:48 to the next problem and it's very quick
1:50 and easy to do this using the no code
1:51 portion click interface I see can you
1:55 provide an example can ensure what it
1:57 actually looks like inside Dynamics 365
1:59 sure one part of the calculations
2:01 determines the base rates so we have a
2:03 separate decision sheet table sheet for
2:05 this calculation you can see this on
2:07 screen now depending on whether the
2:08 property is levied or non-levied and has
2:11 a Barrier Island indicator different
2:12 rates will be used
2:14 how does the North52 rules engine look
2:16 up all those rates
2:18 and Alpha 2 has a feature called xcache
2:20 tables which allows you to store tabular
2:22 data on separate sheets to make them
2:24 easy to read columns which are used as
2:26 conditions are yellow and columns which
2:28 store the matching values are green you
2:30 can see an example on screen now which
2:32 shows the base rates for non-liby
2:34 properties the conditions are region
2:36 segment and a single family home
2:38 indicator and the different rates are
2:40 available in the green columns there is
2:42 a North52 function which allows you to
2:44 specify the xcache table the sheet name
2:46 the filter criteria and the column you
2:48 wish to return
2:50 for example to retrieve the selected
2:52 cell we would specify the FEMA
2:54 underscore Bates rates table the
2:56 non-levage sheet a filter criteria of
2:58 region equal to SC a segment equal to
3:01 one and the single family home indicator
3:04 TS the column to return would be
3:06 specified as Inland flood building
3:09 okay that doesn't make sense but I
3:11 understand there are some more
3:12 complicated red table lookups as well
3:14 yeah some of the rating table values do
3:17 not correspond exactly to the attributes
3:19 of a property for example when
3:21 determining the rating based on a
3:22 property's distance to River value there
3:24 are many potential values that are not
3:26 defined in the rating table the guidance
3:28 from FEMA states that in this situation
3:30 the factors are linearly interpolated
3:32 between break points thankfully North52
3:35 provides an interpolation function for
3:37 this situation that is really
3:39 interesting can you explain how that
3:41 works please
3:42 sure
3:43 let me draw it for you
3:45 if you could imagine a chart with the
3:47 distance derivative values on the x-axis
3:49 and the ratings on the y-axis and we
3:51 plot two known points
3:53 if we draw a straight line between the
3:54 points then for a given distance to
3:56 River value you can work out the rating
3:58 using the intercept between the two
4:00 original points
4:01 so there are four values that are looked
4:03 up the ray tables and that are then used
4:05 to calculate the reading yes for a given
4:08 distance to River value we find the
4:10 nearest lower value and the nearest
4:12 higher value and use the applicable
4:14 ratings for these two values going back
4:16 to my drawing let's take the distance to
4:18 River value of 111.2 and plot it here
4:21 between my two known values if we look
4:24 at the rating tables the distance
4:25 derivative values which are specified
4:27 are 100 and 125. the corresponding
4:30 ratings for these values are 1.082 and
4:33 1.05 the North52 interpolation function
4:37 takes all these values and then works
4:39 out the interpolated value which in this
4:41 case is
4:42 1.068 rounded to three decimal places
4:45 it sounds like over the course of a
4:47 single premium calculation there could
4:49 be more than 100 raid lookups how does
4:52 North52 manage performance for this
4:54 yeah we could easily have over 100 data
4:56 point lookups in a single flood
4:57 insurance calculation uh depending on
4:59 the input values by the property owner
5:01 this is something we spend a significant
5:03 amount of time on tuning performance
5:04 timings to get the best results within
5:07 the Dynamics 365 sandbox it is
5:09 implemented by our x-cash module which
5:11 has the smarts built into effect
5:13 efficiently retrieve store and process
5:15 data used in the calculation and
5:18 minimize database realm trips we even
5:20 had to design our own search algorithm
5:22 to allow us to quickly process tables in
5:24 excess of a hundred thousand rows this
5:26 is just one of examples of the value
5:27 proposition that you get by using nor
5:29 52. we've seen many tabs for different
5:31 sheets in the formula how many sheets
5:33 are there
5:34 there are 20 sheets the main reason for
5:36 having so many is to break down the
5:38 calculations into related groups and
5:39 make it easy to manage it also makes it
5:42 easier for testing most of the sheets
5:44 are very similar looking up values from
5:45 the rate tables in x-cash using the
5:47 building property attributes supplied in
5:49 an application
5:50 how did you decide to break down the
5:52 logic
5:54 the example Excel Works books that FEMA
5:57 provide were a great start they had
5:58 grouped their calculations so we
6:00 predominantly followed that pattern take
6:02 a look here you can see a lot of the
6:04 named rows for example a to K are mapped
6:07 to decision table sheets this made it
6:09 easier to test as the formula was
6:10 growing in complexity too it allowed us
6:12 to incrementally build up the formula
6:14 and test against each row in the sample
6:16 Excel file see what tooling does Nora 52
6:19 provide to help you with testing as you
6:22 were building up that logic there's a
6:24 great feature in North52 where you can
6:25 have a split view of a test record in a
6:27 trace file sorry Bruce what's the trace
6:30 file a trace file is a detailed log of
6:33 everything that happens when a formula
6:34 executes it shows all the values that
6:36 are being used in a calculation and if
6:38 an error occurs the exact point in the
6:40 calculation it plays a crucial part of
6:42 the process when building complex logic
6:44 reading and understanding Trace files is
6:46 a whole topic on its own right and
6:48 there's plenty of information on our
6:49 knowledge base
6:50 that's great do you think we can see
6:52 this with you in action please yeah of
6:54 course let's load it up you can do this
6:56 directly from the formula
6:59 find a record that you want to trigger
7:01 the formula against from the explore Tab
7:03 and open it using the split view let's
7:05 take the example one record here and
7:07 click to open the split viewer we can
7:09 make a change to our record which then
7:12 triggers the formula execution once this
7:14 is done we can load the latest Trace
7:15 file and view the detail on the right
7:17 hand side
7:19 if we scroll down the trace file we can
7:21 see the values being used in the
7:22 decision tables for example if we look
7:24 at the base rate decision sheet we can
7:26 see that we're retrieving a value from
7:28 the xcache tables of 2.255 and this is
7:32 being stored in a variable for the
7:33 Inland flood building rating that looks
7:36 really valuable having all that detail
7:38 available to you when you're doing your
7:39 manual testing yeah it's very handy
7:41 something else you can do is load the
7:43 formula into the split view so that you
7:45 can have the trace file and formula side
7:47 by side this is very helpful in
7:49 troubleshooting errors I can see how
7:51 that would be helpful all right let me
7:53 show you we select the option from here
7:55 to load the formula we can then adjust
7:57 our view to make things easier to work
7:59 with like maximizing the window and
8:01 adjusting the split let's say we were
8:03 reviewing the logic for the base rates
8:05 we can navigate to the decision table
8:06 sheet and review the logic against the
8:08 values being displayed in the trace
8:10 funnel if there were any errors these
8:12 would show the trace file at the point
8:13 of failure enabling a user to easily
8:16 locate the corresponding position in the
8:18 decision sheet and make Corrections
8:21 fascinating stuff it's been really
8:22 interesting to see how you would use
8:24 North52 to build low complex logic like
8:27 this but would you not just you know
8:29 could you not just use a power automated
8:31 floor for some of this stuff
8:33 well
8:34 technically would be possible but the
8:36 reality is it would not be really
8:37 feasible this is because it would be
8:39 difficult to manage due to the sheer
8:40 number of steps that would be required
8:42 it might even breach the limits of
8:44 what's supported could you imagine
8:46 scrolling through dozens of steps to
8:47 find a specific calculation to edit and
8:50 it would be very difficult to understand
8:51 exactly where in the decision logic you
8:53 were nothing beats the compactness of a
8:55 multi-sheet decision table for
8:57 readability and ease of making changes
8:59 yeah definitely get that and I think
9:01 let's be honest here our power automate
9:03 floor would probably be a lot slower
9:05 yes yeah North52 does the calculations
9:08 in real time the result would be
9:09 available before the flow even gets to
9:11 the front of the queue to be processed I
9:13 see so do you consider North52 to be a
9:16 replacement for power automate floor
9:18 no it's it's definitely not a
9:19 replacement we see North52's decision
9:21 Suite as a complimentary and enhancement
9:23 to Microsoft Dynamics 365 and PowerPoint
9:25 Power Platform but where there is
9:28 complexity in an organization's
9:29 requirements North52 will provide Great
9:31 Value I see and where did where did the
9:35 organizations get this value from
9:37 well there are a few key areas but it
9:39 mostly comes down to time which always
9:40 equates to money and that can be savings
9:43 made on delivery but also benefits for
9:45 being quicker to the market North52
9:47 will make complex rules faster to
9:49 implement easier to maintain and often
9:51 when complexity is involved people turn
9:53 to custom code and that brings a whole
9:55 layer of additional maintenance just to
9:57 keep up with Microsoft platform changes
9:59 with North52 we take care of that and we
10:01 also take care of the tooling for
10:03 ongoing automated testing I'm definitely
10:05 interested in learning more about the
10:07 automated testing rules why can you tell
10:09 me about that yeah well that's a topic
10:11 we could talk about for hours in my
10:13 experience testing is something that
10:14 never gets the investment it really
10:16 should mainly because it's costly and
10:18 time consuming to set up and maintain
10:19 one thing we're focused on is enabling
10:22 our clients to build automated tests
10:23 very quickly and easily without any code
10:26 I understand your product test Shield
10:28 takes a slightly different approach from
10:30 other testing tools yeah that's right
10:32 most of the other tools focus on testing
10:35 via the user interface recording
10:36 scenarios in the browser and then
10:38 playing them back each time the test is
10:39 run while testing the UI works is
10:42 important that's not where the real
10:44 value is the real value is in the
10:45 complex business logic that runs after
10:47 users have entered data and this is what
10:49 we have focused on I see what are the
10:52 main benefits of North52's approach in
10:54 this
10:55 I'd say speed and reliability speed for
10:58 building the tests and speed for running
10:59 the tests our tests execute on the
11:01 server side so we can push through many
11:03 hundreds of tests every minute much
11:05 faster than systems relying on simulated
11:07 browser input our tests have reliable
11:09 repeatability too compared with UI
11:12 testing which can be very brittle and
11:13 susceptible to changes you know
11:15 constantly evolving UI working on the
11:17 server side makes things much more
11:19 predictable and if a test fails is
11:21 likely to be a real failure rather than
11:23 a quirk in the UI
11:25 let's see uh can we take a look please
11:28 sure let's have a look at one of the
11:29 tests for the flood insurance
11:30 calculations
11:32 we have a test here which focuses on
11:34 changing one parameter the previous
11:36 number of claims and testing its impact
11:38 on the premium in a test Shield test
11:40 there are four stages when setting it up
11:42 assemble arrange act and assert assemble
11:46 is the first stage and defines what the
11:48 test is doing usually this will
11:50 reference requirements documentation
11:52 it is also the stage where you would
11:54 create records in the system that
11:55 represent a perfect scenario we use that
11:58 scenario in the second arrange stage to
12:00 define the records which will be created
12:02 every time the test is run
12:04 each row represents a new record that
12:06 will be created and you can see that we
12:08 are changing the previous number of
12:10 claims for each row
12:12 the third stage Act is where we would
12:15 Define any steps in a process which
12:17 would normally be done by a user for
12:18 example executing a workflow
12:21 as everything is fully automated in our
12:22 scenario we don't have anything to find
12:24 here
12:25 the final stage is assert this is where
12:28 the perfect outcome parameters are
12:30 defined when the test is run these are
12:32 compared against the records that are
12:34 created on the arrange stage and if
12:36 there are any differences these are
12:37 highlighted with a failed test can we
12:40 see it in action
12:42 yeah of course
12:43 I'll manually execute it now but this
12:45 can be part of an automated as your
12:47 devops process too
12:49 when this test runs it creates four
12:51 flood application records each with the
12:53 same field values except for the
12:55 previous number of claims
12:57 after the records are created test
12:59 Shield checks all the values defined in
13:01 the assertive stage and we can view the
13:03 results here if there are any failures
13:05 we can view the trace log and
13:07 investigate what's with what went wrong
13:10 that's impressive I only have a few more
13:12 questions and then we wrap this up when
13:14 should someone get in touch with Nora 52
13:16 well there are a couple of tests that we
13:18 use one is around the number of and
13:20 complexity of the rules if this is high
13:22 then North52 is likely to be a good fit
13:24 and the other is when you're considering
13:25 writing any custom code and how would
13:28 you initially engage
13:29 typically we'll review the requirements
13:31 for fit and they are an offer a free
13:33 proof of concept to Showcase capability
13:35 we'll help out new customers and
13:37 partners as well to ensure their Journey
13:38 with North52 as a success
13:40 that sounds great it's been a pleasure
13:43 speaking with you today around the FEMA
13:45 flood insurance example and how North52
13:48 adds real value to Dynamics 365 and
13:50 Power Platform projects thanks I've
13:53 enjoyed it too and hope for those who've
13:54 watched it have found it interesting
14:01 thank you
14:04 foreign
14:06 [Music]