Actuarial Outpost
 
Go Back   Actuarial Outpost > Actuarial Discussion Forum > Property - Casualty / General Insurance
FlashChat Actuarial Discussion Preliminary Exams CAS/SOA Exams Cyberchat Around the World Suggestions

DW Simpson Global Actuarial & Analytics Recruitment
Download our Actuarial Salary Survey
now with state-by-state salary information!


Reply
 
Thread Tools Search this Thread Display Modes
  #31  
Old 04-18-2019, 10:48 AM
Actuarially Me Actuarially Me is offline
Member
CAS
 
Join Date: Jun 2013
Posts: 134
Default

How would you create a double lift chart for frequency?

Sort Ratio = Model A Frequency (Claim/Exposure) / Model B Frequency
Sort by Sort Ratio
Create Deciles by Sort Ratio
Group by Decile
Calculate Avg Frequency for Model 1, Model 2, and Actual
Plot?
Reply With Quote
  #32  
Old 04-18-2019, 04:29 PM
itGetsBetter itGetsBetter is offline
Member
CAS AAA
 
Join Date: Feb 2016
Location: Midwest
Studying for Awaiting Exam 9 Result
Favorite beer: Spruce Springsteen
Posts: 250
Default

Quote:
Originally Posted by Actuarially Me View Post
How would you create a double lift chart for frequency?

Sort Ratio = Model A Frequency (Claim/Exposure) / Model B Frequency
Sort by Sort Ratio
Create Deciles by Sort Ratio
Group by Decile
Calculate Avg Frequency for Model 1, Model 2, and Actual
Plot?
That sounds appropriate. Does it look okay?
Reply With Quote
  #33  
Old 04-19-2019, 08:25 AM
Actuarially Me Actuarially Me is offline
Member
CAS
 
Join Date: Jun 2013
Posts: 134
Default

Quote:
Originally Posted by itGetsBetter View Post
That sounds appropriate. Does it look okay?
Not as nice as the examples in the monograph lol.
Reply With Quote
  #34  
Old 04-22-2019, 10:45 AM
TDH TDH is offline
Member
CAS Non-Actuary
 
Join Date: Dec 2016
Posts: 52
Default

can i ask why you've used offset and no weight parameter?
Reply With Quote
  #35  
Old 04-23-2019, 01:35 PM
Actuarially Me Actuarially Me is offline
Member
CAS
 
Join Date: Jun 2013
Posts: 134
Default

Quote:
Originally Posted by TDH View Post
can i ask why you've used offset and no weight parameter?
Weight has an inverse relationship with variance for a GLM. So for claim counts, it doesn't make sense to reduce the variance just because you have more exposure. If your exposure is number of years insured, you would expect higher variance.

An offset turns your model into a rate for log linked functions.

Because log(Counts/Exposure) = Intercept + Variables + Error
log(Counts)-log(Exposure) = Intercept + Variables + Error
log(Counts) = Intercept + Variables + Error + log(Exposure)

For severity, you use weight = log(count) because it makes sense that more counts per policy will have a more stable severity. Your response variable can be either Incurred/Counts with no offset, or Incurred with Counts as an offset.

For Pure premium, it's a little tricky because of the Tweedie distribution having the extra power parameter. You can either model Pure Premium = Incurred/Exposure with weight = log(Exposure). Alternatively, you can do Loss as as Target with Offset as log(Exposure), but the weight would be log(Exposure^(p-1)). The book "Predictive Modeling Applications in Actuarial Science Volume II" has the proof of why that's the case.

We use Exposure as the weight because policies with a higher exposure volume is expected to have more stable experience.
Reply With Quote
  #36  
Old 04-25-2019, 08:58 AM
Vorian Atreides's Avatar
Vorian Atreides Vorian Atreides is offline
Wiki/Note Contributor
CAS
 
Join Date: Apr 2005
Location: As far as 3 cups of sugar will take you
Studying for ACAS
College: Hard Knocks
Favorite beer: Most German dark lagers
Posts: 63,870
Default

Quote:
Originally Posted by Actuarially Me View Post
How would you create a double lift chart for frequency?

Sort Ratio = Model A Frequency (Claim/Exposure) / Model B Frequency
Sort by Sort Ratio
Create Deciles by Sort Ratio
Group by Decile
Calculate Avg Frequency for Model 1, Model 2, and Actual
Plot?
Found the following (publicly available) in the iCAS material for Exam 3 of the CSPA credential that might be relevant to this discussion. I will also add this to the predictive analytics thread resource as well.

https://thecasinstitute.org/wp-conte...n-01162019.pdf
__________________
I find your lack of faith disturbing

Why should I worry about dying? Itís not going to happen in my lifetime!


Freedom of speech is not a license to discourtesy

#BLACKMATTERLIVES
Reply With Quote
  #37  
Old 04-27-2019, 04:40 PM
TDH TDH is offline
Member
CAS Non-Actuary
 
Join Date: Dec 2016
Posts: 52
Default

Quote:
Originally Posted by Actuarially Me View Post
Weight has an inverse relationship with variance for a GLM. So for claim counts, it doesn't make sense to reduce the variance just because you have more exposure. If your exposure is number of years insured, you would expect higher variance.

An offset turns your model into a rate for log linked functions.

Because log(Counts/Exposure) = Intercept + Variables + Error
log(Counts)-log(Exposure) = Intercept + Variables + Error
log(Counts) = Intercept + Variables + Error + log(Exposure)

For severity, you use weight = log(count) because it makes sense that more counts per policy will have a more stable severity. Your response variable can be either Incurred/Counts with no offset, or Incurred with Counts as an offset.

For Pure premium, it's a little tricky because of the Tweedie distribution having the extra power parameter. You can either model Pure Premium = Incurred/Exposure with weight = log(Exposure). Alternatively, you can do Loss as as Target with Offset as log(Exposure), but the weight would be log(Exposure^(p-1)). The book "Predictive Modeling Applications in Actuarial Science Volume II" has the proof of why that's the case.

We use Exposure as the weight because policies with a higher exposure volume is expected to have more stable experience.
When you say using an offset turns it into a rate, how do you mean exactly. If we model a pure premium model with response = Loss and offset = log(exposure) would the predicated value still be the Loss ammount or would it be the Loss/exposure? Similarly if we model claim count as our response with an offset of exposure, would the predicted value be the claim count / exposure (i.e. the frequency) or would it still just be the claim count? If so, would we just divide through by the exposure to get back the frequency or not?

For severity, why is the weight log(count) and not just count? I'd expect it to be count.
Reply With Quote
  #38  
Old 04-30-2019, 11:19 AM
Actuarially Me Actuarially Me is offline
Member
CAS
 
Join Date: Jun 2013
Posts: 134
Default

Quote:
Originally Posted by Vorian Atreides View Post
Found the following (publicly available) in the iCAS material for Exam 3 of the CSPA credential that might be relevant to this discussion. I will also add this to the predictive analytics thread resource as well.

https://thecasinstitute.org/wp-conte...n-01162019.pdf
Perfect thanks! I wondered if I should take those exams considering I'm not an actuary. Would definitely be relevant to my current job
Reply With Quote
  #39  
Old 04-30-2019, 11:22 AM
Vorian Atreides's Avatar
Vorian Atreides Vorian Atreides is offline
Wiki/Note Contributor
CAS
 
Join Date: Apr 2005
Location: As far as 3 cups of sugar will take you
Studying for ACAS
College: Hard Knocks
Favorite beer: Most German dark lagers
Posts: 63,870
Default

Quote:
Originally Posted by Actuarially Me View Post
Perfect thanks! I wondered if I should take those exams considering I'm not an actuary. Would definitely be relevant to my current job
Technically speaking, CSPA isn't for actuaries per se. They're a bridge between actuarial level knowledge of insurance data and master's level knowledge of data science (more specifically, statistical modeling).
__________________
I find your lack of faith disturbing

Why should I worry about dying? Itís not going to happen in my lifetime!


Freedom of speech is not a license to discourtesy

#BLACKMATTERLIVES
Reply With Quote
  #40  
Old 04-30-2019, 03:15 PM
Actuarially Me Actuarially Me is offline
Member
CAS
 
Join Date: Jun 2013
Posts: 134
Default

Quote:
Originally Posted by Vorian Atreides View Post
Technically speaking, CSPA isn't for actuaries per se. They're a bridge between actuarial level knowledge of insurance data and master's level knowledge of data science (more specifically, statistical modeling).
Makes it even easier to convince my boss then!
Reply With Quote
Reply

Tags
glm, peril, pure premium

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off


All times are GMT -4. The time now is 05:30 AM.


Powered by vBulletin®
Copyright ©2000 - 2019, Jelsoft Enterprises Ltd.
*PLEASE NOTE: Posts are not checked for accuracy, and do not
represent the views of the Actuarial Outpost or its sponsors.
Page generated in 0.25905 seconds with 9 queries