Actuarial Outpost
 
Go Back   Actuarial Outpost > Actuarial Discussion Forum > Property - Casualty / General Insurance
FlashChat Actuarial Discussion Preliminary Exams CAS/SOA Exams Cyberchat Around the World Suggestions

ACTUARIAL SALARY SURVEYS
Contact DW Simpson for a Personalized Salary Survey

Reply
 
Thread Tools Search this Thread Display Modes
  #1  
Old 05-15-2019, 09:12 AM
Actuarially Me Actuarially Me is offline
Member
CAS
 
Join Date: Jun 2013
Posts: 191
Default Deploying R Code?

Almost all of my work has been done locally up until this point, so I'm clueless for deployment. We get a lot of our data from outside vendors. One calculation in particular costs quite a bit of money per API call. I am able to replicate the calculation in an R script. The script pulls in a shape file and a data source with lat/long coordinates and calculates the minimum distance between the shape file and the lat/long coordinates.

I'd like to be able to test this in production with the plan of using this as the primary calculation and relying on the vendor only if the R script isn't working properly. I've heard of PlumbR to create an API and dockerize to make it friendly with other languages. I also have free reign to create an AWS account and test their various services. My knowledge of AWS extends as far as the various services they offer. I saw FarGate offers serverless calculations, but not sure how/where I'd store the shape file. I don't want it to have to reload every time there's a query. I picked up an AWS Udemy course last year just in case I got a job that needed it, so may dig into that.

I also want the ability to scale to other calculations in the future. If this first project goes well, I'll have more buy in from the company to do use R for more advanced work such as predictive models and Shiny Dashboards.

There's only me and one other guy that use R, so I can't really justify setting up an internal R server just yet. Curious if anyone else has been in this position.
Reply With Quote
  #2  
Old 05-15-2019, 01:51 PM
examsarehard examsarehard is offline
Member
CAS
 
Join Date: May 2011
Posts: 598
Default

Our IT department allowed us to deploy an internal R server before they were able to get us dedicated resources.
Reply With Quote
  #3  
Old 05-15-2019, 04:49 PM
sticks1839 sticks1839 is offline
Member
CAS AAA
 
Join Date: Jul 2005
Posts: 1,653
Default

There are free R modules for SQL Server 2017 and later (with MS support). I'm not positive they would enable API call style functionality as my research was related to running specific RStudio code remotely on the server to reduce memory/speed bottlenecks of the local PC.
Reply With Quote
  #4  
Old 05-16-2019, 08:43 PM
kevinykuo kevinykuo is offline
Member
CAS
 
Join Date: Nov 2017
Posts: 47
Default

What are your latency/throughput requirements?
Reply With Quote
  #5  
Old 05-17-2019, 08:39 AM
Actuarially Me Actuarially Me is offline
Member
CAS
 
Join Date: Jun 2013
Posts: 191
Default

Quote:
Originally Posted by kevinykuo View Post
What are your latency/throughput requirements?
Pretty low initially. I'll need to build it from the ground up most likely. I want it to be able to scale. I plan on making some Shiny apps to automate some of the common reports and dashboards, which would all be used internally.

I think AWS is my best bet after doing some research. It will just have a longer learning curve for me initially. For the smaller calculations, I can probably get away with using PlumbR on an EC2 instance. For machine learning models, AWS has SageMaker which looks friendly with R.

For the larger projects, I'll probably deploy some shiny apps on an EC2 instance.

I could also use the packrat package and put it in a docker, but that just adds another layer of complexity.

I'd mostly be doing this alone so wanted to see if there's some industry standards/easier ways before I start learning AWS.
Reply With Quote
  #6  
Old 05-20-2019, 09:00 AM
Actuarially Me Actuarially Me is offline
Member
CAS
 
Join Date: Jun 2013
Posts: 191
Default

For anyone else in a similar boat to me, here's the conclusion I came to after some research.

If you want IT to get involved, you can have them set up an internal R Studio server along with an R Shiny Server. This can be used to deploy code. They'd have to manage the server and install of the packages for you. I'm sure most large companies already do this. It's free outside of server costs, and you only really need knowledge of R/Shiny and the plumber package.

Next option is to set up a cloud based server. AWS, Azure, and Digital Ocean being the most popular. A lot of them, you're paying a monthly fee for a single virtual machine instance. So if you install R Server, on the instance you can only have one API/App at a time. There are ways to bypass this though. If your apps are smaller on scale, you can containerize your code into dockers and run many dockers on one virtual machine instance. You'd have to set up all the permissions yourself along with the added layer of complexity of docker. There's also some ways to bypass the single instance by using some .php code. It's relatively cheap, but has a steeper learning curve to set everything up correctly. It also requires you to manage it if something goes wrong.

If you have a lot of buy in for using R at the company, RStudio Connect is an easy, albeit expensive solution. It's $25K a year, but requires no technical knowledge outside of R. It allows click of a button publishing and managing of your apps/api. It's also easy to scale. If you factor in salary and how long it'd take to set up an internal server or learning AWS, it may even out. Problem is getting management to buy in.


The route I'm most likely going is to set up an AWS and create a few proof of concept items that automate some of the more tedious manual tasks. Knowing AWS will come in handy for my career anyways. End goal would be to set up RSConnect. I think it's worth the cost, especially if we plan on growing. We spend a large amount of money on BI tech as it is.
Reply With Quote
  #7  
Old 05-20-2019, 09:51 AM
kevinykuo kevinykuo is offline
Member
CAS
 
Join Date: Nov 2017
Posts: 47
Default

Are the "serverless" options on the cloud not worth it for your use cases?

Also, not sure how big your company is, but there are discounts for RStudio stuff if you're under certain revenue thresholds (50% for <5mm and 75% for <1mm).
Reply With Quote
  #8  
Old 05-20-2019, 10:03 AM
Actuarially Me Actuarially Me is offline
Member
CAS
 
Join Date: Jun 2013
Posts: 191
Default

Quote:
Originally Posted by kevinykuo View Post
Are the "serverless" options on the cloud not worth it for your use cases?

Also, not sure how big your company is, but there are discounts for RStudio stuff if you're under certain revenue thresholds (50% for <5mm and 75% for <1mm).
Not really. I want it to be able to scale. Eventually, I want to create shiny dashboards and such.

Unfortunately, we wouldn't qualify for discounts. I work for an underwriter group, so a lot of the work I do isn't traditional actuarial work. A lot of the work they do is either manually done in Excel or using multiple BI tools. I'm trying to bring some quality of life improvements and modernize.
Reply With Quote
Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off


All times are GMT -4. The time now is 10:41 PM.


Powered by vBulletin®
Copyright ©2000 - 2019, Jelsoft Enterprises Ltd.
*PLEASE NOTE: Posts are not checked for accuracy, and do not
represent the views of the Actuarial Outpost or its sponsors.
Page generated in 0.23094 seconds with 11 queries