

FlashChat  Actuarial Discussion  Preliminary Exams  CAS/SOA Exams  Cyberchat  Around the World  Suggestions 

Thread Tools  Search this Thread  Display Modes 
#11




Quote:
Quote:
__________________
ACAS 7 8 9 
#12




Quote:
If you combine signal processing methods used for audio and images with statistical inference, odds are you’re going to work with a complex random variable at some point. The question here was “why does this machine learning book discuss complex random variables?” A plausible answer is “because audio and image recognition are popular applications of machine learning.” I don’t own the book and I’m trying to entertain myself by seeing if I can predict its contents. That’s a separate question from “do actuaries need to know complex random variables to do machine learning?” The answer is probably not unless they are bringing in a specific method from signal processing and even then, they could probably adapt the method into some form of time series analysis using real numbers for parameters. However, for learning the concepts, image and audio processing may be better to study since there is an abundance of data where concrete and unequivocal results can be demonstrated. 
#14




My review of Chapter 1: http://ymmathstat.blogspot.com/2017/...smachine.html
__________________
If you want to add me on LinkedIn, PM me. Why I hate Microsoft Access. Studying/Reading: C 
#16




Quote:
Math degree programs, I've realized, go through topics too slowly for applications.
__________________
If you want to add me on LinkedIn, PM me. Why I hate Microsoft Access. Studying/Reading: C 
#17




So for a periodic process, the fourier transform constitutes sufficient statistics of the process. The Nyquist sampling theorem gives conditions for a sequence of discrete samples to be sufficient to reconstruct a continuous periodic function.
The ergodic hypothesis is the premise that a sample of a crosssection of a system yields sufficient staistics of the behavior of the system over time. Observations of a population over a short interval can be used as the basis of estimating probabilities for individuals over a longer interval. There is a related but slightly different notion of ergodicity as the property that a sufficiently large time slice of an ergodic process gives you sufficient statistics for the process. This is conceptually similar to reconstructing a periodic function from a finite discrete sample. In this sense, both periodic processes and processes that are ergodic in this sense are characterized by frequencies or distributions of events. These are assumptions that events have rates of occurrence which can be estimated from observations. Information theory is based on a transformation of problems involving probabilities to problems involving discrete codes. Under this correspondence, maximum likelihood estimates are mapped to representations that minimize the quantity of data used to describe the system. These are just ideas you’ve seen in statistics wearing other hats. 
#18




Quote:

#19




Quote:

#20




Quote:

Thread Tools  Search this Thread 
Display Modes  

