[Bmi212_c-news] Notes from yesterday's meeting with CSFoo
Daniel Newburger
daniel.newburger at gmail.com
Sat Oct 17 13:07:47 PDT 2009
Hi folks,
Sorry I forgot to send this last night.
-Dan
Lasso linear regression (L1) to select features while making the model
>
imposes a sparsity penalty
>
>
> Elastic net (L1 & L2 at the same time)
>
> 1. reduce features (maybe to 10,000), then use logistic reg. or SVM
>
> -can use softmax or multiclass SVM across all 8 classes (7 diseases and
> control)
>
-for lasso (or another algorithm that can select features), need an
> algorithm that does not require all data be in memory
>
> SO:
>
> Our pipeline:
>
1. Select SNPs feature vector using p-value and odds-ratio
>
2. use softmax or SVM to learn across all disease classes
>
3. repeat for binomial (early vs. late), for age of onset on disease.
> What about regression?
>
>
Learning theory = method of analyzing your learning method to see if have
> enough data
>
> libraries:
>
svmperf
>
svmlight
>
libsum
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: https://simtk.org/pipermail/bmi212_c-news/attachments/20091017/283f804b/attachment.html
More information about the Bmi212_c-news
mailing list