SparkML 实现 LR 算法

离散特征

举例
  • 性别的男和女就是离散的特征;
离散特征 | 处理
  • one-hot 编码,就是一维的编码,比如性别可以抽象成二维的向量,如果是男就是 (1, 0),女就是(0, 1);
  • 如果离散的特征分布的特别广泛, 比如有 10 种分类的方法,one-hot 编码的向量就是十维,其落在哪个维度上面,其对应的维度就是 1,其他的都是 0;

连续特征

举例
  • age 从 0 到 100 就是连续的特征;
  • price_per_man 也是连续的特征;
  • 连续的特征一般不会直接进模型;
连续特征 | 标准化 | 处理
  • z-score 标准化(x-mean) / std
    • 计算特征值,比如 price_per_man 的平均数(mean)和标准差(std);
    • 这样,就可以使 price_per_man 压缩到 0~1 之间;
  • max-min 标准化 (x-min) / (max-min);
    • 这样也可以把 price_per_man 的值压缩在 0~1 之间;
连续特征 | 离散化 | 处理
  • bucket 编码;
  • 比如 age,比如 1~10 岁的定义为孩子,10~30 定义为青年,30~50 定义为中年,50 以上定义为老年;虽然 age 是离散特征,可以把它当做离散特征落在不同的 bucket 中,然后在基于 bucket 做 one-hot 的编码;

特征处理

featurevalue.csv
"用户id","年龄","性别","门店id","评分","人均价格","是否点击"
"1","22","M","315","4","193","0"
"1","16","F","431","3","193","1"
"1","62","F","489","3","72","1"
"1","12","M","398","0","216","1"
"1","76","M","307","3","131","0"
"1","54","M","490","1","205","0"
"1","38","M","308","2","227","1"
"1","56","M","400","3","82","1"
"1","65","F","426","0","136","0"
"2","48","F","328","3","64","1"
feature.csv
  • 去掉 featurevalue.csv 中 userid,shopid 这些没有意义的字段,然后将其他内容做了映射;
  • age 分成前 4 列;
  • 性别分在 5, 6 列;
  • 评分使用 max-min 标准化分在第 7 列;
  • 人均价格使用 bucket 编码分布自 8 ~ 11 列;
  • 是否点击落在最后一列;
"1","0","0","0","1","0","0.8","0","0","1","0","0"
"1","0","0","0","0","1","0.6","0","0","1","0","1"
"0","0","0","1","0","1","0.6","0","1","0","0","1"
"1","0","0","0","1","0","0.0","0","0","0","1","1"
"0","0","0","1","1","0","0.6","0","0","1","0","0"
"0","0","1","0","1","0","0.2","0","0","0","1","0"
"0","1","0","0","1","0","0.4","0","0","0","1","1"
"0","0","1","0","1","0","0.6","0","1","0","0","1"
"0","0","0","1","0","1","0.0","0","0","1","0","0"
"0","0","1","0","0","1","0.6","0","1","0","0","1"

LR 模型生成

LR 模型生成 | 步骤
  • 用预处理过的特征值,训练生成模型;
  • 生成完了评估一下模型;
LR 模型生成 | 代码
package tech.lixinlei.dianping.recommand;

import java.io.IOException;

import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.function.Function;
import org.apache.spark.ml.classification.LogisticRegression;
import org.apache.spark.ml.classification.LogisticRegressionModel;
import org.apache.spark.ml.evaluation.MulticlassClassificationEvaluator;
import org.apache.spark.ml.linalg.VectorUDT;
import org.apache.spark.ml.linalg.Vectors;
import org.apache.spark.sql.Dataset;
import org.apache.spark.sql.Row;
import org.apache.spark.sql.RowFactory;
import org.apache.spark.sql.SparkSession;
import org.apache.spark.sql.types.DataTypes;
import org.apache.spark.sql.types.Metadata;
import org.apache.spark.sql.types.StructField;
import org.apache.spark.sql.types.StructType;

public class LRTrain {

    public static void main(String[] args) throws IOException {

        // 初始化spark运行环境
        SparkSession spark = SparkSession.builder().master("local").appName("DianpingApp").getOrCreate();

        // 加载特征及 label 训练文件
        JavaRDD<String> csvFile = spark.read().textFile("file:///home/lixinlei/project/gitee/dianping/src/main/resources/feature.csv").toJavaRDD();

        // 做转化
        JavaRDD<Row> rowJavaRDD = csvFile.map(new Function<String, Row>() {
            /**
             *
             * @param v1 feature.csv 中的一行数据;
             * @return
             * @throws Exception
             */
            @Override
            public Row call(String v1) throws Exception {
                v1 = v1.replace("\"", "");
                String[] strArr = v1.split(",");
                return RowFactory.create(new Double(strArr[11]),
                                         Vectors.dense(
                                              Double.valueOf(strArr[0]),
                                              Double.valueOf(strArr[1]),
                                              Double.valueOf(strArr[2]),
                                              Double.valueOf(strArr[3]),
                                              Double.valueOf(strArr[4]),
                                              Double.valueOf(strArr[5]),
                                              Double.valueOf(strArr[6]),
                                              Double.valueOf(strArr[7]),
                                              Double.valueOf(strArr[8]),
                                              Double.valueOf(strArr[9]),
                                              Double.valueOf(10)));
            }
        });

        // 定义列
        StructType schema = new StructType(
                new StructField[]{
                        new StructField("label", DataTypes.DoubleType, false, Metadata.empty()),
                        new StructField("features",new VectorUDT(),false, Metadata.empty())
                }
        );

        // data 只有两列,第一列 label,第二列是个 11 维的向量;
        Dataset<Row> data = spark.createDataFrame(rowJavaRDD, schema);

        // 训练集和测试集
        Dataset<Row>[] dataArr = data.randomSplit(new double[]{0.8, 0.2});
        Dataset<Row> trainData = dataArr[0];
        Dataset<Row> testData = dataArr[1];

        // 模型训练 | 逻辑回归
        LogisticRegression lr = new LogisticRegression()
                .setMaxIter(10) // 迭代次数
                .setRegParam(0.3)
                .setElasticNetParam(0.8)
                .setFamily("multinomial");
        LogisticRegressionModel lrModel = lr.fit(trainData);
        lrModel.save("file:///home/lixinlei/project/gitee/dianping/src/main/resources/lrmode");

        // 测试评估
        Dataset<Row> predictions = lrModel.transform(testData);
        MulticlassClassificationEvaluator evaluator = new MulticlassClassificationEvaluator();
        double accuracy = evaluator.setMetricName("accuracy").evaluate(predictions);

        System.out.println("auc = " + accuracy);

    }

}
©著作权归作者所有,转载或内容合作请联系作者
  • 序言:七十年代末,一起剥皮案震惊了整个滨河市,随后出现的几起案子,更是在滨河造成了极大的恐慌,老刑警刘岩,带你破解...
    沈念sama阅读 213,992评论 6 493
  • 序言:滨河连续发生了三起死亡事件,死亡现场离奇诡异,居然都是意外死亡,警方通过查阅死者的电脑和手机,发现死者居然都...
    沈念sama阅读 91,212评论 3 388
  • 文/潘晓璐 我一进店门,熙熙楼的掌柜王于贵愁眉苦脸地迎上来,“玉大人,你说我怎么就摊上这事。” “怎么了?”我有些...
    开封第一讲书人阅读 159,535评论 0 349
  • 文/不坏的土叔 我叫张陵,是天一观的道长。 经常有香客问我,道长,这世上最难降的妖魔是什么? 我笑而不...
    开封第一讲书人阅读 57,197评论 1 287
  • 正文 为了忘掉前任,我火速办了婚礼,结果婚礼上,老公的妹妹穿的比我还像新娘。我一直安慰自己,他们只是感情好,可当我...
    茶点故事阅读 66,310评论 6 386
  • 文/花漫 我一把揭开白布。 她就那样静静地躺着,像睡着了一般。 火红的嫁衣衬着肌肤如雪。 梳的纹丝不乱的头发上,一...
    开封第一讲书人阅读 50,383评论 1 292
  • 那天,我揣着相机与录音,去河边找鬼。 笑死,一个胖子当着我的面吹牛,可吹牛的内容都是我干的。 我是一名探鬼主播,决...
    沈念sama阅读 39,409评论 3 412
  • 文/苍兰香墨 我猛地睁开眼,长吁一口气:“原来是场噩梦啊……” “哼!你这毒妇竟也来了?” 一声冷哼从身侧响起,我...
    开封第一讲书人阅读 38,191评论 0 269
  • 序言:老挝万荣一对情侣失踪,失踪者是张志新(化名)和其女友刘颖,没想到半个月后,有当地人在树林里发现了一具尸体,经...
    沈念sama阅读 44,621评论 1 306
  • 正文 独居荒郊野岭守林人离奇死亡,尸身上长有42处带血的脓包…… 初始之章·张勋 以下内容为张勋视角 年9月15日...
    茶点故事阅读 36,910评论 2 328
  • 正文 我和宋清朗相恋三年,在试婚纱的时候发现自己被绿了。 大学时的朋友给我发了我未婚夫和他白月光在一起吃饭的照片。...
    茶点故事阅读 39,084评论 1 342
  • 序言:一个原本活蹦乱跳的男人离奇死亡,死状恐怖,灵堂内的尸体忽然破棺而出,到底是诈尸还是另有隐情,我是刑警宁泽,带...
    沈念sama阅读 34,763评论 4 337
  • 正文 年R本政府宣布,位于F岛的核电站,受9级特大地震影响,放射性物质发生泄漏。R本人自食恶果不足惜,却给世界环境...
    茶点故事阅读 40,403评论 3 322
  • 文/蒙蒙 一、第九天 我趴在偏房一处隐蔽的房顶上张望。 院中可真热闹,春花似锦、人声如沸。这庄子的主人今日做“春日...
    开封第一讲书人阅读 31,083评论 0 21
  • 文/苍兰香墨 我抬头看了看天上的太阳。三九已至,却和暖如春,着一层夹袄步出监牢的瞬间,已是汗流浃背。 一阵脚步声响...
    开封第一讲书人阅读 32,318评论 1 267
  • 我被黑心中介骗来泰国打工, 没想到刚下飞机就差点儿被人妖公主榨干…… 1. 我叫王不留,地道东北人。 一个月前我还...
    沈念sama阅读 46,946评论 2 365
  • 正文 我出身青楼,却偏偏与公主长得像,于是被迫代替她去往敌国和亲。 传闻我的和亲对象是个残疾皇子,可洞房花烛夜当晚...
    茶点故事阅读 43,967评论 2 351

推荐阅读更多精彩内容