Apache Sedona 流数据处理入门

Apache Flink介绍

    Apache Flink是由Apache软件基金会开发的开源流处理框架,其核心是用Java和Scala编写的分布式流数据流引擎。Flink以数据并行和流水线方式执行任意流数据程序,Flink的流水线运行时系统可以执行批处理和流处理程序。

Apache Sedona介绍

    Apache Sedona(孵化中)是一个用于处理大规模空间数据的集群计算系统。Sedona用一套开箱即用的分布式空间数据集和空间SQL扩展了Apache Spark和Apache Flink,可以在机器间有效地加载、处理和分析大规模空间数据。

处理流程

java代码

StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
EnvironmentSettings settings = EnvironmentSettings.newInstance().inStreamingMode().build();
StreamTableEnvironment tableEnv = StreamTableEnvironment.create(env, settings);
SedonaFlinkRegistrator.registerType(env);
SedonaFlinkRegistrator.registerFunc(tableEnv);
DataStream<String> socketTextStream = env.socketTextStream("localhost", 8888);
DataStream<Row> rowDataStream = socketTextStream.map(new MapFunction<String, Row>() {
    private static final long serialVersionUID = -3351062125994879777L;
    @Override
    public Row map(String line) throws Exception {
        //System.out.println(line);
        //System.out.println("11");
        String[] fields = line.split(",");
        String pointWkt = fields[1] ;
        int id = Integer.parseInt(fields[0]);
        return Row.of(pointWkt, id);

    }
}).returns(Types.ROW(Types.STRING, Types.INT));
Table pointTable = tableEnv.fromDataStream(rowDataStream);
tableEnv.createTemporaryView("myTable", pointTable);
Table geomTbl = tableEnv.sqlQuery("SELECT ST_GeomFromWKT(f0) as geom_polygon, f1 FROM myTable");
tableEnv.createTemporaryView("geoTable", geomTbl);
geomTbl = tableEnv.sqlQuery("SELECT f1, geom_polygon    FROM geoTable  where  ST_Contains (ST_GeomFromWKT('MultiPolygon (((110.499997 20.010307, 110.499995 20.010759, 110.500473 20.01076, 110.500475 20.010308, 110.499997 20.010307)))'),geom_polygon)");
geomTbl.execute().print();

pom.xml配置

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>cn.hwang</groupId>
    <artifactId>geospark-dev</artifactId>
    <version>1.0</version>

    <properties>
        <scala.version>2.12</scala.version>
        <scala.compat.version>2.12</scala.compat.version>
        <geospark.version>1.2.0</geospark.version>
        <spark.compatible.verison>3.0</spark.compatible.verison>
        <spark.version>3.1.2</spark.version>
        <hadoop.version>3.2.0</hadoop.version>
        <geotools.version>24.0</geotools.version>
        <flink.version>1.14.3</flink.version>
        <kafka.version>2.8.1</kafka.version>
        <dependency.scope>compile</dependency.scope>
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>

    </properties>

    <dependencies>

        <dependency>
            <groupId>org.apache.sedona</groupId>
            <artifactId>sedona-core-3.0_2.12</artifactId>
            <version>1.2.0-incubating</version>
        </dependency>
        <dependency>
            <groupId>org.apache.sedona</groupId>
            <artifactId>sedona-viz-3.0_2.12</artifactId>
            <version>1.2.0-incubating</version>
        </dependency>
        <dependency>
            <groupId>org.apache.sedona</groupId>
            <artifactId>sedona-sql-3.0_2.12</artifactId>
            <version>1.2.0-incubating</version>
        </dependency>
        <dependency>
            <groupId>org.apache.sedona</groupId>
            <artifactId>sedona-flink_2.12</artifactId>
            <version>1.2.0-incubating</version>
        </dependency>
        <dependency>
            <groupId>org.scala-lang</groupId>
            <artifactId>scala-library</artifactId>
            <version>2.12.13</version>
        </dependency>

        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.12</artifactId>
            <version>${spark.version}</version>
            <scope>${dependency.scope}</scope>
            <exclusions>
                <exclusion>
                    <groupId>org.apache.hadoop</groupId>
                    <artifactId>*</artifactId>
                </exclusion>
            </exclusions>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.12</artifactId>
            <version>${spark.version}</version>
            <scope>${dependency.scope}</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-mapreduce-client-core</artifactId>
            <version>${hadoop.version}</version>
            <scope>${dependency.scope}</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-common</artifactId>
            <version>${hadoop.version}</version>
            <scope>${dependency.scope}</scope>
        </dependency>
        <!-- https://mvnrepository.com/artifact/org.postgresql/postgresql -->
        <dependency>
            <groupId>org.postgresql</groupId>
            <artifactId>postgresql</artifactId>
            <version>42.2.5</version>
        </dependency>

<!--        <dependency>-->
<!--            <groupId>org.geotools</groupId>-->
<!--            <artifactId>gt-shapefile</artifactId>-->
<!--            <version>22-RC</version>-->
<!--        </dependency>-->
        <dependency>
            <groupId>org.datasyslab</groupId>
            <artifactId>geotools-wrapper</artifactId>
            <version>1.1.0-25.2</version>
        </dependency>
        <dependency>
            <groupId>org.locationtech.jts</groupId>
            <artifactId>jts-core</artifactId>
            <version>1.18.0</version>
        </dependency>
<!--        <dependency>-->
<!--            <groupId>org.wololo</groupId>-->
<!--            <artifactId>jts2geojson</artifactId>-->
<!--            <version>0.16.1</version>-->
<!--        </dependency>-->
        <dependency>
            <groupId>org.wololo</groupId>
            <artifactId>jts2geojson</artifactId>
            <version>0.16.1</version>
            <exclusions>
                <exclusion>
                    <groupId>org.locationtech.jts</groupId>
                    <artifactId>jts-core</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>com.fasterxml.jackson.core</groupId>
                    <artifactId>*</artifactId>
                </exclusion>
            </exclusions>
        </dependency>
        <dependency>
            <groupId>org.geotools</groupId>
            <artifactId>gt-main</artifactId>
            <version>24.0</version>
        </dependency>
        <!-- https://mvnrepository.com/artifact/org.geotools/gt-referencing -->
        <dependency>
            <groupId>org.geotools</groupId>
            <artifactId>gt-referencing</artifactId>
            <version>24.0</version>
        </dependency>
        <!-- https://mvnrepository.com/artifact/org.geotools/gt-epsg-hsql -->
        <dependency>
            <groupId>org.geotools</groupId>
            <artifactId>gt-epsg-hsql</artifactId>
            <version>24.0</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-core</artifactId>
            <version>${flink.version}</version>
            <scope>${dependency.scope}</scope>
        </dependency>
        <!--        For Flink DataStream API-->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-streaming-java_${scala.compat.version}</artifactId>
            <version>${flink.version}</version>
            <scope>${dependency.scope}</scope>
        </dependency>
        <!-- Kafka  -->
        <dependency>
            <groupId>org.apache.kafka</groupId>
            <artifactId>kafka-clients</artifactId>
            <version>${kafka.version}</version>
        </dependency>
        <!--        Flink Kafka connector-->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-kafka_${scala.compat.version}</artifactId>
            <version>${flink.version}</version>
            <scope>${dependency.scope}</scope>
        </dependency>
        <!--        For playing flink in IDE-->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-clients_${scala.compat.version}</artifactId>
            <version>${flink.version}</version>
            <scope>${dependency.scope}</scope>
        </dependency>
        <!--        For Flink flink api, planner, udf/udt, csv-->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-java-bridge_${scala.compat.version}</artifactId>
            <version>${flink.version}</version>
            <scope>${dependency.scope}</scope>
        </dependency>
        <!--        Starting Flink 14, Blink planner has been renamed to the official Flink planner-->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-planner_${scala.compat.version}</artifactId>
            <version>${flink.version}</version>
            <scope>${dependency.scope}</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-common</artifactId>
            <version>${flink.version}</version>
            <scope>${dependency.scope}</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-csv</artifactId>
            <version>${flink.version}</version>
            <scope>${dependency.scope}</scope>
        </dependency>
        <!--        For Flink Web Ui in test-->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-runtime-web_${scala.compat.version}</artifactId>
            <version>${flink.version}</version>
            <scope>test</scope>
        </dependency>
    </dependencies>

    <build>
        <plugins>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-compiler-plugin</artifactId>
                <version>3.8.0</version>
                <configuration>
                    <source>1.8</source>
                    <target>1.8</target>
                </configuration>
            </plugin>
        </plugins>
    </build>

    <repositories>
        <repository>
            <id>central</id>
            <name>maven.aliyun.com</name>
            <url>https://maven.aliyun.com/repository/public</url>
        </repository>
        <repository>
            <id>maven2-repository.dev.java.net</id>
            <name>Java.net repository</name>
            <url>https://download.java.net/maven/2</url>
        </repository>
        <repository>
            <id>osgeo</id>
            <name>OSGeo Release Repository</name>
            <url>https://repo.osgeo.org/repository/release/</url>
            <snapshots>
                <enabled>false</enabled>
            </snapshots>
            <releases>
                <enabled>true</enabled>
            </releases>
        </repository>
        <repository>
            <id>Central</id>
            <name>Central Repository</name>
            <url>https://repo1.maven.org/maven2/</url>
        </repository>

    </repositories>

</project>

    下载netcat解压到本地,使用cmd运行nc程序,模拟流数据输入。下面就是测试的数据模板。


70e977b5cf1c4c47ba83171009a957b0_tplv-k3u1fbpfcp-watermark.png
1,Point (110.500235 20.0105335)
2,Point (110.18832409 20.06088375)
3,Point (110.18784591 20.06088125)
4,Point (109.4116775 18.2997045)
5,Point (109.55539791 18.30762275)
6,Point (109.5483405 19.778523)

    粘贴测试数据到nc程序下,开始运行代码。

[图片上传失败...(image-fe31f3-1652021098777)]
    Apache Sedona已经可以成功运行一些空间流数据,十分感谢JupiterChow同学对我的帮助,给我提供了示例代码和配置,还有耐心讲解。


00733f4fc0b54e21b2defe1502894240_tplv-k3u1fbpfcp-watermark.png

附( 花了好几天安装但是没用上的,flink1.14 部署到ubuntu)

安装JDK

sudo apt update 
sudo apt install openjdk-11-jdk

下载解压Flink

wget https://dlcdn.apache.org/flink/flink-1.14.4/flink-1.14.4-bin-scala_2.11.tgz
tar -xzf flink-1.14.4-bin-scala_2.11.tgz
cd flink-1.14.4

启动集群

./bin/start-cluster.sh

测试自带的例子

./bin/flink run ./examples/batch/WordCount.jar

打开自带的UI界面

b504ad8472164434bdc8e79ae093f179_tplv-k3u1fbpfcp-watermark.png

wget https://github.com/glink-incubator/glink/releases/download/release-1.0.0/glink-1.0.0-bin.tar.gz
tar -zxvf glink-1.0.0-bin.tar.gz
./flink-1.14.4/bin/flink run ./examples/batch/WordCount.jar
7b498899c1c04bc9bc17e1b94917ec27_tplv-k3u1fbpfcp-watermark.png

参考资料

https://zhuanlan.zhihu.com/p/447743903

https://www.cnblogs.com/liufei1983/p/15661322.html

https://blog.csdn.net/weixin_46684578/article/details/122803180

https://juejin.cn/post/7023210394894204936

https://cloud.tencent.com/developer/article/1626610

https://baike.baidu.com/item/Apache%20Flink/59924858

https://eternallybored.org/misc/netcat/

https://sedona.apache.org/

©著作权归作者所有,转载或内容合作请联系作者
  • 序言:七十年代末,一起剥皮案震惊了整个滨河市,随后出现的几起案子,更是在滨河造成了极大的恐慌,老刑警刘岩,带你破解...
    沈念sama阅读 215,539评论 6 497
  • 序言:滨河连续发生了三起死亡事件,死亡现场离奇诡异,居然都是意外死亡,警方通过查阅死者的电脑和手机,发现死者居然都...
    沈念sama阅读 91,911评论 3 391
  • 文/潘晓璐 我一进店门,熙熙楼的掌柜王于贵愁眉苦脸地迎上来,“玉大人,你说我怎么就摊上这事。” “怎么了?”我有些...
    开封第一讲书人阅读 161,337评论 0 351
  • 文/不坏的土叔 我叫张陵,是天一观的道长。 经常有香客问我,道长,这世上最难降的妖魔是什么? 我笑而不...
    开封第一讲书人阅读 57,723评论 1 290
  • 正文 为了忘掉前任,我火速办了婚礼,结果婚礼上,老公的妹妹穿的比我还像新娘。我一直安慰自己,他们只是感情好,可当我...
    茶点故事阅读 66,795评论 6 388
  • 文/花漫 我一把揭开白布。 她就那样静静地躺着,像睡着了一般。 火红的嫁衣衬着肌肤如雪。 梳的纹丝不乱的头发上,一...
    开封第一讲书人阅读 50,762评论 1 294
  • 那天,我揣着相机与录音,去河边找鬼。 笑死,一个胖子当着我的面吹牛,可吹牛的内容都是我干的。 我是一名探鬼主播,决...
    沈念sama阅读 39,742评论 3 416
  • 文/苍兰香墨 我猛地睁开眼,长吁一口气:“原来是场噩梦啊……” “哼!你这毒妇竟也来了?” 一声冷哼从身侧响起,我...
    开封第一讲书人阅读 38,508评论 0 271
  • 序言:老挝万荣一对情侣失踪,失踪者是张志新(化名)和其女友刘颖,没想到半个月后,有当地人在树林里发现了一具尸体,经...
    沈念sama阅读 44,954评论 1 308
  • 正文 独居荒郊野岭守林人离奇死亡,尸身上长有42处带血的脓包…… 初始之章·张勋 以下内容为张勋视角 年9月15日...
    茶点故事阅读 37,247评论 2 331
  • 正文 我和宋清朗相恋三年,在试婚纱的时候发现自己被绿了。 大学时的朋友给我发了我未婚夫和他白月光在一起吃饭的照片。...
    茶点故事阅读 39,404评论 1 345
  • 序言:一个原本活蹦乱跳的男人离奇死亡,死状恐怖,灵堂内的尸体忽然破棺而出,到底是诈尸还是另有隐情,我是刑警宁泽,带...
    沈念sama阅读 35,104评论 5 340
  • 正文 年R本政府宣布,位于F岛的核电站,受9级特大地震影响,放射性物质发生泄漏。R本人自食恶果不足惜,却给世界环境...
    茶点故事阅读 40,736评论 3 324
  • 文/蒙蒙 一、第九天 我趴在偏房一处隐蔽的房顶上张望。 院中可真热闹,春花似锦、人声如沸。这庄子的主人今日做“春日...
    开封第一讲书人阅读 31,352评论 0 21
  • 文/苍兰香墨 我抬头看了看天上的太阳。三九已至,却和暖如春,着一层夹袄步出监牢的瞬间,已是汗流浃背。 一阵脚步声响...
    开封第一讲书人阅读 32,557评论 1 268
  • 我被黑心中介骗来泰国打工, 没想到刚下飞机就差点儿被人妖公主榨干…… 1. 我叫王不留,地道东北人。 一个月前我还...
    沈念sama阅读 47,371评论 2 368
  • 正文 我出身青楼,却偏偏与公主长得像,于是被迫代替她去往敌国和亲。 传闻我的和亲对象是个残疾皇子,可洞房花烛夜当晚...
    茶点故事阅读 44,292评论 2 352

推荐阅读更多精彩内容