Full Code of yangyichao-mango/flink-study for AI

main 9e7daac7923b cached
393 files
1.5 MB
379.8k tokens
1459 symbols
1 requests
Download .txt
Showing preview only (1,743K chars total). Download the full file or copy to clipboard to get everything.
Repository: yangyichao-mango/flink-study
Branch: main
Commit: 9e7daac7923b
Files: 393
Total size: 1.5 MB

Directory structure:
gitextract_amrzktuf/

├── .gitignore
├── README.md
├── flink-examples-1.10/
│   ├── pom.xml
│   └── src/
│       └── main/
│           └── java/
│               └── flink/
│                   └── examples/
│                       └── sql/
│                           └── _07/
│                               └── query/
│                                   └── _06_joins/
│                                       └── _02_interval_joins/
│                                           └── _01_outer_join/
│                                               ├── WindowJoinFunction$46.java
│                                               └── _06_Interval_Outer_Joins_EventTime_Test.java
├── flink-examples-1.12/
│   ├── .gitignore
│   ├── pom.xml
│   └── src/
│       └── main/
│           └── java/
│               └── flink/
│                   └── examples/
│                       ├── datastream/
│                       │   └── _07/
│                       │       └── query/
│                       │           └── _04_window/
│                       │               └── _04_TumbleWindowTest.java
│                       └── sql/
│                           └── _07/
│                               └── query/
│                                   └── _04_window_agg/
│                                       ├── _04_TumbleWindowTest.java
│                                       ├── _04_TumbleWindowTest_GroupingWindowAggsHandler$59.java
│                                       ├── _04_TumbleWindowTest_KeyProjection$69.java
│                                       └── _04_TumbleWindowTest_WatermarkGenerator$6.java
├── flink-examples-1.13/
│   ├── .gitignore
│   ├── pom.xml
│   └── src/
│       ├── main/
│       │   ├── java/
│       │   │   └── flink/
│       │   │       ├── core/
│       │   │       │   └── source/
│       │   │       │       ├── JaninoUtils.java
│       │   │       │       └── SourceFactory.java
│       │   │       └── examples/
│       │   │           ├── FlinkEnvUtils.java
│       │   │           ├── JacksonUtils.java
│       │   │           ├── datastream/
│       │   │           │   ├── _01/
│       │   │           │   │   └── bytedance/
│       │   │           │   │       └── split/
│       │   │           │   │           ├── codegen/
│       │   │           │   │           │   ├── JaninoUtils.java
│       │   │           │   │           │   └── benchmark/
│       │   │           │   │           │       └── Benchmark.java
│       │   │           │   │           ├── job/
│       │   │           │   │           │   ├── SplitExampleJob.java
│       │   │           │   │           │   └── start.sh
│       │   │           │   │           ├── kafka/
│       │   │           │   │           │   ├── KafkaProducerCenter.java
│       │   │           │   │           │   └── demo/
│       │   │           │   │           │       ├── Application.java
│       │   │           │   │           │       ├── ConsumerThread.java
│       │   │           │   │           │       └── ProducerThread.java
│       │   │           │   │           ├── model/
│       │   │           │   │           │   ├── ClientLogSink.java
│       │   │           │   │           │   ├── ClientLogSource.java
│       │   │           │   │           │   ├── DynamicProducerRule.java
│       │   │           │   │           │   └── Evaluable.java
│       │   │           │   │           └── zkconfigcenter/
│       │   │           │   │               ├── ZkBasedConfigCenter.java
│       │   │           │   │               ├── new.json
│       │   │           │   │               └── old.json
│       │   │           │   ├── _02/
│       │   │           │   │   ├── DataStreamTest.java
│       │   │           │   │   └── DataStreamTest1.java
│       │   │           │   ├── _03/
│       │   │           │   │   ├── enums_state/
│       │   │           │   │   │   ├── EnumsStateTest.java
│       │   │           │   │   │   └── SenerioTest.java
│       │   │           │   │   └── state/
│       │   │           │   │       ├── StateExamplesTest.java
│       │   │           │   │       ├── _01_broadcast_state/
│       │   │           │   │       │   └── BroadcastStateTest.java
│       │   │           │   │       ├── _03_rocksdb/
│       │   │           │   │       │   ├── CreateStateBackendTest.java
│       │   │           │   │       │   ├── GettingStartDemo.java
│       │   │           │   │       │   ├── Rocksdb_OperatorAndKeyedState_StateStorageDIr_Test.java
│       │   │           │   │       │   ├── keyed_state/
│       │   │           │   │       │   │   ├── RocksBackendKeyedMapStateTest.java
│       │   │           │   │       │   │   └── RocksBackendKeyedValueStateTest.java
│       │   │           │   │       │   └── operator_state/
│       │   │           │   │       │       ├── KeyedStreamOperatorListStateTest.java
│       │   │           │   │       │       └── RocksBackendOperatorListStateTest.java
│       │   │           │   │       ├── _04_filesystem/
│       │   │           │   │       │   ├── keyed_state/
│       │   │           │   │       │   │   └── FsStateBackendKeyedMapStateTest.java
│       │   │           │   │       │   └── operator_state/
│       │   │           │   │       │       └── FsStateBackendOperatorListStateTest.java
│       │   │           │   │       └── _05_memory/
│       │   │           │   │           └── keyed_state/
│       │   │           │   │               └── MemoryStateBackendKeyedMapStateTest.java
│       │   │           │   ├── _04/
│       │   │           │   │   └── keyed_co_process/
│       │   │           │   │       ├── HashMapTest.java
│       │   │           │   │       └── _04_KeyedCoProcessFunctionTest.java
│       │   │           │   ├── _05_ken/
│       │   │           │   │   └── _01_watermark/
│       │   │           │   │       └── WatermarkTest.java
│       │   │           │   ├── _06_test/
│       │   │           │   │   └── _01_event_proctime/
│       │   │           │   │       ├── OneJobWIthProcAndEventTimeWIndowTest.java
│       │   │           │   │       └── OneJobWIthTimerTest.java
│       │   │           │   ├── _07_lambda_error/
│       │   │           │   │   └── LambdaErrorTest.java
│       │   │           │   ├── _08_late_record/
│       │   │           │   │   └── LatenessTest.java
│       │   │           │   ├── _09_join/
│       │   │           │   │   ├── _01_window_join/
│       │   │           │   │   │   └── _01_Window_Join_Test.java
│       │   │           │   │   └── _02_connect/
│       │   │           │   │       └── _01_Connect_Test.java
│       │   │           │   └── _10_agg/
│       │   │           │       └── AggTest.java
│       │   │           ├── practice/
│       │   │           │   └── _01/
│       │   │           │       └── dau/
│       │   │           │           └── _01_DataStream_Session_Window.java
│       │   │           ├── question/
│       │   │           │   ├── datastream/
│       │   │           │   │   └── _01/
│       │   │           │   │       └── kryo_protobuf_no_more_bytes_left/
│       │   │           │   │           └── KryoProtobufNoMoreBytesLeftTest.java
│       │   │           │   └── sql/
│       │   │           │       └── _01/
│       │   │           │           └── lots_source_fields_poor_performance/
│       │   │           │               ├── EmbeddedKafka.java
│       │   │           │               ├── _01_DataGenSourceTest.java
│       │   │           │               └── _01_JsonSourceTest.java
│       │   │           ├── runtime/
│       │   │           │   ├── _01/
│       │   │           │   │   └── future/
│       │   │           │   │       ├── CompletableFutureTest.java
│       │   │           │   │       ├── CompletableFutureTest4.java
│       │   │           │   │       ├── CompletableFuture_AnyOf_Test3.java
│       │   │           │   │       ├── CompletableFuture_ThenApplyAsync_Test2.java
│       │   │           │   │       ├── CompletableFuture_ThenComposeAsync_Test2.java
│       │   │           │   │       └── FutureTest.java
│       │   │           │   └── _04/
│       │   │           │       └── statebackend/
│       │   │           │           └── CancelAndRestoreWithCheckpointTest.java
│       │   │           └── sql/
│       │   │               ├── _01/
│       │   │               │   └── countdistincterror/
│       │   │               │       ├── CountDistinctErrorTest.java
│       │   │               │       ├── CountDistinctErrorTest2.java
│       │   │               │       ├── CountDistinctErrorTest3.java
│       │   │               │       └── udf/
│       │   │               │           ├── Mod_UDF.java
│       │   │               │           ├── StatusMapper1_UDF.java
│       │   │               │           └── StatusMapper_UDF.java
│       │   │               ├── _02/
│       │   │               │   └── timezone/
│       │   │               │       ├── TimeZoneTest.java
│       │   │               │       ├── TimeZoneTest2.java
│       │   │               │       └── TimeZoneTest3.java
│       │   │               ├── _03/
│       │   │               │   └── source_sink/
│       │   │               │       ├── CreateViewTest.java
│       │   │               │       ├── DataStreamSourceEventTimeTest.java
│       │   │               │       ├── DataStreamSourceProcessingTimeTest.java
│       │   │               │       ├── KafkaSourceTest.java
│       │   │               │       ├── RedisLookupTest.java
│       │   │               │       ├── RedisSinkTest.java
│       │   │               │       ├── SocketSourceTest.java
│       │   │               │       ├── TableApiKafkaSourceTest.java
│       │   │               │       ├── UpsertKafkaSinkProtobufFormatSupportTest.java
│       │   │               │       ├── UpsertKafkaSinkTest.java
│       │   │               │       ├── UserDefinedSourceTest.java
│       │   │               │       ├── abilities/
│       │   │               │       │   ├── sink/
│       │   │               │       │   │   ├── Abilities_SinkFunction.java
│       │   │               │       │   │   ├── Abilities_TableSink.java
│       │   │               │       │   │   ├── Abilities_TableSinkFactory.java
│       │   │               │       │   │   └── _01_SupportsWritingMetadata_Test.java
│       │   │               │       │   └── source/
│       │   │               │       │       ├── Abilities_SourceFunction.java
│       │   │               │       │       ├── Abilities_TableSource.java
│       │   │               │       │       ├── Abilities_TableSourceFactory.java
│       │   │               │       │       ├── _01_SupportsFilterPushDown_Test.java
│       │   │               │       │       ├── _02_SupportsLimitPushDown_Test.java
│       │   │               │       │       ├── _03_SupportsPartitionPushDown_Test.java
│       │   │               │       │       ├── _04_SupportsProjectionPushDown_JDBC_Test.java
│       │   │               │       │       ├── _04_SupportsProjectionPushDown_Test.java
│       │   │               │       │       ├── _05_SupportsReadingMetadata_Test.java
│       │   │               │       │       ├── _06_SupportsWatermarkPushDown_Test.java
│       │   │               │       │       ├── _07_SupportsSourceWatermark_Test.java
│       │   │               │       │       └── before/
│       │   │               │       │           ├── Before_Abilities_SourceFunction.java
│       │   │               │       │           ├── Before_Abilities_TableSource.java
│       │   │               │       │           ├── Before_Abilities_TableSourceFactory.java
│       │   │               │       │           ├── _01_Before_SupportsFilterPushDown_Test.java
│       │   │               │       │           ├── _02_Before_SupportsLimitPushDown_Test.java
│       │   │               │       │           ├── _03_Before_SupportsPartitionPushDown_Test.java
│       │   │               │       │           ├── _04_Before_SupportsProjectionPushDown_Test.java
│       │   │               │       │           ├── _05_Before_SupportsReadingMetadata_Test.java
│       │   │               │       │           ├── _06_Before_SupportsWatermarkPushDown_Test.java
│       │   │               │       │           └── _07_Before_SupportsSourceWatermark_Test.java
│       │   │               │       ├── ddl/
│       │   │               │       │   └── TableApiDDLTest.java
│       │   │               │       └── table/
│       │   │               │           ├── redis/
│       │   │               │           │   ├── container/
│       │   │               │           │   │   ├── RedisCommandsContainer.java
│       │   │               │           │   │   ├── RedisCommandsContainerBuilder.java
│       │   │               │           │   │   └── RedisContainer.java
│       │   │               │           │   ├── demo/
│       │   │               │           │   │   └── RedisDemo.java
│       │   │               │           │   ├── mapper/
│       │   │               │           │   │   ├── LookupRedisMapper.java
│       │   │               │           │   │   ├── RedisCommand.java
│       │   │               │           │   │   ├── RedisCommandDescription.java
│       │   │               │           │   │   └── SetRedisMapper.java
│       │   │               │           │   ├── options/
│       │   │               │           │   │   ├── RedisLookupOptions.java
│       │   │               │           │   │   ├── RedisOptions.java
│       │   │               │           │   │   └── RedisWriteOptions.java
│       │   │               │           │   ├── v1/
│       │   │               │           │   │   ├── RedisDynamicTableFactory.java
│       │   │               │           │   │   ├── sink/
│       │   │               │           │   │   │   └── RedisDynamicTableSink.java
│       │   │               │           │   │   └── source/
│       │   │               │           │   │       ├── RedisDynamicTableSource.java
│       │   │               │           │   │       └── RedisRowDataLookupFunction.java
│       │   │               │           │   └── v2/
│       │   │               │           │       ├── RedisDynamicTableFactory.java
│       │   │               │           │       ├── sink/
│       │   │               │           │       │   └── RedisDynamicTableSink.java
│       │   │               │           │       └── source/
│       │   │               │           │           ├── RedisDynamicTableSource.java
│       │   │               │           │           ├── RedisRowDataBatchLookupFunction.java
│       │   │               │           │           └── RedisRowDataLookupFunction.java
│       │   │               │           ├── socket/
│       │   │               │           │   ├── SocketDynamicTableFactory.java
│       │   │               │           │   ├── SocketDynamicTableSource.java
│       │   │               │           │   └── SocketSourceFunction.java
│       │   │               │           └── user_defined/
│       │   │               │               ├── UserDefinedDynamicTableFactory.java
│       │   │               │               ├── UserDefinedDynamicTableSource.java
│       │   │               │               └── UserDefinedSource.java
│       │   │               ├── _04/
│       │   │               │   └── type/
│       │   │               │       ├── BlinkPlannerTest.java
│       │   │               │       ├── JavaEnvTest.java
│       │   │               │       └── OldPlannerTest.java
│       │   │               ├── _05/
│       │   │               │   └── format/
│       │   │               │       └── formats/
│       │   │               │           ├── ProtobufFormatTest.java
│       │   │               │           ├── SocketWriteTest.java
│       │   │               │           ├── csv/
│       │   │               │           │   ├── ChangelogCsvDeserializer.java
│       │   │               │           │   ├── ChangelogCsvFormat.java
│       │   │               │           │   └── ChangelogCsvFormatFactory.java
│       │   │               │           ├── protobuf/
│       │   │               │           │   ├── descriptors/
│       │   │               │           │   │   ├── Protobuf.java
│       │   │               │           │   │   └── ProtobufValidator.java
│       │   │               │           │   ├── row/
│       │   │               │           │   │   ├── ProtobufDeserializationSchema.java
│       │   │               │           │   │   ├── ProtobufRowDeserializationSchema.java
│       │   │               │           │   │   ├── ProtobufRowFormatFactory.java
│       │   │               │           │   │   ├── ProtobufRowSerializationSchema.java
│       │   │               │           │   │   ├── ProtobufSerializationSchema.java
│       │   │               │           │   │   ├── ProtobufUtils.java
│       │   │               │           │   │   └── typeutils/
│       │   │               │           │   │       └── ProtobufSchemaConverter.java
│       │   │               │           │   └── rowdata/
│       │   │               │           │       ├── ProtobufFormatFactory.java
│       │   │               │           │       ├── ProtobufOptions.java
│       │   │               │           │       ├── ProtobufRowDataDeserializationSchema.java
│       │   │               │           │       ├── ProtobufRowDataSerializationSchema.java
│       │   │               │           │       ├── ProtobufToRowDataConverters.java
│       │   │               │           │       └── RowDataToProtobufConverters.java
│       │   │               │           └── utils/
│       │   │               │               ├── MoreRunnables.java
│       │   │               │               ├── MoreSuppliers.java
│       │   │               │               ├── ThrowableRunable.java
│       │   │               │               └── ThrowableSupplier.java
│       │   │               ├── _06/
│       │   │               │   └── calcite/
│       │   │               │       ├── CalciteTest.java
│       │   │               │       ├── ParserTest.java
│       │   │               │       └── javacc/
│       │   │               │           ├── JavaccCodeGenTest.java
│       │   │               │           ├── Simple1Test.java
│       │   │               │           └── generatedcode/
│       │   │               │               ├── ParseException.java
│       │   │               │               ├── Simple1.java
│       │   │               │               ├── Simple1Constants.java
│       │   │               │               ├── Simple1TokenManager.java
│       │   │               │               ├── SimpleCharStream.java
│       │   │               │               ├── Token.java
│       │   │               │               └── TokenMgrError.java
│       │   │               ├── _07/
│       │   │               │   └── query/
│       │   │               │       ├── _01_select_where/
│       │   │               │       │   ├── SelectWhereHiveDialect.java
│       │   │               │       │   ├── SelectWhereTest.java
│       │   │               │       │   ├── SelectWhereTest2.java
│       │   │               │       │   ├── SelectWhereTest3.java
│       │   │               │       │   ├── SelectWhereTest4.java
│       │   │               │       │   ├── SelectWhereTest5.java
│       │   │               │       │   └── StreamExecCalc$10.java
│       │   │               │       ├── _02_select_distinct/
│       │   │               │       │   ├── GroupAggsHandler$5.java
│       │   │               │       │   ├── KeyProjection$0.java
│       │   │               │       │   ├── SelectDistinctTest.java
│       │   │               │       │   └── SelectDistinctTest2.java
│       │   │               │       ├── _03_group_agg/
│       │   │               │       │   ├── _01_group_agg/
│       │   │               │       │   │   ├── GroupAggMiniBatchTest.java
│       │   │               │       │   │   ├── GroupAggTest.java
│       │   │               │       │   │   └── GroupAggsHandler$39.java
│       │   │               │       │   ├── _02_count_distinct/
│       │   │               │       │   │   ├── CountDistinctGroupAggTest.java
│       │   │               │       │   │   └── GroupAggsHandler$17.java
│       │   │               │       │   ├── _03_grouping_sets/
│       │   │               │       │   │   ├── GroupingSetsEqualsGroupAggUnionAllGroupAggTest2.java
│       │   │               │       │   │   ├── GroupingSetsGroupAggTest.java
│       │   │               │       │   │   ├── GroupingSetsGroupAggTest2.java
│       │   │               │       │   │   └── StreamExecExpand$20.java
│       │   │               │       │   ├── _04_cube/
│       │   │               │       │   │   ├── CubeGroupAggTest.java
│       │   │               │       │   │   └── CubeGroupAggTest2.java
│       │   │               │       │   └── _05_rollup/
│       │   │               │       │       ├── RollUpGroupAggTest.java
│       │   │               │       │       └── RollUpGroupAggTest2.java
│       │   │               │       ├── _04_window_agg/
│       │   │               │       │   ├── _01_tumble_window/
│       │   │               │       │   │   ├── TumbleWindow2GroupAggTest.java
│       │   │               │       │   │   ├── TumbleWindowTest.java
│       │   │               │       │   │   ├── TumbleWindowTest2.java
│       │   │               │       │   │   ├── TumbleWindowTest3.java
│       │   │               │       │   │   ├── TumbleWindowTest4.java
│       │   │               │       │   │   ├── TumbleWindowTest5.java
│       │   │               │       │   │   ├── global_agg/
│       │   │               │       │   │   │   ├── GlobalWindowAggsHandler$232.java
│       │   │               │       │   │   │   ├── LocalWindowAggsHandler$162.java
│       │   │               │       │   │   │   └── StateWindowAggsHandler$300.java
│       │   │               │       │   │   └── local_agg/
│       │   │               │       │   │       ├── KeyProjection$89.java
│       │   │               │       │   │       └── LocalWindowAggsHandler$88.java
│       │   │               │       │   ├── _02_cumulate_window/
│       │   │               │       │   │   ├── CumulateWindowGroupingSetsBigintTest.java
│       │   │               │       │   │   ├── CumulateWindowGroupingSetsTest.java
│       │   │               │       │   │   ├── CumulateWindowTest.java
│       │   │               │       │   │   ├── TumbleWindowEarlyFireTest.java
│       │   │               │       │   │   ├── cumulate/
│       │   │               │       │   │   │   ├── global_agg/
│       │   │               │       │   │   │   │   ├── GlobalWindowAggsHandler$232.java
│       │   │               │       │   │   │   │   ├── KeyProjection$301.java
│       │   │               │       │   │   │   │   ├── LocalWindowAggsHandler$162.java
│       │   │               │       │   │   │   │   └── StateWindowAggsHandler$300.java
│       │   │               │       │   │   │   └── local_agg/
│       │   │               │       │   │   │       ├── KeyProjection$89.java
│       │   │               │       │   │   │       └── LocalWindowAggsHandler$88.java
│       │   │               │       │   │   └── earlyfire/
│       │   │               │       │   │       ├── GroupAggsHandler$210.java
│       │   │               │       │   │       └── GroupingWindowAggsHandler$57.java
│       │   │               │       │   └── _03_hop_window/
│       │   │               │       │       └── HopWindowGroupWindowAggTest.java
│       │   │               │       ├── _05_over/
│       │   │               │       │   ├── _01_row_number/
│       │   │               │       │   │   ├── RowNumberOrderByBigintTest.java
│       │   │               │       │   │   ├── RowNumberOrderByStringTest.java
│       │   │               │       │   │   ├── RowNumberOrderByUnixTimestampTest.java
│       │   │               │       │   │   ├── RowNumberWithoutPartitionKeyTest.java
│       │   │               │       │   │   ├── RowNumberWithoutRowNumberEqual1Test.java
│       │   │               │       │   │   └── Scalar_UDF.java
│       │   │               │       │   └── _02_agg/
│       │   │               │       │       ├── RangeIntervalProctimeTest.java
│       │   │               │       │       ├── RangeIntervalRowtimeAscendingTest.java
│       │   │               │       │       ├── RangeIntervalRowtimeBoundedOutOfOrdernessTest.java
│       │   │               │       │       ├── RangeIntervalRowtimeStrictlyAscendingTest.java
│       │   │               │       │       └── RowIntervalTest.java
│       │   │               │       ├── _06_joins/
│       │   │               │       │   ├── _01_regular_joins/
│       │   │               │       │   │   ├── _01_inner_join/
│       │   │               │       │   │   │   ├── ConditionFunction$4.java
│       │   │               │       │   │   │   ├── _01_InnerJoinsTest.java
│       │   │               │       │   │   │   └── _02_InnerJoinsOnNotEqualTest.java
│       │   │               │       │   │   └── _02_outer_join/
│       │   │               │       │   │       ├── _01_LeftJoinsTest.java
│       │   │               │       │   │       ├── _02_RightJoinsTest.java
│       │   │               │       │   │       └── _03_FullJoinsTest.java
│       │   │               │       │   ├── _02_interval_joins/
│       │   │               │       │   │   ├── _01_proctime/
│       │   │               │       │   │   │   ├── Interval_Full_Joins_ProcesingTime_Test.java
│       │   │               │       │   │   │   ├── Interval_Inner_Joins_ProcesingTime_Test.java
│       │   │               │       │   │   │   ├── Interval_Left_Joins_ProcesingTime_Test.java
│       │   │               │       │   │   │   └── Interval_Right_Joins_ProcesingTime_Test.java
│       │   │               │       │   │   └── _02_row_time/
│       │   │               │       │   │       ├── Interval_Full_JoinsOnNotEqual_EventTime_Test.java
│       │   │               │       │   │       ├── Interval_Full_Joins_EventTime_Test.java
│       │   │               │       │   │       ├── Interval_Inner_Joins_EventTime_Test.java
│       │   │               │       │   │       ├── Interval_Left_Joins_EventTime_Test.java
│       │   │               │       │   │       └── Interval_Right_Joins_EventTime_Test.java
│       │   │               │       │   ├── _03_temporal_join/
│       │   │               │       │   │   ├── _01_proctime/
│       │   │               │       │   │   │   └── Temporal_Join_ProcesingTime_Test.java
│       │   │               │       │   │   └── _02_row_time/
│       │   │               │       │   │       └── Temporal_Join_EventTime_Test.java
│       │   │               │       │   ├── _04_lookup_join/
│       │   │               │       │   │   └── _01_redis/
│       │   │               │       │   │       ├── RedisBatchLookupTest2.java
│       │   │               │       │   │       ├── RedisDemo.java
│       │   │               │       │   │       ├── RedisLookupTest.java
│       │   │               │       │   │       ├── RedisLookupTest2.java
│       │   │               │       │   │       └── pipeline/
│       │   │               │       │   │           ├── BatchJoinTableFuncCollector$8.java
│       │   │               │       │   │           ├── BatchLookupFunction$4.java
│       │   │               │       │   │           ├── JoinTableFuncCollector$8.java
│       │   │               │       │   │           ├── JoinTableFuncCollector$9.java
│       │   │               │       │   │           ├── LookupFunction$4.java
│       │   │               │       │   │           ├── LookupFunction$5.java
│       │   │               │       │   │           └── T1.java
│       │   │               │       │   ├── _05_array_expansion/
│       │   │               │       │   │   └── _01_ArrayExpansionTest.java
│       │   │               │       │   └── _06_table_function/
│       │   │               │       │       └── _01_inner_join/
│       │   │               │       │           ├── TableFunctionInnerJoin_Test.java
│       │   │               │       │           └── TableFunctionInnerJoin_WithEmptyTableFunction_Test.java
│       │   │               │       ├── _07_deduplication/
│       │   │               │       │   ├── DeduplicationProcessingTimeTest.java
│       │   │               │       │   ├── DeduplicationProcessingTimeTest1.java
│       │   │               │       │   └── DeduplicationRowTimeTest.java
│       │   │               │       ├── _08_datastream_trans/
│       │   │               │       │   ├── AlertExample.java
│       │   │               │       │   ├── AlertExampleRetract.java
│       │   │               │       │   ├── AlertExampleRetractError.java
│       │   │               │       │   ├── RetractExample.java
│       │   │               │       │   └── Test.java
│       │   │               │       ├── _09_set_operations/
│       │   │               │       │   ├── Except_Test.java
│       │   │               │       │   ├── Exist_Test.java
│       │   │               │       │   ├── In_Test.java
│       │   │               │       │   ├── Intersect_Test.java
│       │   │               │       │   ├── UnionAll_Test.java
│       │   │               │       │   └── Union_Test.java
│       │   │               │       ├── _10_order_by/
│       │   │               │       │   ├── OrderBy_with_time_attr_Test.java
│       │   │               │       │   └── OrderBy_without_time_attr_Test.java
│       │   │               │       ├── _11_limit/
│       │   │               │       │   └── Limit_Test.java
│       │   │               │       ├── _12_topn/
│       │   │               │       │   └── TopN_Test.java
│       │   │               │       ├── _13_window_topn/
│       │   │               │       │   └── WindowTopN_Test.java
│       │   │               │       ├── _14_retract/
│       │   │               │       │   └── Retract_Test.java
│       │   │               │       ├── _15_exec_options/
│       │   │               │       │   ├── Default_Parallelism_Test.java
│       │   │               │       │   ├── Idle_Timeout_Test.java
│       │   │               │       │   └── State_Ttl_Test.java
│       │   │               │       ├── _16_optimizer_options/
│       │   │               │       │   ├── Agg_OnePhase_Strategy_window_Test.java
│       │   │               │       │   ├── Agg_TwoPhase_Strategy_unbounded_Test.java
│       │   │               │       │   ├── Agg_TwoPhase_Strategy_window_Test.java
│       │   │               │       │   ├── DistinctAgg_Split_One_Distinct_Key_Test.java
│       │   │               │       │   └── DistinctAgg_Split_Two_Distinct_Key_Test.java
│       │   │               │       ├── _17_table_options/
│       │   │               │       │   ├── Dml_Syc_False_Test.java
│       │   │               │       │   ├── Dml_Syc_True_Test.java
│       │   │               │       │   └── TimeZone_window_Test.java
│       │   │               │       └── _18_performance_tuning/
│       │   │               │           └── Count_Distinct_Filter_Test.java
│       │   │               ├── _08/
│       │   │               │   └── batch/
│       │   │               │       ├── Utils.java
│       │   │               │       ├── _01_ddl/
│       │   │               │       │   └── HiveDDLTest.java
│       │   │               │       ├── _02_dml/
│       │   │               │       │   ├── HiveDMLBetweenAndTest.java
│       │   │               │       │   ├── HiveDMLTest.java
│       │   │               │       │   ├── HiveTest2.java
│       │   │               │       │   ├── _01_hive_dialect/
│       │   │               │       │   │   └── HiveDMLTest.java
│       │   │               │       │   ├── _02_with_as/
│       │   │               │       │   │   └── HIveWIthAsTest.java
│       │   │               │       │   ├── _03_substr/
│       │   │               │       │   │   └── HiveSubstrTest.java
│       │   │               │       │   ├── _04_tumble_window/
│       │   │               │       │   │   ├── Test.java
│       │   │               │       │   │   ├── Test1.java
│       │   │               │       │   │   ├── Test2_BIGINT_SOURCE.java
│       │   │               │       │   │   ├── Test3.java
│       │   │               │       │   │   └── Test5.java
│       │   │               │       │   ├── _05_batch_to_datastream/
│       │   │               │       │   │   └── Test.java
│       │   │               │       │   └── _06_select_where/
│       │   │               │       │       └── Test.java
│       │   │               │       ├── _03_hive_udf/
│       │   │               │       │   ├── HiveModuleV2.java
│       │   │               │       │   ├── HiveUDFRegistryTest.java
│       │   │               │       │   ├── HiveUDFRegistryUnloadTest.java
│       │   │               │       │   ├── _01_GenericUDAFResolver2/
│       │   │               │       │   │   ├── HiveUDAF_hive_module_registry_Test.java
│       │   │               │       │   │   ├── HiveUDAF_sql_registry_create_function_Test.java
│       │   │               │       │   │   ├── HiveUDAF_sql_registry_create_temporary_function_Test.java
│       │   │               │       │   │   └── TestHiveUDAF.java
│       │   │               │       │   ├── _02_GenericUDTF/
│       │   │               │       │   │   ├── HiveUDTF_hive_module_registry_Test.java
│       │   │               │       │   │   ├── HiveUDTF_sql_registry_create_function_Test.java
│       │   │               │       │   │   ├── HiveUDTF_sql_registry_create_temporary_function_Test.java
│       │   │               │       │   │   └── TestHiveUDTF.java
│       │   │               │       │   ├── _03_built_in_udf/
│       │   │               │       │   │   ├── _01_get_json_object/
│       │   │               │       │   │   │   └── HiveUDF_get_json_object_Test.java
│       │   │               │       │   │   └── _02_rlike/
│       │   │               │       │   │       └── HiveUDF_rlike_Test.java
│       │   │               │       │   └── _04_GenericUDF/
│       │   │               │       │       ├── HiveUDF_hive_module_registry_Test.java
│       │   │               │       │       ├── HiveUDF_sql_registry_create_function_Test.java
│       │   │               │       │       ├── HiveUDF_sql_registry_create_temporary_function_Test.java
│       │   │               │       │       └── TestGenericUDF.java
│       │   │               │       ├── _04_flink_udf/
│       │   │               │       │   ├── FlinkUDAF_Test.java
│       │   │               │       │   ├── FlinkUDF_Test.java
│       │   │               │       │   └── FlinkUDTF_Test.java
│       │   │               │       └── _05_test/
│       │   │               │           └── _01_batch_to_datastream/
│       │   │               │               └── Test.java
│       │   │               ├── _09/
│       │   │               │   └── udf/
│       │   │               │       ├── _01_hive_udf/
│       │   │               │       │   └── _01_GenericUDF/
│       │   │               │       │       ├── HiveUDF_sql_registry_create_function_Test.java
│       │   │               │       │       ├── HiveUDF_sql_registry_create_function_with_hive_catalog_Test.java
│       │   │               │       │       ├── HiveUDF_sql_registry_create_temporary_function_Test.java
│       │   │               │       │       ├── HiveUDF_sql_registry_create_temporary_function_with_hive_catalog_Test.java
│       │   │               │       │       └── TestGenericUDF.java
│       │   │               │       ├── _02_stream_hive_udf/
│       │   │               │       │   ├── HiveUDF_Error_Test.java
│       │   │               │       │   ├── HiveUDF_create_temporary_error_Test.java
│       │   │               │       │   ├── HiveUDF_hive_module_registry_Test.java
│       │   │               │       │   ├── HiveUDF_load_first_Test.java
│       │   │               │       │   ├── HiveUDF_load_second_Test.java
│       │   │               │       │   ├── TestGenericUDF.java
│       │   │               │       │   └── UserDefinedSource.java
│       │   │               │       ├── _03_advanced_type_inference/
│       │   │               │       │   ├── AdvancedFunctionsExample.java
│       │   │               │       │   ├── InternalRowMergerFunction.java
│       │   │               │       │   └── LastDatedValueFunction.java
│       │   │               │       ├── _04_udf/
│       │   │               │       │   └── UDAF_Test.java
│       │   │               │       └── _05_scalar_function/
│       │   │               │           ├── ExplodeUDTF.java
│       │   │               │           ├── ExplodeUDTFV2.java
│       │   │               │           ├── GetMapValue.java
│       │   │               │           ├── GetSetValue.java
│       │   │               │           ├── ScalarFunctionTest.java
│       │   │               │           ├── ScalarFunctionTest2.java
│       │   │               │           ├── SetStringUDF.java
│       │   │               │           └── TableFunctionTest2.java
│       │   │               ├── _10_share/
│       │   │               │   └── A.java
│       │   │               ├── _11_explain/
│       │   │               │   └── Explain_Test.java
│       │   │               └── _12_data_type/
│       │   │                   ├── _01_interval/
│       │   │                   │   ├── Timestamp3_Interval_To_Test.java
│       │   │                   │   └── Timestamp_ltz3_Interval_To_Test.java
│       │   │                   ├── _02_user_defined/
│       │   │                   │   ├── User.java
│       │   │                   │   ├── UserDefinedDataTypes_Test.java
│       │   │                   │   ├── UserDefinedDataTypes_Test2.java
│       │   │                   │   └── UserScalarFunction.java
│       │   │                   └── _03_raw/
│       │   │                       ├── RawScalarFunction.java
│       │   │                       └── Raw_DataTypes_Test2.java
│       │   ├── javacc/
│       │   │   └── Simple1.jj
│       │   ├── proto/
│       │   │   ├── source.proto
│       │   │   └── test.proto
│       │   ├── resources/
│       │   │   └── META-INF/
│       │   │       └── services/
│       │   │           └── org.apache.flink.table.factories.Factory
│       │   └── scala/
│       │       └── flink/
│       │           └── examples/
│       │               └── sql/
│       │                   └── _04/
│       │                       └── type/
│       │                           └── TableFunc0.scala
│       └── test/
│           ├── java/
│           │   └── flink/
│           │       └── examples/
│           │           └── sql/
│           │               ├── _05/
│           │               │   └── format/
│           │               │       └── formats/
│           │               │           └── protobuf/
│           │               │               ├── row/
│           │               │               │   ├── ProtobufRowDeserializationSchemaTest.java
│           │               │               │   └── ProtobufRowSerializationSchemaTest.java
│           │               │               └── rowdata/
│           │               │                   ├── ProtobufRowDataDeserializationSchemaTest.java
│           │               │                   └── ProtobufRowDataSerializationSchemaTest.java
│           │               ├── _06/
│           │               │   └── calcite/
│           │               │       └── CalciteTest.java
│           │               └── _07/
│           │                   └── query/
│           │                       └── _06_joins/
│           │                           └── JaninoCompileTest.java
│           ├── proto/
│           │   └── person.proto
│           └── scala/
│               ├── ScalaEnv.scala
│               └── TableFunc0.scala
├── flink-examples-1.14/
│   ├── pom.xml
│   └── src/
│       └── main/
│           └── java/
│               └── flink/
│                   └── examples/
│                       └── sql/
│                           └── _08/
│                               └── batch/
│                                   ├── HiveModuleV2.java
│                                   └── Test.java
├── flink-examples-1.8/
│   ├── .gitignore
│   └── pom.xml
└── pom.xml

================================================
FILE CONTENTS
================================================

================================================
FILE: .gitignore
================================================
HELP.md
target/
!.mvn/wrapper/maven-wrapper.jar
!**/src/main/**
#**/src/test/**
.idea/
*.iml
*.DS_Store

### IntelliJ IDEA ###
.idea
*.iws
*.ipr



================================================
FILE: README.md
================================================
# 1.友情提示

> 1. 联系我:如要有问题咨询,请联系我(公众号:[`大数据羊说`](#32公众号),备注来自`GitHub`)
> 2. 该仓库会持续更新 flink 教程福利干货,麻烦路过的各位亲给这个项目点个 `star`,太不易了,写了这么多,算是对我坚持下来的一种鼓励吧!

![在这里插入图片描述](https://raw.githubusercontent.com/yangyichao-mango/yangyichao-mango.github.io/master/1631459281928.png)

![Stargazers over time](https://starchart.cc/yangyichao-mango/flink-study.svg)

<br>
<p align="center">
    <a href="#32公众号" style="text-decoration:none;">
        <img src="https://img.shields.io/badge/WeChat-%E5%85%AC%E4%BC%97%E5%8F%B7-green" alt="公众号" />
    </a>
    <a href="https://www.zhihu.com/people/onemango" target="_blank" style="text-decoration:none;">
        <img src="https://img.shields.io/badge/zhihu-%E7%9F%A5%E4%B9%8E-blue" alt="知乎" />
    </a>
    <a href="https://juejin.cn/user/562562548382926" target="_blank" style="text-decoration:none;">
        <img src="https://img.shields.io/badge/juejin-%E6%8E%98%E9%87%91-blue" alt="掘金" />
    </a>
    <a href="https://blog.csdn.net/qq_34608620?spm=1001.2014.3001.5343&type=blog" target="_blank" style="text-decoration:none;">
        <img src="https://img.shields.io/badge/csdn-CSDN-red" alt="CSDN" />
    </a>
    <a href="https://home.51cto.com/space?uid=15322900" target="_blank" style="text-decoration:none;">
        <img src="https://img.shields.io/badge/51cto-51CT0%E5%8D%9A%E5%AE%A2-orange" alt="51CT0博客" />
        </a>
    <img src="https://img.shields.io/github/stars/yangyichao-mango/flink-study" alt="投稿">           
</p>

# 2.文章目录

> 以下列出的是作者对原创的一些文章和一些学习资源做了一个汇总,会持续更新!如果帮到了您,请点个star支持一下,谢谢!

## 2.1.flink sql

1. [公众号文章:踩坑记 | flink sql count 还有这种坑!](https://mp.weixin.qq.com/s/5XDkmuEIfHB_WsMHPeinkw),[源码](https://github.com/yangyichao-mango/flink-study/tree/main/flink-examples-1.13/src/main/java/flink/examples/sql/_01/countdistincterror)
2. [公众号文章:实战 | flink sql 与微博热搜的碰撞!!!](https://mp.weixin.qq.com/s/GHLoWMBZxajA2nXPHhH8WA)
3. [公众号文章:flink sql 知其所以然(一)| source\sink 原理](https://mp.weixin.qq.com/s/xIXh8B_suAlKSp56aO5aEg),[源码](https://github.com/yangyichao-mango/flink-study/tree/main/flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink)
4. [公众号文章:flink sql 知其所以然(二)| 自定义 redis 数据维表(附源码)](https://mp.weixin.qq.com/s/b_zV_tGp5QJQjgnSaxNT_Q),[源码](https://github.com/yangyichao-mango/flink-study/tree/main/flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink)
5. [公众号文章:flink sql 知其所以然(三)| 自定义 redis 数据汇表(附源码)](https://mp.weixin.qq.com/s/7Fwey_AXNJ0jQZWfXvtNmw),[源码](https://github.com/yangyichao-mango/flink-study/tree/main/flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink)
6. [公众号文章:flink sql 知其所以然(四)| sql api 类型系统](https://mp.weixin.qq.com/s/aqDRWgr3Kim7lblx10JvtA),[源码](https://github.com/yangyichao-mango/flink-study/tree/main/flink-examples-1.13/src/main/java/flink/examples/sql/_04/type)
7. [公众号文章:flink sql 知其所以然(五)| 自定义 protobuf format](https://mp.weixin.qq.com/s/STUC4trW-HA3cnrsqT-N6g),[源码](https://github.com/yangyichao-mango/flink-study/tree/main/flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats)
8. [公众号文章:flink sql 知其所以然(六)| flink sql 约会 calcite(看这篇就够了)](https://mp.weixin.qq.com/s/SxRKp368mYSKVmuduPoXFg),[源码](https://github.com/yangyichao-mango/flink-study/tree/main/flink-examples-1.13/src/main/java/flink/examples/sql/_06/calcite)
9. [公众号文章:flink sql 知其所以然(七):不会连最适合 flink sql 的 ETL 和 group agg 场景都没见过吧?](https://github.com/yangyichao-mango/flink-study/tree/main/flink-examples-1.13/src/main/java/flink/examples/sql/_07/query)
10. [公众号文章:flink sql 知其所以然(八):flink sql tumble window 的奇妙解析之路](https://mp.weixin.qq.com/s/IRmt8dWmxAmbBh696akHdw),[源码](https://github.com/yangyichao-mango/flink-study/tree/main/flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_04_window)
11. [公众号文章:flink sql 知其所以然(九):window tvf tumble window 的奇思妙解](https://mp.weixin.qq.com/s/QVuu5_N4lHo5gXlt1tdncw),[源码](https://github.com/yangyichao-mango/flink-study/tree/main/flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_04_window_agg/_01_tumble_window)
12. [公众号文章:flink sql 知其所以然(十):大家都用 cumulate window 啦](https://mp.weixin.qq.com/s/IqAzjrQmcGmnxvHm1FAV5g),[源码](https://github.com/yangyichao-mango/flink-study/blob/main/flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_04_window_agg/_02_cumulate_window/CumulateWindowTest.java)
13. [公众号文章:flink sql 知其所以然(十一):去重不仅仅有 count distinct 还有强大的 deduplication](https://mp.weixin.qq.com/s/VL6egD76B4J7IcpHShTq7Q),[源码](https://github.com/yangyichao-mango/flink-study/tree/main/flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_05_over/_01_row_number)
14. [公众号文章:flink sql 知其所以然(十二):流 join 很难嘛???(上)](https://mp.weixin.qq.com/s/Z8QfKfhrX5KEnR-s7gRtsA),[源码](https://github.com/yangyichao-mango/flink-study/tree/main/flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_06_joins/_01_regular_joins)
15. [公众号文章:flink sql 知其所以然(十三):流 join 很难嘛???(下)](),[源码](https://github.com/yangyichao-mango/flink-study/tree/main/flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_06_joins/_02_interval_joins)
16. [公众号文章:flink sql 知其所以然(十四):维表 join 的性能优化之路(上)附源码](),[源码](https://github.com/yangyichao-mango/flink-study/tree/main/flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_06_joins/_04_lookup_join/_01_redis)
17. [公众号文章:flink sql 知其所以然(十五):改了改源码,实现了个 batch lookup join(附源码)](),[源码](https://github.com/yangyichao-mango/flink-study/blob/main/flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_06_joins/_04_lookup_join/_01_redis/RedisBatchLookupTest2.java)
18. [公众号文章:flink sql 知其所以然(十八):在 flink 中怎么使用 hive udf?附源码](),[源码](https://github.com/yangyichao-mango/flink-study/tree/main/flink-examples-1.13/src/main/java/flink/examples/sql/_09/udf/_02_stream_hive_udf)
19. [公众号文章:flink sql 知其所以然(十九):Table 与 DataStream 的转转转(附源码)](),[源码](https://github.com/yangyichao-mango/flink-study/tree/main/flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_08_datastream_trans)
20. [公众号文章:(上)史上最全干货!Flink SQL 成神之路(全文 18 万字、138 个案例、42 张图)](),[源码](https://github.com/yangyichao-mango/flink-study/blob/main/flink-examples-1.13/src/main/java/flink/examples/sql)
21. [公众号文章:(中)史上最全干货!Flink SQL 成神之路(全文 18 万字、138 个案例、42 张图)](),[源码](https://github.com/yangyichao-mango/flink-study/tree/main/flink-examples-1.13/src/main/java/flink/examples/sql)
22. [公众号文章:(下)史上最全干货!Flink SQL 成神之路(全文 18 万字、138 个案例、42 张图)](),[源码](https://github.com/yangyichao-mango/flink-study/tree/main/flink-examples-1.13/src/main/java/flink/examples/sql)

## 2.2.flink 实战

1. [公众号文章:揭秘字节跳动埋点数据实时动态处理引擎(附源码)](https://mp.weixin.qq.com/s/PoK0XOA9OHIDJezb1fLOMw),[源码](https://github.com/yangyichao-mango/flink-study/tree/main/flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split)
2. [公众号文章:踩坑记| flink state 序列化 java enum 竟然岔劈了](https://mp.weixin.qq.com/s/YElwTL-wzo2UVVIsIH_9YA),[源码](https://github.com/yangyichao-mango/flink-study/tree/main/flink-examples-1.13/src/main/java/flink/examples/datastream/_03/enums_state)
3. [公众号文章:flink idea 本地调试状态恢复](https://mp.weixin.qq.com/s/rLeKY_49q8rR9C_RmlTmhg),[源码](https://github.com/yangyichao-mango/flink-study/blob/main/flink-examples-1.13/src/main/java/flink/examples/runtime/_04/statebackend/CancelAndRestoreWithCheckpointTest.java)

# 3.联系我

## 3.1.微信

有任何学习上的疑惑都欢迎添加作者的微信,一起学习,一起交流!

![在这里插入图片描述](https://raw.githubusercontent.com/yangyichao-mango/yangyichao-mango.github.io/master/1.png)

## 3.2.公众号

如果大家想要实时关注我更新的文章以及分享的干货的话,可以关注我的公众号:**大数据羊说**

![在这里插入图片描述](https://raw.githubusercontent.com/yangyichao-mango/yangyichao-mango.github.io/master/2.png)


================================================
FILE: flink-examples-1.10/pom.xml
================================================
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <parent>
        <artifactId>flink-study</artifactId>
        <groupId>com.github.antigeneral</groupId>
        <version>1.0-SNAPSHOT</version>
    </parent>
    <modelVersion>4.0.0</modelVersion>

    <groupId>com.github.antigeneral</groupId>
    <artifactId>flink-examples-1.10</artifactId>


    <build>

        <extensions>
            <extension>
                <groupId>kr.motd.maven</groupId>
                <artifactId>os-maven-plugin</artifactId>
                <version>${os-maven-plugin.version}</version>
            </extension>
        </extensions>

        <plugins>


            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-compiler-plugin</artifactId>
                <configuration>
                    <source>8</source>
                    <target>8</target>
                </configuration>
            </plugin>

            <plugin>
                <groupId>org.xolstice.maven.plugins</groupId>
                <artifactId>protobuf-maven-plugin</artifactId>
                <version>${protobuf-maven-plugin.version}</version>
                <configuration>
                    <protoSourceRoot>
                        src/test/proto
                    </protoSourceRoot>
                    <protocArtifact>
                        com.google.protobuf:protoc:3.1.0:exe:${os.detected.classifier}
                    </protocArtifact>
                    <pluginId>grpc-java</pluginId>
                    <pluginArtifact>
                        io.grpc:protoc-gen-grpc-java:${grpc-plugin.version}:exe:${os.detected.classifier}
                    </pluginArtifact>
                </configuration>
                <executions>
                    <execution>
                        <goals>
                            <goal>compile</goal>
                            <goal>compile-custom</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>

    <properties>
        <flink.version>1.10.1</flink.version>
        <lombok.version>1.18.20</lombok.version>
        <scala.binary.version>2.11</scala.binary.version>
        <mvel2.version>2.4.12.Final</mvel2.version>
        <curator.version>2.12.0</curator.version>
        <kafka.version>2.1.1</kafka.version>
        <groovy.version>2.5.7</groovy.version>
        <gson.version>2.2.4</gson.version>
        <guava.version>30.1.1-jre</guava.version>
        <guava.retrying.version>2.0.0</guava.retrying.version>
        <logback-classic.version>1.2.3</logback-classic.version>
        <slf4j-log4j12.version>1.8.0-beta2</slf4j-log4j12.version>

        <grpc-plugin.version>1.23.1</grpc-plugin.version>
        <protobuf-maven-plugin.version>0.6.1</protobuf-maven-plugin.version>
        <protobuf-java.version>3.11.0</protobuf-java.version>

        <joda-time.version>2.5</joda-time.version>

        <os-maven-plugin.version>1.6.2</os-maven-plugin.version>
    </properties>

<!--    <dependencies>-->

<!--        <dependency>-->
<!--            <groupId>org.apache.httpcomponents</groupId>-->
<!--            <artifactId>httpclient</artifactId>-->
<!--            <version>4.5.10</version>-->
<!--            <scope>compile</scope>-->
<!--        </dependency>-->

<!--        <dependency>-->
<!--            <groupId>joda-time</groupId>-->
<!--            <artifactId>joda-time</artifactId>-->
<!--            &lt;!&ndash; managed version &ndash;&gt;-->
<!--            <scope>provided</scope>-->
<!--            &lt;!&ndash; Avro records can contain JodaTime fields when using logical fields.-->
<!--                In order to handle them, we need to add an optional dependency.-->
<!--                Users with those Avro records need to add this dependency themselves. &ndash;&gt;-->
<!--            <optional>true</optional>-->
<!--            <version>${joda-time.version}</version>-->
<!--        </dependency>-->

<!--        <dependency>-->
<!--            <groupId>com.google.protobuf</groupId>-->
<!--            <artifactId>protobuf-java</artifactId>-->
<!--            <version>${protobuf-java.version}</version>-->
<!--        </dependency>-->

<!--        &lt;!&ndash; https://mvnrepository.com/artifact/com.github.rholder/guava-retrying &ndash;&gt;-->
<!--        <dependency>-->
<!--            <groupId>com.github.rholder</groupId>-->
<!--            <artifactId>guava-retrying</artifactId>-->
<!--            <version>${guava.retrying.version}</version>-->
<!--        </dependency>-->

<!--        <dependency>-->
<!--            <groupId>com.google.guava</groupId>-->
<!--            <artifactId>guava</artifactId>-->
<!--            <version>${guava.version}</version>-->
<!--        </dependency>-->

<!--        <dependency>-->
<!--            <groupId>org.projectlombok</groupId>-->
<!--            <artifactId>lombok</artifactId>-->
<!--            <version>${lombok.version}</version>-->
<!--        </dependency>-->

<!--        <dependency>-->
<!--            <groupId>org.apache.flink</groupId>-->
<!--            <artifactId>flink-java</artifactId>-->
<!--            <version>${flink.version}</version>-->
<!--        </dependency>-->
<!--        <dependency>-->
<!--            <groupId>org.apache.flink</groupId>-->
<!--            <artifactId>flink-streaming-java_2.11</artifactId>-->
<!--            <version>${flink.version}</version>-->
<!--        </dependency>-->
<!--        <dependency>-->
<!--            <groupId>org.apache.flink</groupId>-->
<!--            <artifactId>flink-clients_2.11</artifactId>-->
<!--            <version>${flink.version}</version>-->
<!--        </dependency>-->

<!--        &lt;!&ndash; https://mvnrepository.com/artifact/org.mvel/mvel2 &ndash;&gt;-->
<!--        <dependency>-->
<!--            <groupId>org.mvel</groupId>-->
<!--            <artifactId>mvel2</artifactId>-->
<!--            <version>${mvel2.version}</version>-->
<!--        </dependency>-->

<!--        &lt;!&ndash; https://mvnrepository.com/artifact/redis.clients/jedis &ndash;&gt;-->
<!--        <dependency>-->
<!--            <groupId>redis.clients</groupId>-->
<!--            <artifactId>jedis</artifactId>-->
<!--            <version>3.6.3</version>-->
<!--        </dependency>-->

<!--        &lt;!&ndash; 对zookeeper的底层api的一些封装 &ndash;&gt;-->
<!--        <dependency>-->
<!--            <groupId>org.apache.curator</groupId>-->
<!--            <artifactId>curator-framework</artifactId>-->
<!--            <version>${curator.version}</version>-->
<!--        </dependency>-->
<!--        &lt;!&ndash; 封装了一些高级特性,如:Cache事件监听、选举、分布式锁、分布式Barrier &ndash;&gt;-->
<!--        <dependency>-->
<!--            <groupId>org.apache.curator</groupId>-->
<!--            <artifactId>curator-recipes</artifactId>-->
<!--            <version>${curator.version}</version>-->
<!--        </dependency>-->

<!--        <dependency>-->
<!--            <groupId>org.apache.kafka</groupId>-->
<!--            <artifactId>kafka-clients</artifactId>-->
<!--            <version>${kafka.version}</version>-->
<!--        </dependency>-->

<!--        <dependency>-->
<!--            <groupId>org.codehaus.groovy</groupId>-->
<!--            <artifactId>groovy</artifactId>-->
<!--            <version>${groovy.version}</version>-->
<!--        </dependency>-->
<!--        <dependency>-->
<!--            <groupId>org.codehaus.groovy</groupId>-->
<!--            <artifactId>groovy-ant</artifactId>-->
<!--            <version>${groovy.version}</version>-->
<!--        </dependency>-->
<!--        <dependency>-->
<!--            <groupId>org.codehaus.groovy</groupId>-->
<!--            <artifactId>groovy-cli-commons</artifactId>-->
<!--            <version>${groovy.version}</version>-->
<!--        </dependency>-->
<!--        <dependency>-->
<!--            <groupId>org.codehaus.groovy</groupId>-->
<!--            <artifactId>groovy-cli-picocli</artifactId>-->
<!--            <version>${groovy.version}</version>-->
<!--        </dependency>-->
<!--        <dependency>-->
<!--            <groupId>org.codehaus.groovy</groupId>-->
<!--            <artifactId>groovy-console</artifactId>-->
<!--            <version>${groovy.version}</version>-->
<!--        </dependency>-->
<!--        <dependency>-->
<!--            <groupId>org.codehaus.groovy</groupId>-->
<!--            <artifactId>groovy-datetime</artifactId>-->
<!--            <version>${groovy.version}</version>-->
<!--        </dependency>-->
<!--        <dependency>-->
<!--            <groupId>org.codehaus.groovy</groupId>-->
<!--            <artifactId>groovy-docgenerator</artifactId>-->
<!--            <version>${groovy.version}</version>-->
<!--        </dependency>-->
<!--        <dependency>-->
<!--            <groupId>org.codehaus.groovy</groupId>-->
<!--            <artifactId>groovy-groovydoc</artifactId>-->
<!--            <version>${groovy.version}</version>-->
<!--        </dependency>-->
<!--        <dependency>-->
<!--            <groupId>org.codehaus.groovy</groupId>-->
<!--            <artifactId>groovy-groovysh</artifactId>-->
<!--            <version>${groovy.version}</version>-->
<!--        </dependency>-->
<!--        <dependency>-->
<!--            <groupId>org.codehaus.groovy</groupId>-->
<!--            <artifactId>groovy-jmx</artifactId>-->
<!--            <version>${groovy.version}</version>-->
<!--        </dependency>-->
<!--        <dependency>-->
<!--            <groupId>org.codehaus.groovy</groupId>-->
<!--            <artifactId>groovy-json</artifactId>-->
<!--            <version>${groovy.version}</version>-->
<!--        </dependency>-->
<!--        <dependency>-->
<!--            <groupId>org.codehaus.groovy</groupId>-->
<!--            <artifactId>groovy-jsr223</artifactId>-->
<!--            <version>${groovy.version}</version>-->
<!--        </dependency>-->
<!--        <dependency>-->
<!--            <groupId>org.codehaus.groovy</groupId>-->
<!--            <artifactId>groovy-macro</artifactId>-->
<!--            <version>${groovy.version}</version>-->
<!--        </dependency>-->
<!--        <dependency>-->
<!--            <groupId>org.codehaus.groovy</groupId>-->
<!--            <artifactId>groovy-nio</artifactId>-->
<!--            <version>${groovy.version}</version>-->
<!--        </dependency>-->
<!--        <dependency>-->
<!--            <groupId>org.codehaus.groovy</groupId>-->
<!--            <artifactId>groovy-servlet</artifactId>-->
<!--            <version>${groovy.version}</version>-->
<!--        </dependency>-->
<!--        <dependency>-->
<!--            <groupId>org.codehaus.groovy</groupId>-->
<!--            <artifactId>groovy-sql</artifactId>-->
<!--            <version>${groovy.version}</version>-->
<!--        </dependency>-->
<!--        <dependency>-->
<!--            <groupId>org.codehaus.groovy</groupId>-->
<!--            <artifactId>groovy-swing</artifactId>-->
<!--            <version>${groovy.version}</version>-->
<!--        </dependency>-->
<!--        <dependency>-->
<!--            <groupId>org.codehaus.groovy</groupId>-->
<!--            <artifactId>groovy-templates</artifactId>-->
<!--            <version>${groovy.version}</version>-->
<!--        </dependency>-->
<!--        <dependency>-->
<!--            <groupId>org.codehaus.groovy</groupId>-->
<!--            <artifactId>groovy-test</artifactId>-->
<!--            <version>${groovy.version}</version>-->
<!--        </dependency>-->
<!--        <dependency>-->
<!--            <groupId>org.codehaus.groovy</groupId>-->
<!--            <artifactId>groovy-test-junit5</artifactId>-->
<!--            <version>${groovy.version}</version>-->
<!--        </dependency>-->
<!--        <dependency>-->
<!--            <groupId>org.codehaus.groovy</groupId>-->
<!--            <artifactId>groovy-testng</artifactId>-->
<!--            <version>${groovy.version}</version>-->
<!--        </dependency>-->
<!--        <dependency>-->
<!--            <groupId>org.codehaus.groovy</groupId>-->
<!--            <artifactId>groovy-xml</artifactId>-->
<!--            <version>${groovy.version}</version>-->
<!--        </dependency>-->

<!--        <dependency>-->
<!--            <groupId>org.apache.flink</groupId>-->
<!--            <artifactId>flink-table-planner_2.11</artifactId>-->
<!--            <version>${flink.version}</version>-->
<!--        </dependency>-->

<!--        <dependency>-->
<!--            <groupId>com.google.code.gson</groupId>-->
<!--            <artifactId>gson</artifactId>-->
<!--            <version>${gson.version}</version>-->
<!--        </dependency>-->

<!--        <dependency>-->
<!--            <groupId>org.apache.flink</groupId>-->
<!--            <artifactId>flink-table-common</artifactId>-->
<!--            <version>${flink.version}</version>-->
<!--            <scope>compile</scope>-->
<!--        </dependency>-->
<!--        <dependency>-->
<!--            <groupId>org.apache.flink</groupId>-->
<!--            <artifactId>flink-table-api-java</artifactId>-->
<!--            <version>${flink.version}</version>-->
<!--            <scope>compile</scope>-->
<!--        </dependency>-->
<!--        <dependency>-->
<!--            <groupId>org.apache.flink</groupId>-->
<!--            <artifactId>flink-table-api-java-bridge_2.11</artifactId>-->
<!--            <version>${flink.version}</version>-->
<!--            <scope>compile</scope>-->
<!--        </dependency>-->
<!--        <dependency>-->
<!--            <groupId>org.apache.flink</groupId>-->
<!--            <artifactId>flink-table-planner-blink_2.11</artifactId>-->
<!--            <version>${flink.version}</version>-->
<!--            <scope>compile</scope>-->
<!--        </dependency>-->

<!--        &lt;!&ndash; https://mvnrepository.com/artifact/org.apache.flink/flink-connector-jdbc &ndash;&gt;-->
<!--        <dependency>-->
<!--            <groupId>org.apache.flink</groupId>-->
<!--            <artifactId>flink-json</artifactId>-->
<!--            <version>${flink.version}</version>-->
<!--        </dependency>-->

<!--        &lt;!&ndash; https://mvnrepository.com/artifact/org.apache.bahir/flink-connector-redis &ndash;&gt;-->
<!--        <dependency>-->
<!--            <groupId>org.apache.bahir</groupId>-->
<!--            <artifactId>flink-connector-redis_2.10</artifactId>-->
<!--            <version>1.0</version>-->
<!--        </dependency>-->


<!--        &lt;!&ndash; https://mvnrepository.com/artifact/org.apache.flink/flink-connector-kafka &ndash;&gt;-->
<!--        <dependency>-->
<!--            <groupId>org.apache.flink</groupId>-->
<!--            <artifactId>flink-connector-kafka_2.12</artifactId>-->
<!--            <version>${flink.version}</version>-->
<!--        </dependency>-->


<!--        <dependency>-->
<!--            <groupId>ch.qos.logback</groupId>-->
<!--            <artifactId>logback-classic</artifactId>-->
<!--            <scope>compile</scope>-->
<!--            <version>${logback-classic.version}</version>-->
<!--        </dependency>-->

<!--        &lt;!&ndash; https://mvnrepository.com/artifact/org.slf4j/slf4j-log4j12 &ndash;&gt;-->
<!--        <dependency>-->
<!--            <groupId>org.slf4j</groupId>-->
<!--            <artifactId>slf4j-log4j12</artifactId>-->
<!--            <version>${slf4j-log4j12.version}</version>-->
<!--        </dependency>-->

<!--        <dependency>-->

<!--            <groupId>org.apache.flink</groupId>-->

<!--            <artifactId>flink-runtime-web_2.11</artifactId>-->

<!--            <version>${flink.version}</version>-->

<!--        </dependency>-->

<!--        &lt;!&ndash; https://mvnrepository.com/artifact/com.fasterxml.jackson.core/jackson-databind &ndash;&gt;-->
<!--        <dependency>-->
<!--            <groupId>com.fasterxml.jackson.core</groupId>-->
<!--            <artifactId>jackson-databind</artifactId>-->
<!--            <version>2.12.4</version>-->
<!--        </dependency>-->

<!--        &lt;!&ndash; https://mvnrepository.com/artifact/com.fasterxml.jackson.module/jackson-module-kotlin &ndash;&gt;-->
<!--        <dependency>-->
<!--            <groupId>com.fasterxml.jackson.module</groupId>-->
<!--            <artifactId>jackson-module-kotlin</artifactId>-->
<!--            <version>2.12.4</version>-->
<!--        </dependency>-->

<!--        &lt;!&ndash; https://mvnrepository.com/artifact/com.fasterxml.jackson.module/jackson-module-parameter-names &ndash;&gt;-->
<!--        <dependency>-->
<!--            <groupId>com.fasterxml.jackson.module</groupId>-->
<!--            <artifactId>jackson-module-parameter-names</artifactId>-->
<!--            <version>2.12.4</version>-->
<!--        </dependency>-->

<!--        &lt;!&ndash; https://mvnrepository.com/artifact/com.fasterxml.jackson.datatype/jackson-datatype-guava &ndash;&gt;-->
<!--        <dependency>-->
<!--            <groupId>com.fasterxml.jackson.datatype</groupId>-->
<!--            <artifactId>jackson-datatype-guava</artifactId>-->
<!--            <version>2.12.4</version>-->
<!--        </dependency>-->


<!--        &lt;!&ndash; https://mvnrepository.com/artifact/com.hubspot.jackson/jackson-datatype-protobuf &ndash;&gt;-->
<!--        <dependency>-->
<!--            <groupId>com.hubspot.jackson</groupId>-->
<!--            <artifactId>jackson-datatype-protobuf</artifactId>-->
<!--            <version>0.9.12</version>-->
<!--        </dependency>-->


<!--    </dependencies>-->


</project>

================================================
FILE: flink-examples-1.10/src/main/java/flink/examples/sql/_07/query/_06_joins/_02_interval_joins/_01_outer_join/WindowJoinFunction$46.java
================================================
package flink.examples.sql._07.query._06_joins._02_interval_joins._01_outer_join;


public class WindowJoinFunction$46
        extends org.apache.flink.api.common.functions.RichFlatJoinFunction {

    final org.apache.flink.table.dataformat.JoinedRow joinedRow = new org.apache.flink.table.dataformat.JoinedRow();

    public WindowJoinFunction$46(Object[] references) throws Exception {

    }


    @Override
    public void open(org.apache.flink.configuration.Configuration parameters) throws Exception {

    }

    @Override
    public void join(Object _in1, Object _in2, org.apache.flink.util.Collector c) throws Exception {
        org.apache.flink.table.dataformat.BaseRow in1 = (org.apache.flink.table.dataformat.BaseRow) _in1;
        org.apache.flink.table.dataformat.BaseRow in2 = (org.apache.flink.table.dataformat.BaseRow) _in2;

        int result$40;
        boolean isNull$40;
        int field$41;
        boolean isNull$41;
        int result$42;
        boolean isNull$42;
        int field$43;
        boolean isNull$43;
        boolean isNull$44;
        boolean result$45;
        result$40 = -1;
        isNull$40 = true;
        if (in1 != null) {
            isNull$41 = in1.isNullAt(0);
            field$41 = -1;
            if (!isNull$41) {
                field$41 = in1.getInt(0);
            }
            result$40 = field$41;
            isNull$40 = isNull$41;
        }
        result$42 = -1;
        isNull$42 = true;
        if (in2 != null) {
            isNull$43 = in2.isNullAt(0);
            field$43 = -1;
            if (!isNull$43) {
                field$43 = in2.getInt(0);
            }
            result$42 = field$43;
            isNull$42 = isNull$43;
        }


        isNull$44 = isNull$40 || isNull$42;
        result$45 = false;
        if (!isNull$44) {

            result$45 = result$40 == result$42;

        }

        if (result$45) {

            joinedRow.replace(in1, in2);
            c.collect(joinedRow);
        }
    }

    @Override
    public void close() throws Exception {

    }
}

================================================
FILE: flink-examples-1.10/src/main/java/flink/examples/sql/_07/query/_06_joins/_02_interval_joins/_01_outer_join/_06_Interval_Outer_Joins_EventTime_Test.java
================================================
package flink.examples.sql._07.query._06_joins._02_interval_joins._01_outer_join;

import java.util.concurrent.TimeUnit;

import org.apache.flink.api.common.restartstrategy.RestartStrategies;
import org.apache.flink.api.common.typeinfo.BasicTypeInfo;
import org.apache.flink.api.common.typeinfo.TypeInformation;
import org.apache.flink.api.java.typeutils.ResultTypeQueryable;
import org.apache.flink.api.java.typeutils.RowTypeInfo;
import org.apache.flink.api.java.utils.ParameterTool;
import org.apache.flink.configuration.Configuration;
import org.apache.flink.streaming.api.CheckpointingMode;
import org.apache.flink.streaming.api.TimeCharacteristic;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.CheckpointConfig;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.api.functions.source.SourceFunction;
import org.apache.flink.streaming.api.functions.timestamps.BoundedOutOfOrdernessTimestampExtractor;
import org.apache.flink.streaming.api.windowing.time.Time;
import org.apache.flink.table.api.EnvironmentSettings;
import org.apache.flink.table.api.Table;
import org.apache.flink.table.api.java.StreamTableEnvironment;
import org.apache.flink.types.Row;


public class _06_Interval_Outer_Joins_EventTime_Test {

    public static void main(String[] args) throws Exception {

        StreamExecutionEnvironment env =
                StreamExecutionEnvironment.createLocalEnvironmentWithWebUI(new Configuration());

        ParameterTool parameterTool = ParameterTool.fromArgs(args);

        env.setRestartStrategy(RestartStrategies.failureRateRestart(6, org.apache.flink.api.common.time.Time
                .of(10L, TimeUnit.MINUTES), org.apache.flink.api.common.time.Time.of(5L, TimeUnit.SECONDS)));
        env.getConfig().setGlobalJobParameters(parameterTool);
        env.setParallelism(10);

        // ck 设置
        env.getCheckpointConfig().setFailOnCheckpointingErrors(false);
        env.enableCheckpointing(30 * 1000L, CheckpointingMode.EXACTLY_ONCE);
        env.getCheckpointConfig().setMinPauseBetweenCheckpoints(3L);
        env.getCheckpointConfig().enableExternalizedCheckpoints(CheckpointConfig.ExternalizedCheckpointCleanup.RETAIN_ON_CANCELLATION);

        env.setStreamTimeCharacteristic(TimeCharacteristic.EventTime);


        EnvironmentSettings settings = EnvironmentSettings
                .newInstance()
                .useBlinkPlanner()
                .inStreamingMode().build();

        StreamTableEnvironment tEnv = StreamTableEnvironment.create(env, settings);

        tEnv.getConfig().getConfiguration().setString("pipeline.name", "1.10.1 Interval Join 事件时间案例");

        DataStream<Row> sourceTable = env.addSource(new UserDefinedSource1())
                .assignTimestampsAndWatermarks(new BoundedOutOfOrdernessTimestampExtractor<Row>(Time.minutes(0L)) {
                    @Override
                    public long extractTimestamp(Row row) {
                        return (long) row.getField(2);
                    }
                });

        tEnv.createTemporaryView("source_table", sourceTable, "user_id, name, timestamp, rowtime.rowtime");

        DataStream<Row> dimTable = env.addSource(new UserDefinedSource2())
                .assignTimestampsAndWatermarks(new BoundedOutOfOrdernessTimestampExtractor<Row>(Time.minutes(0L)) {
                    @Override
                    public long extractTimestamp(Row row) {
                        return (long) row.getField(2);
                    }
                });

        tEnv.createTemporaryView("dim_table", dimTable, "user_id, platform, timestamp, rowtime.rowtime");

        String sql = "SELECT\n"
                + "    s.user_id as user_id,\n"
                + "    s.name as name,\n"
                + "    d.platform as platform\n"
                + "FROM source_table as s\n"
                + "FULL JOIN dim_table as d ON s.user_id = d.user_id\n"
                + "AND s.rowtime BETWEEN d.rowtime AND d.rowtime + INTERVAL '30' SECOND";

        /**
         * join 算子:{@link org.apache.flink.table.runtime.operators.join.KeyedCoProcessOperatorWithWatermarkDelay}
         *                 -> {@link org.apache.flink.table.runtime.operators.join.RowTimeBoundedStreamJoin}
          */

        Table result = tEnv.sqlQuery(sql);

        tEnv.toAppendStream(result, Row.class)
                .print();

        env.execute("1.10.1 Interval Full Join 事件时间案例");

    }

    private static class UserDefinedSource1 implements SourceFunction<Row>, ResultTypeQueryable<Row> {

        private volatile boolean isCancel;

        @Override
        public void run(SourceContext<Row> sourceContext) throws Exception {

            int i = 0;

            while (!this.isCancel) {

                Row row = new Row(3);

                row.setField(0, i);

                row.setField(1, "name");

                long timestamp = System.currentTimeMillis();

                row.setField(2, timestamp);

                sourceContext.collect(row);

                Thread.sleep(1000L);
                i++;
            }

        }

        @Override
        public void cancel() {
            this.isCancel = true;
        }

        @Override
        public TypeInformation<Row> getProducedType() {
            return new RowTypeInfo(BasicTypeInfo.INT_TYPE_INFO, BasicTypeInfo.STRING_TYPE_INFO, BasicTypeInfo.LONG_TYPE_INFO);
        }
    }

    private static class UserDefinedSource2 implements SourceFunction<Row>, ResultTypeQueryable<Row> {

        private volatile boolean isCancel;

        @Override
        public void run(SourceContext<Row> sourceContext) throws Exception {

            int i = 10;

            while (!this.isCancel) {

                Row row = new Row(3);

                row.setField(0, i);

                row.setField(1, "platform");

                long timestamp = System.currentTimeMillis();

                row.setField(2, timestamp);

                sourceContext.collect(row);

                Thread.sleep(1000L);
                i++;
            }

        }

        @Override
        public void cancel() {
            this.isCancel = true;
        }

        @Override
        public TypeInformation<Row> getProducedType() {
            return new RowTypeInfo(BasicTypeInfo.INT_TYPE_INFO, BasicTypeInfo.STRING_TYPE_INFO, BasicTypeInfo.LONG_TYPE_INFO);
        }
    }

}


================================================
FILE: flink-examples-1.12/.gitignore
================================================
HELP.md
target/
!.mvn/wrapper/maven-wrapper.jar
!**/src/main/**
#**/src/test/**
.idea/
*.iml
*.DS_Store

### IntelliJ IDEA ###
.idea
*.iws
*.ipr



================================================
FILE: flink-examples-1.12/pom.xml
================================================
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <parent>
        <artifactId>flink-study</artifactId>
        <groupId>com.github.antigeneral</groupId>
        <version>1.0-SNAPSHOT</version>
    </parent>
    <modelVersion>4.0.0</modelVersion>

    <groupId>com.github.antigeneral</groupId>
    <artifactId>flink-examples-1.12</artifactId>

    <!--    <build>-->

    <!--        <extensions>-->
    <!--            <extension>-->
    <!--                <groupId>kr.motd.maven</groupId>-->
    <!--                <artifactId>os-maven-plugin</artifactId>-->
    <!--                <version>${os-maven-plugin.version}</version>-->
    <!--            </extension>-->
    <!--        </extensions>-->

    <!--        <plugins>-->



    <!--            <plugin>-->
    <!--                <groupId>org.apache.maven.plugins</groupId>-->
    <!--                <artifactId>maven-compiler-plugin</artifactId>-->
    <!--                <configuration>-->
    <!--                    <source>8</source>-->
    <!--                    <target>8</target>-->
    <!--                </configuration>-->
    <!--            </plugin>-->

    <!--            <plugin>-->
    <!--                <groupId>org.xolstice.maven.plugins</groupId>-->
    <!--                <artifactId>protobuf-maven-plugin</artifactId>-->
    <!--                <version>${protobuf-maven-plugin.version}</version>-->
    <!--                <configuration>-->
    <!--                    <protoSourceRoot>-->
    <!--                        src/test/proto-->
    <!--                    </protoSourceRoot>-->
    <!--                    <protocArtifact>-->
    <!--                        com.google.protobuf:protoc:3.1.0:exe:${os.detected.classifier}-->
    <!--                    </protocArtifact>-->
    <!--                    <pluginId>grpc-java</pluginId>-->
    <!--                    <pluginArtifact>-->
    <!--                        io.grpc:protoc-gen-grpc-java:${grpc-plugin.version}:exe:${os.detected.classifier}-->
    <!--                    </pluginArtifact>-->
    <!--                </configuration>-->
    <!--                <executions>-->
    <!--                    <execution>-->
    <!--                        <goals>-->
    <!--                            <goal>compile</goal>-->
    <!--                            <goal>compile-custom</goal>-->
    <!--                        </goals>-->
    <!--                    </execution>-->
    <!--                </executions>-->
    <!--            </plugin>-->
    <!--        </plugins>-->
    <!--    </build>-->

    <!--    <properties>-->
    <!--        <flink.version>1.12.1</flink.version>-->
    <!--        <lombok.version>1.18.20</lombok.version>-->
    <!--        <scala.binary.version>2.11</scala.binary.version>-->
    <!--        <mvel2.version>2.4.12.Final</mvel2.version>-->
    <!--        <curator.version>2.12.0</curator.version>-->
    <!--        <kafka.version>2.1.1</kafka.version>-->
    <!--        <groovy.version>2.5.7</groovy.version>-->
    <!--        <gson.version>2.2.4</gson.version>-->
    <!--        <guava.version>30.1.1-jre</guava.version>-->
    <!--        <guava.retrying.version>2.0.0</guava.retrying.version>-->
    <!--        <logback-classic.version>1.2.3</logback-classic.version>-->
    <!--        <slf4j-log4j12.version>1.8.0-beta2</slf4j-log4j12.version>-->

    <!--        <grpc-plugin.version>1.23.1</grpc-plugin.version>-->
    <!--        <protobuf-maven-plugin.version>0.6.1</protobuf-maven-plugin.version>-->
    <!--        <protobuf-java.version>3.11.0</protobuf-java.version>-->

    <!--        <joda-time.version>2.5</joda-time.version>-->

    <!--        <os-maven-plugin.version>1.6.2</os-maven-plugin.version>-->
    <!--    </properties>-->

    <!--    <dependencies>-->

    <!--        <dependency>-->
    <!--            <groupId>org.apache.httpcomponents</groupId>-->
    <!--            <artifactId>httpclient</artifactId>-->
    <!--            <version>4.5.10</version>-->
    <!--            <scope>compile</scope>-->
    <!--        </dependency>-->

    <!--        <dependency>-->
    <!--            <groupId>joda-time</groupId>-->
    <!--            <artifactId>joda-time</artifactId>-->
    <!--            &lt;!&ndash; managed version &ndash;&gt;-->
    <!--            <scope>provided</scope>-->
    <!--            &lt;!&ndash; Avro records can contain JodaTime fields when using logical fields.-->
    <!--                In order to handle them, we need to add an optional dependency.-->
    <!--                Users with those Avro records need to add this dependency themselves. &ndash;&gt;-->
    <!--            <optional>true</optional>-->
    <!--            <version>${joda-time.version}</version>-->
    <!--        </dependency>-->

    <!--        <dependency>-->
    <!--            <groupId>com.google.protobuf</groupId>-->
    <!--            <artifactId>protobuf-java</artifactId>-->
    <!--            <version>${protobuf-java.version}</version>-->
    <!--        </dependency>-->

    <!--        &lt;!&ndash; https://mvnrepository.com/artifact/com.github.rholder/guava-retrying &ndash;&gt;-->
    <!--        <dependency>-->
    <!--            <groupId>com.github.rholder</groupId>-->
    <!--            <artifactId>guava-retrying</artifactId>-->
    <!--            <version>${guava.retrying.version}</version>-->
    <!--        </dependency>-->

    <!--        <dependency>-->
    <!--            <groupId>com.google.guava</groupId>-->
    <!--            <artifactId>guava</artifactId>-->
    <!--            <version>${guava.version}</version>-->
    <!--        </dependency>-->

    <!--        <dependency>-->
    <!--            <groupId>org.projectlombok</groupId>-->
    <!--            <artifactId>lombok</artifactId>-->
    <!--            <version>${lombok.version}</version>-->
    <!--        </dependency>-->

    <!--        <dependency>-->
    <!--            <groupId>org.apache.flink</groupId>-->
    <!--            <artifactId>flink-java</artifactId>-->
    <!--            <version>${flink.version}</version>-->
    <!--        </dependency>-->
    <!--        <dependency>-->
    <!--            <groupId>org.apache.flink</groupId>-->
    <!--            <artifactId>flink-streaming-java_2.11</artifactId>-->
    <!--            <version>${flink.version}</version>-->
    <!--        </dependency>-->
    <!--        <dependency>-->
    <!--            <groupId>org.apache.flink</groupId>-->
    <!--            <artifactId>flink-clients_2.11</artifactId>-->
    <!--            <version>${flink.version}</version>-->
    <!--        </dependency>-->

    <!--        &lt;!&ndash; https://mvnrepository.com/artifact/org.mvel/mvel2 &ndash;&gt;-->
    <!--        <dependency>-->
    <!--            <groupId>org.mvel</groupId>-->
    <!--            <artifactId>mvel2</artifactId>-->
    <!--            <version>${mvel2.version}</version>-->
    <!--        </dependency>-->

    <!--        &lt;!&ndash; https://mvnrepository.com/artifact/redis.clients/jedis &ndash;&gt;-->
    <!--        <dependency>-->
    <!--            <groupId>redis.clients</groupId>-->
    <!--            <artifactId>jedis</artifactId>-->
    <!--            <version>3.6.3</version>-->
    <!--        </dependency>-->

    <!--        &lt;!&ndash; 对zookeeper的底层api的一些封装 &ndash;&gt;-->
    <!--        <dependency>-->
    <!--            <groupId>org.apache.curator</groupId>-->
    <!--            <artifactId>curator-framework</artifactId>-->
    <!--            <version>${curator.version}</version>-->
    <!--        </dependency>-->
    <!--        &lt;!&ndash; 封装了一些高级特性,如:Cache事件监听、选举、分布式锁、分布式Barrier &ndash;&gt;-->
    <!--        <dependency>-->
    <!--            <groupId>org.apache.curator</groupId>-->
    <!--            <artifactId>curator-recipes</artifactId>-->
    <!--            <version>${curator.version}</version>-->
    <!--        </dependency>-->

    <!--        <dependency>-->
    <!--            <groupId>org.apache.kafka</groupId>-->
    <!--            <artifactId>kafka-clients</artifactId>-->
    <!--            <version>${kafka.version}</version>-->
    <!--        </dependency>-->

    <!--        <dependency>-->
    <!--            <groupId>org.codehaus.groovy</groupId>-->
    <!--            <artifactId>groovy</artifactId>-->
    <!--            <version>${groovy.version}</version>-->
    <!--        </dependency>-->
    <!--        <dependency>-->
    <!--            <groupId>org.codehaus.groovy</groupId>-->
    <!--            <artifactId>groovy-ant</artifactId>-->
    <!--            <version>${groovy.version}</version>-->
    <!--        </dependency>-->
    <!--        <dependency>-->
    <!--            <groupId>org.codehaus.groovy</groupId>-->
    <!--            <artifactId>groovy-cli-commons</artifactId>-->
    <!--            <version>${groovy.version}</version>-->
    <!--        </dependency>-->
    <!--        <dependency>-->
    <!--            <groupId>org.codehaus.groovy</groupId>-->
    <!--            <artifactId>groovy-cli-picocli</artifactId>-->
    <!--            <version>${groovy.version}</version>-->
    <!--        </dependency>-->
    <!--        <dependency>-->
    <!--            <groupId>org.codehaus.groovy</groupId>-->
    <!--            <artifactId>groovy-console</artifactId>-->
    <!--            <version>${groovy.version}</version>-->
    <!--        </dependency>-->
    <!--        <dependency>-->
    <!--            <groupId>org.codehaus.groovy</groupId>-->
    <!--            <artifactId>groovy-datetime</artifactId>-->
    <!--            <version>${groovy.version}</version>-->
    <!--        </dependency>-->
    <!--        <dependency>-->
    <!--            <groupId>org.codehaus.groovy</groupId>-->
    <!--            <artifactId>groovy-docgenerator</artifactId>-->
    <!--            <version>${groovy.version}</version>-->
    <!--        </dependency>-->
    <!--        <dependency>-->
    <!--            <groupId>org.codehaus.groovy</groupId>-->
    <!--            <artifactId>groovy-groovydoc</artifactId>-->
    <!--            <version>${groovy.version}</version>-->
    <!--        </dependency>-->
    <!--        <dependency>-->
    <!--            <groupId>org.codehaus.groovy</groupId>-->
    <!--            <artifactId>groovy-groovysh</artifactId>-->
    <!--            <version>${groovy.version}</version>-->
    <!--        </dependency>-->
    <!--        <dependency>-->
    <!--            <groupId>org.codehaus.groovy</groupId>-->
    <!--            <artifactId>groovy-jmx</artifactId>-->
    <!--            <version>${groovy.version}</version>-->
    <!--        </dependency>-->
    <!--        <dependency>-->
    <!--            <groupId>org.codehaus.groovy</groupId>-->
    <!--            <artifactId>groovy-json</artifactId>-->
    <!--            <version>${groovy.version}</version>-->
    <!--        </dependency>-->
    <!--        <dependency>-->
    <!--            <groupId>org.codehaus.groovy</groupId>-->
    <!--            <artifactId>groovy-jsr223</artifactId>-->
    <!--            <version>${groovy.version}</version>-->
    <!--        </dependency>-->
    <!--        <dependency>-->
    <!--            <groupId>org.codehaus.groovy</groupId>-->
    <!--            <artifactId>groovy-macro</artifactId>-->
    <!--            <version>${groovy.version}</version>-->
    <!--        </dependency>-->
    <!--        <dependency>-->
    <!--            <groupId>org.codehaus.groovy</groupId>-->
    <!--            <artifactId>groovy-nio</artifactId>-->
    <!--            <version>${groovy.version}</version>-->
    <!--        </dependency>-->
    <!--        <dependency>-->
    <!--            <groupId>org.codehaus.groovy</groupId>-->
    <!--            <artifactId>groovy-servlet</artifactId>-->
    <!--            <version>${groovy.version}</version>-->
    <!--        </dependency>-->
    <!--        <dependency>-->
    <!--            <groupId>org.codehaus.groovy</groupId>-->
    <!--            <artifactId>groovy-sql</artifactId>-->
    <!--            <version>${groovy.version}</version>-->
    <!--        </dependency>-->
    <!--        <dependency>-->
    <!--            <groupId>org.codehaus.groovy</groupId>-->
    <!--            <artifactId>groovy-swing</artifactId>-->
    <!--            <version>${groovy.version}</version>-->
    <!--        </dependency>-->
    <!--        <dependency>-->
    <!--            <groupId>org.codehaus.groovy</groupId>-->
    <!--            <artifactId>groovy-templates</artifactId>-->
    <!--            <version>${groovy.version}</version>-->
    <!--        </dependency>-->
    <!--        <dependency>-->
    <!--            <groupId>org.codehaus.groovy</groupId>-->
    <!--            <artifactId>groovy-test</artifactId>-->
    <!--            <version>${groovy.version}</version>-->
    <!--        </dependency>-->
    <!--        <dependency>-->
    <!--            <groupId>org.codehaus.groovy</groupId>-->
    <!--            <artifactId>groovy-test-junit5</artifactId>-->
    <!--            <version>${groovy.version}</version>-->
    <!--        </dependency>-->
    <!--        <dependency>-->
    <!--            <groupId>org.codehaus.groovy</groupId>-->
    <!--            <artifactId>groovy-testng</artifactId>-->
    <!--            <version>${groovy.version}</version>-->
    <!--        </dependency>-->
    <!--        <dependency>-->
    <!--            <groupId>org.codehaus.groovy</groupId>-->
    <!--            <artifactId>groovy-xml</artifactId>-->
    <!--            <version>${groovy.version}</version>-->
    <!--        </dependency>-->

    <!--        <dependency>-->
    <!--            <groupId>org.apache.flink</groupId>-->
    <!--            <artifactId>flink-table-planner_2.11</artifactId>-->
    <!--            <version>${flink.version}</version>-->
    <!--        </dependency>-->

    <!--        <dependency>-->
    <!--            <groupId>com.google.code.gson</groupId>-->
    <!--            <artifactId>gson</artifactId>-->
    <!--            <version>${gson.version}</version>-->
    <!--        </dependency>-->

    <!--        <dependency>-->
    <!--            <groupId>org.apache.flink</groupId>-->
    <!--            <artifactId>flink-table-common</artifactId>-->
    <!--            <version>${flink.version}</version>-->
    <!--            <scope>compile</scope>-->
    <!--        </dependency>-->
    <!--        <dependency>-->
    <!--            <groupId>org.apache.flink</groupId>-->
    <!--            <artifactId>flink-table-api-java</artifactId>-->
    <!--            <version>${flink.version}</version>-->
    <!--            <scope>compile</scope>-->
    <!--        </dependency>-->
    <!--        <dependency>-->
    <!--            <groupId>org.apache.flink</groupId>-->
    <!--            <artifactId>flink-table-api-java-bridge_2.11</artifactId>-->
    <!--            <version>${flink.version}</version>-->
    <!--            <scope>compile</scope>-->
    <!--        </dependency>-->
    <!--        <dependency>-->
    <!--            <groupId>org.apache.flink</groupId>-->
    <!--            <artifactId>flink-table-planner-blink_2.11</artifactId>-->
    <!--            <version>${flink.version}</version>-->
    <!--            <scope>compile</scope>-->
    <!--        </dependency>-->

    <!--        &lt;!&ndash; https://mvnrepository.com/artifact/org.apache.flink/flink-connector-jdbc &ndash;&gt;-->
    <!--        <dependency>-->
    <!--            <groupId>org.apache.flink</groupId>-->
    <!--            <artifactId>flink-connector-jdbc_2.11</artifactId>-->
    <!--            <version>${flink.version}</version>-->
    <!--        </dependency>-->

    <!--        <dependency>-->
    <!--            <groupId>org.apache.flink</groupId>-->
    <!--            <artifactId>flink-connector-hbase-2.2_2.11</artifactId>-->
    <!--            <version>${flink.version}</version>-->
    <!--        </dependency>-->

    <!--        <dependency>-->
    <!--            <groupId>org.apache.flink</groupId>-->
    <!--            <artifactId>flink-json</artifactId>-->
    <!--            <version>${flink.version}</version>-->
    <!--        </dependency>-->

    <!--        &lt;!&ndash; https://mvnrepository.com/artifact/org.apache.bahir/flink-connector-redis &ndash;&gt;-->
    <!--        <dependency>-->
    <!--            <groupId>org.apache.bahir</groupId>-->
    <!--            <artifactId>flink-connector-redis_2.10</artifactId>-->
    <!--            <version>1.0</version>-->
    <!--        </dependency>-->



    <!--        &lt;!&ndash; https://mvnrepository.com/artifact/org.apache.flink/flink-connector-kafka &ndash;&gt;-->
    <!--        <dependency>-->
    <!--            <groupId>org.apache.flink</groupId>-->
    <!--            <artifactId>flink-connector-kafka_2.12</artifactId>-->
    <!--            <version>${flink.version}</version>-->
    <!--        </dependency>-->


    <!--        <dependency>-->
    <!--            <groupId>ch.qos.logback</groupId>-->
    <!--            <artifactId>logback-classic</artifactId>-->
    <!--            <scope>compile</scope>-->
    <!--            <version>${logback-classic.version}</version>-->
    <!--        </dependency>-->

    <!--        &lt;!&ndash; https://mvnrepository.com/artifact/org.slf4j/slf4j-log4j12 &ndash;&gt;-->
    <!--        <dependency>-->
    <!--            <groupId>org.slf4j</groupId>-->
    <!--            <artifactId>slf4j-log4j12</artifactId>-->
    <!--            <version>${slf4j-log4j12.version}</version>-->
    <!--        </dependency>-->

    <!--        <dependency>-->

    <!--            <groupId>org.apache.flink</groupId>-->

    <!--            <artifactId>flink-runtime-web_2.11</artifactId>-->

    <!--            <version>${flink.version}</version>-->

    <!--        </dependency>-->

    <!--        &lt;!&ndash; https://mvnrepository.com/artifact/com.fasterxml.jackson.core/jackson-databind &ndash;&gt;-->
    <!--        <dependency>-->
    <!--            <groupId>com.fasterxml.jackson.core</groupId>-->
    <!--            <artifactId>jackson-databind</artifactId>-->
    <!--            <version>2.12.4</version>-->
    <!--        </dependency>-->

    <!--        &lt;!&ndash; https://mvnrepository.com/artifact/com.fasterxml.jackson.module/jackson-module-kotlin &ndash;&gt;-->
    <!--        <dependency>-->
    <!--            <groupId>com.fasterxml.jackson.module</groupId>-->
    <!--            <artifactId>jackson-module-kotlin</artifactId>-->
    <!--            <version>2.12.4</version>-->
    <!--        </dependency>-->

    <!--        &lt;!&ndash; https://mvnrepository.com/artifact/com.fasterxml.jackson.module/jackson-module-parameter-names &ndash;&gt;-->
    <!--        <dependency>-->
    <!--            <groupId>com.fasterxml.jackson.module</groupId>-->
    <!--            <artifactId>jackson-module-parameter-names</artifactId>-->
    <!--            <version>2.12.4</version>-->
    <!--        </dependency>-->

    <!--        &lt;!&ndash; https://mvnrepository.com/artifact/com.fasterxml.jackson.datatype/jackson-datatype-guava &ndash;&gt;-->
    <!--        <dependency>-->
    <!--            <groupId>com.fasterxml.jackson.datatype</groupId>-->
    <!--            <artifactId>jackson-datatype-guava</artifactId>-->
    <!--            <version>2.12.4</version>-->
    <!--        </dependency>-->


    <!--        &lt;!&ndash; https://mvnrepository.com/artifact/com.hubspot.jackson/jackson-datatype-protobuf &ndash;&gt;-->
    <!--        <dependency>-->
    <!--            <groupId>com.hubspot.jackson</groupId>-->
    <!--            <artifactId>jackson-datatype-protobuf</artifactId>-->
    <!--            <version>0.9.12</version>-->
    <!--        </dependency>-->



    <!--    </dependencies>-->


</project>

================================================
FILE: flink-examples-1.12/src/main/java/flink/examples/datastream/_07/query/_04_window/_04_TumbleWindowTest.java
================================================
package flink.examples.datastream._07.query._04_window;

import org.apache.flink.api.java.functions.KeySelector;
import org.apache.flink.api.java.tuple.Tuple4;
import org.apache.flink.configuration.Configuration;
import org.apache.flink.streaming.api.TimeCharacteristic;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.api.functions.source.SourceFunction;
import org.apache.flink.streaming.api.functions.timestamps.BoundedOutOfOrdernessTimestampExtractor;
import org.apache.flink.streaming.api.windowing.assigners.TumblingEventTimeWindows;
import org.apache.flink.streaming.api.windowing.time.Time;

public class _04_TumbleWindowTest {

    public static void main(String[] args) throws Exception {

        StreamExecutionEnvironment env =
                StreamExecutionEnvironment.createLocalEnvironmentWithWebUI(new Configuration());

        env.setParallelism(1);

        env.setStreamTimeCharacteristic(TimeCharacteristic.EventTime);

        env.addSource(new UserDefinedSource())
                .assignTimestampsAndWatermarks(new BoundedOutOfOrdernessTimestampExtractor<Tuple4<String, String, Integer, Long>>(Time.seconds(0)) {
                    @Override
                    public long extractTimestamp(Tuple4<String, String, Integer, Long> element) {
                        return element.f3;
                    }
                })
                .keyBy(new KeySelector<Tuple4<String, String, Integer, Long>, String>() {
                    @Override
                    public String getKey(Tuple4<String, String, Integer, Long> row) throws Exception {
                        return row.f0;
                    }
                })
                .window(TumblingEventTimeWindows.of(Time.seconds(10)))
                .sum(2)
                .print();

        env.execute("1.12.1 DataStream TUMBLE WINDOW 案例");
    }

    private static class UserDefinedSource implements SourceFunction<Tuple4<String, String, Integer, Long>> {

        private volatile boolean isCancel;

        @Override
        public void run(SourceContext<Tuple4<String, String, Integer, Long>> sourceContext) throws Exception {

            while (!this.isCancel) {

                sourceContext.collect(Tuple4.of("a", "b", 1, System.currentTimeMillis()));

                Thread.sleep(10L);
            }

        }

        @Override
        public void cancel() {
            this.isCancel = true;
        }
    }
}

================================================
FILE: flink-examples-1.12/src/main/java/flink/examples/sql/_07/query/_04_window_agg/_04_TumbleWindowTest.java
================================================
package flink.examples.sql._07.query._04_window_agg;

import java.util.concurrent.TimeUnit;

import org.apache.flink.api.common.restartstrategy.RestartStrategies;
import org.apache.flink.api.java.utils.ParameterTool;
import org.apache.flink.configuration.Configuration;
import org.apache.flink.streaming.api.CheckpointingMode;
import org.apache.flink.streaming.api.environment.CheckpointConfig;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.table.api.EnvironmentSettings;
import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;


public class _04_TumbleWindowTest {

    public static void main(String[] args) throws Exception {

        StreamExecutionEnvironment env =
                StreamExecutionEnvironment.createLocalEnvironmentWithWebUI(new Configuration());

        ParameterTool parameterTool = ParameterTool.fromArgs(args);

        env.setRestartStrategy(RestartStrategies.failureRateRestart(6, org.apache.flink.api.common.time.Time
                .of(10L, TimeUnit.MINUTES), org.apache.flink.api.common.time.Time.of(5L, TimeUnit.SECONDS)));
        env.getConfig().setGlobalJobParameters(parameterTool);
        env.setParallelism(1);

        // ck 设置
        env.getCheckpointConfig().setFailOnCheckpointingErrors(false);
        env.enableCheckpointing(30 * 1000L, CheckpointingMode.EXACTLY_ONCE);
        env.getCheckpointConfig().setMinPauseBetweenCheckpoints(3L);
        env.getCheckpointConfig().enableExternalizedCheckpoints(CheckpointConfig.ExternalizedCheckpointCleanup.RETAIN_ON_CANCELLATION);

        EnvironmentSettings settings = EnvironmentSettings
                .newInstance()
                .useBlinkPlanner()
                .inStreamingMode().build();

        StreamTableEnvironment tEnv = StreamTableEnvironment.create(env, settings);

        String sourceSql = "CREATE TABLE source_table (\n"
                + "    dim STRING,\n"
                + "    user_id BIGINT,\n"
                + "    price BIGINT,\n"
                + "    row_time AS cast(CURRENT_TIMESTAMP as timestamp(3)),\n"
                + "    WATERMARK FOR row_time AS row_time - INTERVAL '5' SECOND\n"
                + ") WITH (\n"
                + "  'connector' = 'datagen',\n"
                + "  'rows-per-second' = '10',\n"
                + "  'fields.dim.length' = '1',\n"
                + "  'fields.user_id.min' = '1',\n"
                + "  'fields.user_id.max' = '100000',\n"
                + "  'fields.price.min' = '1',\n"
                + "  'fields.price.max' = '100000'\n"
                + ")";

        String sinkSql = "CREATE TABLE sink_table (\n"
                + "    dim STRING,\n"
                + "    pv BIGINT,\n"
                + "    sum_price BIGINT,\n"
                + "    max_price BIGINT,\n"
                + "    min_price BIGINT,\n"
                + "    uv BIGINT,\n"
                + "    window_start bigint\n"
                + ") WITH (\n"
                + "  'connector' = 'print'\n"
                + ")";

        String selectWhereSql = "insert into sink_table\n"
                + "select dim,\n"
                + "       sum(bucket_pv) as pv,\n"
                + "       sum(bucket_sum_price) as sum_price,\n"
                + "       max(bucket_max_price) as max_price,\n"
                + "       min(bucket_min_price) as min_price,\n"
                + "       sum(bucket_uv) as uv,\n"
                + "       max(window_start) as window_start\n"
                + "from (\n"
                + "     select dim,\n"
                + "            count(*) as bucket_pv,\n"
                + "            sum(price) as bucket_sum_price,\n"
                + "            max(price) as bucket_max_price,\n"
                + "            min(price) as bucket_min_price,\n"
                + "            count(distinct user_id) as bucket_uv,\n"
                + "            cast(tumble_start(row_time, interval '1' minute) as bigint) * 1000 as window_start\n"
                + "     from source_table\n"
                + "     group by\n"
                + "            mod(user_id, 1024),\n"
                + "            dim,\n"
                + "            tumble(row_time, interval '1' minute)\n"
                + ")\n"
                + "group by dim,\n"
                + "         window_start";

        tEnv.getConfig().getConfiguration().setString("pipeline.name", "1.12.1 TUMBLE WINDOW 案例");

        tEnv.executeSql(sourceSql);
        tEnv.executeSql(sinkSql);
        tEnv.executeSql(selectWhereSql);
    }

}


================================================
FILE: flink-examples-1.12/src/main/java/flink/examples/sql/_07/query/_04_window_agg/_04_TumbleWindowTest_GroupingWindowAggsHandler$59.java
================================================
package flink.examples.sql._07.query._04_window_agg;


public final class _04_TumbleWindowTest_GroupingWindowAggsHandler$59 implements
        org.apache.flink.table.runtime.generated.NamespaceAggsHandleFunction<org.apache.flink.table.runtime.operators.window.TimeWindow> {

    long agg0_count1;
    boolean agg0_count1IsNull;
    long agg1_sum;
    boolean agg1_sumIsNull;
    long agg2_max;
    boolean agg2_maxIsNull;
    long agg3_min;
    boolean agg3_minIsNull;
    long agg4_count;
    boolean agg4_countIsNull;
    private transient org.apache.flink.table.runtime.typeutils.ExternalSerializer externalSerializer$22;
    private transient org.apache.flink.table.runtime.typeutils.ExternalSerializer externalSerializer$23;
    private org.apache.flink.table.runtime.dataview.StateMapView distinctAcc_0_dataview;
    private org.apache.flink.table.data.binary.BinaryRawValueData distinctAcc_0_dataview_raw_value;
    private org.apache.flink.table.api.dataview.MapView distinct_view_0;
    org.apache.flink.table.data.GenericRowData acc$25 = new org.apache.flink.table.data.GenericRowData(6);
    org.apache.flink.table.data.GenericRowData acc$27 = new org.apache.flink.table.data.GenericRowData(6);
    org.apache.flink.table.data.GenericRowData aggValue$58 = new org.apache.flink.table.data.GenericRowData(9);

    private org.apache.flink.table.runtime.dataview.StateDataViewStore store;

    private org.apache.flink.table.runtime.operators.window.TimeWindow namespace;

    public _04_TumbleWindowTest_GroupingWindowAggsHandler$59(Object[] references) throws Exception {
        externalSerializer$22 = (((org.apache.flink.table.runtime.typeutils.ExternalSerializer) references[0]));
        externalSerializer$23 = (((org.apache.flink.table.runtime.typeutils.ExternalSerializer) references[1]));
    }

    private org.apache.flink.api.common.functions.RuntimeContext getRuntimeContext() {
        return store.getRuntimeContext();
    }

    @Override
    public void open(org.apache.flink.table.runtime.dataview.StateDataViewStore store) throws Exception {
        this.store = store;

        distinctAcc_0_dataview = (org.apache.flink.table.runtime.dataview.StateMapView) store
                .getStateMapView("distinctAcc_0", true, externalSerializer$22, externalSerializer$23);
        distinctAcc_0_dataview_raw_value =
                org.apache.flink.table.data.binary.BinaryRawValueData.fromObject(distinctAcc_0_dataview);

        distinct_view_0 = distinctAcc_0_dataview;
    }

    @Override
    public void accumulate(org.apache.flink.table.data.RowData accInput) throws Exception {

        boolean isNull$34;
        long result$35;
        long field$36;
        boolean isNull$36;
        boolean isNull$37;
        long result$38;
        boolean isNull$41;
        boolean result$42;
        boolean isNull$46;
        boolean result$47;
        long field$51;
        boolean isNull$51;
        boolean isNull$53;
        long result$54;
        isNull$51 = accInput.isNullAt(4);
        field$51 = -1L;
        if (!isNull$51) {
            field$51 = accInput.getLong(4);
        }
        isNull$36 = accInput.isNullAt(3);
        field$36 = -1L;
        if (!isNull$36) {
            field$36 = accInput.getLong(3);
        }


        isNull$34 = agg0_count1IsNull || false;
        result$35 = -1L;
        if (!isNull$34) {

            result$35 = (long) (agg0_count1 + ((long) 1L));

        }

        agg0_count1 = result$35;
        ;
        agg0_count1IsNull = isNull$34;


        long result$40 = -1L;
        boolean isNull$40;
        if (isNull$36) {

            isNull$40 = agg1_sumIsNull;
            if (!isNull$40) {
                result$40 = agg1_sum;
            }
        } else {
            long result$39 = -1L;
            boolean isNull$39;
            if (agg1_sumIsNull) {

                isNull$39 = isNull$36;
                if (!isNull$39) {
                    result$39 = field$36;
                }
            } else {


                isNull$37 = agg1_sumIsNull || isNull$36;
                result$38 = -1L;
                if (!isNull$37) {

                    result$38 = (long) (agg1_sum + field$36);

                }

                isNull$39 = isNull$37;
                if (!isNull$39) {
                    result$39 = result$38;
                }
            }
            isNull$40 = isNull$39;
            if (!isNull$40) {
                result$40 = result$39;
            }
        }
        agg1_sum = result$40;
        ;
        agg1_sumIsNull = isNull$40;


        long result$45 = -1L;
        boolean isNull$45;
        if (isNull$36) {

            isNull$45 = agg2_maxIsNull;
            if (!isNull$45) {
                result$45 = agg2_max;
            }
        } else {
            long result$44 = -1L;
            boolean isNull$44;
            if (agg2_maxIsNull) {

                isNull$44 = isNull$36;
                if (!isNull$44) {
                    result$44 = field$36;
                }
            } else {
                isNull$41 = isNull$36 || agg2_maxIsNull;
                result$42 = false;
                if (!isNull$41) {

                    result$42 = field$36 > agg2_max;

                }

                long result$43 = -1L;
                boolean isNull$43;
                if (result$42) {

                    isNull$43 = isNull$36;
                    if (!isNull$43) {
                        result$43 = field$36;
                    }
                } else {

                    isNull$43 = agg2_maxIsNull;
                    if (!isNull$43) {
                        result$43 = agg2_max;
                    }
                }
                isNull$44 = isNull$43;
                if (!isNull$44) {
                    result$44 = result$43;
                }
            }
            isNull$45 = isNull$44;
            if (!isNull$45) {
                result$45 = result$44;
            }
        }
        agg2_max = result$45;
        ;
        agg2_maxIsNull = isNull$45;


        long result$50 = -1L;
        boolean isNull$50;
        if (isNull$36) {

            isNull$50 = agg3_minIsNull;
            if (!isNull$50) {
                result$50 = agg3_min;
            }
        } else {
            long result$49 = -1L;
            boolean isNull$49;
            if (agg3_minIsNull) {

                isNull$49 = isNull$36;
                if (!isNull$49) {
                    result$49 = field$36;
                }
            } else {
                isNull$46 = isNull$36 || agg3_minIsNull;
                result$47 = false;
                if (!isNull$46) {

                    result$47 = field$36 < agg3_min;

                }

                long result$48 = -1L;
                boolean isNull$48;
                if (result$47) {

                    isNull$48 = isNull$36;
                    if (!isNull$48) {
                        result$48 = field$36;
                    }
                } else {

                    isNull$48 = agg3_minIsNull;
                    if (!isNull$48) {
                        result$48 = agg3_min;
                    }
                }
                isNull$49 = isNull$48;
                if (!isNull$49) {
                    result$49 = result$48;
                }
            }
            isNull$50 = isNull$49;
            if (!isNull$50) {
                result$50 = result$49;
            }
        }
        agg3_min = result$50;
        ;
        agg3_minIsNull = isNull$50;


        Long distinctKey$52 = (Long) field$51;
        if (isNull$51) {
            distinctKey$52 = null;
        }

        Long value$56 = (Long) distinct_view_0.get(distinctKey$52);
        if (value$56 == null) {
            value$56 = 0L;
        }

        boolean is_distinct_value_changed_0 = false;

        long existed$57 = ((long) value$56) & (1L << 0);
        if (existed$57 == 0) {  // not existed
            value$56 = ((long) value$56) | (1L << 0);
            is_distinct_value_changed_0 = true;

            long result$55 = -1L;
            boolean isNull$55;
            if (isNull$51) {

                isNull$55 = agg4_countIsNull;
                if (!isNull$55) {
                    result$55 = agg4_count;
                }
            } else {


                isNull$53 = agg4_countIsNull || false;
                result$54 = -1L;
                if (!isNull$53) {

                    result$54 = (long) (agg4_count + ((long) 1L));

                }

                isNull$55 = isNull$53;
                if (!isNull$55) {
                    result$55 = result$54;
                }
            }
            agg4_count = result$55;
            ;
            agg4_countIsNull = isNull$55;

        }

        if (is_distinct_value_changed_0) {
            distinct_view_0.put(distinctKey$52, value$56);
        }


    }

    @Override
    public void retract(org.apache.flink.table.data.RowData retractInput) throws Exception {

        throw new RuntimeException(
                "This function not require retract method, but the retract method is called.");

    }

    @Override
    public void merge(org.apache.flink.table.runtime.operators.window.TimeWindow ns,
            org.apache.flink.table.data.RowData otherAcc) throws Exception {
        namespace = (org.apache.flink.table.runtime.operators.window.TimeWindow) ns;

        throw new RuntimeException("This function not require merge method, but the merge method is called.");

    }

    @Override
    public void setAccumulators(org.apache.flink.table.runtime.operators.window.TimeWindow ns,
            org.apache.flink.table.data.RowData acc)
            throws Exception {
        namespace = (org.apache.flink.table.runtime.operators.window.TimeWindow) ns;

        long field$28;
        boolean isNull$28;
        long field$29;
        boolean isNull$29;
        long field$30;
        boolean isNull$30;
        long field$31;
        boolean isNull$31;
        long field$32;
        boolean isNull$32;
        org.apache.flink.table.data.binary.BinaryRawValueData field$33;
        boolean isNull$33;
        isNull$32 = acc.isNullAt(4);
        field$32 = -1L;
        if (!isNull$32) {
            field$32 = acc.getLong(4);
        }
        isNull$28 = acc.isNullAt(0);
        field$28 = -1L;
        if (!isNull$28) {
            field$28 = acc.getLong(0);
        }
        isNull$29 = acc.isNullAt(1);
        field$29 = -1L;
        if (!isNull$29) {
            field$29 = acc.getLong(1);
        }
        isNull$31 = acc.isNullAt(3);
        field$31 = -1L;
        if (!isNull$31) {
            field$31 = acc.getLong(3);
        }

        // when namespace is null, the dataview is used in heap, no key and namespace set
        if (namespace != null) {
            distinctAcc_0_dataview.setCurrentNamespace(namespace);
            distinct_view_0 = distinctAcc_0_dataview;
        } else {
            isNull$33 = acc.isNullAt(5);
            field$33 = null;
            if (!isNull$33) {
                field$33 = ((org.apache.flink.table.data.binary.BinaryRawValueData) acc.getRawValue(5));
            }
            distinct_view_0 = (org.apache.flink.table.api.dataview.MapView) field$33.getJavaObject();
        }

        isNull$30 = acc.isNullAt(2);
        field$30 = -1L;
        if (!isNull$30) {
            field$30 = acc.getLong(2);
        }

        agg0_count1 = field$28;
        ;
        agg0_count1IsNull = isNull$28;


        agg1_sum = field$29;
        ;
        agg1_sumIsNull = isNull$29;


        agg2_max = field$30;
        ;
        agg2_maxIsNull = isNull$30;


        agg3_min = field$31;
        ;
        agg3_minIsNull = isNull$31;


        agg4_count = field$32;
        ;
        agg4_countIsNull = isNull$32;


    }

    @Override
    public org.apache.flink.table.data.RowData getAccumulators() throws Exception {


        acc$27 = new org.apache.flink.table.data.GenericRowData(6);


        if (agg0_count1IsNull) {
            acc$27.setField(0, null);
        } else {
            acc$27.setField(0, agg0_count1);
        }


        if (agg1_sumIsNull) {
            acc$27.setField(1, null);
        } else {
            acc$27.setField(1, agg1_sum);
        }


        if (agg2_maxIsNull) {
            acc$27.setField(2, null);
        } else {
            acc$27.setField(2, agg2_max);
        }


        if (agg3_minIsNull) {
            acc$27.setField(3, null);
        } else {
            acc$27.setField(3, agg3_min);
        }


        if (agg4_countIsNull) {
            acc$27.setField(4, null);
        } else {
            acc$27.setField(4, agg4_count);
        }


        org.apache.flink.table.data.binary.BinaryRawValueData distinct_acc$26 =
                org.apache.flink.table.data.binary.BinaryRawValueData.fromObject(distinct_view_0);

        if (false) {
            acc$27.setField(5, null);
        } else {
            acc$27.setField(5, distinct_acc$26);
        }


        return acc$27;

    }

    @Override
    public org.apache.flink.table.data.RowData createAccumulators() throws Exception {


        acc$25 = new org.apache.flink.table.data.GenericRowData(6);


        if (false) {
            acc$25.setField(0, null);
        } else {
            acc$25.setField(0, ((long) 0L));
        }


        if (true) {
            acc$25.setField(1, null);
        } else {
            acc$25.setField(1, ((long) -1L));
        }


        if (true) {
            acc$25.setField(2, null);
        } else {
            acc$25.setField(2, ((long) -1L));
        }


        if (true) {
            acc$25.setField(3, null);
        } else {
            acc$25.setField(3, ((long) -1L));
        }


        if (false) {
            acc$25.setField(4, null);
        } else {
            acc$25.setField(4, ((long) 0L));
        }


        org.apache.flink.table.api.dataview.MapView mapview$24 = new org.apache.flink.table.api.dataview.MapView();
        org.apache.flink.table.data.binary.BinaryRawValueData distinct_acc$24 =
                org.apache.flink.table.data.binary.BinaryRawValueData.fromObject(mapview$24);

        if (false) {
            acc$25.setField(5, null);
        } else {
            acc$25.setField(5, distinct_acc$24);
        }


        return acc$25;

    }

    @Override
    public org.apache.flink.table.data.RowData getValue(org.apache.flink.table.runtime.operators.window.TimeWindow ns)
            throws Exception {
        namespace = (org.apache.flink.table.runtime.operators.window.TimeWindow) ns;


        aggValue$58 = new org.apache.flink.table.data.GenericRowData(9);


        if (agg0_count1IsNull) {
            aggValue$58.setField(0, null);
        } else {
            aggValue$58.setField(0, agg0_count1);
        }


        if (agg1_sumIsNull) {
            aggValue$58.setField(1, null);
        } else {
            aggValue$58.setField(1, agg1_sum);
        }


        if (agg2_maxIsNull) {
            aggValue$58.setField(2, null);
        } else {
            aggValue$58.setField(2, agg2_max);
        }


        if (agg3_minIsNull) {
            aggValue$58.setField(3, null);
        } else {
            aggValue$58.setField(3, agg3_min);
        }


        if (agg4_countIsNull) {
            aggValue$58.setField(4, null);
        } else {
            aggValue$58.setField(4, agg4_count);
        }


        if (false) {
            aggValue$58.setField(5, null);
        } else {
            aggValue$58.setField(5, org.apache.flink.table.data.TimestampData.fromEpochMillis(namespace.getStart()));
        }


        if (false) {
            aggValue$58.setField(6, null);
        } else {
            aggValue$58.setField(6, org.apache.flink.table.data.TimestampData.fromEpochMillis(namespace.getEnd()));
        }


        if (false) {
            aggValue$58.setField(7, null);
        } else {
            aggValue$58.setField(7, org.apache.flink.table.data.TimestampData.fromEpochMillis(namespace.getEnd() - 1));
        }


        if (true) {
            aggValue$58.setField(8, null);
        } else {
            aggValue$58.setField(8, org.apache.flink.table.data.TimestampData.fromEpochMillis(-1L));
        }


        return aggValue$58;

    }

    @Override
    public void cleanup(org.apache.flink.table.runtime.operators.window.TimeWindow ns) throws Exception {
        namespace = (org.apache.flink.table.runtime.operators.window.TimeWindow) ns;

        distinctAcc_0_dataview.setCurrentNamespace(namespace);
        distinctAcc_0_dataview.clear();


    }

    @Override
    public void close() throws Exception {

    }
}

================================================
FILE: flink-examples-1.12/src/main/java/flink/examples/sql/_07/query/_04_window_agg/_04_TumbleWindowTest_KeyProjection$69.java
================================================
package flink.examples.sql._07.query._04_window_agg;


public final class _04_TumbleWindowTest_KeyProjection$69 implements
        org.apache.flink.table.runtime.generated.Projection<org.apache.flink.table.data.RowData,
                org.apache.flink.table.data.binary.BinaryRowData> {

    org.apache.flink.table.data.binary.BinaryRowData out = new org.apache.flink.table.data.binary.BinaryRowData(2);
    org.apache.flink.table.data.writer.BinaryRowWriter outWriter =
            new org.apache.flink.table.data.writer.BinaryRowWriter(out);

    public _04_TumbleWindowTest_KeyProjection$69(Object[] references) throws Exception {

    }

    @Override
    public org.apache.flink.table.data.binary.BinaryRowData apply(org.apache.flink.table.data.RowData in1) {
        int field$70;
        boolean isNull$70;
        org.apache.flink.table.data.binary.BinaryStringData field$71;
        boolean isNull$71;
        outWriter.reset();
        isNull$70 = in1.isNullAt(0);
        field$70 = -1;
        if (!isNull$70) {
            field$70 = in1.getInt(0);
        }
        if (isNull$70) {
            outWriter.setNullAt(0);
        } else {
            outWriter.writeInt(0, field$70);
        }

        isNull$71 = in1.isNullAt(1);
        field$71 = org.apache.flink.table.data.binary.BinaryStringData.EMPTY_UTF8;
        if (!isNull$71) {
            field$71 = ((org.apache.flink.table.data.binary.BinaryStringData) in1.getString(1));
        }
        if (isNull$71) {
            outWriter.setNullAt(1);
        } else {
            outWriter.writeString(1, field$71);
        }

        outWriter.complete();


        return out;
    }
}

================================================
FILE: flink-examples-1.12/src/main/java/flink/examples/sql/_07/query/_04_window_agg/_04_TumbleWindowTest_WatermarkGenerator$6.java
================================================
package flink.examples.sql._07.query._04_window_agg;


public final class _04_TumbleWindowTest_WatermarkGenerator$6
        extends org.apache.flink.table.runtime.generated.WatermarkGenerator {


    public _04_TumbleWindowTest_WatermarkGenerator$6(Object[] references) throws Exception {

    }

    @Override
    public void open(org.apache.flink.configuration.Configuration parameters) throws Exception {

    }

    @Override
    public Long currentWatermark(org.apache.flink.table.data.RowData row) throws Exception {

        org.apache.flink.table.data.TimestampData field$7;
        boolean isNull$7;
        boolean isNull$8;
        org.apache.flink.table.data.TimestampData result$9;
        isNull$7 = row.isNullAt(3);
        field$7 = null;
        if (!isNull$7) {
            field$7 = row.getTimestamp(3, 3);
        }


        isNull$8 = isNull$7 || false;
        result$9 = null;
        if (!isNull$8) {

            result$9 = org.apache.flink.table.data.TimestampData
                    .fromEpochMillis(field$7.getMillisecond() - ((long) 5000L), field$7.getNanoOfMillisecond());

        }

        if (isNull$8) {
            return null;
        } else {
            return result$9.getMillisecond();
        }
    }

    @Override
    public void close() throws Exception {

    }
}

================================================
FILE: flink-examples-1.13/.gitignore
================================================
HELP.md
target/
!.mvn/wrapper/maven-wrapper.jar
!**/src/main/**
#**/src/test/**
.idea/
*.iml
*.DS_Store

### IntelliJ IDEA ###
.idea
*.iws
*.ipr



================================================
FILE: flink-examples-1.13/pom.xml
================================================
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <parent>
        <artifactId>flink-study</artifactId>
        <groupId>com.github.antigeneral</groupId>
        <version>1.0-SNAPSHOT</version>
    </parent>
    <modelVersion>4.0.0</modelVersion>

    <groupId>com.github.antigeneral</groupId>
    <artifactId>flink-examples-1.13</artifactId>

    <build>

        <extensions>
            <extension>
                <groupId>kr.motd.maven</groupId>
                <artifactId>os-maven-plugin</artifactId>
                <version>${os-maven-plugin.version}</version>
            </extension>
        </extensions>

        <plugins>


            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-compiler-plugin</artifactId>
            </plugin>

            <plugin>
                <groupId>org.xolstice.maven.plugins</groupId>
                <artifactId>protobuf-maven-plugin</artifactId>
            </plugin>

            <!--            <plugin>-->
            <!--                &lt;!&ndash; Extract parser grammar template from calcite-core.jar and put-->
            <!--                     it under ${project.build.directory} where all freemarker templates are. &ndash;&gt;-->
            <!--                <groupId>org.apache.maven.plugins</groupId>-->
            <!--                <artifactId>maven-dependency-plugin</artifactId>-->
            <!--                <executions>-->
            <!--                    <execution>-->
            <!--                        <id>unpack-parser-template</id>-->
            <!--                        <phase>initialize</phase>-->
            <!--                        <goals>-->
            <!--                            <goal>unpack</goal>-->
            <!--                        </goals>-->
            <!--                        <configuration>-->
            <!--                            <artifactItems>-->
            <!--                                <artifactItem>-->
            <!--                                    <groupId>org.apache.calcite</groupId>-->
            <!--                                    <artifactId>calcite-core</artifactId>-->
            <!--                                    <type>jar</type>-->
            <!--                                    <overWrite>true</overWrite>-->
            <!--                                    <outputDirectory>${project.build.directory}/</outputDirectory>-->
            <!--                                    <includes>**/Parser.jj</includes>-->
            <!--                                </artifactItem>-->
            <!--                            </artifactItems>-->
            <!--                        </configuration>-->
            <!--                    </execution>-->
            <!--                </executions>-->
            <!--            </plugin>-->
            <!--            &lt;!&ndash; adding fmpp code gen &ndash;&gt;-->
            <!--            <plugin>-->
            <!--                <artifactId>maven-resources-plugin</artifactId>-->
            <!--            </plugin>-->
            <!--            <plugin>-->
            <!--                <groupId>com.googlecode.fmpp-maven-plugin</groupId>-->
            <!--                <artifactId>fmpp-maven-plugin</artifactId>-->
            <!--            </plugin>-->
            <!--            <plugin>-->
            <!--                &lt;!&ndash; This must be run AFTER the fmpp-maven-plugin &ndash;&gt;-->
            <!--                <groupId>org.codehaus.mojo</groupId>-->
            <!--                <artifactId>javacc-maven-plugin</artifactId>-->
            <!--            </plugin>-->
            <!--            <plugin>-->
            <!--                <groupId>org.apache.maven.plugins</groupId>-->
            <!--                <artifactId>maven-surefire-plugin</artifactId>-->
            <!--            </plugin>-->
        </plugins>
    </build>


    <dependencies>



        <dependency>
            <groupId>com.google.protobuf</groupId>
            <artifactId>protobuf-java</artifactId>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-hive_2.11</artifactId>
        </dependency>

        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-common</artifactId>
            <version>3.1.0</version>
            <scope>compile</scope>
            <exclusions>
                <exclusion>
                    <artifactId>slf4j-log4j12</artifactId>
                    <groupId>org.slf4j</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>commons-logging</artifactId>
                    <groupId>commmons-logging</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>servlet-api</artifactId>
                    <groupId>javax.servlet</groupId>
                </exclusion>
            </exclusions>
            <optional>true</optional>
        </dependency>

        <dependency>
            <groupId>org.apache.hive</groupId>
            <artifactId>hive-exec</artifactId>
            <exclusions>
                <exclusion>
                    <artifactId>log4j-slf4j-impl</artifactId>
                    <groupId>org.apache.logging.log4j</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>guava</artifactId>
                    <groupId>com.google.guava</groupId>
                </exclusion>
                <!--                <exclusion>-->
<!--                    <artifactId>hadoop-common</artifactId>-->
<!--                    <groupId>org.apache.hadoop</groupId>-->
<!--                </exclusion>-->
            </exclusions>
        </dependency>
<!--        <dependency>-->
<!--            <groupId>org.apache.hadoop</groupId>-->
<!--            <artifactId>hadoop-common</artifactId>-->
<!--            <version>${hadoop.version}</version>-->
<!--            <exclusions>-->
<!--                <exclusion>-->
<!--                    <artifactId>slf4j-log4j12</artifactId>-->
<!--                    <groupId>org.slf4j</groupId>-->
<!--                </exclusion>-->
<!--                <exclusion>-->
<!--                    <artifactId>jsr311-api</artifactId>-->
<!--                    <groupId>javax.ws.rs</groupId>-->
<!--                </exclusion>-->
<!--                <exclusion>-->
<!--                    <artifactId>jersey-core</artifactId>-->
<!--                    <groupId>com.sun.jersey</groupId>-->
<!--                </exclusion>-->
<!--                <exclusion>-->
<!--                    <artifactId>jersey-server</artifactId>-->
<!--                    <groupId>com.sun.jersey</groupId>-->
<!--                </exclusion>-->
<!--                <exclusion>-->
<!--                    <artifactId>jersey-servlet</artifactId>-->
<!--                    <groupId>com.sun.jersey</groupId>-->
<!--                </exclusion>-->
<!--                <exclusion>-->
<!--                    <artifactId>jersey-json</artifactId>-->
<!--                    <groupId>com.sun.jersey</groupId>-->
<!--                </exclusion>-->
<!--            </exclusions>-->
<!--        </dependency>-->
<!--        <dependency>-->
<!--            <groupId>org.apache.hadoop</groupId>-->
<!--            <artifactId>hadoop-client</artifactId>-->
<!--            <version>${hadoop.version}</version>-->
<!--            <exclusions>-->
<!--                <exclusion>-->
<!--                    <artifactId>guava</artifactId>-->
<!--                    <groupId>com.google.guava</groupId>-->
<!--                </exclusion>-->
<!--                <exclusion>-->
<!--                    <artifactId>hadoop-common</artifactId>-->
<!--                    <groupId>org.apache.hadoop</groupId>-->
<!--                </exclusion>-->
<!--            </exclusions>-->
<!--        </dependency>-->
<!--        <dependency>-->
<!--            <groupId>org.apache.hadoop</groupId>-->
<!--            <artifactId>hadoop-hdfs</artifactId>-->
<!--            <version>${hadoop.version}</version>-->
<!--            <exclusions>-->
<!--                <exclusion>-->
<!--                    <artifactId>jsr311-api</artifactId>-->
<!--                    <groupId>javax.ws.rs</groupId>-->
<!--                </exclusion>-->
<!--                <exclusion>-->
<!--                    <artifactId>jersey-core</artifactId>-->
<!--                    <groupId>com.sun.jersey</groupId>-->
<!--                </exclusion>-->
<!--                <exclusion>-->
<!--                    <artifactId>jersey-server</artifactId>-->
<!--                    <groupId>com.sun.jersey</groupId>-->
<!--                </exclusion>-->
<!--                <exclusion>-->
<!--                    <artifactId>guava</artifactId>-->
<!--                    <groupId>com.google.guava</groupId>-->
<!--                </exclusion>-->
<!--            </exclusions>-->
<!--        </dependency>-->
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-mapreduce-client-core</artifactId>
            <version>3.1.0</version>
            <exclusions>
                <exclusion>
                    <artifactId>slf4j-log4j12</artifactId>
                    <groupId>org.slf4j</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>jersey-client</artifactId>
                    <groupId>com.sun.jersey</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>jersey-server</artifactId>
                    <groupId>com.sun.jersey</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>jersey-servlet</artifactId>
                    <groupId>com.sun.jersey</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>jersey-core</artifactId>
                    <groupId>com.sun.jersey</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>jersey-json</artifactId>
                    <groupId>com.sun.jersey</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>guava</artifactId>
                    <groupId>com.google.guava</groupId>
                </exclusion>
            </exclusions>
        </dependency>

        <!-- https://mvnrepository.com/artifact/net.mguenther.kafka/kafka-junit -->
<!--        <dependency>-->
<!--            <groupId>net.mguenther.kafka</groupId>-->
<!--            <artifactId>kafka-junit</artifactId>-->
<!--        </dependency>-->

        <!-- https://mvnrepository.com/artifact/org.scala-lang/scala-library -->
<!--        <dependency>-->
<!--            <groupId>org.scala-lang</groupId>-->
<!--            <artifactId>scala-library</artifactId>-->
<!--        </dependency>-->

        <dependency>
            <groupId>com.twitter</groupId>
            <artifactId>chill-protobuf</artifactId>
            <!-- exclusions for dependency conversion -->
            <exclusions>
                <exclusion>
                    <groupId>com.esotericsoftware.kryo</groupId>
                    <artifactId>kryo</artifactId>
                </exclusion>
            </exclusions>
        </dependency>

<!--        <dependency>-->
<!--            <groupId>org.apache.kafka</groupId>-->
<!--            <artifactId>kafka_2.13</artifactId>-->
<!--        </dependency>-->


        <dependency>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
            <scope>test</scope>
        </dependency>

        <dependency>
            <groupId>net.java.dev.javacc</groupId>
            <artifactId>javacc</artifactId>
        </dependency>

        <dependency>
            <groupId>org.apache.httpcomponents</groupId>
            <artifactId>httpclient</artifactId>
            <version>4.5.10</version>
            <scope>compile</scope>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-statebackend-rocksdb_2.11</artifactId>
            <version>${flink.version}</version>
        </dependency>

        <dependency>
            <groupId>joda-time</groupId>
            <artifactId>joda-time</artifactId>
            <!-- managed version -->
            <scope>provided</scope>
            <!-- Avro records can contain JodaTime fields when using logical fields.
                In order to handle them, we need to add an optional dependency.
                Users with those Avro records need to add this dependency themselves. -->
            <optional>true</optional>
        </dependency>

        <!-- https://mvnrepository.com/artifact/com.github.rholder/guava-retrying -->
        <dependency>
            <groupId>com.github.rholder</groupId>
            <artifactId>guava-retrying</artifactId>
            <exclusions>
                <exclusion>
                    <artifactId>guava</artifactId>
                    <groupId>com.google.guava</groupId>
                </exclusion>
            </exclusions>
        </dependency>

        <dependency>
            <groupId>org.projectlombok</groupId>
            <artifactId>lombok</artifactId>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-java</artifactId>
            <version>${flink.version}</version>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-streaming-java_2.11</artifactId>
            <version>${flink.version}</version>
            <exclusions>
                <exclusion>
                    <artifactId>flink-shaded-zookeeper-3</artifactId>
                    <groupId>org.apache.flink</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>flink-shaded-guava</artifactId>
                    <groupId>org.apache.flink</groupId>
                </exclusion>
            </exclusions>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-clients_2.11</artifactId>
            <version>${flink.version}</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/org.mvel/mvel2 -->
        <dependency>
            <groupId>org.mvel</groupId>
            <artifactId>mvel2</artifactId>
        </dependency>

        <!-- https://mvnrepository.com/artifact/redis.clients/jedis -->
        <dependency>
            <groupId>redis.clients</groupId>
            <artifactId>jedis</artifactId>
        </dependency>

        <!-- 对zookeeper的底层api的一些封装 -->
        <dependency>
            <groupId>org.apache.curator</groupId>
            <artifactId>curator-framework</artifactId>
        </dependency>
        <!-- 封装了一些高级特性,如:Cache事件监听、选举、分布式锁、分布式Barrier -->
        <dependency>
            <groupId>org.apache.curator</groupId>
            <artifactId>curator-recipes</artifactId>
        </dependency>

        <dependency>
            <groupId>org.apache.kafka</groupId>
            <artifactId>kafka-clients</artifactId>
        </dependency>

        <dependency>
            <groupId>org.codehaus.groovy</groupId>
            <artifactId>groovy</artifactId>
        </dependency>
        <dependency>
            <groupId>org.codehaus.groovy</groupId>
            <artifactId>groovy-ant</artifactId>
        </dependency>
        <dependency>
            <groupId>org.codehaus.groovy</groupId>
            <artifactId>groovy-cli-commons</artifactId>
        </dependency>
        <dependency>
            <groupId>org.codehaus.groovy</groupId>
            <artifactId>groovy-cli-picocli</artifactId>
        </dependency>
        <dependency>
            <groupId>org.codehaus.groovy</groupId>
            <artifactId>groovy-console</artifactId>
        </dependency>
        <dependency>
            <groupId>org.codehaus.groovy</groupId>
            <artifactId>groovy-datetime</artifactId>
        </dependency>
        <dependency>
            <groupId>org.codehaus.groovy</groupId>
            <artifactId>groovy-docgenerator</artifactId>
        </dependency>
        <dependency>
            <groupId>org.codehaus.groovy</groupId>
            <artifactId>groovy-groovydoc</artifactId>

        </dependency>
        <dependency>
            <groupId>org.codehaus.groovy</groupId>
            <artifactId>groovy-groovysh</artifactId>

        </dependency>
        <dependency>
            <groupId>org.codehaus.groovy</groupId>
            <artifactId>groovy-jmx</artifactId>

        </dependency>
        <dependency>
            <groupId>org.codehaus.groovy</groupId>
            <artifactId>groovy-json</artifactId>

        </dependency>
        <dependency>
            <groupId>org.codehaus.groovy</groupId>
            <artifactId>groovy-jsr223</artifactId>

        </dependency>
        <dependency>
            <groupId>org.codehaus.groovy</groupId>
            <artifactId>groovy-macro</artifactId>

        </dependency>
        <dependency>
            <groupId>org.codehaus.groovy</groupId>
            <artifactId>groovy-nio</artifactId>

        </dependency>
        <dependency>
            <groupId>org.codehaus.groovy</groupId>
            <artifactId>groovy-servlet</artifactId>

        </dependency>
        <dependency>
            <groupId>org.codehaus.groovy</groupId>
            <artifactId>groovy-sql</artifactId>

        </dependency>
        <dependency>
            <groupId>org.codehaus.groovy</groupId>
            <artifactId>groovy-swing</artifactId>

        </dependency>
        <dependency>
            <groupId>org.codehaus.groovy</groupId>
            <artifactId>groovy-templates</artifactId>

        </dependency>
        <dependency>
            <groupId>org.codehaus.groovy</groupId>
            <artifactId>groovy-test</artifactId>

        </dependency>
        <dependency>
            <groupId>org.codehaus.groovy</groupId>
            <artifactId>groovy-test-junit5</artifactId>

        </dependency>
        <dependency>
            <groupId>org.codehaus.groovy</groupId>
            <artifactId>groovy-testng</artifactId>

        </dependency>
        <dependency>
            <groupId>org.codehaus.groovy</groupId>
            <artifactId>groovy-xml</artifactId>

        </dependency>

<!--        <dependency>-->
<!--            <groupId>org.apache.flink</groupId>-->
<!--            <artifactId>flink-table-planner_2.11</artifactId>-->
<!--            <version>1.13.5</version>-->
<!--&lt;!&ndash;            <scope>provided</scope>&ndash;&gt;-->
<!--        </dependency>-->

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-streaming-scala_2.11</artifactId>
            <version>${flink.version}</version>
        </dependency>

        <dependency>
            <groupId>mysql</groupId>
            <artifactId>mysql-connector-java</artifactId>
            <version>${mysql.version}</version>
        </dependency>

        <dependency>
            <groupId>com.google.code.gson</groupId>
            <artifactId>gson</artifactId>

        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-common</artifactId>
            <version>${flink.version}</version>
            <scope>compile</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-java</artifactId>
            <version>${flink.version}</version>
            <scope>compile</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-java-bridge_2.11</artifactId>
            <version>${flink.version}</version>
            <scope>compile</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-planner-blink_2.11</artifactId>
            <version>${flink.version}</version>
            <scope>compile</scope>
        </dependency>

        <!-- https://mvnrepository.com/artifact/org.apache.flink/flink-connector-jdbc -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-jdbc_2.11</artifactId>
            <version>${flink.version}</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-hbase-2.2_2.11</artifactId>
            <version>${flink.version}</version>
            <exclusions>
                <exclusion>
                    <artifactId>hbase-shaded-miscellaneous</artifactId>
                    <groupId>org.apache.hbase.thirdparty</groupId>
                </exclusion>
            </exclusions>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-json</artifactId>
            <version>${flink.version}</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/org.apache.bahir/flink-connector-redis -->
        <dependency>
            <groupId>org.apache.bahir</groupId>
            <artifactId>flink-connector-redis_2.10</artifactId>
            <version>1.0</version>
        </dependency>


        <!-- https://mvnrepository.com/artifact/org.apache.flink/flink-connector-kafka -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-kafka_2.12</artifactId>

        </dependency>


        <dependency>
            <groupId>ch.qos.logback</groupId>
            <artifactId>logback-classic</artifactId>
            <scope>compile</scope>

        </dependency>

        <!-- https://mvnrepository.com/artifact/org.slf4j/slf4j-log4j12 -->
        <dependency>
            <groupId>org.slf4j</groupId>
            <artifactId>slf4j-log4j12</artifactId>

        </dependency>

        <dependency>

            <groupId>org.apache.flink</groupId>

            <artifactId>flink-runtime-web_2.11</artifactId>

            <version>${flink.version}</version>

        </dependency>

        <!-- https://mvnrepository.com/artifact/com.fasterxml.jackson.core/jackson-databind -->
        <dependency>
            <groupId>com.fasterxml.jackson.core</groupId>
            <artifactId>jackson-databind</artifactId>

        </dependency>

        <dependency>
            <groupId>com.fasterxml.jackson.core</groupId>
            <artifactId>jackson-core</artifactId>

        </dependency>

        <dependency>
            <groupId>com.fasterxml.jackson.core</groupId>
            <artifactId>jackson-annotations</artifactId>

        </dependency>

        <!-- https://mvnrepository.com/artifact/com.fasterxml.jackson.module/jackson-module-kotlin -->
        <dependency>
            <groupId>com.fasterxml.jackson.module</groupId>
            <artifactId>jackson-module-kotlin</artifactId>

        </dependency>

        <!-- https://mvnrepository.com/artifact/com.fasterxml.jackson.module/jackson-module-parameter-names -->
        <dependency>
            <groupId>com.fasterxml.jackson.module</groupId>
            <artifactId>jackson-module-parameter-names</artifactId>

        </dependency>

        <!-- https://mvnrepository.com/artifact/com.fasterxml.jackson.datatype/jackson-datatype-guava -->
        <dependency>
            <groupId>com.fasterxml.jackson.datatype</groupId>
            <artifactId>jackson-datatype-guava</artifactId>
            <exclusions>
                <exclusion>
                    <artifactId>guava</artifactId>
                    <groupId>com.google.guava</groupId>
                </exclusion>
            </exclusions>

        </dependency>


        <!-- https://mvnrepository.com/artifact/com.hubspot.jackson/jackson-datatype-protobuf -->
        <dependency>
            <groupId>com.hubspot.jackson</groupId>
            <artifactId>jackson-datatype-protobuf</artifactId>
            <exclusions>
                <exclusion>
                    <artifactId>guava</artifactId>
                    <groupId>com.google.guava</groupId>
                </exclusion>
            </exclusions>

        </dependency>

        <!-- https://mvnrepository.com/artifact/org.apache.calcite/calcite-core -->
        <dependency>
            <groupId>org.apache.calcite</groupId>
            <artifactId>calcite-core</artifactId>
            <exclusions>
                <exclusion>
                    <artifactId>guava</artifactId>
                    <groupId>com.google.guava</groupId>
                </exclusion>
            </exclusions>

        </dependency>

        <dependency>
            <groupId>com.google.guava</groupId>
            <artifactId>guava</artifactId>
        </dependency>


    </dependencies>


</project>

================================================
FILE: flink-examples-1.13/src/main/java/flink/core/source/JaninoUtils.java
================================================
package flink.core.source;

import org.codehaus.janino.SimpleCompiler;

import lombok.extern.slf4j.Slf4j;


@Slf4j
public class JaninoUtils {

    private static final SimpleCompiler COMPILER = new SimpleCompiler();

    static {
        COMPILER.setParentClassLoader(JaninoUtils.class.getClassLoader());
    }

    public static <T> Class<T> genClass(String className, String code, Class<T> clazz) throws Exception {

        COMPILER.cook(code);

        System.out.println("生成的代码:\n" + code);

        return (Class<T>) COMPILER.getClassLoader().loadClass(className);
    }

}


================================================
FILE: flink-examples-1.13/src/main/java/flink/core/source/SourceFactory.java
================================================
package flink.core.source;

import java.io.IOException;

import org.apache.flink.api.common.serialization.DeserializationSchema;
import org.apache.flink.api.common.serialization.SerializationSchema;

import com.google.protobuf.GeneratedMessageV3;

import flink.examples.datastream._04.keyed_co_process.protobuf.Source;
import lombok.SneakyThrows;

public class SourceFactory {

    public static <Message extends GeneratedMessageV3> SerializationSchema<Message> getProtobufSer(Class<Message> clazz) {
        return new SerializationSchema<Message>() {
            @Override
            public byte[] serialize(Message element) {
                return element.toByteArray();
            }
        };
    }

    @SneakyThrows
    public static <Message extends GeneratedMessageV3> DeserializationSchema<Message> getProtobufDerse(Class<Message> clazz) {

        String code = TEMPLATE.replaceAll("\\$\\{ProtobufClassName}", clazz.getName())
                .replaceAll("\\$\\{SimpleProtobufName}", clazz.getSimpleName());

        String className = clazz.getSimpleName() + "_DeserializationSchema";

        Class<DeserializationSchema> deClass = JaninoUtils.genClass(className, code, DeserializationSchema.class);

        return deClass.newInstance();
    }

    private static final String TEMPLATE =
                        "public class ${SimpleProtobufName}_DeserializationSchema extends org.apache.flink.api.common"
                      + ".serialization.AbstractDeserializationSchema<${ProtobufClassName}> {\n"
                      + "\n"
                      + "    public ${SimpleProtobufName}_DeserializationSchema() {\n"
                      + "        super(${ProtobufClassName}.class);\n"
                      + "    }\n"
                      + "\n"
                      + "    @Override\n"
                      + "    public ${ProtobufClassName} deserialize(byte[] message) throws java.io.IOException {\n"
                      + "        return ${ProtobufClassName}.parseFrom(message);\n"
                      + "    }\n"
                      + "}";

    public static void main(String[] args) throws IOException {
        System.out.println(SourceFactory.class.getName());
        System.out.println(SourceFactory.class.getCanonicalName());
        System.out.println(SourceFactory.class.getSimpleName());
        System.out.println(SourceFactory.class.getTypeName());

        DeserializationSchema<Source> ds = getProtobufDerse(Source.class);

        Source s = Source.newBuilder()
                .addNames("antigeneral")
                .build();

        Source s1 = ds.deserialize(s.toByteArray());

        System.out.println();
    }

}


================================================
FILE: flink-examples-1.13/src/main/java/flink/examples/FlinkEnvUtils.java
================================================
package flink.examples;

import java.io.IOException;
import java.util.Optional;
import java.util.concurrent.TimeUnit;

import org.apache.flink.api.common.restartstrategy.RestartStrategies;
import org.apache.flink.api.java.utils.ParameterTool;
import org.apache.flink.configuration.Configuration;
import org.apache.flink.contrib.streaming.state.PredefinedOptions;
import org.apache.flink.contrib.streaming.state.RocksDBStateBackend;
import org.apache.flink.runtime.state.StateBackend;
import org.apache.flink.runtime.state.filesystem.FsStateBackend;
import org.apache.flink.runtime.state.memory.MemoryStateBackend;
import org.apache.flink.streaming.api.CheckpointingMode;
import org.apache.flink.streaming.api.environment.CheckpointConfig;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.table.api.EnvironmentSettings;
import org.apache.flink.table.api.SqlDialect;
import org.apache.flink.table.api.TableEnvironment;
import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
import org.apache.flink.table.catalog.hive.HiveCatalog;
import org.apache.flink.table.module.CoreModule;

import flink.examples.sql._08.batch._03_hive_udf.HiveModuleV2;
import lombok.Builder;
import lombok.Data;

public class FlinkEnvUtils {

    private static final boolean ENABLE_INCREMENTAL_CHECKPOINT = true;
    private static final int NUMBER_OF_TRANSFER_THREADS = 3;

    /**
     * 设置状态后端为 RocksDBStateBackend
     *
     * @param env env
     */
    public static void setRocksDBStateBackend(StreamExecutionEnvironment env) throws IOException {
        setCheckpointConfig(env);

        RocksDBStateBackend rocksDBStateBackend = new RocksDBStateBackend(
                "file:///Users/flink/checkpoints", ENABLE_INCREMENTAL_CHECKPOINT);
        rocksDBStateBackend.setNumberOfTransferThreads(NUMBER_OF_TRANSFER_THREADS);
        rocksDBStateBackend.setPredefinedOptions(PredefinedOptions.SPINNING_DISK_OPTIMIZED_HIGH_MEM);
        env.setStateBackend((StateBackend) rocksDBStateBackend);
    }


    /**
     * 设置状态后端为 FsStateBackend
     *
     * @param env env
     */
    public static void setFsStateBackend(StreamExecutionEnvironment env) throws IOException {
        setCheckpointConfig(env);
        FsStateBackend fsStateBackend = new FsStateBackend("file:///Users/flink/checkpoints");
        env.setStateBackend((StateBackend) fsStateBackend);
    }


    /**
     * 设置状态后端为 MemoryStateBackend
     *
     * @param env env
     */
    public static void setMemoryStateBackend(StreamExecutionEnvironment env) throws IOException {
        setCheckpointConfig(env);
        env.setStateBackend((StateBackend) new MemoryStateBackend());
    }

    /**
     * Checkpoint 参数相关配置,but 不设置 StateBackend,即:读取 flink-conf.yaml 文件的配置
     *
     * @param env env
     */
    public static void setCheckpointConfig(StreamExecutionEnvironment env) throws IOException {
        env.getCheckpointConfig().setCheckpointTimeout(TimeUnit.MINUTES.toMillis(3));
        // ck 设置
        env.getCheckpointConfig().setFailOnCheckpointingErrors(false);
        env.enableCheckpointing(180 * 1000L, CheckpointingMode.EXACTLY_ONCE);
        env.getCheckpointConfig().setMinPauseBetweenCheckpoints(3L);

        Configuration configuration = new Configuration();
        configuration.setString("state.checkpoints.num-retained", "3");

        env.configure(configuration, Thread.currentThread().getContextClassLoader());

        env.getCheckpointConfig().enableExternalizedCheckpoints(CheckpointConfig.ExternalizedCheckpointCleanup.RETAIN_ON_CANCELLATION);
    }

    public static FlinkEnv getStreamTableEnv(String[] args) throws IOException {

        ParameterTool parameterTool = ParameterTool.fromArgs(args);

        Configuration configuration = Configuration.fromMap(parameterTool.toMap());

        configuration.setString("rest.flamegraph.enabled", "true");

        StreamExecutionEnvironment env =
                StreamExecutionEnvironment.createLocalEnvironmentWithWebUI(configuration);

        String stateBackend = parameterTool.get("state.backend", "rocksdb");

        env.setParallelism(1);

        if ("rocksdb".equals(stateBackend)) {
            setRocksDBStateBackend(env);
        } else if ("filesystem".equals(stateBackend)) {
            setFsStateBackend(env);
        } else if ("jobmanager".equals(stateBackend)) {
            setMemoryStateBackend(env);
        }


        env.setRestartStrategy(RestartStrategies.failureRateRestart(6, org.apache.flink.api.common.time.Time
                .of(10L, TimeUnit.MINUTES), org.apache.flink.api.common.time.Time.of(5L, TimeUnit.SECONDS)));
        env.getConfig().setGlobalJobParameters(parameterTool);

        EnvironmentSettings settings = EnvironmentSettings
                .newInstance()
                .useBlinkPlanner()
                .inStreamingMode()
                .build();

        StreamTableEnvironment tEnv = StreamTableEnvironment.create(env, settings);

        tEnv.getConfig().addConfiguration(configuration);

        FlinkEnv flinkEnv = FlinkEnv
                .builder()
                .streamExecutionEnvironment(env)
                .streamTableEnvironment(tEnv)
                .build();

        initHiveEnv(flinkEnv, parameterTool);

        return flinkEnv;
    }

    /**
     * hadoop 启动:/usr/local/Cellar/hadoop/3.2.1/sbin/start-all.sh
     * http://localhost:9870/
     * http://localhost:8088/cluster
     *
     * hive 启动:$HIVE_HOME/bin/hive --service metastore &
     * hive cli:$HIVE_HOME/bin/hive
     */
    private static void initHiveEnv(FlinkEnv flinkEnv, ParameterTool parameterTool) {
        String defaultDatabase = "default";
        String hiveConfDir = "/usr/local/Cellar/hive/3.1.2/libexec/conf";

        boolean enableHiveCatalog = parameterTool.getBoolean("enable.hive.catalog", false);

        if (enableHiveCatalog) {
            HiveCatalog hive = new HiveCatalog("default", defaultDatabase, hiveConfDir);

            Optional.ofNullable(flinkEnv.streamTEnv())
                    .ifPresent(s -> s.registerCatalog("default", hive));

            Optional.ofNullable(flinkEnv.batchTEnv())
                    .ifPresent(s -> s.registerCatalog("default", hive));

            // set the HiveCatalog as the current catalog of the session

            Optional.ofNullable(flinkEnv.streamTEnv())
                    .ifPresent(s -> s.useCatalog("default"));

            Optional.ofNullable(flinkEnv.batchTEnv())
                    .ifPresent(s -> s.useCatalog("default"));
        }

        boolean enableHiveDialect = parameterTool.getBoolean("enable.hive.dialect", false);

        if (enableHiveDialect) {

            Optional.ofNullable(flinkEnv.streamTEnv())
                    .ifPresent(s -> s.getConfig().setSqlDialect(SqlDialect.HIVE));

            Optional.ofNullable(flinkEnv.batchTEnv())
                    .ifPresent(s -> s.getConfig().setSqlDialect(SqlDialect.HIVE));
        }

        boolean enableHiveModuleV2 = parameterTool.getBoolean("enable.hive.module.v2", true);

        if (enableHiveModuleV2) {
            String version = "3.1.2";

            HiveModuleV2 hiveModuleV2 = new HiveModuleV2(version);

            final boolean enableHiveModuleLoadFirst = parameterTool.getBoolean("enable.hive.module.load-first", true);

            Optional.ofNullable(flinkEnv.streamTEnv())
                    .ifPresent(s -> {
                        if (enableHiveModuleLoadFirst) {
                            s.unloadModule("core");
                            s.loadModule("default", hiveModuleV2);
                            s.loadModule("core", CoreModule.INSTANCE);
                        } else {
                            s.loadModule("default", hiveModuleV2);
                        }
                    });

            Optional.ofNullable(flinkEnv.batchTEnv())
                    .ifPresent(s -> {
                        if (enableHiveModuleLoadFirst) {
                            s.unloadModule("core");
                            s.loadModule("default", hiveModuleV2);
                            s.loadModule("core", CoreModule.INSTANCE);
                        } else {
                            s.loadModule("default", hiveModuleV2);
                        }
                    });

            flinkEnv.setHiveModuleV2(hiveModuleV2);
        }
    }


    public static FlinkEnv getBatchTableEnv(String[] args) throws IOException {

        StreamExecutionEnvironment env =
                StreamExecutionEnvironment.createLocalEnvironmentWithWebUI(new Configuration());

        ParameterTool parameterTool = ParameterTool.fromArgs(args);

        env.setRestartStrategy(RestartStrategies.failureRateRestart(6, org.apache.flink.api.common.time.Time
                .of(10L, TimeUnit.MINUTES), org.apache.flink.api.common.time.Time.of(5L, TimeUnit.SECONDS)));
        env.getConfig().setGlobalJobParameters(parameterTool);
        env.setParallelism(1);

        // ck 设置
        env.getCheckpointConfig().setFailOnCheckpointingErrors(false);
        env.enableCheckpointing(30 * 1000L, CheckpointingMode.EXACTLY_ONCE);
        env.getCheckpointConfig().setMinPauseBetweenCheckpoints(3L);
        env.getCheckpointConfig()
                .enableExternalizedCheckpoints(CheckpointConfig.ExternalizedCheckpointCleanup.RETAIN_ON_CANCELLATION);

        EnvironmentSettings settings = EnvironmentSettings
                .newInstance()
                .useBlinkPlanner()
                .inBatchMode()
                .build();

        TableEnvironment tEnv = TableEnvironment.create(settings);

        FlinkEnv flinkEnv = FlinkEnv
                .builder()
                .streamExecutionEnvironment(env)
                .tableEnvironment(tEnv)
                .build();


        initHiveEnv(flinkEnv, parameterTool);

        return flinkEnv;
    }

    @Builder
    @Data
    public static class FlinkEnv {
        private StreamExecutionEnvironment streamExecutionEnvironment;
        private StreamTableEnvironment streamTableEnvironment;
        private TableEnvironment tableEnvironment;
        private HiveModuleV2 hiveModuleV2;

        public StreamTableEnvironment streamTEnv() {
            return this.streamTableEnvironment;
        }

        public TableEnvironment batchTEnv() {
            return this.tableEnvironment;
        }

        public StreamExecutionEnvironment env() {
            return this.streamExecutionEnvironment;
        }

        public HiveModuleV2 hiveModuleV2() {
            return this.hiveModuleV2;
        }
    }

}


================================================
FILE: flink-examples-1.13/src/main/java/flink/examples/JacksonUtils.java
================================================
package flink.examples;

import static com.fasterxml.jackson.core.JsonParser.Feature.ALLOW_COMMENTS;
import static com.fasterxml.jackson.core.JsonParser.Feature.ALLOW_UNQUOTED_CONTROL_CHARS;
import static com.fasterxml.jackson.databind.DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES;

import java.util.List;
import java.util.Map;

import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.JavaType;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.hubspot.jackson.datatype.protobuf.ProtobufModule;


public class JacksonUtils {

    private static ObjectMapper mapper = new ObjectMapper();

    static {
        mapper.registerModule(new ProtobufModule());
        mapper.disable(FAIL_ON_UNKNOWN_PROPERTIES);
        mapper.enable(ALLOW_UNQUOTED_CONTROL_CHARS);
        mapper.enable(ALLOW_COMMENTS);
    }

    public static String bean2Json(Object data) {
        try {
            String result = mapper.writeValueAsString(data);
            return result;
        } catch (JsonProcessingException e) {
            e.printStackTrace();
        }
        return null;
    }

    public static <T> T json2Bean(String jsonData, Class<T> beanType) {
        try {
            T result = mapper.readValue(jsonData, beanType);
            return result;
        } catch (Exception e) {
            e.printStackTrace();
        }

        return null;
    }

    public static <T> List<T> json2List(String jsonData, Class<T> beanType) {
        JavaType javaType = mapper.getTypeFactory().constructParametricType(List.class, beanType);

        try {
            List<T> resultList = mapper.readValue(jsonData, javaType);
            return resultList;
        } catch (Exception e) {
            e.printStackTrace();
        }

        return null;
    }

    public static <K, V> Map<K, V> json2Map(String jsonData, Class<K> keyType, Class<V> valueType) {
        JavaType javaType = mapper.getTypeFactory().constructMapType(Map.class, keyType, valueType);

        try {
            Map<K, V> resultMap = mapper.readValue(jsonData, javaType);
            return resultMap;
        } catch (Exception e) {
            e.printStackTrace();
        }

        return null;
    }

}


================================================
FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/codegen/JaninoUtils.java
================================================
package flink.examples.datastream._01.bytedance.split.codegen;

import org.codehaus.janino.SimpleCompiler;

import flink.examples.datastream._01.bytedance.split.model.Evaluable;
import lombok.extern.slf4j.Slf4j;


@Slf4j
public class JaninoUtils {

    private static final SimpleCompiler COMPILER = new SimpleCompiler();

    static {
        COMPILER.setParentClassLoader(JaninoUtils.class.getClassLoader());
    }

    public static Class<Evaluable> genCodeAndGetClazz(Long id, String topic, String condition) throws Exception {

        String className = "CodeGen_" + topic + "_" + id;

        String code = "import org.apache.commons.lang3.ArrayUtils;\n"
                + "\n"
                + "public class " + className + " implements flink.examples.datastream._01.bytedance.split.model.Evaluable {\n"
                + "    \n"
                + "    @Override\n"
                + "    public boolean eval(flink.examples.datastream._01.bytedance.split.model.ClientLogSource clientLogSource) {\n"
                + "        \n"
                + "        return " + condition + ";\n"
                + "    }\n"
                + "}\n";

        COMPILER.cook(code);

        System.out.println("生成的代码:\n" + code);

        return (Class<Evaluable>) COMPILER.getClassLoader().loadClass(className);
    }

    public static void main(String[] args) throws Exception {
        Class<Evaluable> c = genCodeAndGetClazz(1L, "topic", "1==1");

        System.out.println(1);
    }

}


================================================
FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/codegen/benchmark/Benchmark.java
================================================
package flink.examples.datastream._01.bytedance.split.codegen.benchmark;

import org.codehaus.groovy.control.CompilerConfiguration;

import flink.examples.datastream._01.bytedance.split.model.ClientLogSource;
import flink.examples.datastream._01.bytedance.split.model.DynamicProducerRule;
import groovy.lang.GroovyClassLoader;
import groovy.lang.GroovyObject;
import lombok.extern.slf4j.Slf4j;


@Slf4j
public class Benchmark {

    private static void benchmarkForJava() {
        ClientLogSource s = ClientLogSource.builder().id(1).build();

        long start2 = System.currentTimeMillis();

        for (int i = 0; i < 50000000; i++) {
            boolean b = String.valueOf(s.getId()).equals("1");
        }

        long end2 = System.currentTimeMillis();

        System.out.println("java:" + (end2 - start2) + " ms");
    }

    public static void benchmarkForGroovyClassLoader() {

        CompilerConfiguration config = new CompilerConfiguration();
        config.setSourceEncoding("UTF-8");
        // 设置该GroovyClassLoader的父ClassLoader为当前线程的加载器(默认)
        GroovyClassLoader groovyClassLoader =
                new GroovyClassLoader(Thread.currentThread().getContextClassLoader(), config);

        String groovyCode = "class demo_002 {\n"
                + "    boolean eval(flink.examples.datastream._01.bytedance.split.model.SourceModel sourceModel) {\n"
                + "        return String.valueOf(sourceModel.getId()).equals(\"1\");\n"
                + "    }\n"
                + "}";
        try {
            // 获得GroovyShell_2加载后的class
            Class<?> groovyClass = groovyClassLoader.parseClass(groovyCode);
            // 获得GroovyShell_2的实例
            GroovyObject groovyObject = (GroovyObject) groovyClass.newInstance();

            ClientLogSource s = ClientLogSource.builder().id(1).build();

            long start1 = System.currentTimeMillis();

            for (int i = 0; i < 50000000; i++) {
                Object methodResult = groovyObject.invokeMethod("eval", s);
            }

            long end1 = System.currentTimeMillis();

            System.out.println("groovy:" + (end1 - start1) + " ms");
        } catch (Exception e) {
            e.getStackTrace();
        }
    }

    public static void benchmarkForJanino() {

        String condition = "String.valueOf(sourceModel.getId()).equals(\"1\")";

        DynamicProducerRule dynamicProducerRule = DynamicProducerRule
                .builder()
                .condition(condition)
                .targetTopic("t")
                .build();

        dynamicProducerRule.init(1L);

        ClientLogSource s = ClientLogSource.builder().id(1).build();

        long start2 = System.currentTimeMillis();

        for (int i = 0; i < 50000000; i++) {
            boolean b = dynamicProducerRule.eval(s);
        }

        long end2 = System.currentTimeMillis();

        System.out.println("janino:" + (end2 - start2) + " ms");
    }

    public static void main(String[] args) throws Exception {

        for (int i = 0; i < 10; i++) {
            benchmarkForJava();

            // janino
            benchmarkForJanino();

            // groovy classloader
            benchmarkForGroovyClassLoader();

            System.out.println();
        }
    }

}


================================================
FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/job/SplitExampleJob.java
================================================
package flink.examples.datastream._01.bytedance.split.job;

import java.util.Date;
import java.util.concurrent.TimeUnit;
import java.util.function.BiConsumer;

import org.apache.commons.lang3.RandomUtils;
import org.apache.flink.api.common.restartstrategy.RestartStrategies;
import org.apache.flink.api.java.utils.ParameterTool;
import org.apache.flink.configuration.Configuration;
import org.apache.flink.streaming.api.CheckpointingMode;
import org.apache.flink.streaming.api.TimeCharacteristic;
import org.apache.flink.streaming.api.environment.CheckpointConfig;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.api.functions.ProcessFunction;
import org.apache.flink.streaming.api.functions.source.SourceFunction;
import org.apache.flink.util.Collector;

import flink.examples.datastream._01.bytedance.split.kafka.KafkaProducerCenter;
import flink.examples.datastream._01.bytedance.split.model.ClientLogSink;
import flink.examples.datastream._01.bytedance.split.model.ClientLogSource;
import flink.examples.datastream._01.bytedance.split.model.DynamicProducerRule;
import flink.examples.datastream._01.bytedance.split.zkconfigcenter.ZkBasedConfigCenter;

/**
 * zk:https://www.jianshu.com/p/5491d16e6abd
 * kafka:https://www.jianshu.com/p/dd2578d47ff6
 */
public class SplitExampleJob {

    public static void main(String[] args) throws Exception {

        ParameterTool parameters = ParameterTool.fromArgs(args);

        StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();

        // 其他参数设置
        env.setRestartStrategy(RestartStrategies.failureRateRestart(6, org.apache.flink.api.common.time.Time
                .of(10L, TimeUnit.MINUTES), org.apache.flink.api.common.time.Time.of(5L, TimeUnit.SECONDS)));
        env.getConfig().setGlobalJobParameters(parameters);
        env.setMaxParallelism(2);

        // ck 设置
        env.getCheckpointConfig().setFailOnCheckpointingErrors(false);
        env.enableCheckpointing(30 * 1000L, CheckpointingMode.EXACTLY_ONCE);
        env.getCheckpointConfig().setMinPauseBetweenCheckpoints(3L);
        env.getCheckpointConfig().enableExternalizedCheckpoints(CheckpointConfig.ExternalizedCheckpointCleanup.RETAIN_ON_CANCELLATION);

        env.setParallelism(1);

        env.setStreamTimeCharacteristic(TimeCharacteristic.ProcessingTime);

        env.addSource(new UserDefinedSource())
                .process(new ProcessFunction<ClientLogSource, ClientLogSink>() {

                    private ZkBasedConfigCenter zkBasedConfigCenter;

                    private KafkaProducerCenter kafkaProducerCenter;

                    @Override
                    public void open(Configuration parameters) throws Exception {
                        super.open(parameters);
                        this.zkBasedConfigCenter = ZkBasedConfigCenter.getInstance();
                        this.kafkaProducerCenter = KafkaProducerCenter.getInstance();

                    }

                    @Override
                    public void processElement(ClientLogSource clientLogSource, Context context, Collector<ClientLogSink> collector)
                            throws Exception {

                        this.zkBasedConfigCenter.getMap().forEach(new BiConsumer<Long, DynamicProducerRule>() {
                            @Override
                            public void accept(Long id, DynamicProducerRule dynamicProducerRule) {

                                if (dynamicProducerRule.eval(clientLogSource)) {
                                    kafkaProducerCenter.send(dynamicProducerRule.getTargetTopic(), clientLogSource.toString());
                                }

                            }
                        });
                    }

                    @Override
                    public void close() throws Exception {
                        super.close();
                        this.zkBasedConfigCenter.close();
                        this.kafkaProducerCenter.close();
                    }
                });

        env.execute();
    }

    private static class UserDefinedSource implements SourceFunction<ClientLogSource> {

        private volatile boolean isCancel;

        @Override
        public void run(SourceContext<ClientLogSource> sourceContext) throws Exception {

            while (!this.isCancel) {
                sourceContext.collect(
                        ClientLogSource
                                .builder()
                                .id(RandomUtils.nextInt(0, 10))
                                .price(RandomUtils.nextInt(0, 100))
                                .timestamp(System.currentTimeMillis())
                                .date(new Date().toString())
                                .build()
                );

                Thread.sleep(1000L);
            }

        }

        @Override
        public void cancel() {
            this.isCancel = true;
        }
    }

}


================================================
FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/job/start.sh
================================================
# 1.kafka 初始化

cd /kafka-bin-目录

# 启动 kafka server
./kafka-server-start /usr/local/etc/kafka/server.properties &

# 创建 3 个 topic
kafka-topics --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic tuzisir

kafka-topics --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic tuzisir1

kafka-topics --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic tuzisir2

# 启动一个 console consumer

kafka-console-consumer --bootstrap-server localhost:9092 --topic tuzisir --from-beginning

# 2.zk 初始化

cd /zk-bin-目录

zkServer start

zkCli -server 127.0.0.1:2181

# zkCli 中需要执行的命令
create /kafka-config {"1":{"condition":"1==1","targetTopic":"tuzisir1"},"2":{"condition":"1!=1","targetTopic":"tuzisir2"}}

get /kafka-config


================================================
FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/kafka/KafkaProducerCenter.java
================================================
package flink.examples.datastream._01.bytedance.split.kafka;

import java.util.Properties;
import java.util.concurrent.ConcurrentHashMap;
import java.util.concurrent.ConcurrentMap;
import java.util.function.BiConsumer;
import java.util.function.Function;

import org.apache.kafka.clients.producer.KafkaProducer;
import org.apache.kafka.clients.producer.Producer;
import org.apache.kafka.clients.producer.ProducerRecord;
import org.apache.kafka.clients.producer.RecordMetadata;

import flink.examples.datastream._01.bytedance.split.zkconfigcenter.ZkBasedConfigCenter;


public class KafkaProducerCenter {

    private final ConcurrentMap<String, Producer<String, String>> producerConcurrentMap
            = new ConcurrentHashMap<>();

    private KafkaProducerCenter() {
        ZkBasedConfigCenter.getInstance()
                .getMap()
                .values()
                .forEach(d -> getProducer(d.getTargetTopic()));
    }

    private static class Factory {
        private static final KafkaProducerCenter INSTANCE = new KafkaProducerCenter();
    }

    public static KafkaProducerCenter getInstance() {
        return Factory.INSTANCE;
    }

    private Producer<String, String> getProducer(String topicName) {

        Producer<String, String> producer = producerConcurrentMap.get(topicName);

        if (null != producer) {
            return producer;
        }

        return producerConcurrentMap.computeIfAbsent(topicName, new Function<String, Producer<String, String>>() {
            @Override
            public Producer<String, String> apply(String topicName) {
                Properties props = new Properties();
                props.put("bootstrap.servers", "localhost:9092");
                props.put("acks", "all");
                props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
                props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
                return new KafkaProducer<>(props);
            }
        });

    }

    public void send(String topicName, String message) {

        final ProducerRecord<String, String> record = new ProducerRecord<>(topicName,
                "", message);
        try {
            RecordMetadata metadata = getProducer(topicName).send(record).get();
        } catch (Exception e) {
            throw new RuntimeException(e);
        }
    }

    public void close() {
        this.producerConcurrentMap.forEach(new BiConsumer<String, Producer<String, String>>() {
            @Override
            public void accept(String s, Producer<String, String> stringStringProducer) {
                stringStringProducer.flush();
                stringStringProducer.close();
            }
        });
    }

}


================================================
FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/kafka/demo/Application.java
================================================
package flink.examples.datastream._01.bytedance.split.kafka.demo;


public class Application {

    private String topicName = "tuzisir";
    private String consumerGrp = "consumerGrp";
    private String brokerUrl = "localhost:9092";

    public static void main(String[] args) throws InterruptedException {


        System.out.println(1);

        Application application = new Application();
        new Thread(new ProducerThread(application), "Producer : ").start();
        new Thread(new ConsumerThread(application), "Consumer1 : ").start();

        //for multiple consumers in same group, start new consumer threads
        //new Thread(new ConsumerThread(application), "Consumer2 : ").start();
    }

    public String getTopicName() {
        return topicName;
    }

    public String getConsumerGrp() {
        return consumerGrp;
    }

    public String getBrokerUrl() {
        return brokerUrl;
    }

}


================================================
FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/kafka/demo/ConsumerThread.java
================================================
package flink.examples.datastream._01.bytedance.split.kafka.demo;

import java.time.Duration;
import java.util.Collections;
import java.util.Properties;

import org.apache.kafka.clients.consumer.Consumer;
import org.apache.kafka.clients.consumer.ConsumerRecord;
import org.apache.kafka.clients.consumer.ConsumerRecords;
import org.apache.kafka.clients.consumer.KafkaConsumer;


public class ConsumerThread implements Runnable {

    private Consumer<String, String> consumer;

    public ConsumerThread(Application application) {
        Properties props = new Properties();
        props.put("bootstrap.servers", application.getBrokerUrl());
        props.put("group.id", application.getConsumerGrp());
        props.put("enable.auto.commit", "true");
        props.put("auto.commit.interval.ms", "1000");
        props.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
        props.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
        //props.put("auto.offset.reset", "earliest");
        consumer = new KafkaConsumer<>(props);
        consumer.subscribe(Collections.singletonList(application.getTopicName()));
    }

    @Override
    public void run() {
        String threadName = Thread.currentThread().getName();
        int noMessageToFetch = 1;
        while (noMessageToFetch < 3) {
            System.out.println(threadName + "poll start..");
            final ConsumerRecords<String, String> consumerRecords = consumer.poll(Duration.ofSeconds(1));
            System.out.println(threadName + "records polled : " + consumerRecords.count());
            if (consumerRecords.count() == 0) {
                noMessageToFetch++;
                continue;
            }
            for (ConsumerRecord<String, String> record : consumerRecords) {
                System.out.printf(threadName + "offset = %d, key = %s, value = %s, partition =%d%n",
                        record.offset(), record.key(), record.value(), record.partition());
            }
            consumer.commitAsync();
        }
    }

}


================================================
FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/kafka/demo/ProducerThread.java
================================================
package flink.examples.datastream._01.bytedance.split.kafka.demo;

import java.util.Properties;
import java.util.concurrent.ExecutionException;

import org.apache.kafka.clients.producer.KafkaProducer;
import org.apache.kafka.clients.producer.Producer;
import org.apache.kafka.clients.producer.ProducerRecord;
import org.apache.kafka.clients.producer.RecordMetadata;


public class ProducerThread implements Runnable {

    private Producer<String, String> producer;
    private String topicName;

    public ProducerThread(Application application) {
        this.topicName = application.getTopicName();
        Properties props = new Properties();
        props.put("bootstrap.servers", application.getBrokerUrl());
        props.put("acks", "all");
        props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
        props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
        producer = new KafkaProducer<>(props);
    }

    @Override
    public void run() {
        String threadName = Thread.currentThread().getName();
        for (int index = 1; index < 100; index++) {
            final ProducerRecord<String, String> record = new ProducerRecord<>(topicName,
                    Integer.toString(index), Integer.toString(index));
            try {
                RecordMetadata metadata = producer.send(record).get();
                System.out
                        .println(threadName + "Record sent with key " + index + " to partition " + metadata.partition()
                                + " with offset " + metadata.offset());
            } catch (ExecutionException e) {
                System.out.println(threadName + "Error in sending record :" + e);
                throw new RuntimeException(e);
            } catch (InterruptedException e) {
                System.out.println(threadName + "Error in sending record : " + e);
                throw new RuntimeException(e);
            } catch (Exception e) {
                System.out.println(threadName + "Error in sending record : " + e);
                throw new RuntimeException(e);
            }
        }
    }

}


================================================
FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/model/ClientLogSink.java
================================================
package flink.examples.datastream._01.bytedance.split.model;

import lombok.Builder;
import lombok.Data;


@Data
@Builder
public class ClientLogSink {
    private int id;
    private int price;
    private long timestamp;

}


================================================
FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/model/ClientLogSource.java
================================================
package flink.examples.datastream._01.bytedance.split.model;

import lombok.Builder;
import lombok.Data;


@Data
@Builder
public class ClientLogSource {

    private int id;
    private int price;
    private long timestamp;
    private String date;
    private String page;

}


================================================
FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/model/DynamicProducerRule.java
================================================
package flink.examples.datastream._01.bytedance.split.model;


import flink.examples.datastream._01.bytedance.split.codegen.JaninoUtils;
import lombok.Builder;
import lombok.Data;


@Data
@Builder
public class DynamicProducerRule implements Evaluable {

    private String condition;

    private String targetTopic;

    private Evaluable evaluable;

    public void init(Long id) {
        try {
            Class<Evaluable> clazz = JaninoUtils.genCodeAndGetClazz(id, targetTopic, condition);
            this.evaluable = clazz.newInstance();
        } catch (Exception e) {
            throw new RuntimeException(e);
        }
    }

    @Override
    public boolean eval(ClientLogSource clientLogSource) {
        return this.evaluable.eval(clientLogSource);
    }

    public static void main(String[] args) throws Exception {
        String condition = "String.valueOf(sourceModel.getId())==\"1\"";

        DynamicProducerRule dynamicProducerRule = DynamicProducerRule
                .builder()
                .condition(condition)
                .targetTopic("t")
                .build();

        dynamicProducerRule.init(1L);

        boolean b = dynamicProducerRule.eval(ClientLogSource.builder().id(1).build());

        System.out.println();
    }

}


================================================
FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/model/Evaluable.java
================================================
package flink.examples.datastream._01.bytedance.split.model;


public interface Evaluable {

    boolean eval(ClientLogSource clientLogSource);

}


================================================
FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/zkconfigcenter/ZkBasedConfigCenter.java
================================================
package flink.examples.datastream._01.bytedance.split.zkconfigcenter;

import java.lang.reflect.Type;
import java.util.HashMap;
import java.util.Map;
import java.util.Optional;
import java.util.Set;
import java.util.concurrent.ConcurrentHashMap;
import java.util.concurrent.ConcurrentMap;
import java.util.function.BiConsumer;
import java.util.function.Consumer;

import org.apache.curator.framework.CuratorFramework;
import org.apache.curator.framework.CuratorFrameworkFactory;
import org.apache.curator.framework.recipes.cache.TreeCache;
import org.apache.curator.framework.recipes.cache.TreeCacheEvent;
import org.apache.curator.framework.recipes.cache.TreeCacheListener;
import org.apache.curator.retry.RetryOneTime;

import com.google.common.collect.Sets;
import com.google.gson.Gson;
import com.google.gson.reflect.TypeToken;

import flink.examples.datastream._01.bytedance.split.model.DynamicProducerRule;


public class ZkBasedConfigCenter {

    private TreeCache treeCache;

    private CuratorFramework zkClient;

    private static class Factory {
        private static final ZkBasedConfigCenter INSTANCE = new ZkBasedConfigCenter();
    }

    public static ZkBasedConfigCenter getInstance() {
        return Factory.INSTANCE;
    }

    private ZkBasedConfigCenter() {
        try {
            open();
        } catch (Exception e) {
            e.printStackTrace();
            throw new RuntimeException(e);
        }
    }

    private ConcurrentMap<Long, DynamicProducerRule> map = new ConcurrentHashMap<>();

    public ConcurrentMap<Long, DynamicProducerRule> getMap() {
        return map;
    }


    private void setData() throws Exception {
        String path = "/kafka-config";
        zkClient = CuratorFrameworkFactory.newClient("127.0.0.1:2181", new RetryOneTime(1000));
        zkClient.start();

        zkClient.setData().forPath(path, ("{\n"
                + "  1: {\n"
                + "    \"condition\": \"1==1\",\n"
                + "    \"targetTopic\": \"tuzisir1\"\n"
                + "  },\n"
                + "  2: {\n"
                + "    \"condition\": \"1!=1\",\n"
                + "    \"targetTopic\": \"tuzisir2\"\n"
                + "  }\n"
                + "}").getBytes());
    }

    private void open() throws Exception {

        String path = "/kafka-config";

        zkClient = CuratorFrameworkFactory.newClient("127.0.0.1:2181", new RetryOneTime(1000));
        zkClient.start();
        // 启动时读取远程配置中心的配置信息

        String json = new String(zkClient.getData().forPath(path));

        this.update(json);

        treeCache = new TreeCache(zkClient, path);
        treeCache.start();
        treeCache.getListenable().addListener(new TreeCacheListener() {
            @Override
            public void childEvent(CuratorFramework curatorFramework, TreeCacheEvent treeCacheEvent) throws Exception {
                switch (treeCacheEvent.getType()) {
                    case NODE_UPDATED:
                        // 通知的内容:包含路径和值
                        byte[] data = treeCacheEvent.getData().getData();

                        String json = new String(data);

                        System.out.println("配置变化为了:" + json);

                        // 更新数据
                        update(json);
                        break;
                    default:

                }

            }
        });

    }

    public void close() {
        this.treeCache.close();
        this.zkClient.close();
    }

    private void update(String json) {

        Map<Long, DynamicProducerRule>
                result = getNewMap(json);

        Set<Long> needAddId = Sets.difference(result.keySet(), map.keySet()).immutableCopy();

        Set<Long> needDeleteId = Sets.difference(map.keySet(), result.keySet()).immutableCopy();

        needAddId.forEach(new Consumer<Long>() {
            @Override
            public void accept(Long id) {
                DynamicProducerRule dynamicProducerRule = result.get(id);
                dynamicProducerRule.init(id);
                map.put(id, dynamicProducerRule);
            }
        });

        needDeleteId.forEach(new Consumer<Long>() {
            @Override
            public void accept(Long id) {
                map.remove(id);
            }
        });
    }

    private Map<Long, DynamicProducerRule> getNewMap(String json) {

        Gson gson = new Gson();

        Map<String, DynamicProducerRule> newMap = null;

        Type type = new TypeToken<Map<String, DynamicProducerRule>>() {
        }.getType();

        newMap = gson.fromJson(json, type);

        Map<Long, DynamicProducerRule> result = new HashMap<>();

        Optional.ofNullable(newMap)
                .ifPresent(new Consumer<Map<String, DynamicProducerRule>>() {
                    @Override
                    public void accept(Map<String, DynamicProducerRule> stringDynamicProducerRuleMap) {
                        stringDynamicProducerRuleMap.forEach(new BiConsumer<String, DynamicProducerRule>() {
                            @Override
                            public void accept(String s, DynamicProducerRule dynamicProducerRule) {
                                result.put(Long.parseLong(s), dynamicProducerRule);
                            }
                        });
                    }
                });


        return result;

    }

}


================================================
FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/zkconfigcenter/new.json
================================================
{"1":{"condition":"1==1","targetTopic":"tuzisir1"},"2":{"condition":"1!=1","targetTopic":"tuzisir2"},"3":{"condition":"1==1","targetTopic":"tuzisir"}}

================================================
FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/zkconfigcenter/old.json
================================================
{"1":{"condition":"1==1","targetTopic":"tuzisir1"},"2":{"condition":"1!=1","targetTopic":"tuzisir2"}}

================================================
FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_02/DataStreamTest.java
================================================
package flink.examples.datastream._02;

import java.io.IOException;
import java.util.Properties;
import java.util.concurrent.TimeUnit;
import java.util.function.Consumer;

import org.apache.commons.lang3.RandomUtils;
import org.apache.flink.api.common.restartstrategy.RestartStrategies;
import org.apache.flink.api.common.serialization.AbstractDeserializationSchema;
import org.apache.flink.api.common.serialization.DeserializationSchema;
import org.apache.flink.api.java.functions.KeySelector;
import org.apache.flink.api.java.tuple.Tuple2;
import org.apache.flink.api.java.utils.ParameterTool;
import org.apache.flink.streaming.api.CheckpointingMode;
import org.apache.flink.streaming.api.TimeCharacteristic;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.CheckpointConfig;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.api.functions.source.SourceFunction;
import org.apache.flink.streaming.api.functions.timestamps.BoundedOutOfOrdernessTimestampExtractor;
import org.apache.flink.streaming.api.functions.windowing.ProcessWindowFunction;
import org.apache.flink.streaming.api.windowing.assigners.TumblingProcessingTimeWindows;
import org.apache.flink.streaming.api.windowing.time.Time;
import org.apache.flink.streaming.api.windowing.windows.TimeWindow;
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer;
import org.apache.flink.util.Collector;

import lombok.Builder;
import lombok.Data;


public class DataStreamTest {

    public static void main(String[] args) throws Exception {

        ParameterTool parameters = ParameterTool.fromArgs(args);

        StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();


        // 其他参数设置
        env.setRestartStrategy(RestartStrategies.failureRateRestart(6, org.apache.flink.api.common.time.Time
                .of(10L, TimeUnit.MINUTES), org.apache.flink.api.common.time.Time.of(5L, TimeUnit.SECONDS)));
        env.getConfig().setGlobalJobParameters(parameters);
        env.setMaxParallelism(2);

        // ck 设置
        env.getCheckpointConfig().setFailOnCheckpointingErrors(false);
        env.enableCheckpointing(30 * 1000L, CheckpointingMode.EXACTLY_ONCE);
        env.getCheckpointConfig().setMinPauseBetweenCheckpoints(3L);
        env.getCheckpointConfig()
                .enableExternalizedCheckpoints(CheckpointConfig.ExternalizedCheckpointCleanup.RETAIN_ON_CANCELLATION);

        env.setParallelism(1);

        env.setStreamTimeCharacteristic(TimeCharacteristic.EventTime);

        Properties properties = new Properties();
        properties.setProperty("bootstrap.servers", "localhost:9092");
        properties.setProperty("group.id", "test");

        DeserializationSchema<Tuple2<String, String>> d = new AbstractDeserializationSchema<Tuple2<String, String>>() {

            @Override
            public Tuple2<String, String> deserialize(byte[] message) throws IOException {
                return null;
            }
        };

        DataStream<Tuple2<String, String>> stream = env
                .addSource(new FlinkKafkaConsumer<>("topic", d, properties));

        DataStream<MidModel> eventTimeResult =
                env
                        .addSource(new UserDefinedSource())
                        .assignTimestampsAndWatermarks(
                                new BoundedOutOfOrdernessTimestampExtractor<SourceModel>(Time.seconds(1L)) {
                                    @Override
                                    public long extractTimestamp(SourceModel sourceModel) {
                                        return sourceModel.getTimestamp();
                                    }
                                }
                        )
                        .uid("source")
                        .keyBy(new KeySelector<SourceModel, Integer>() {
                            @Override
                            public Integer getKey(SourceModel sourceModel) throws Exception {
                                return sourceModel.getId();
                            }
                        })
                        // !!!事件时间窗口
                        .timeWindow(Time.seconds(1L))
                        .process(new ProcessWindowFunction<SourceModel, MidModel, Integer, TimeWindow>() {
                            @Override
                            public void process(Integer integer, Context context, Iterable<SourceModel> iterable,
                                    Collector<MidModel> collector) throws Exception {

                                iterable.forEach(new Consumer<SourceModel>() {
                                    @Override
                                    public void accept(SourceModel sourceModel) {
                                        collector.collect(
                                                MidModel
                                                        .builder()
                                                        .id(sourceModel.getId())
                                                        .price(sourceModel.getPrice())
                                                        .timestamp(sourceModel.getTimestamp())
                                                        .build()
                                        );
                                    }
                                });
                            }
                        })
                        .uid("process-event-time");


        DataStream<SinkModel> processingTimeResult = eventTimeResult
                .keyBy(new KeySelector<MidModel, Integer>() {
                    @Override
                    public Integer getKey(MidModel midModel) throws Exception {
                        return midModel.getId();
                    }
                })
                // !!!处理时间窗口
                .window(TumblingProcessingTimeWindows.of(Time.seconds(1L)))
                .process(new ProcessWindowFunction<MidModel, SinkModel, Integer, TimeWindow>() {
                    @Override
                    public void process(Integer integer, Context context, Iterable<MidModel> iterable,
                            Collector<SinkModel> collector) throws Exception {

                        iterable.forEach(new Consumer<MidModel>() {
                            @Override
                            public void accept(MidModel midModel) {
                                collector.collect(
                                        SinkModel
                                                .builder()
                                                .id(midModel.getId())
                                                .price(midModel.getPrice())
                                                .timestamp(midModel.getTimestamp())
                                                .build()
                                );
                            }
                        });

                    }
                })
                .uid("process-process-time");

        processingTimeResult.print();

        env.execute();
    }

    @Data
    @Builder
    private static class SourceModel {
        private int id;
        private int price;
        private long timestamp;
    }

    @Data
    @Builder
    private static class MidModel {
        private int id;
        private int price;
        private long timestamp;
    }

    @Data
    @Builder
    private static class SinkModel {
        private int id;
        private int price;
        private long timestamp;
    }

    private static class UserDefinedSource implements SourceFunction<SourceModel> {

        private volatile boolean isCancel;

        @Override
        public void run(SourceContext<SourceModel> sourceContext) throws Exception {

            while (!this.isCancel) {
                sourceContext.collect(
                        SourceModel
                                .builder()
                                .id(RandomUtils.nextInt(0, 10))
                                .price(RandomUtils.nextInt(0, 100))
                                .timestamp(System.currentTimeMillis())
                                .build()
                );

                Thread.sleep(10L);
            }

        }

        @Override
        public void cancel() {
            this.isCancel = true;
        }
    }

}


================================================
FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_02/DataStreamTest1.java
================================================
//package flink.examples.datastream._02;
//
//import java.io.IOException;
//import java.util.Properties;
//import java.util.concurrent.TimeUnit;
//import java.util.function.Consumer;
//
//import org.apache.commons.lang3.RandomUtils;
//import org.apache.flink.api.common.restartstrategy.RestartStrategies;
//import org.apache.flink.api.common.serialization.AbstractDeserializationSchema;
//import org.apache.flink.api.common.serialization.DeserializationSchema;
//import org.apache.flink.api.java.functions.KeySelector;
//import org.apache.flink.api.java.tuple.Tuple2;
//import org.apache.flink.api.java.utils.ParameterTool;
//import org.apache.flink.streaming.api.CheckpointingMode;
//import org.apache.flink.streaming.api.TimeCharacteristic;
//import org.apache.flink.streaming.api.datastream.DataStream;
//import org.apache.flink.streaming.api.environment.CheckpointConfig;
//import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
//import org.apache.flink.streaming.api.functions.source.SourceFunction;
//import org.apache.flink.streaming.api.functions.windowing.ProcessWindowFunction;
//import org.apache.flink.streaming.api.windowing.assigners.TumblingProcessingTimeWindows;
//import org.apache.flink.streaming.api.windowing.time.Time;
//import org.apache.flink.streaming.api.windowing.windows.TimeWindow;
//import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer;
//import org.apache.flink.util.Collector;
//
//import lombok.Builder;
//import lombok.Data;
//
//
//public class DataStreamTest1 {
//
//    public static void main(String[] args) throws Exception {
//
//        ParameterTool parameters = ParameterTool.fromArgs(args);
//
//        StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
//
//
//        // 其他参数设置
//        env.setRestartStrategy(RestartStrategies.failureRateRestart(6, org.apache.flink.api.common.time.Time
//                .of(10L, TimeUnit.MINUTES), org.apache.flink.api.common.time.Time.of(5L, TimeUnit.SECONDS)));
//        env.getConfig().setGlobalJobParameters(parameters);
//        env.setMaxParallelism(2);
//
//        // ck 设置
//        env.getCheckpointConfig().setFailOnCheckpointingErrors(false);
//        env.enableCheckpointing(30 * 1000L, CheckpointingMode.EXACTLY_ONCE);
//        env.getCheckpointConfig().setMinPauseBetweenCheckpoints(3L);
//        env.getCheckpointConfig()
//                .enableExternalizedCheckpoints(CheckpointConfig.ExternalizedCheckpointCleanup.RETAIN_ON_CANCELLATION);
//
//        env.setParallelism(1);
//
//        env.setStreamTimeCharacteristic(TimeCharacteristic.EventTime);
//
//        Properties properties = new Properties();
//        properties.setProperty("bootstrap.servers", "localhost:9092");
//        properties.setProperty("group.id", "test");
//
//        DeserializationSchema<Tuple2<String, String>> d = new AbstractDeserializationSchema<Tuple2<String, String>>() {
//
//            @Override
//            public Tuple2<String, String> deserialize(byte[] message) throws IOException {
//                return null;
//            }
//        };
//
//        DataStream<Tuple2<String, String>> stream = env
//                .addSource(new FlinkKafkaConsumer<>("topic", d, properties));
//
//        DataStream<MidModel> eventTimeResult =
//                env
//                        .addSource(new UserDefinedSource())
//                        .map()
//                        .flatMap()
//                        .process()
//                        .keyBy()
//                        .sum()
//
//
//        DataStream<SinkModel> processingTimeResult = eventTimeResult
//                .keyBy(new KeySelector<MidModel, Integer>() {
//                    @Override
//                    public Integer getKey(MidModel midModel) throws Exception {
//                        return midModel.getId();
//                    }
//                })
//                // !!!处理时间窗口
//                .window(TumblingProcessingTimeWindows.of(Time.seconds(1L)))
//                .process(new ProcessWindowFunction<MidModel, SinkModel, Integer, TimeWindow>() {
//                    @Override
//                    public void process(Integer integer, Context context, Iterable<MidModel> iterable,
//                            Collector<SinkModel> collector) throws Exception {
//
//                        iterable.forEach(new Consumer<MidModel>() {
//                            @Override
//                            public void accept(MidModel midModel) {
//                                collector.collect(
//                                        SinkModel
//                                                .builder()
//                                                .id(midModel.getId())
//                                                .price(midModel.getPrice())
//                                                .timestamp(midModel.getTimestamp())
//                                                .build()
//                                );
//                            }
//                        });
//
//                    }
//                })
//                .uid("process-process-time");
//
//        processingTimeResult.print();
//
//        env.execute();
//    }
//
//    @Data
//    @Builder
//    private static class SourceModel {
//        private int id;
//        private int price;
//        private long timestamp;
//    }
//
//    @Data
//    @Builder
//    private static class MidModel {
//        private int id;
//        private int price;
//        private long timestamp;
//    }
//
//    @Data
//    @Builder
//    private static class SinkModel {
//        private int id;
//        private int price;
//        private long timestamp;
//    }
//
//    private static class UserDefinedSource implements SourceFunction<SourceModel> {
//
//        private volatile boolean isCancel;
//
//        @Override
//        public void run(SourceContext<SourceModel> sourceContext) throws Exception {
//
//            while (!this.isCancel) {
//                sourceContext.collect(
//                        SourceModel
//                                .builder()
//                                .id(RandomUtils.nextInt(0, 10))
//                                .price(RandomUtils.nextInt(0, 100))
//                                .timestamp(System.currentTimeMillis())
//                                .build()
//                );
//
//                Thread.sleep(10L);
//            }
//
//        }
//
//        @Override
//        public void cancel() {
//            this.isCancel = true;
//        }
//    }
//
//}


================================================
FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_03/enums_state/EnumsStateTest.java
================================================
package flink.examples.datastream._03.enums_state;

import org.apache.flink.api.common.typeinfo.TypeInformation;
import org.apache.flink.api.common.typeutils.base.EnumSerializer;
import org.apache.flink.configuration.Configuration;
import org.apache.flink.core.memory.DataOutputSerializer;
import org.apache.flink.streaming.api.TimeCharacteristic;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;


public class EnumsStateTest {


    public static void main(String[] args) throws Exception {
        StreamExecutionEnvironment env =
                StreamExecutionEnvironment.createLocalEnvironmentWithWebUI(new Configuration());

        env.setParallelism(1);

        env.setStreamTimeCharacteristic(TimeCharacteristic.ProcessingTime);

        TypeInformation<StateTestEnums> t = TypeInformation.of(StateTestEnums.class);

        EnumSerializer<StateTestEnums> e = (EnumSerializer<StateTestEnums>) t.createSerializer(env.getConfig());

        DataOutputSerializer d = new DataOutputSerializer(10000);

        e.serialize(StateTestEnums.A, d);

        env.execute();
    }

    enum StateTestEnums {
        A,
        B,
        C
        ;
    }

}


================================================
FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_03/enums_state/SenerioTest.java
================================================
package flink.examples.datastream._03.enums_state;

import java.util.HashMap;
import java.util.Map;
import java.util.function.BiConsumer;

import org.apache.flink.api.common.functions.AggregateFunction;
import org.apache.flink.api.common.state.ValueState;
import org.apache.flink.api.common.state.ValueStateDescriptor;
import org.apache.flink.api.common.typeinfo.TypeHint;
import org.apache.flink.api.common.typeinfo.TypeInformation;
import org.apache.flink.api.java.functions.KeySelector;
import org.apache.flink.api.java.tuple.Tuple2;
import org.apache.flink.configuration.Configuration;
import org.apache.flink.streaming.api.TimeCharacteristic;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.api.functions.source.SourceFunction;
import org.apache.flink.streaming.api.functions.timestamps.BoundedOutOfOrdernessTimestampExtractor;
import org.apache.flink.streaming.api.functions.windowing.ProcessWindowFunction;
import org.apache.flink.streaming.api.windowing.time.Time;
import org.apache.flink.streaming.api.windowing.windows.TimeWindow;
import org.apache.flink.util.Collector;

import com.google.common.collect.Lists;

import lombok.Builder;
import lombok.Data;
import lombok.extern.slf4j.Slf4j;


@Slf4j
public class SenerioTest {

    public static void main(String[] args) throws Exception {
        StreamExecutionEnvironment env =
                StreamExecutionEnvironment.createLocalEnvironmentWithWebUI(new Configuration());

        Tuple2<DimNameEnum, String> k = Tuple2.of(DimNameEnum.sex, "男");

        System.out.println(k.toString());

        env.setParallelism(1);

        env.setStreamTimeCharacteristic(TimeCharacteristic.ProcessingTime);

        env.addSource(new SourceFunction<SourceModel>() {

            private volatile boolean isCancel = false;

            @Override
            public void run(SourceContext<SourceModel> ctx) throws Exception {

            }

            @Override
            public void cancel() {
                this.isCancel = true;
            }
        })
                .assignTimestampsAndWatermarks(new BoundedOutOfOrdernessTimestampExtractor<SourceModel>(Time.minutes(1L)) {
                    @Override
                    public long extractTimestamp(SourceModel element) {
                        return element.getTimestamp();
                    }
                })
                .keyBy(new KeySelector<SourceModel, Long>() {
                    @Override
                    public Long getKey(SourceModel value) throws Exception {
                        return value.getUserId() % 1000;
                    }
                })
                .timeWindow(Time.minutes(1))
                .aggregate(
                        new AggregateFunction<SourceModel, Map<Tuple2<DimNameEnum, String>, Long>, Map<Tuple2<DimNameEnum, String>, Long>>() {

                            @Override
                            public Map<Tuple2<DimNameEnum, String>, Long> createAccumulator() {
                                return new HashMap<>();
                            }

                            @Override
                            public Map<Tuple2<DimNameEnum, String>, Long> add(SourceModel value,
                                    Map<Tuple2<DimNameEnum, String>, Long> accumulator) {

                                Lists.newArrayList(Tuple2.of(DimNameEnum.province, value.getProvince())
                                        , Tuple2.of(DimNameEnum.age, value.getAge())
                                        , Tuple2.of(DimNameEnum.sex, value.getSex()))
                                        .forEach(t -> {
                                            Long l = accumulator.get(t);

                                            if (null == l) {
                                                accumulator.put(t, 1L);
                                            } else {
                                                accumulator.put(t, l + 1);
                                            }
                                        });

                                return accumulator;
                            }

                            @Override
                            public Map<Tuple2<DimNameEnum, String>, Long> getResult(
                                    Map<Tuple2<DimNameEnum, String>, Long> accumulator) {
                                return accumulator;
                            }

                            @Override
                            public Map<Tuple2<DimNameEnum, String>, Long> merge(
                                    Map<Tuple2<DimNameEnum, String>, Long> a,
                                    Map<Tuple2<DimNameEnum, String>, Long> b) {
                                return null;
                            }
                        },
                        new ProcessWindowFunction<Map<Tuple2<DimNameEnum, String>, Long>, SinkModel, Long, TimeWindow>() {

                            private transient ValueState<Map<Tuple2<DimNameEnum, String>, Long>> todayPv;

                            @Override
                            public void open(Configuration parameters) throws Exception {
                                super.open(parameters);
                                this.todayPv = getRuntimeContext().getState(new ValueStateDescriptor<Map<Tuple2<DimNameEnum, String>, Long>>(
                                        "todayPv", TypeInformation.of(
                                        new TypeHint<Map<Tuple2<DimNameEnum, String>, Long>>() {
                                        })));
                            }

                            @Override
                            public void process(Long aLong, Context context,
                                    Iterable<Map<Tuple2<DimNameEnum, String>, Long>> elements, Collector<SinkModel> out)
                                    throws Exception {
                                // 将 elements 数据 merge 到 todayPv 中
                                // 然后 out#collect 出去即可

                                this.todayPv.value()
                                        .forEach(new BiConsumer<Tuple2<DimNameEnum, String>, Long>() {
                                            @Override
                                            public void accept(Tuple2<DimNameEnum, String> k,
                                                    Long v) {
                                                log.info("key 值:{},value 值:{}", k.toString(), v);
                                            }
                                        });
                            }
                        });

        env.execute();
    }

    @Data
    @Builder
    private static class SourceModel {
        private long userId;
        private String province;
        private String age;
        private String sex;
        private long timestamp;
    }


    @Data
    @Builder
    private static class SinkModel {
        private String dimName;
        private String dimValue;
        private long timestamp;
    }

    enum DimNameEnum {
        province,
        age,
        sex,
        ;
    }

}


================================================
FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_03/state/StateExamplesTest.java
================================================
package flink.examples.datastream._03.state;

import java.util.LinkedList;
import java.util.List;

import org.apache.flink.api.common.functions.AggregateFunction;
import org.apache.flink.api.common.functions.ReduceFunction;
import org.apache.flink.api.common.state.AggregatingState;
import org.apache.flink.api.common.state.AggregatingStateDescriptor;
import org.apache.flink.api.common.state.ListState;
import org.apache.flink.api.common.state.ListStateDescriptor;
import org.apache.flink.api.common.state.MapState;
import org.apache.flink.api.common.state.MapStateDescriptor;
import org.apache.flink.api.common.state.ReducingState;
import org.apache.flink.api.common.state.ReducingStateDescriptor;
import org.apache.flink.api.common.state.StateTtlConfig;
import org.apache.flink.api.common.state.ValueState;
import org.apache.flink.api.common.state.ValueStateDescriptor;
import org.apache.flink.api.common.time.Time;
import org.apache.flink.api.common.typeinfo.BasicTypeInfo;
import org.apache.flink.api.java.functions.KeySelector;
import org.apache.flink.api.java.typeutils.ListTypeInfo;
import org.apache.flink.configuration.Configuration;
import org.apache.flink.streaming.api.functions.KeyedProcessFunction;
import org.apache.flink.streaming.api.functions.source.ParallelSourceFunction;
import org.apache.flink.util.Collector;

import flink.examples.FlinkEnvUtils;
import flink.examples.FlinkEnvUtils.FlinkEnv;
import lombok.Builder;
import lombok.Data;

/**
 * https://ci.apache.org/projects/flink/flink-docs-release-1.13/docs/dev/datastream/fault-tolerance/broadcast_state/
 */

public class StateExamplesTest {


    public static void main(String[] args) throws Exception {
        FlinkEnv flinkEnv = FlinkEnvUtils.getStreamTableEnv(args);

        flinkEnv.env().setParallelism(1);

        flinkEnv.env()
                .addSource(new ParallelSourceFunction<Item>() {

                    private volatile boolean isCancel = false;

                    @Override
                    public void run(SourceContext<Item> ctx) throws Exception {

                        int i = 0;

                        while (!this.isCancel) {
                            ctx.collect(
                                    Item.builder()
                                            .name("item")
                                            .color(Color.RED)
                                            .shape(Shape.CIRCLE)
                                            .build()
                            );
                            i++;
                            Thread.sleep(1000);
                        }
                    }

                    @Override
                    public void cancel() {
                        this.isCancel = true;
                    }
                })
                .keyBy(new KeySelector<Item, Integer>() {
                    @Override
                    public Integer getKey(Item item) throws Exception {
                        return item.color.ordinal();
                    }
                })
                .process(new KeyedProcessFunction<Integer, Item, String>() {

                    // store partial matches, i.e. first elements of the pair waiting for their second element
                    // we keep a list as we may have many first elements waiting
                    private final MapStateDescriptor<String, List<Item>> mapStateDesc =
                            new MapStateDescriptor<>(
                                    "itemsMap",
                                    BasicTypeInfo.STRING_TYPE_INFO,
                                    new ListTypeInfo<>(Item.class));

                    private final ListStateDescriptor<Item> listStateDesc =
                            new ListStateDescriptor<>(
                                    "itemsList",
                                    Item.class);

                    private final ValueStateDescriptor<Item> valueStateDesc =
                            new ValueStateDescriptor<>(
                                    "itemsValue"
                                    , Item.class);

                    private final ReducingStateDescriptor<String> reducingStateDesc =
                            new ReducingStateDescriptor<>(
                                    "itemsReducing"
                                    , new ReduceFunction<String>() {
                                @Override
                                public String reduce(String value1, String value2) throws Exception {
                                    return value1 + value2;
                                }
                            }, String.class);

                    private final AggregatingStateDescriptor<Item, String, String> aggregatingStateDesc =
                            new AggregatingStateDescriptor<Item, String, String>("itemsAgg",
                                    new AggregateFunction<Item, String, String>() {
                                        @Override
                                        public String createAccumulator() {
                                            return "";
                                        }

                                        @Override
                                        public String add(Item value, String accumulator) {
                                            return accumulator + value.name;
                                        }

                                        @Override
                                        public String getResult(String accumulator) {
                                            return accumulator;
                                        }

                                        @Override
                                        public String merge(String a, String b) {
                                            return null;
                                        }
                                    }, String.class);

                    @Override
                    public void open(Configuration parameters) throws Exception {
                        super.open(parameters);

                        mapStateDesc.enableTimeToLive(StateTtlConfig
                                .newBuilder(Time.milliseconds(1))
                                .setUpdateType(StateTtlConfig.UpdateType.OnCreateAndWrite)
                                .setStateVisibility(StateTtlConfig.StateVisibility.NeverReturnExpired)
                                .cleanupInRocksdbCompactFilter(10)
                                .build());

                    }


                    @Override
                    public void processElement(Item value, Context ctx, Collector<String> out) throws Exception {

                        MapState<String, List<Item>> mapState = getRuntimeContext().getMapState(mapStateDesc);

                        List<Item> l = mapState.get(value.name);

                        if (null == l) {
                            l = new LinkedList<>();
                        }

                        l.add(value);

                        mapState.put(value.name, l);

                        ListState<Item> listState = getRuntimeContext().getListState(listStateDesc);

                        listState.add(value);

                        Object o = listState.get();

                        ValueState<Item> valueState = getRuntimeContext().getState(valueStateDesc);

                        valueState.update(value);

                        Item i = valueState.value();

                        AggregatingState<Item, String> aggregatingState = getRuntimeContext().getAggregatingState(aggregatingStateDesc);

                        aggregatingState.add(value);

                        String aggResult = aggregatingState.get();

                        ReducingState<String> reducingState = getRuntimeContext().getReducingState(reducingStateDesc);

                        reducingState.add(value.name);

                        String reducingResult = reducingState.get();

                        System.out.println(1);

                    }
                })
                .print();


        flinkEnv.env().execute("广播状态测试任务");

    }

    @Builder
    @Data
    private static class Rule {
        private String name;
        private Shape first;
        private Shape second;
    }

    @Builder
    @Data
    private static class Item {
        private String name;
        private Shape shape;
        private Color color;

    }


    private enum Shape {
        CIRCLE,
        SQUARE
        ;
    }

    private enum Color {
        RED,
        BLUE,
        BLACK,
        ;
    }

}


================================================
FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_03/state/_01_broadcast_state/BroadcastStateTest.java
================================================
package flink.examples.datastream._03.state._01_broadcast_state;

import java.util.ArrayList;
import java.util.List;
import java.util.Map;

import org.apache.flink.api.common.state.MapState;
import org.apache.flink.api.common.state.MapStateDescriptor;
import org.apache.flink.api.common.typeinfo.BasicTypeInfo;
import org.apache.flink.api.common.typeinfo.TypeHint;
import org.apache.flink.api.common.typeinfo.TypeInformation;
import org.apache.flink.api.java.functions.KeySelector;
import org.apache.flink.api.java.typeutils.ListTypeInfo;
import org.apache.flink.streaming.api.datastream.BroadcastStream;
import org.apache.flink.streaming.api.functions.co.KeyedBroadcastProcessFunction;
import org.apache.flink.streaming.api.functions.source.ParallelSourceFunction;
import org.apache.flink.streaming.api.functions.source.SourceFunction;
import org.apache.flink.util.Collector;

import flink.examples.FlinkEnvUtils;
import flink.examples.FlinkEnvUtils.FlinkEnv;
import lombok.Builder;
import lombok.Data;

/**
 * https://ci.apache.org/projects/flink/flink-docs-release-1.13/docs/dev/datastream/fault-tolerance/broadcast_state/
 */

public class BroadcastStateTest {


    public static void main(String[] args) throws Exception {
        FlinkEnv flinkEnv = FlinkEnvUtils.getStreamTableEnv(args);

        // a map descriptor to store the name of the rule (string) and the rule itself.
        MapStateDescriptor<String, Rule> ruleStateDescriptor = new MapStateDescriptor<>(
                "RulesBroadcastState",
                BasicTypeInfo.STRING_TYPE_INFO,
                TypeInformation.of(new TypeHint<Rule>() {
                }));

        // broadcast the rules and create the broadcast state
        BroadcastStream<Rule> ruleBroadcastStream = flinkEnv.env()
                .addSource(new SourceFunction<Rule>() {

                    private volatile boolean isCancel = false;

                    @Override
                    public void run(SourceContext<Rule> ctx) throws Exception {

                        int i = 0;

                        while (!this.isCancel) {
                            ctx.collect(
                                    Rule.builder()
                                            .name("rule" + i)
                                            .first(Shape.CIRCLE)
                                            .second(Shape.SQUARE)
                                            .build()
                            );
                            i++;
                            Thread.sleep(1000);
                        }
                    }

                    @Override
                    public void cancel() {
                        this.isCancel = true;
                    }
                })
                .setParallelism(1)
                .broadcast(ruleStateDescriptor);

        flinkEnv.env()
                .addSource(new ParallelSourceFunction<Item>() {

                    private volatile boolean isCancel = false;

                    @Override
                    public void run(SourceContext<Item> ctx) throws Exception {

                        int i = 0;

                        while (!this.isCancel) {
                            ctx.collect(
                                    Item.builder()
                                            .name("item" + i)
                                            .color(Color.RED)
                                            .shape(Shape.CIRCLE)
                                            .build()
                            );
                            i++;
                            Thread.sleep(1000);
                        }
                    }

                    @Override
                    public void cancel() {
                        this.isCancel = true;
                    }
                })
                .keyBy(new KeySelector<Item, Color>() {
                    @Override
                    public Color getKey(Item item) throws Exception {
                        return item.color;
                    }
                })
                .connect(ruleBroadcastStream)
                .process(new KeyedBroadcastProcessFunction<Color, Item, Rule, String>() {

                    // store partial matches, i.e. first elements of the pair waiting for their second element
                    // we keep a list as we may have many first elements waiting
                    private final MapStateDescriptor<String, List<Item>> mapStateDesc =
                            new MapStateDescriptor<>(
                                    "items",
                                    BasicTypeInfo.STRING_TYPE_INFO,
                                    new ListTypeInfo<>(Item.class));

                    // identical to our ruleStateDescriptor above
                    private final MapStateDescriptor<String, Rule> ruleStateDescriptor =
                            new MapStateDescriptor<>(
                                    "RulesBroadcastState",
                                    BasicTypeInfo.STRING_TYPE_INFO,
                                    TypeInformation.of(new TypeHint<Rule>() {
                                    }));

                    @Override
                    public void processBroadcastElement(Rule value,
                            Context ctx,
                            Collector<String> out) throws Exception {
                        ctx.getBroadcastState(ruleStateDescriptor).put(value.name, value);
                    }

                    @Override
                    public void processElement(Item value,
                            ReadOnlyContext ctx,
                            Collector<String> out) throws Exception {

                        final MapState<String, List<Item>> state = getRuntimeContext().getMapState(mapStateDesc);
                        final Shape shape = value.getShape();

                        for (Map.Entry<String, Rule> entry
                                : ctx.getBroadcastState(ruleStateDescriptor).immutableEntries()) {
                            final String ruleName = entry.getKey();
                            final Rule rule = entry.getValue();

                            List<Item> stored = state.get(ruleName);
                            if (stored == null) {
                                stored = new ArrayList<>();
                            }

                            if (shape == rule.second && !stored.isEmpty()) {
                                for (Item i : stored) {
                                    out.collect("MATCH: " + i + " - " + value);
                                }
                                stored.clear();
                            }

                            // there is no else{} to cover if rule.first == rule.second
                            if (shape.equals(rule.first)) {
                                stored.add(value);
                            }

                            if (stored.isEmpty()) {
                                state.remove(ruleName);
                            } else {
                                state.put(ruleName, stored);
                            }
                        }
                    }
                })
                .print();


        flinkEnv.env().execute("广播状态测试任务");

    }

    @Builder
    @Data
    private static class Rule {
        private String name;
        private Shape first;
        private Shape second;
    }

    @Builder
    @Data
    private static class Item {
        private String name;
        private Shape shape;
        private Color color;

    }


    private enum Shape {
        CIRCLE,
        SQUARE
        ;
    }

    private enum Color {
        RED,
        BLUE,
        BLACK,
        ;
    }

}


================================================
FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_03/state/_03_rocksdb/CreateStateBackendTest.java
================================================
//package flink.examples.datastream._03.state._03_rocksdb;
//
//import java.util.LinkedList;
//import java.util.List;
//
//import org.apache.flink.api.common.state.MapState;
//import org.apache.flink.api.common.state.MapStateDescriptor;
//import org.apache.flink.api.common.state.StateTtlConfig;
//import org.apache.flink.api.common.state.StateTtlConfig.TtlTimeCharacteristic;
//import org.apache.flink.api.common.time.Time;
//import org.apache.flink.api.java.functions.KeySelector;
//import org.apache.flink.configuration.Configuration;
//import org.apache.flink.streaming.api.functions.KeyedProcessFunction;
//import org.apache.flink.streaming.api.functions.source.ParallelSourceFunction;
//import org.apache.flink.util.Collector;
//
//import flink.examples.FlinkEnvUtils;
//import flink.examples.FlinkEnvUtils.FlinkEnv;
//import lombok.Builder;
//import lombok.Data;
//
///**
// * https://ci.apache.org/projects/flink/flink-docs-release-1.13/docs/dev/datastream/fault-tolerance/broadcast_state/
// */
//
//public class CreateStateBackendTest {
//
//
//    public static void main(String[] args) throws Exception {
//        FlinkEnv flinkEnv = FlinkEnvUtils.getStreamTableEnv(args);
//
//        flinkEnv.env().setParallelism(1);
//
//        flinkEnv.env()
//                .addSource(new ParallelSourceFunction<Item>() {
//
//                    private volatile boolean isCancel = false;
//
//                    @Override
//                    public void run(SourceContext<Item> ctx) throws Exception {
//
//                        int i = 0;
//
//                        while (!this.isCancel) {
//                            ctx.collect(
//                                    Item.builder()
//                                            .name("item")
//                                            .color(Color.RED)
//                                            .shape(Shape.CIRCLE)
//                                            .build()
//                            );
//                            i++;
//                            Thread.sleep(1000);
//                        }
//                    }
//
//                    @Override
//                    public void cancel() {
//                        this.isCancel = true;
//                    }
//                })
//                .keyBy(new KeySelector<Item, Integer>() {
//                    @Override
//                    public Integer getKey(Item item) throws Exception {
//                        return item.color.ordinal();
//                    }
//                })
//                .process(new KeyedProcessFunction<Integer, Item, String>() {
//
//                    // store partial matches, i.e. first elements of the pair waiting for their second element
//                    // we keep a list as we may have many first elements waiting
//                    private MapStateDescriptor<String, String> mapStateDescriptor =
//                            new MapStateDescriptor<>("map state name", String.class, String.class);
//
//                    private transient MapState<String, String> mapState;
//
//                    @Override
//                    public void open(Configuration parameters) throws Exception {
//                        super.open(parameters);
//
//                        StateTtlConfig stateTtlConfig = StateTtlConfig
//                                // 1.ttl 时长
//                                .newBuilder(Time.milliseconds(1))
//
//                                // 2.更新类型
//                                .setUpdateType(StateTtlConfig.UpdateType.OnCreateAndWrite)
//                                // 创建和写入更新
//                                .updateTtlOnCreateAndWrite()
//                                // 读取和写入更新
//             
Download .txt
gitextract_amrzktuf/

├── .gitignore
├── README.md
├── flink-examples-1.10/
│   ├── pom.xml
│   └── src/
│       └── main/
│           └── java/
│               └── flink/
│                   └── examples/
│                       └── sql/
│                           └── _07/
│                               └── query/
│                                   └── _06_joins/
│                                       └── _02_interval_joins/
│                                           └── _01_outer_join/
│                                               ├── WindowJoinFunction$46.java
│                                               └── _06_Interval_Outer_Joins_EventTime_Test.java
├── flink-examples-1.12/
│   ├── .gitignore
│   ├── pom.xml
│   └── src/
│       └── main/
│           └── java/
│               └── flink/
│                   └── examples/
│                       ├── datastream/
│                       │   └── _07/
│                       │       └── query/
│                       │           └── _04_window/
│                       │               └── _04_TumbleWindowTest.java
│                       └── sql/
│                           └── _07/
│                               └── query/
│                                   └── _04_window_agg/
│                                       ├── _04_TumbleWindowTest.java
│                                       ├── _04_TumbleWindowTest_GroupingWindowAggsHandler$59.java
│                                       ├── _04_TumbleWindowTest_KeyProjection$69.java
│                                       └── _04_TumbleWindowTest_WatermarkGenerator$6.java
├── flink-examples-1.13/
│   ├── .gitignore
│   ├── pom.xml
│   └── src/
│       ├── main/
│       │   ├── java/
│       │   │   └── flink/
│       │   │       ├── core/
│       │   │       │   └── source/
│       │   │       │       ├── JaninoUtils.java
│       │   │       │       └── SourceFactory.java
│       │   │       └── examples/
│       │   │           ├── FlinkEnvUtils.java
│       │   │           ├── JacksonUtils.java
│       │   │           ├── datastream/
│       │   │           │   ├── _01/
│       │   │           │   │   └── bytedance/
│       │   │           │   │       └── split/
│       │   │           │   │           ├── codegen/
│       │   │           │   │           │   ├── JaninoUtils.java
│       │   │           │   │           │   └── benchmark/
│       │   │           │   │           │       └── Benchmark.java
│       │   │           │   │           ├── job/
│       │   │           │   │           │   ├── SplitExampleJob.java
│       │   │           │   │           │   └── start.sh
│       │   │           │   │           ├── kafka/
│       │   │           │   │           │   ├── KafkaProducerCenter.java
│       │   │           │   │           │   └── demo/
│       │   │           │   │           │       ├── Application.java
│       │   │           │   │           │       ├── ConsumerThread.java
│       │   │           │   │           │       └── ProducerThread.java
│       │   │           │   │           ├── model/
│       │   │           │   │           │   ├── ClientLogSink.java
│       │   │           │   │           │   ├── ClientLogSource.java
│       │   │           │   │           │   ├── DynamicProducerRule.java
│       │   │           │   │           │   └── Evaluable.java
│       │   │           │   │           └── zkconfigcenter/
│       │   │           │   │               ├── ZkBasedConfigCenter.java
│       │   │           │   │               ├── new.json
│       │   │           │   │               └── old.json
│       │   │           │   ├── _02/
│       │   │           │   │   ├── DataStreamTest.java
│       │   │           │   │   └── DataStreamTest1.java
│       │   │           │   ├── _03/
│       │   │           │   │   ├── enums_state/
│       │   │           │   │   │   ├── EnumsStateTest.java
│       │   │           │   │   │   └── SenerioTest.java
│       │   │           │   │   └── state/
│       │   │           │   │       ├── StateExamplesTest.java
│       │   │           │   │       ├── _01_broadcast_state/
│       │   │           │   │       │   └── BroadcastStateTest.java
│       │   │           │   │       ├── _03_rocksdb/
│       │   │           │   │       │   ├── CreateStateBackendTest.java
│       │   │           │   │       │   ├── GettingStartDemo.java
│       │   │           │   │       │   ├── Rocksdb_OperatorAndKeyedState_StateStorageDIr_Test.java
│       │   │           │   │       │   ├── keyed_state/
│       │   │           │   │       │   │   ├── RocksBackendKeyedMapStateTest.java
│       │   │           │   │       │   │   └── RocksBackendKeyedValueStateTest.java
│       │   │           │   │       │   └── operator_state/
│       │   │           │   │       │       ├── KeyedStreamOperatorListStateTest.java
│       │   │           │   │       │       └── RocksBackendOperatorListStateTest.java
│       │   │           │   │       ├── _04_filesystem/
│       │   │           │   │       │   ├── keyed_state/
│       │   │           │   │       │   │   └── FsStateBackendKeyedMapStateTest.java
│       │   │           │   │       │   └── operator_state/
│       │   │           │   │       │       └── FsStateBackendOperatorListStateTest.java
│       │   │           │   │       └── _05_memory/
│       │   │           │   │           └── keyed_state/
│       │   │           │   │               └── MemoryStateBackendKeyedMapStateTest.java
│       │   │           │   ├── _04/
│       │   │           │   │   └── keyed_co_process/
│       │   │           │   │       ├── HashMapTest.java
│       │   │           │   │       └── _04_KeyedCoProcessFunctionTest.java
│       │   │           │   ├── _05_ken/
│       │   │           │   │   └── _01_watermark/
│       │   │           │   │       └── WatermarkTest.java
│       │   │           │   ├── _06_test/
│       │   │           │   │   └── _01_event_proctime/
│       │   │           │   │       ├── OneJobWIthProcAndEventTimeWIndowTest.java
│       │   │           │   │       └── OneJobWIthTimerTest.java
│       │   │           │   ├── _07_lambda_error/
│       │   │           │   │   └── LambdaErrorTest.java
│       │   │           │   ├── _08_late_record/
│       │   │           │   │   └── LatenessTest.java
│       │   │           │   ├── _09_join/
│       │   │           │   │   ├── _01_window_join/
│       │   │           │   │   │   └── _01_Window_Join_Test.java
│       │   │           │   │   └── _02_connect/
│       │   │           │   │       └── _01_Connect_Test.java
│       │   │           │   └── _10_agg/
│       │   │           │       └── AggTest.java
│       │   │           ├── practice/
│       │   │           │   └── _01/
│       │   │           │       └── dau/
│       │   │           │           └── _01_DataStream_Session_Window.java
│       │   │           ├── question/
│       │   │           │   ├── datastream/
│       │   │           │   │   └── _01/
│       │   │           │   │       └── kryo_protobuf_no_more_bytes_left/
│       │   │           │   │           └── KryoProtobufNoMoreBytesLeftTest.java
│       │   │           │   └── sql/
│       │   │           │       └── _01/
│       │   │           │           └── lots_source_fields_poor_performance/
│       │   │           │               ├── EmbeddedKafka.java
│       │   │           │               ├── _01_DataGenSourceTest.java
│       │   │           │               └── _01_JsonSourceTest.java
│       │   │           ├── runtime/
│       │   │           │   ├── _01/
│       │   │           │   │   └── future/
│       │   │           │   │       ├── CompletableFutureTest.java
│       │   │           │   │       ├── CompletableFutureTest4.java
│       │   │           │   │       ├── CompletableFuture_AnyOf_Test3.java
│       │   │           │   │       ├── CompletableFuture_ThenApplyAsync_Test2.java
│       │   │           │   │       ├── CompletableFuture_ThenComposeAsync_Test2.java
│       │   │           │   │       └── FutureTest.java
│       │   │           │   └── _04/
│       │   │           │       └── statebackend/
│       │   │           │           └── CancelAndRestoreWithCheckpointTest.java
│       │   │           └── sql/
│       │   │               ├── _01/
│       │   │               │   └── countdistincterror/
│       │   │               │       ├── CountDistinctErrorTest.java
│       │   │               │       ├── CountDistinctErrorTest2.java
│       │   │               │       ├── CountDistinctErrorTest3.java
│       │   │               │       └── udf/
│       │   │               │           ├── Mod_UDF.java
│       │   │               │           ├── StatusMapper1_UDF.java
│       │   │               │           └── StatusMapper_UDF.java
│       │   │               ├── _02/
│       │   │               │   └── timezone/
│       │   │               │       ├── TimeZoneTest.java
│       │   │               │       ├── TimeZoneTest2.java
│       │   │               │       └── TimeZoneTest3.java
│       │   │               ├── _03/
│       │   │               │   └── source_sink/
│       │   │               │       ├── CreateViewTest.java
│       │   │               │       ├── DataStreamSourceEventTimeTest.java
│       │   │               │       ├── DataStreamSourceProcessingTimeTest.java
│       │   │               │       ├── KafkaSourceTest.java
│       │   │               │       ├── RedisLookupTest.java
│       │   │               │       ├── RedisSinkTest.java
│       │   │               │       ├── SocketSourceTest.java
│       │   │               │       ├── TableApiKafkaSourceTest.java
│       │   │               │       ├── UpsertKafkaSinkProtobufFormatSupportTest.java
│       │   │               │       ├── UpsertKafkaSinkTest.java
│       │   │               │       ├── UserDefinedSourceTest.java
│       │   │               │       ├── abilities/
│       │   │               │       │   ├── sink/
│       │   │               │       │   │   ├── Abilities_SinkFunction.java
│       │   │               │       │   │   ├── Abilities_TableSink.java
│       │   │               │       │   │   ├── Abilities_TableSinkFactory.java
│       │   │               │       │   │   └── _01_SupportsWritingMetadata_Test.java
│       │   │               │       │   └── source/
│       │   │               │       │       ├── Abilities_SourceFunction.java
│       │   │               │       │       ├── Abilities_TableSource.java
│       │   │               │       │       ├── Abilities_TableSourceFactory.java
│       │   │               │       │       ├── _01_SupportsFilterPushDown_Test.java
│       │   │               │       │       ├── _02_SupportsLimitPushDown_Test.java
│       │   │               │       │       ├── _03_SupportsPartitionPushDown_Test.java
│       │   │               │       │       ├── _04_SupportsProjectionPushDown_JDBC_Test.java
│       │   │               │       │       ├── _04_SupportsProjectionPushDown_Test.java
│       │   │               │       │       ├── _05_SupportsReadingMetadata_Test.java
│       │   │               │       │       ├── _06_SupportsWatermarkPushDown_Test.java
│       │   │               │       │       ├── _07_SupportsSourceWatermark_Test.java
│       │   │               │       │       └── before/
│       │   │               │       │           ├── Before_Abilities_SourceFunction.java
│       │   │               │       │           ├── Before_Abilities_TableSource.java
│       │   │               │       │           ├── Before_Abilities_TableSourceFactory.java
│       │   │               │       │           ├── _01_Before_SupportsFilterPushDown_Test.java
│       │   │               │       │           ├── _02_Before_SupportsLimitPushDown_Test.java
│       │   │               │       │           ├── _03_Before_SupportsPartitionPushDown_Test.java
│       │   │               │       │           ├── _04_Before_SupportsProjectionPushDown_Test.java
│       │   │               │       │           ├── _05_Before_SupportsReadingMetadata_Test.java
│       │   │               │       │           ├── _06_Before_SupportsWatermarkPushDown_Test.java
│       │   │               │       │           └── _07_Before_SupportsSourceWatermark_Test.java
│       │   │               │       ├── ddl/
│       │   │               │       │   └── TableApiDDLTest.java
│       │   │               │       └── table/
│       │   │               │           ├── redis/
│       │   │               │           │   ├── container/
│       │   │               │           │   │   ├── RedisCommandsContainer.java
│       │   │               │           │   │   ├── RedisCommandsContainerBuilder.java
│       │   │               │           │   │   └── RedisContainer.java
│       │   │               │           │   ├── demo/
│       │   │               │           │   │   └── RedisDemo.java
│       │   │               │           │   ├── mapper/
│       │   │               │           │   │   ├── LookupRedisMapper.java
│       │   │               │           │   │   ├── RedisCommand.java
│       │   │               │           │   │   ├── RedisCommandDescription.java
│       │   │               │           │   │   └── SetRedisMapper.java
│       │   │               │           │   ├── options/
│       │   │               │           │   │   ├── RedisLookupOptions.java
│       │   │               │           │   │   ├── RedisOptions.java
│       │   │               │           │   │   └── RedisWriteOptions.java
│       │   │               │           │   ├── v1/
│       │   │               │           │   │   ├── RedisDynamicTableFactory.java
│       │   │               │           │   │   ├── sink/
│       │   │               │           │   │   │   └── RedisDynamicTableSink.java
│       │   │               │           │   │   └── source/
│       │   │               │           │   │       ├── RedisDynamicTableSource.java
│       │   │               │           │   │       └── RedisRowDataLookupFunction.java
│       │   │               │           │   └── v2/
│       │   │               │           │       ├── RedisDynamicTableFactory.java
│       │   │               │           │       ├── sink/
│       │   │               │           │       │   └── RedisDynamicTableSink.java
│       │   │               │           │       └── source/
│       │   │               │           │           ├── RedisDynamicTableSource.java
│       │   │               │           │           ├── RedisRowDataBatchLookupFunction.java
│       │   │               │           │           └── RedisRowDataLookupFunction.java
│       │   │               │           ├── socket/
│       │   │               │           │   ├── SocketDynamicTableFactory.java
│       │   │               │           │   ├── SocketDynamicTableSource.java
│       │   │               │           │   └── SocketSourceFunction.java
│       │   │               │           └── user_defined/
│       │   │               │               ├── UserDefinedDynamicTableFactory.java
│       │   │               │               ├── UserDefinedDynamicTableSource.java
│       │   │               │               └── UserDefinedSource.java
│       │   │               ├── _04/
│       │   │               │   └── type/
│       │   │               │       ├── BlinkPlannerTest.java
│       │   │               │       ├── JavaEnvTest.java
│       │   │               │       └── OldPlannerTest.java
│       │   │               ├── _05/
│       │   │               │   └── format/
│       │   │               │       └── formats/
│       │   │               │           ├── ProtobufFormatTest.java
│       │   │               │           ├── SocketWriteTest.java
│       │   │               │           ├── csv/
│       │   │               │           │   ├── ChangelogCsvDeserializer.java
│       │   │               │           │   ├── ChangelogCsvFormat.java
│       │   │               │           │   └── ChangelogCsvFormatFactory.java
│       │   │               │           ├── protobuf/
│       │   │               │           │   ├── descriptors/
│       │   │               │           │   │   ├── Protobuf.java
│       │   │               │           │   │   └── ProtobufValidator.java
│       │   │               │           │   ├── row/
│       │   │               │           │   │   ├── ProtobufDeserializationSchema.java
│       │   │               │           │   │   ├── ProtobufRowDeserializationSchema.java
│       │   │               │           │   │   ├── ProtobufRowFormatFactory.java
│       │   │               │           │   │   ├── ProtobufRowSerializationSchema.java
│       │   │               │           │   │   ├── ProtobufSerializationSchema.java
│       │   │               │           │   │   ├── ProtobufUtils.java
│       │   │               │           │   │   └── typeutils/
│       │   │               │           │   │       └── ProtobufSchemaConverter.java
│       │   │               │           │   └── rowdata/
│       │   │               │           │       ├── ProtobufFormatFactory.java
│       │   │               │           │       ├── ProtobufOptions.java
│       │   │               │           │       ├── ProtobufRowDataDeserializationSchema.java
│       │   │               │           │       ├── ProtobufRowDataSerializationSchema.java
│       │   │               │           │       ├── ProtobufToRowDataConverters.java
│       │   │               │           │       └── RowDataToProtobufConverters.java
│       │   │               │           └── utils/
│       │   │               │               ├── MoreRunnables.java
│       │   │               │               ├── MoreSuppliers.java
│       │   │               │               ├── ThrowableRunable.java
│       │   │               │               └── ThrowableSupplier.java
│       │   │               ├── _06/
│       │   │               │   └── calcite/
│       │   │               │       ├── CalciteTest.java
│       │   │               │       ├── ParserTest.java
│       │   │               │       └── javacc/
│       │   │               │           ├── JavaccCodeGenTest.java
│       │   │               │           ├── Simple1Test.java
│       │   │               │           └── generatedcode/
│       │   │               │               ├── ParseException.java
│       │   │               │               ├── Simple1.java
│       │   │               │               ├── Simple1Constants.java
│       │   │               │               ├── Simple1TokenManager.java
│       │   │               │               ├── SimpleCharStream.java
│       │   │               │               ├── Token.java
│       │   │               │               └── TokenMgrError.java
│       │   │               ├── _07/
│       │   │               │   └── query/
│       │   │               │       ├── _01_select_where/
│       │   │               │       │   ├── SelectWhereHiveDialect.java
│       │   │               │       │   ├── SelectWhereTest.java
│       │   │               │       │   ├── SelectWhereTest2.java
│       │   │               │       │   ├── SelectWhereTest3.java
│       │   │               │       │   ├── SelectWhereTest4.java
│       │   │               │       │   ├── SelectWhereTest5.java
│       │   │               │       │   └── StreamExecCalc$10.java
│       │   │               │       ├── _02_select_distinct/
│       │   │               │       │   ├── GroupAggsHandler$5.java
│       │   │               │       │   ├── KeyProjection$0.java
│       │   │               │       │   ├── SelectDistinctTest.java
│       │   │               │       │   └── SelectDistinctTest2.java
│       │   │               │       ├── _03_group_agg/
│       │   │               │       │   ├── _01_group_agg/
│       │   │               │       │   │   ├── GroupAggMiniBatchTest.java
│       │   │               │       │   │   ├── GroupAggTest.java
│       │   │               │       │   │   └── GroupAggsHandler$39.java
│       │   │               │       │   ├── _02_count_distinct/
│       │   │               │       │   │   ├── CountDistinctGroupAggTest.java
│       │   │               │       │   │   └── GroupAggsHandler$17.java
│       │   │               │       │   ├── _03_grouping_sets/
│       │   │               │       │   │   ├── GroupingSetsEqualsGroupAggUnionAllGroupAggTest2.java
│       │   │               │       │   │   ├── GroupingSetsGroupAggTest.java
│       │   │               │       │   │   ├── GroupingSetsGroupAggTest2.java
│       │   │               │       │   │   └── StreamExecExpand$20.java
│       │   │               │       │   ├── _04_cube/
│       │   │               │       │   │   ├── CubeGroupAggTest.java
│       │   │               │       │   │   └── CubeGroupAggTest2.java
│       │   │               │       │   └── _05_rollup/
│       │   │               │       │       ├── RollUpGroupAggTest.java
│       │   │               │       │       └── RollUpGroupAggTest2.java
│       │   │               │       ├── _04_window_agg/
│       │   │               │       │   ├── _01_tumble_window/
│       │   │               │       │   │   ├── TumbleWindow2GroupAggTest.java
│       │   │               │       │   │   ├── TumbleWindowTest.java
│       │   │               │       │   │   ├── TumbleWindowTest2.java
│       │   │               │       │   │   ├── TumbleWindowTest3.java
│       │   │               │       │   │   ├── TumbleWindowTest4.java
│       │   │               │       │   │   ├── TumbleWindowTest5.java
│       │   │               │       │   │   ├── global_agg/
│       │   │               │       │   │   │   ├── GlobalWindowAggsHandler$232.java
│       │   │               │       │   │   │   ├── LocalWindowAggsHandler$162.java
│       │   │               │       │   │   │   └── StateWindowAggsHandler$300.java
│       │   │               │       │   │   └── local_agg/
│       │   │               │       │   │       ├── KeyProjection$89.java
│       │   │               │       │   │       └── LocalWindowAggsHandler$88.java
│       │   │               │       │   ├── _02_cumulate_window/
│       │   │               │       │   │   ├── CumulateWindowGroupingSetsBigintTest.java
│       │   │               │       │   │   ├── CumulateWindowGroupingSetsTest.java
│       │   │               │       │   │   ├── CumulateWindowTest.java
│       │   │               │       │   │   ├── TumbleWindowEarlyFireTest.java
│       │   │               │       │   │   ├── cumulate/
│       │   │               │       │   │   │   ├── global_agg/
│       │   │               │       │   │   │   │   ├── GlobalWindowAggsHandler$232.java
│       │   │               │       │   │   │   │   ├── KeyProjection$301.java
│       │   │               │       │   │   │   │   ├── LocalWindowAggsHandler$162.java
│       │   │               │       │   │   │   │   └── StateWindowAggsHandler$300.java
│       │   │               │       │   │   │   └── local_agg/
│       │   │               │       │   │   │       ├── KeyProjection$89.java
│       │   │               │       │   │   │       └── LocalWindowAggsHandler$88.java
│       │   │               │       │   │   └── earlyfire/
│       │   │               │       │   │       ├── GroupAggsHandler$210.java
│       │   │               │       │   │       └── GroupingWindowAggsHandler$57.java
│       │   │               │       │   └── _03_hop_window/
│       │   │               │       │       └── HopWindowGroupWindowAggTest.java
│       │   │               │       ├── _05_over/
│       │   │               │       │   ├── _01_row_number/
│       │   │               │       │   │   ├── RowNumberOrderByBigintTest.java
│       │   │               │       │   │   ├── RowNumberOrderByStringTest.java
│       │   │               │       │   │   ├── RowNumberOrderByUnixTimestampTest.java
│       │   │               │       │   │   ├── RowNumberWithoutPartitionKeyTest.java
│       │   │               │       │   │   ├── RowNumberWithoutRowNumberEqual1Test.java
│       │   │               │       │   │   └── Scalar_UDF.java
│       │   │               │       │   └── _02_agg/
│       │   │               │       │       ├── RangeIntervalProctimeTest.java
│       │   │               │       │       ├── RangeIntervalRowtimeAscendingTest.java
│       │   │               │       │       ├── RangeIntervalRowtimeBoundedOutOfOrdernessTest.java
│       │   │               │       │       ├── RangeIntervalRowtimeStrictlyAscendingTest.java
│       │   │               │       │       └── RowIntervalTest.java
│       │   │               │       ├── _06_joins/
│       │   │               │       │   ├── _01_regular_joins/
│       │   │               │       │   │   ├── _01_inner_join/
│       │   │               │       │   │   │   ├── ConditionFunction$4.java
│       │   │               │       │   │   │   ├── _01_InnerJoinsTest.java
│       │   │               │       │   │   │   └── _02_InnerJoinsOnNotEqualTest.java
│       │   │               │       │   │   └── _02_outer_join/
│       │   │               │       │   │       ├── _01_LeftJoinsTest.java
│       │   │               │       │   │       ├── _02_RightJoinsTest.java
│       │   │               │       │   │       └── _03_FullJoinsTest.java
│       │   │               │       │   ├── _02_interval_joins/
│       │   │               │       │   │   ├── _01_proctime/
│       │   │               │       │   │   │   ├── Interval_Full_Joins_ProcesingTime_Test.java
│       │   │               │       │   │   │   ├── Interval_Inner_Joins_ProcesingTime_Test.java
│       │   │               │       │   │   │   ├── Interval_Left_Joins_ProcesingTime_Test.java
│       │   │               │       │   │   │   └── Interval_Right_Joins_ProcesingTime_Test.java
│       │   │               │       │   │   └── _02_row_time/
│       │   │               │       │   │       ├── Interval_Full_JoinsOnNotEqual_EventTime_Test.java
│       │   │               │       │   │       ├── Interval_Full_Joins_EventTime_Test.java
│       │   │               │       │   │       ├── Interval_Inner_Joins_EventTime_Test.java
│       │   │               │       │   │       ├── Interval_Left_Joins_EventTime_Test.java
│       │   │               │       │   │       └── Interval_Right_Joins_EventTime_Test.java
│       │   │               │       │   ├── _03_temporal_join/
│       │   │               │       │   │   ├── _01_proctime/
│       │   │               │       │   │   │   └── Temporal_Join_ProcesingTime_Test.java
│       │   │               │       │   │   └── _02_row_time/
│       │   │               │       │   │       └── Temporal_Join_EventTime_Test.java
│       │   │               │       │   ├── _04_lookup_join/
│       │   │               │       │   │   └── _01_redis/
│       │   │               │       │   │       ├── RedisBatchLookupTest2.java
│       │   │               │       │   │       ├── RedisDemo.java
│       │   │               │       │   │       ├── RedisLookupTest.java
│       │   │               │       │   │       ├── RedisLookupTest2.java
│       │   │               │       │   │       └── pipeline/
│       │   │               │       │   │           ├── BatchJoinTableFuncCollector$8.java
│       │   │               │       │   │           ├── BatchLookupFunction$4.java
│       │   │               │       │   │           ├── JoinTableFuncCollector$8.java
│       │   │               │       │   │           ├── JoinTableFuncCollector$9.java
│       │   │               │       │   │           ├── LookupFunction$4.java
│       │   │               │       │   │           ├── LookupFunction$5.java
│       │   │               │       │   │           └── T1.java
│       │   │               │       │   ├── _05_array_expansion/
│       │   │               │       │   │   └── _01_ArrayExpansionTest.java
│       │   │               │       │   └── _06_table_function/
│       │   │               │       │       └── _01_inner_join/
│       │   │               │       │           ├── TableFunctionInnerJoin_Test.java
│       │   │               │       │           └── TableFunctionInnerJoin_WithEmptyTableFunction_Test.java
│       │   │               │       ├── _07_deduplication/
│       │   │               │       │   ├── DeduplicationProcessingTimeTest.java
│       │   │               │       │   ├── DeduplicationProcessingTimeTest1.java
│       │   │               │       │   └── DeduplicationRowTimeTest.java
│       │   │               │       ├── _08_datastream_trans/
│       │   │               │       │   ├── AlertExample.java
│       │   │               │       │   ├── AlertExampleRetract.java
│       │   │               │       │   ├── AlertExampleRetractError.java
│       │   │               │       │   ├── RetractExample.java
│       │   │               │       │   └── Test.java
│       │   │               │       ├── _09_set_operations/
│       │   │               │       │   ├── Except_Test.java
│       │   │               │       │   ├── Exist_Test.java
│       │   │               │       │   ├── In_Test.java
│       │   │               │       │   ├── Intersect_Test.java
│       │   │               │       │   ├── UnionAll_Test.java
│       │   │               │       │   └── Union_Test.java
│       │   │               │       ├── _10_order_by/
│       │   │               │       │   ├── OrderBy_with_time_attr_Test.java
│       │   │               │       │   └── OrderBy_without_time_attr_Test.java
│       │   │               │       ├── _11_limit/
│       │   │               │       │   └── Limit_Test.java
│       │   │               │       ├── _12_topn/
│       │   │               │       │   └── TopN_Test.java
│       │   │               │       ├── _13_window_topn/
│       │   │               │       │   └── WindowTopN_Test.java
│       │   │               │       ├── _14_retract/
│       │   │               │       │   └── Retract_Test.java
│       │   │               │       ├── _15_exec_options/
│       │   │               │       │   ├── Default_Parallelism_Test.java
│       │   │               │       │   ├── Idle_Timeout_Test.java
│       │   │               │       │   └── State_Ttl_Test.java
│       │   │               │       ├── _16_optimizer_options/
│       │   │               │       │   ├── Agg_OnePhase_Strategy_window_Test.java
│       │   │               │       │   ├── Agg_TwoPhase_Strategy_unbounded_Test.java
│       │   │               │       │   ├── Agg_TwoPhase_Strategy_window_Test.java
│       │   │               │       │   ├── DistinctAgg_Split_One_Distinct_Key_Test.java
│       │   │               │       │   └── DistinctAgg_Split_Two_Distinct_Key_Test.java
│       │   │               │       ├── _17_table_options/
│       │   │               │       │   ├── Dml_Syc_False_Test.java
│       │   │               │       │   ├── Dml_Syc_True_Test.java
│       │   │               │       │   └── TimeZone_window_Test.java
│       │   │               │       └── _18_performance_tuning/
│       │   │               │           └── Count_Distinct_Filter_Test.java
│       │   │               ├── _08/
│       │   │               │   └── batch/
│       │   │               │       ├── Utils.java
│       │   │               │       ├── _01_ddl/
│       │   │               │       │   └── HiveDDLTest.java
│       │   │               │       ├── _02_dml/
│       │   │               │       │   ├── HiveDMLBetweenAndTest.java
│       │   │               │       │   ├── HiveDMLTest.java
│       │   │               │       │   ├── HiveTest2.java
│       │   │               │       │   ├── _01_hive_dialect/
│       │   │               │       │   │   └── HiveDMLTest.java
│       │   │               │       │   ├── _02_with_as/
│       │   │               │       │   │   └── HIveWIthAsTest.java
│       │   │               │       │   ├── _03_substr/
│       │   │               │       │   │   └── HiveSubstrTest.java
│       │   │               │       │   ├── _04_tumble_window/
│       │   │               │       │   │   ├── Test.java
│       │   │               │       │   │   ├── Test1.java
│       │   │               │       │   │   ├── Test2_BIGINT_SOURCE.java
│       │   │               │       │   │   ├── Test3.java
│       │   │               │       │   │   └── Test5.java
│       │   │               │       │   ├── _05_batch_to_datastream/
│       │   │               │       │   │   └── Test.java
│       │   │               │       │   └── _06_select_where/
│       │   │               │       │       └── Test.java
│       │   │               │       ├── _03_hive_udf/
│       │   │               │       │   ├── HiveModuleV2.java
│       │   │               │       │   ├── HiveUDFRegistryTest.java
│       │   │               │       │   ├── HiveUDFRegistryUnloadTest.java
│       │   │               │       │   ├── _01_GenericUDAFResolver2/
│       │   │               │       │   │   ├── HiveUDAF_hive_module_registry_Test.java
│       │   │               │       │   │   ├── HiveUDAF_sql_registry_create_function_Test.java
│       │   │               │       │   │   ├── HiveUDAF_sql_registry_create_temporary_function_Test.java
│       │   │               │       │   │   └── TestHiveUDAF.java
│       │   │               │       │   ├── _02_GenericUDTF/
│       │   │               │       │   │   ├── HiveUDTF_hive_module_registry_Test.java
│       │   │               │       │   │   ├── HiveUDTF_sql_registry_create_function_Test.java
│       │   │               │       │   │   ├── HiveUDTF_sql_registry_create_temporary_function_Test.java
│       │   │               │       │   │   └── TestHiveUDTF.java
│       │   │               │       │   ├── _03_built_in_udf/
│       │   │               │       │   │   ├── _01_get_json_object/
│       │   │               │       │   │   │   └── HiveUDF_get_json_object_Test.java
│       │   │               │       │   │   └── _02_rlike/
│       │   │               │       │   │       └── HiveUDF_rlike_Test.java
│       │   │               │       │   └── _04_GenericUDF/
│       │   │               │       │       ├── HiveUDF_hive_module_registry_Test.java
│       │   │               │       │       ├── HiveUDF_sql_registry_create_function_Test.java
│       │   │               │       │       ├── HiveUDF_sql_registry_create_temporary_function_Test.java
│       │   │               │       │       └── TestGenericUDF.java
│       │   │               │       ├── _04_flink_udf/
│       │   │               │       │   ├── FlinkUDAF_Test.java
│       │   │               │       │   ├── FlinkUDF_Test.java
│       │   │               │       │   └── FlinkUDTF_Test.java
│       │   │               │       └── _05_test/
│       │   │               │           └── _01_batch_to_datastream/
│       │   │               │               └── Test.java
│       │   │               ├── _09/
│       │   │               │   └── udf/
│       │   │               │       ├── _01_hive_udf/
│       │   │               │       │   └── _01_GenericUDF/
│       │   │               │       │       ├── HiveUDF_sql_registry_create_function_Test.java
│       │   │               │       │       ├── HiveUDF_sql_registry_create_function_with_hive_catalog_Test.java
│       │   │               │       │       ├── HiveUDF_sql_registry_create_temporary_function_Test.java
│       │   │               │       │       ├── HiveUDF_sql_registry_create_temporary_function_with_hive_catalog_Test.java
│       │   │               │       │       └── TestGenericUDF.java
│       │   │               │       ├── _02_stream_hive_udf/
│       │   │               │       │   ├── HiveUDF_Error_Test.java
│       │   │               │       │   ├── HiveUDF_create_temporary_error_Test.java
│       │   │               │       │   ├── HiveUDF_hive_module_registry_Test.java
│       │   │               │       │   ├── HiveUDF_load_first_Test.java
│       │   │               │       │   ├── HiveUDF_load_second_Test.java
│       │   │               │       │   ├── TestGenericUDF.java
│       │   │               │       │   └── UserDefinedSource.java
│       │   │               │       ├── _03_advanced_type_inference/
│       │   │               │       │   ├── AdvancedFunctionsExample.java
│       │   │               │       │   ├── InternalRowMergerFunction.java
│       │   │               │       │   └── LastDatedValueFunction.java
│       │   │               │       ├── _04_udf/
│       │   │               │       │   └── UDAF_Test.java
│       │   │               │       └── _05_scalar_function/
│       │   │               │           ├── ExplodeUDTF.java
│       │   │               │           ├── ExplodeUDTFV2.java
│       │   │               │           ├── GetMapValue.java
│       │   │               │           ├── GetSetValue.java
│       │   │               │           ├── ScalarFunctionTest.java
│       │   │               │           ├── ScalarFunctionTest2.java
│       │   │               │           ├── SetStringUDF.java
│       │   │               │           └── TableFunctionTest2.java
│       │   │               ├── _10_share/
│       │   │               │   └── A.java
│       │   │               ├── _11_explain/
│       │   │               │   └── Explain_Test.java
│       │   │               └── _12_data_type/
│       │   │                   ├── _01_interval/
│       │   │                   │   ├── Timestamp3_Interval_To_Test.java
│       │   │                   │   └── Timestamp_ltz3_Interval_To_Test.java
│       │   │                   ├── _02_user_defined/
│       │   │                   │   ├── User.java
│       │   │                   │   ├── UserDefinedDataTypes_Test.java
│       │   │                   │   ├── UserDefinedDataTypes_Test2.java
│       │   │                   │   └── UserScalarFunction.java
│       │   │                   └── _03_raw/
│       │   │                       ├── RawScalarFunction.java
│       │   │                       └── Raw_DataTypes_Test2.java
│       │   ├── javacc/
│       │   │   └── Simple1.jj
│       │   ├── proto/
│       │   │   ├── source.proto
│       │   │   └── test.proto
│       │   ├── resources/
│       │   │   └── META-INF/
│       │   │       └── services/
│       │   │           └── org.apache.flink.table.factories.Factory
│       │   └── scala/
│       │       └── flink/
│       │           └── examples/
│       │               └── sql/
│       │                   └── _04/
│       │                       └── type/
│       │                           └── TableFunc0.scala
│       └── test/
│           ├── java/
│           │   └── flink/
│           │       └── examples/
│           │           └── sql/
│           │               ├── _05/
│           │               │   └── format/
│           │               │       └── formats/
│           │               │           └── protobuf/
│           │               │               ├── row/
│           │               │               │   ├── ProtobufRowDeserializationSchemaTest.java
│           │               │               │   └── ProtobufRowSerializationSchemaTest.java
│           │               │               └── rowdata/
│           │               │                   ├── ProtobufRowDataDeserializationSchemaTest.java
│           │               │                   └── ProtobufRowDataSerializationSchemaTest.java
│           │               ├── _06/
│           │               │   └── calcite/
│           │               │       └── CalciteTest.java
│           │               └── _07/
│           │                   └── query/
│           │                       └── _06_joins/
│           │                           └── JaninoCompileTest.java
│           ├── proto/
│           │   └── person.proto
│           └── scala/
│               ├── ScalaEnv.scala
│               └── TableFunc0.scala
├── flink-examples-1.14/
│   ├── pom.xml
│   └── src/
│       └── main/
│           └── java/
│               └── flink/
│                   └── examples/
│                       └── sql/
│                           └── _08/
│                               └── batch/
│                                   ├── HiveModuleV2.java
│                                   └── Test.java
├── flink-examples-1.8/
│   ├── .gitignore
│   └── pom.xml
└── pom.xml
Download .txt
SYMBOL INDEX (1459 symbols across 348 files)

FILE: flink-examples-1.10/src/main/java/flink/examples/sql/_07/query/_06_joins/_02_interval_joins/_01_outer_join/WindowJoinFunction$46.java
  class WindowJoinFunction$46 (line 4) | public class WindowJoinFunction$46
    method WindowJoinFunction$46 (line 9) | public WindowJoinFunction$46(Object[] references) throws Exception {
    method open (line 14) | @Override
    method join (line 19) | @Override
    method close (line 73) | @Override

FILE: flink-examples-1.10/src/main/java/flink/examples/sql/_07/query/_06_joins/_02_interval_joins/_01_outer_join/_06_Interval_Outer_Joins_EventTime_Test.java
  class _06_Interval_Outer_Joins_EventTime_Test (line 26) | public class _06_Interval_Outer_Joins_EventTime_Test {
    method main (line 28) | public static void main(String[] args) throws Exception {
    class UserDefinedSource1 (line 100) | private static class UserDefinedSource1 implements SourceFunction<Row>...
      method run (line 104) | @Override
      method cancel (line 129) | @Override
      method getProducedType (line 134) | @Override
    class UserDefinedSource2 (line 140) | private static class UserDefinedSource2 implements SourceFunction<Row>...
      method run (line 144) | @Override
      method cancel (line 169) | @Override
      method getProducedType (line 174) | @Override

FILE: flink-examples-1.12/src/main/java/flink/examples/datastream/_07/query/_04_window/_04_TumbleWindowTest.java
  class _04_TumbleWindowTest (line 13) | public class _04_TumbleWindowTest {
    method main (line 15) | public static void main(String[] args) throws Exception {
    class UserDefinedSource (line 44) | private static class UserDefinedSource implements SourceFunction<Tuple...
      method run (line 48) | @Override
      method cancel (line 60) | @Override

FILE: flink-examples-1.12/src/main/java/flink/examples/sql/_07/query/_04_window_agg/_04_TumbleWindowTest.java
  class _04_TumbleWindowTest (line 15) | public class _04_TumbleWindowTest {
    method main (line 17) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.12/src/main/java/flink/examples/sql/_07/query/_04_window_agg/_04_TumbleWindowTest_GroupingWindowAggsHandler$59.java
  class _04_TumbleWindowTest_GroupingWindowAggsHandler$59 (line 4) | public final class _04_TumbleWindowTest_GroupingWindowAggsHandler$59 imp...
    method _04_TumbleWindowTest_GroupingWindowAggsHandler$59 (line 30) | public _04_TumbleWindowTest_GroupingWindowAggsHandler$59(Object[] refe...
    method getRuntimeContext (line 35) | private org.apache.flink.api.common.functions.RuntimeContext getRuntim...
    method open (line 39) | @Override
    method accumulate (line 51) | @Override
    method retract (line 302) | @Override
    method merge (line 310) | @Override
    method setAccumulators (line 319) | @Override
    method getAccumulators (line 404) | @Override
    method createAccumulators (line 460) | @Override
    method getValue (line 517) | @Override
    method cleanup (line 593) | @Override
    method close (line 603) | @Override

FILE: flink-examples-1.12/src/main/java/flink/examples/sql/_07/query/_04_window_agg/_04_TumbleWindowTest_KeyProjection$69.java
  class _04_TumbleWindowTest_KeyProjection$69 (line 4) | public final class _04_TumbleWindowTest_KeyProjection$69 implements
    method _04_TumbleWindowTest_KeyProjection$69 (line 12) | public _04_TumbleWindowTest_KeyProjection$69(Object[] references) thro...
    method apply (line 16) | @Override

FILE: flink-examples-1.12/src/main/java/flink/examples/sql/_07/query/_04_window_agg/_04_TumbleWindowTest_WatermarkGenerator$6.java
  class _04_TumbleWindowTest_WatermarkGenerator$6 (line 4) | public final class _04_TumbleWindowTest_WatermarkGenerator$6
    method _04_TumbleWindowTest_WatermarkGenerator$6 (line 8) | public _04_TumbleWindowTest_WatermarkGenerator$6(Object[] references) ...
    method open (line 12) | @Override
    method currentWatermark (line 17) | @Override
    method close (line 47) | @Override

FILE: flink-examples-1.13/src/main/java/flink/core/source/JaninoUtils.java
  class JaninoUtils (line 8) | @Slf4j
    method genClass (line 17) | public static <T> Class<T> genClass(String className, String code, Cla...

FILE: flink-examples-1.13/src/main/java/flink/core/source/SourceFactory.java
  class SourceFactory (line 13) | public class SourceFactory {
    method getProtobufSer (line 15) | public static <Message extends GeneratedMessageV3> SerializationSchema...
    method getProtobufDerse (line 24) | @SneakyThrows
    method main (line 51) | public static void main(String[] args) throws IOException {

FILE: flink-examples-1.13/src/main/java/flink/examples/FlinkEnvUtils.java
  class FlinkEnvUtils (line 29) | public class FlinkEnvUtils {
    method setRocksDBStateBackend (line 39) | public static void setRocksDBStateBackend(StreamExecutionEnvironment e...
    method setFsStateBackend (line 55) | public static void setFsStateBackend(StreamExecutionEnvironment env) t...
    method setMemoryStateBackend (line 67) | public static void setMemoryStateBackend(StreamExecutionEnvironment en...
    method setCheckpointConfig (line 77) | public static void setCheckpointConfig(StreamExecutionEnvironment env)...
    method getStreamTableEnv (line 92) | public static FlinkEnv getStreamTableEnv(String[] args) throws IOExcep...
    method initHiveEnv (line 149) | private static void initHiveEnv(FlinkEnv flinkEnv, ParameterTool param...
    method getBatchTableEnv (line 220) | public static FlinkEnv getBatchTableEnv(String[] args) throws IOExcept...
    class FlinkEnv (line 259) | @Builder
      method streamTEnv (line 267) | public StreamTableEnvironment streamTEnv() {
      method batchTEnv (line 271) | public TableEnvironment batchTEnv() {
      method env (line 275) | public StreamExecutionEnvironment env() {
      method hiveModuleV2 (line 279) | public HiveModuleV2 hiveModuleV2() {

FILE: flink-examples-1.13/src/main/java/flink/examples/JacksonUtils.java
  class JacksonUtils (line 16) | public class JacksonUtils {
    method bean2Json (line 27) | public static String bean2Json(Object data) {
    method json2Bean (line 37) | public static <T> T json2Bean(String jsonData, Class<T> beanType) {
    method json2List (line 48) | public static <T> List<T> json2List(String jsonData, Class<T> beanType) {
    method json2Map (line 61) | public static <K, V> Map<K, V> json2Map(String jsonData, Class<K> keyT...

FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/codegen/JaninoUtils.java
  class JaninoUtils (line 9) | @Slf4j
    method genCodeAndGetClazz (line 18) | public static Class<Evaluable> genCodeAndGetClazz(Long id, String topi...
    method main (line 40) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/codegen/benchmark/Benchmark.java
  class Benchmark (line 12) | @Slf4j
    method benchmarkForJava (line 15) | private static void benchmarkForJava() {
    method benchmarkForGroovyClassLoader (line 29) | public static void benchmarkForGroovyClassLoader() {
    method benchmarkForJanino (line 64) | public static void benchmarkForJanino() {
    method main (line 89) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/job/SplitExampleJob.java
  class SplitExampleJob (line 29) | public class SplitExampleJob {
    method main (line 31) | public static void main(String[] args) throws Exception {
    class UserDefinedSource (line 95) | private static class UserDefinedSource implements SourceFunction<Clien...
      method run (line 99) | @Override
      method cancel (line 118) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/kafka/KafkaProducerCenter.java
  class KafkaProducerCenter (line 17) | public class KafkaProducerCenter {
    method KafkaProducerCenter (line 22) | private KafkaProducerCenter() {
    class Factory (line 29) | private static class Factory {
    method getInstance (line 33) | public static KafkaProducerCenter getInstance() {
    method getProducer (line 37) | private Producer<String, String> getProducer(String topicName) {
    method send (line 59) | public void send(String topicName, String message) {
    method close (line 70) | public void close() {

FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/kafka/demo/Application.java
  class Application (line 4) | public class Application {
    method main (line 10) | public static void main(String[] args) throws InterruptedException {
    method getTopicName (line 23) | public String getTopicName() {
    method getConsumerGrp (line 27) | public String getConsumerGrp() {
    method getBrokerUrl (line 31) | public String getBrokerUrl() {

FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/kafka/demo/ConsumerThread.java
  class ConsumerThread (line 13) | public class ConsumerThread implements Runnable {
    method ConsumerThread (line 17) | public ConsumerThread(Application application) {
    method run (line 30) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/kafka/demo/ProducerThread.java
  class ProducerThread (line 12) | public class ProducerThread implements Runnable {
    method ProducerThread (line 17) | public ProducerThread(Application application) {
    method run (line 27) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/model/ClientLogSink.java
  class ClientLogSink (line 7) | @Data

FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/model/ClientLogSource.java
  class ClientLogSource (line 7) | @Data

FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/model/DynamicProducerRule.java
  class DynamicProducerRule (line 9) | @Data
    method init (line 19) | public void init(Long id) {
    method eval (line 28) | @Override
    method main (line 33) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/model/Evaluable.java
  type Evaluable (line 4) | public interface Evaluable {
    method eval (line 6) | boolean eval(ClientLogSource clientLogSource);

FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/zkconfigcenter/ZkBasedConfigCenter.java
  class ZkBasedConfigCenter (line 27) | public class ZkBasedConfigCenter {
    class Factory (line 33) | private static class Factory {
    method getInstance (line 37) | public static ZkBasedConfigCenter getInstance() {
    method ZkBasedConfigCenter (line 41) | private ZkBasedConfigCenter() {
    method getMap (line 52) | public ConcurrentMap<Long, DynamicProducerRule> getMap() {
    method setData (line 57) | private void setData() throws Exception {
    method open (line 74) | private void open() throws Exception {
    method close (line 112) | public void close() {
    method update (line 117) | private void update(String json) {
    method getNewMap (line 143) | private Map<Long, DynamicProducerRule> getNewMap(String json) {

FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_02/DataStreamTest.java
  class DataStreamTest (line 33) | public class DataStreamTest {
    method main (line 35) | public static void main(String[] args) throws Exception {
    class SourceModel (line 154) | @Data
    class MidModel (line 162) | @Data
    class SinkModel (line 170) | @Data
    class UserDefinedSource (line 178) | private static class UserDefinedSource implements SourceFunction<Sourc...
      method run (line 182) | @Override
      method cancel (line 200) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_03/enums_state/EnumsStateTest.java
  class EnumsStateTest (line 11) | public class EnumsStateTest {
    method main (line 14) | public static void main(String[] args) throws Exception {
    type StateTestEnums (line 33) | enum StateTestEnums {

FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_03/enums_state/SenerioTest.java
  class SenerioTest (line 31) | @Slf4j
    method main (line 34) | public static void main(String[] args) throws Exception {
    class SourceModel (line 148) | @Data
    class SinkModel (line 159) | @Data
    type DimNameEnum (line 167) | enum DimNameEnum {

FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_03/state/StateExamplesTest.java
  class StateExamplesTest (line 37) | public class StateExamplesTest {
    method main (line 40) | public static void main(String[] args) throws Exception {
    class Rule (line 197) | @Builder
    class Item (line 205) | @Builder
    type Shape (line 215) | private enum Shape {
    type Color (line 221) | private enum Color {

FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_03/state/_01_broadcast_state/BroadcastStateTest.java
  class BroadcastStateTest (line 29) | public class BroadcastStateTest {
    method main (line 32) | public static void main(String[] args) throws Exception {
    class Rule (line 179) | @Builder
    class Item (line 187) | @Builder
    type Shape (line 197) | private enum Shape {
    type Color (line 203) | private enum Color {

FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_03/state/_03_rocksdb/GettingStartDemo.java
  class GettingStartDemo (line 7) | public class GettingStartDemo {
    method main (line 20) | public static void main(String[] args) throws RocksDBException {

FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_03/state/_03_rocksdb/Rocksdb_OperatorAndKeyedState_StateStorageDIr_Test.java
  class Rocksdb_OperatorAndKeyedState_StateStorageDIr_Test (line 32) | public class Rocksdb_OperatorAndKeyedState_StateStorageDIr_Test {
    method main (line 35) | public static void main(String[] args) throws Exception {
    class Rule (line 127) | @Builder
    class Item (line 135) | @Builder
    type Shape (line 145) | private enum Shape {
    type Color (line 151) | private enum Color {
    class UserDefinedSource (line 158) | private static class UserDefinedSource extends RichParallelSourceFunct...
      method run (line 168) | @Override
      method cancel (line 195) | @Override
      method snapshotState (line 200) | @Override
      method initializeState (line 205) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_03/state/_03_rocksdb/keyed_state/RocksBackendKeyedMapStateTest.java
  class RocksBackendKeyedMapStateTest (line 28) | public class RocksBackendKeyedMapStateTest {
    method main (line 31) | public static void main(String[] args) throws Exception {
    class Rule (line 166) | @Builder
    class Item (line 174) | @Builder
    type Shape (line 184) | private enum Shape {
    type Color (line 190) | private enum Color {

FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_03/state/_03_rocksdb/keyed_state/RocksBackendKeyedValueStateTest.java
  class RocksBackendKeyedValueStateTest (line 26) | public class RocksBackendKeyedValueStateTest {
    method main (line 29) | public static void main(String[] args) throws Exception {
    class Rule (line 115) | @Builder
    class Item (line 123) | @Builder
    type Shape (line 133) | private enum Shape {
    type Color (line 139) | private enum Color {

FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_03/state/_03_rocksdb/operator_state/KeyedStreamOperatorListStateTest.java
  class KeyedStreamOperatorListStateTest (line 26) | public class KeyedStreamOperatorListStateTest {
    method main (line 29) | public static void main(String[] args) throws Exception {
    class Rule (line 50) | @Builder
    class Item (line 58) | @Builder
    type Shape (line 68) | private enum Shape {
    type Color (line 74) | private enum Color {
    class UserDefinedSource (line 81) | private static class UserDefinedSource extends RichParallelSourceFunct...
      method run (line 91) | @Override
      method cancel (line 121) | @Override
      method snapshotState (line 126) | @Override
      method initializeState (line 131) | @Override
    class UserDefinedKeyPF (line 137) | private static class UserDefinedKeyPF extends KeyedProcessFunction<Int...
      method processElement (line 144) | @Override
      method snapshotState (line 149) | @Override
      method initializeState (line 154) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_03/state/_03_rocksdb/operator_state/RocksBackendOperatorListStateTest.java
  class RocksBackendOperatorListStateTest (line 24) | public class RocksBackendOperatorListStateTest {
    method main (line 27) | public static void main(String[] args) throws Exception {
    class Rule (line 54) | @Builder
    class Item (line 62) | @Builder
    type Shape (line 72) | private enum Shape {
    type Color (line 78) | private enum Color {
    class UserDefinedSource (line 85) | private static class UserDefinedSource extends RichParallelSourceFunct...
      method run (line 95) | @Override
      method cancel (line 125) | @Override
      method snapshotState (line 130) | @Override
      method initializeState (line 135) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_03/state/_04_filesystem/keyed_state/FsStateBackendKeyedMapStateTest.java
  class FsStateBackendKeyedMapStateTest (line 25) | public class FsStateBackendKeyedMapStateTest {
    method main (line 28) | public static void main(String[] args) throws Exception {
    class Rule (line 119) | @Builder
    class Item (line 127) | @Builder
    type Shape (line 137) | private enum Shape {
    type Color (line 143) | private enum Color {

FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_03/state/_04_filesystem/operator_state/FsStateBackendOperatorListStateTest.java
  class FsStateBackendOperatorListStateTest (line 24) | public class FsStateBackendOperatorListStateTest {
    method main (line 27) | public static void main(String[] args) throws Exception {
    class Rule (line 54) | @Builder
    class Item (line 62) | @Builder
    type Shape (line 72) | private enum Shape {
    type Color (line 77) | private enum Color {
    class UserDefinedSource (line 84) | private static class UserDefinedSource extends RichParallelSourceFunct...
      method run (line 94) | @Override
      method cancel (line 124) | @Override
      method snapshotState (line 129) | @Override
      method initializeState (line 134) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_03/state/_05_memory/keyed_state/MemoryStateBackendKeyedMapStateTest.java
  class MemoryStateBackendKeyedMapStateTest (line 27) | public class MemoryStateBackendKeyedMapStateTest {
    method main (line 30) | public static void main(String[] args) throws Exception {
    class Rule (line 121) | @Builder
    class Item (line 129) | @Builder
    type Shape (line 139) | private enum Shape {
    type Color (line 145) | private enum Color {

FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_04/keyed_co_process/HashMapTest.java
  class HashMapTest (line 7) | public class HashMapTest {
    method main (line 9) | public static void main(String[] args) {

FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_04/keyed_co_process/_04_KeyedCoProcessFunctionTest.java
  class _04_KeyedCoProcessFunctionTest (line 32) | public class _04_KeyedCoProcessFunctionTest {
    method main (line 34) | public static void main(String[] args) throws Exception {
    class UserDefineSource1 (line 200) | private static class UserDefineSource1 extends RichSourceFunction<Sour...
      method run (line 204) | @Override
      method cancel (line 224) | @Override
    class UserDefineSource2 (line 230) | private static class UserDefineSource2 extends RichSourceFunction<Sour...
      method run (line 234) | @Override
      method cancel (line 252) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_05_ken/_01_watermark/WatermarkTest.java
  class WatermarkTest (line 23) | public class WatermarkTest {
    method main (line 25) | public static void main(String[] args) throws Exception {
    class SourceModel (line 103) | @Data
    class SinkModel (line 111) | @Data

FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_06_test/_01_event_proctime/OneJobWIthProcAndEventTimeWIndowTest.java
  class OneJobWIthProcAndEventTimeWIndowTest (line 23) | public class OneJobWIthProcAndEventTimeWIndowTest {
    method main (line 25) | public static void main(String[] args) throws Exception {
    class SourceModel (line 114) | @Data
    class MiddleModel (line 122) | @Data
    class SinkModel (line 129) | @Data

FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_06_test/_01_event_proctime/OneJobWIthTimerTest.java
  class OneJobWIthTimerTest (line 16) | public class OneJobWIthTimerTest {
    method main (line 18) | public static void main(String[] args) throws Exception {
    class SourceModel (line 91) | @Data
    class MiddleModel (line 99) | @Data
    class SinkModel (line 106) | @Data

FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_07_lambda_error/LambdaErrorTest.java
  class LambdaErrorTest (line 11) | public class LambdaErrorTest {
    method main (line 13) | public static void main(String[] args) throws Exception {
    class SourceModel (line 52) | @Data
    class MiddleModel (line 60) | @Data
    class SinkModel (line 67) | @Data

FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_08_late_record/LatenessTest.java
  class LatenessTest (line 18) | public class LatenessTest {
    method main (line 20) | public static void main(String[] args) throws Exception {
    class SourceModel (line 101) | @Data
    class MiddleModel (line 109) | @Data
    class SinkModel (line 116) | @Data

FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_09_join/_02_connect/_01_Connect_Test.java
  class _01_Connect_Test (line 15) | public class _01_Connect_Test {
    method main (line 17) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/datastream/_10_agg/AggTest.java
  class AggTest (line 15) | public class AggTest {
    method main (line 17) | public static void main(String[] args) throws Exception {
    class SourceModel (line 88) | @Data
    class MiddleModel (line 96) | @Data
    class SinkModel (line 103) | @Data

FILE: flink-examples-1.13/src/main/java/flink/examples/practice/_01/dau/_01_DataStream_Session_Window.java
  class _01_DataStream_Session_Window (line 17) | public class _01_DataStream_Session_Window {
    method main (line 19) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/question/datastream/_01/kryo_protobuf_no_more_bytes_left/KryoProtobufNoMoreBytesLeftTest.java
  class KryoProtobufNoMoreBytesLeftTest (line 14) | public class KryoProtobufNoMoreBytesLeftTest {
    method main (line 16) | public static void main(String[] args) throws Exception {
    method testGetParse (line 49) | private static void testGetParse() throws Exception {
    class ProtobufSerializerV2 (line 61) | private static class ProtobufSerializerV2 extends ProtobufSerializer {
      method getParse (line 62) | @Override
    method newKryo (line 68) | private static Kryo newKryo() {

FILE: flink-examples-1.13/src/main/java/flink/examples/question/sql/_01/lots_source_fields_poor_performance/_01_DataGenSourceTest.java
  class _01_DataGenSourceTest (line 16) | public class _01_DataGenSourceTest {
    method main (line 18) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/question/sql/_01/lots_source_fields_poor_performance/_01_JsonSourceTest.java
  class _01_JsonSourceTest (line 23) | public class _01_JsonSourceTest {
    method main (line 25) | public static void main(String[] args) throws Exception {
    class UserDefineSource1 (line 144) | public static class UserDefineSource1 extends RichSourceFunction<RowDa...
      method UserDefineSource1 (line 150) | public UserDefineSource1(DeserializationSchema<RowData> dser) {
      method run (line 154) | @Override
      method cancel (line 167) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/runtime/_01/future/CompletableFutureTest.java
  class CompletableFutureTest (line 6) | public class CompletableFutureTest {
    method main (line 8) | public static void main(String[] args) throws Exception {
    method fetchPrice (line 24) | static Double fetchPrice() {

FILE: flink-examples-1.13/src/main/java/flink/examples/runtime/_01/future/CompletableFutureTest4.java
  class CompletableFutureTest4 (line 6) | public class CompletableFutureTest4 {
    method main (line 8) | public static void main(String[] args) throws Exception {
    method queryCode (line 25) | static String queryCode(String name) {
    method fetchPrice (line 33) | static String fetchPrice(String code) {

FILE: flink-examples-1.13/src/main/java/flink/examples/runtime/_01/future/CompletableFuture_AnyOf_Test3.java
  class CompletableFuture_AnyOf_Test3 (line 6) | public class CompletableFuture_AnyOf_Test3 {
    method main (line 8) | public static void main(String[] args) throws Exception {
    method queryCode (line 39) | static String queryCode(String name, String url) {
    method fetchPrice (line 48) | static Double fetchPrice(String code, String url) {

FILE: flink-examples-1.13/src/main/java/flink/examples/runtime/_01/future/CompletableFuture_ThenApplyAsync_Test2.java
  class CompletableFuture_ThenApplyAsync_Test2 (line 6) | public class CompletableFuture_ThenApplyAsync_Test2 {
    method main (line 8) | public static void main(String[] args) throws Exception {
    method queryCode (line 25) | static String queryCode(String name) {
    method fetchPrice (line 33) | static String fetchPrice(String code) {

FILE: flink-examples-1.13/src/main/java/flink/examples/runtime/_01/future/CompletableFuture_ThenComposeAsync_Test2.java
  class CompletableFuture_ThenComposeAsync_Test2 (line 6) | public class CompletableFuture_ThenComposeAsync_Test2 {
    method main (line 8) | public static void main(String[] args) throws Exception {
    method queryCode (line 25) | static String queryCode(String name) {
    method fetchPrice (line 33) | static String fetchPrice(String code) {

FILE: flink-examples-1.13/src/main/java/flink/examples/runtime/_01/future/FutureTest.java
  class FutureTest (line 11) | public class FutureTest {
    method main (line 13) | public static void main(String[] args) throws ExecutionException, Inte...
    class Task (line 29) | private static class Task implements Callable<String> {
      method call (line 30) | public String call() throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/runtime/_04/statebackend/CancelAndRestoreWithCheckpointTest.java
  class CancelAndRestoreWithCheckpointTest (line 19) | public class CancelAndRestoreWithCheckpointTest {
    method main (line 24) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_01/countdistincterror/CountDistinctErrorTest.java
  class CountDistinctErrorTest (line 17) | public class CountDistinctErrorTest {
    method main (line 19) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_01/countdistincterror/CountDistinctErrorTest2.java
  class CountDistinctErrorTest2 (line 17) | public class CountDistinctErrorTest2 {
    method main (line 19) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_01/countdistincterror/CountDistinctErrorTest3.java
  class CountDistinctErrorTest3 (line 17) | public class CountDistinctErrorTest3 {
    method main (line 19) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_01/countdistincterror/udf/Mod_UDF.java
  class Mod_UDF (line 6) | public class Mod_UDF extends ScalarFunction {
    method eval (line 8) | public int eval(long id, int remainder) {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_01/countdistincterror/udf/StatusMapper1_UDF.java
  class StatusMapper1_UDF (line 6) | public class StatusMapper1_UDF extends ScalarFunction {
    method eval (line 10) | public String eval(String status) {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_01/countdistincterror/udf/StatusMapper_UDF.java
  class StatusMapper_UDF (line 6) | public class StatusMapper_UDF extends TableFunction<String> {
    method eval (line 10) | public void eval(String status) throws InterruptedException {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_02/timezone/TimeZoneTest.java
  class TimeZoneTest (line 18) | public class TimeZoneTest {
    method main (line 20) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_02/timezone/TimeZoneTest2.java
  class TimeZoneTest2 (line 19) | @Slf4j
    method main (line 22) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_02/timezone/TimeZoneTest3.java
  class TimeZoneTest3 (line 7) | public class TimeZoneTest3 {
    method main (line 9) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/CreateViewTest.java
  class CreateViewTest (line 8) | public class CreateViewTest {
    method main (line 10) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/DataStreamSourceEventTimeTest.java
  class DataStreamSourceEventTimeTest (line 17) | public class DataStreamSourceEventTimeTest {
    method main (line 19) | public static void main(String[] args) throws Exception {
    class UserDefinedSource (line 60) | private static class UserDefinedSource implements SourceFunction<Row>,...
      method run (line 64) | @Override
      method cancel (line 79) | @Override
      method getProducedType (line 84) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/DataStreamSourceProcessingTimeTest.java
  class DataStreamSourceProcessingTimeTest (line 15) | public class DataStreamSourceProcessingTimeTest {
    method main (line 17) | public static void main(String[] args) throws Exception {
    class UserDefinedSource (line 53) | private static class UserDefinedSource implements SourceFunction<Row>,...
      method run (line 57) | @Override
      method cancel (line 72) | @Override
      method getProducedType (line 77) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/KafkaSourceTest.java
  class KafkaSourceTest (line 11) | public class KafkaSourceTest {
    method main (line 13) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/RedisLookupTest.java
  class RedisLookupTest (line 19) | public class RedisLookupTest {
    method main (line 21) | public static void main(String[] args) throws Exception {
    class UserDefinedSource (line 76) | private static class UserDefinedSource implements SourceFunction<Row>,...
      method run (line 80) | @Override
      method cancel (line 93) | @Override
      method getProducedType (line 98) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/RedisSinkTest.java
  class RedisSinkTest (line 18) | public class RedisSinkTest {
    method main (line 20) | public static void main(String[] args) throws Exception {
    class UserDefinedSource (line 66) | private static class UserDefinedSource implements SourceFunction<Row>,...
      method run (line 70) | @Override
      method cancel (line 82) | @Override
      method getProducedType (line 87) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/SocketSourceTest.java
  class SocketSourceTest (line 11) | public class SocketSourceTest {
    method main (line 13) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/TableApiKafkaSourceTest.java
  class TableApiKafkaSourceTest (line 16) | public class TableApiKafkaSourceTest {
    method main (line 18) | public static void main(String[] args) throws Exception {
    class UserDefinedSource (line 54) | private static class UserDefinedSource implements SourceFunction<Row>,...
      method run (line 58) | @Override
      method cancel (line 73) | @Override
      method getProducedType (line 78) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/UpsertKafkaSinkProtobufFormatSupportTest.java
  class UpsertKafkaSinkProtobufFormatSupportTest (line 21) | public class UpsertKafkaSinkProtobufFormatSupportTest {
    method main (line 23) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/UpsertKafkaSinkTest.java
  class UpsertKafkaSinkTest (line 21) | public class UpsertKafkaSinkTest {
    method main (line 23) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/UserDefinedSourceTest.java
  class UserDefinedSourceTest (line 8) | public class UserDefinedSourceTest {
    method main (line 10) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/sink/Abilities_SinkFunction.java
  class Abilities_SinkFunction (line 11) | public class Abilities_SinkFunction extends RichSinkFunction<RowData> {
    method Abilities_SinkFunction (line 18) | public Abilities_SinkFunction(
    method open (line 24) | @Override
    method invoke (line 31) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/sink/Abilities_TableSink.java
  class Abilities_TableSink (line 23) | @Slf4j
    method Abilities_TableSink (line 37) | public Abilities_TableSink(
    method getChangelogMode (line 45) | @Override
    method getSinkRuntimeProvider (line 50) | @Override
    method copy (line 57) | @Override
    method asSummaryString (line 62) | @Override
    method applyOverwrite (line 67) | @Override
    method applyStaticPartition (line 72) | @Override
    method listWritableMetadata (line 77) | @Override
    method applyWritableMetadata (line 84) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/sink/Abilities_TableSinkFactory.java
  class Abilities_TableSinkFactory (line 14) | public class Abilities_TableSinkFactory implements DynamicTableSinkFacto...
    method factoryIdentifier (line 32) | @Override
    method requiredOptions (line 37) | @Override
    method optionalOptions (line 42) | @Override
    method createDynamicTableSink (line 51) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/sink/_01_SupportsWritingMetadata_Test.java
  class _01_SupportsWritingMetadata_Test (line 8) | public class _01_SupportsWritingMetadata_Test {
    method main (line 10) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/Abilities_SourceFunction.java
  class Abilities_SourceFunction (line 12) | public class Abilities_SourceFunction extends RichSourceFunction<RowData> {
    method Abilities_SourceFunction (line 22) | public Abilities_SourceFunction(DeserializationSchema<RowData> dser) {
    method Abilities_SourceFunction (line 26) | public Abilities_SourceFunction(DeserializationSchema<RowData> dser, l...
    method Abilities_SourceFunction (line 31) | public Abilities_SourceFunction(DeserializationSchema<RowData> dser, b...
    method run (line 36) | @Override
    method cancel (line 62) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/Abilities_TableSource.java
  class Abilities_TableSource (line 37) | @Slf4j
    method Abilities_TableSource (line 59) | public Abilities_TableSource(
    method getChangelogMode (line 76) | @Override
    method getScanRuntimeProvider (line 83) | @SneakyThrows
    method copy (line 108) | @Override
    method asSummaryString (line 113) | @Override
    method applyFilters (line 118) | @Override
    method applyLimit (line 129) | @Override
    method listPartitions (line 134) | @Override
    method applyPartitions (line 139) | @Override
    method supportsNestedProjection (line 144) | @Override
    method applyProjection (line 149) | @Override
    method listReadableMetadata (line 154) | @Override
    method applyReadableMetadata (line 161) | @Override
    method applyWatermark (line 167) | @Override
    method applySourceWatermark (line 174) | @Override
    method projectSchemaWithMetadata (line 181) | public static TableSchema projectSchemaWithMetadata(TableSchema tableS...
    method getSchemaWithMetadata (line 206) | public static TableSchema getSchemaWithMetadata(TableSchema tableSchem...

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/Abilities_TableSourceFactory.java
  class Abilities_TableSourceFactory (line 22) | public class Abilities_TableSourceFactory implements DynamicTableSourceF...
    method factoryIdentifier (line 29) | @Override
    method requiredOptions (line 34) | @Override
    method optionalOptions (line 42) | @Override
    method createDynamicTableSource (line 48) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/_01_SupportsFilterPushDown_Test.java
  class _01_SupportsFilterPushDown_Test (line 8) | public class _01_SupportsFilterPushDown_Test {
    method main (line 10) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/_02_SupportsLimitPushDown_Test.java
  class _02_SupportsLimitPushDown_Test (line 8) | public class _02_SupportsLimitPushDown_Test {
    method main (line 10) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/_03_SupportsPartitionPushDown_Test.java
  class _03_SupportsPartitionPushDown_Test (line 8) | public class _03_SupportsPartitionPushDown_Test {
    method main (line 10) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/_04_SupportsProjectionPushDown_JDBC_Test.java
  class _04_SupportsProjectionPushDown_JDBC_Test (line 8) | public class _04_SupportsProjectionPushDown_JDBC_Test {
    method main (line 10) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/_04_SupportsProjectionPushDown_Test.java
  class _04_SupportsProjectionPushDown_Test (line 8) | public class _04_SupportsProjectionPushDown_Test {
    method main (line 10) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/_05_SupportsReadingMetadata_Test.java
  class _05_SupportsReadingMetadata_Test (line 8) | public class _05_SupportsReadingMetadata_Test {
    method main (line 10) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/_06_SupportsWatermarkPushDown_Test.java
  class _06_SupportsWatermarkPushDown_Test (line 8) | public class _06_SupportsWatermarkPushDown_Test {
    method main (line 10) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/_07_SupportsSourceWatermark_Test.java
  class _07_SupportsSourceWatermark_Test (line 8) | public class _07_SupportsSourceWatermark_Test {
    method main (line 10) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/before/Before_Abilities_SourceFunction.java
  class Before_Abilities_SourceFunction (line 12) | public class Before_Abilities_SourceFunction extends RichSourceFunction<...
    method Before_Abilities_SourceFunction (line 22) | public Before_Abilities_SourceFunction(DeserializationSchema<RowData> ...
    method Before_Abilities_SourceFunction (line 26) | public Before_Abilities_SourceFunction(DeserializationSchema<RowData> ...
    method Before_Abilities_SourceFunction (line 31) | public Before_Abilities_SourceFunction(DeserializationSchema<RowData> ...
    method run (line 36) | @Override
    method cancel (line 62) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/before/Before_Abilities_TableSource.java
  class Before_Abilities_TableSource (line 19) | @Slf4j
    method Before_Abilities_TableSource (line 32) | public Before_Abilities_TableSource(
    method getChangelogMode (line 47) | @Override
    method getScanRuntimeProvider (line 54) | @SneakyThrows
    method copy (line 77) | @Override
    method asSummaryString (line 82) | @Override
    method getSchemaWithMetadata (line 87) | public static TableSchema getSchemaWithMetadata(TableSchema tableSchem...

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/before/Before_Abilities_TableSourceFactory.java
  class Before_Abilities_TableSourceFactory (line 22) | public class Before_Abilities_TableSourceFactory implements DynamicTable...
    method factoryIdentifier (line 29) | @Override
    method requiredOptions (line 34) | @Override
    method optionalOptions (line 42) | @Override
    method createDynamicTableSource (line 48) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/before/_01_Before_SupportsFilterPushDown_Test.java
  class _01_Before_SupportsFilterPushDown_Test (line 8) | public class _01_Before_SupportsFilterPushDown_Test {
    method main (line 10) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/before/_02_Before_SupportsLimitPushDown_Test.java
  class _02_Before_SupportsLimitPushDown_Test (line 8) | public class _02_Before_SupportsLimitPushDown_Test {
    method main (line 10) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/before/_03_Before_SupportsPartitionPushDown_Test.java
  class _03_Before_SupportsPartitionPushDown_Test (line 8) | public class _03_Before_SupportsPartitionPushDown_Test {
    method main (line 10) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/before/_04_Before_SupportsProjectionPushDown_Test.java
  class _04_Before_SupportsProjectionPushDown_Test (line 8) | public class _04_Before_SupportsProjectionPushDown_Test {
    method main (line 10) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/before/_05_Before_SupportsReadingMetadata_Test.java
  class _05_Before_SupportsReadingMetadata_Test (line 8) | public class _05_Before_SupportsReadingMetadata_Test {
    method main (line 10) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/before/_06_Before_SupportsWatermarkPushDown_Test.java
  class _06_Before_SupportsWatermarkPushDown_Test (line 8) | public class _06_Before_SupportsWatermarkPushDown_Test {
    method main (line 10) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/before/_07_Before_SupportsSourceWatermark_Test.java
  class _07_Before_SupportsSourceWatermark_Test (line 8) | public class _07_Before_SupportsSourceWatermark_Test {
    method main (line 10) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/ddl/TableApiDDLTest.java
  class TableApiDDLTest (line 19) | public class TableApiDDLTest {
    method main (line 23) | public static void main(String[] args) throws Exception {
    class UserDefinedSource (line 76) | private static class UserDefinedSource implements SourceFunction<Row>,...
      method run (line 80) | @Override
      method cancel (line 92) | @Override
      method getProducedType (line 97) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/redis/container/RedisCommandsContainer.java
  type RedisCommandsContainer (line 26) | public interface RedisCommandsContainer extends Closeable, Serializable {
    method open (line 28) | void open() throws Exception;
    method get (line 30) | byte[] get(byte[] key);
    method multiGet (line 32) | List<Object> multiGet(List<byte[]> key);
    method hget (line 34) | byte[] hget(byte[] key, byte[] hashField);

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/redis/container/RedisCommandsContainerBuilder.java
  class RedisCommandsContainerBuilder (line 12) | public class RedisCommandsContainerBuilder {
    method build (line 14) | public static RedisCommandsContainer build(FlinkJedisConfigBase flinkJ...
    method build (line 33) | public static RedisCommandsContainer build(FlinkJedisPoolConfig jedisP...

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/redis/container/RedisContainer.java
  class RedisContainer (line 16) | public class RedisContainer implements RedisCommandsContainer, Closeable {
    method RedisContainer (line 27) | public RedisContainer(JedisPool jedisPool) {
    method RedisContainer (line 33) | public RedisContainer(JedisSentinelPool sentinelPool) {
    method getInstance (line 39) | private Jedis getInstance() {
    method releaseInstance (line 47) | private void releaseInstance(final Jedis jedis) {
    method open (line 58) | @Override
    method multiGet (line 63) | @Override
    method get (line 82) | @Override
    method hget (line 99) | @Override
    method close (line 116) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/redis/demo/RedisDemo.java
  class RedisDemo (line 14) | public class RedisDemo {
    method main (line 16) | public static void main(String[] args) {
    method singleConnect (line 21) | public static void singleConnect() {
    method poolConnect (line 40) | public static void poolConnect() {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/redis/mapper/LookupRedisMapper.java
  class LookupRedisMapper (line 14) | public class LookupRedisMapper extends AbstractDeserializationSchema<Row...
    method LookupRedisMapper (line 19) | public LookupRedisMapper(DeserializationSchema<RowData> valueDeseriali...
    method getCommandDescription (line 25) | public RedisCommandDescription getCommandDescription() {
    method deserialize (line 29) | @Override
    method serialize (line 38) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/redis/mapper/RedisCommand.java
  type RedisCommand (line 6) | public enum RedisCommand {
    method RedisCommand (line 16) | RedisCommand(RedisDataType redisDataType) {
    method getRedisDataType (line 20) | public RedisDataType getRedisDataType() {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/redis/mapper/RedisCommandDescription.java
  class RedisCommandDescription (line 6) | public class RedisCommandDescription {
    method RedisCommandDescription (line 14) | public RedisCommandDescription(RedisCommand redisCommand, String addit...
    method RedisCommandDescription (line 26) | public RedisCommandDescription(RedisCommand redisCommand) {
    method getRedisCommand (line 31) | public RedisCommand getRedisCommand() {
    method getAdditionalKey (line 35) | public String getAdditionalKey() {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/redis/mapper/SetRedisMapper.java
  class SetRedisMapper (line 9) | public class SetRedisMapper implements RedisMapper<RowData> {
    method getCommandDescription (line 11) | @Override
    method getKeyFromData (line 16) | @Override
    method getValueFromData (line 21) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/redis/options/RedisLookupOptions.java
  class RedisLookupOptions (line 6) | public class RedisLookupOptions implements Serializable {
    method getHostname (line 13) | public String getHostname() {
    method getPort (line 17) | public int getPort() {
    method RedisLookupOptions (line 32) | public RedisLookupOptions(
    method getCacheMaxSize (line 54) | public long getCacheMaxSize() {
    method getCacheExpireMs (line 58) | public long getCacheExpireMs() {
    method getMaxRetryTimes (line 62) | public int getMaxRetryTimes() {
    method getLookupAsync (line 66) | public boolean getLookupAsync() {
    method builder (line 70) | public static Builder builder() {
    method isBatchMode (line 74) | public boolean isBatchMode() {
    method getBatchSize (line 78) | public int getBatchSize() {
    method getBatchMinTriggerDelayMs (line 82) | public int getBatchMinTriggerDelayMs() {
    class Builder (line 87) | public static class Builder {
      method setIsBatchMode (line 96) | public Builder setIsBatchMode(boolean isBatchMode) {
      method setBatchSize (line 103) | public Builder setBatchSize(int batchSize) {
      method setBatchMinTriggerDelayMs (line 110) | public Builder setBatchMinTriggerDelayMs(int batchMinTriggerDelayMs) {
      method setCacheMaxSize (line 116) | public Builder setCacheMaxSize(long cacheMaxSize) {
      method setCacheExpireMs (line 122) | public Builder setCacheExpireMs(long cacheExpireMs) {
      method setMaxRetryTimes (line 128) | public Builder setMaxRetryTimes(int maxRetryTimes) {
      method setLookupAsync (line 134) | public Builder setLookupAsync(boolean lookupAsync) {
      method setHostname (line 146) | public Builder setHostname(String hostname) {
      method setPort (line 154) | public Builder setPort(int port) {
      method build (line 159) | public RedisLookupOptions build() {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/redis/options/RedisOptions.java
  class RedisOptions (line 20) | public class RedisOptions {
    method getRedisLookupOptions (line 155) | public static RedisLookupOptions getRedisLookupOptions(ReadableConfig ...
    method getRedisWriteOptions (line 167) | public static RedisWriteOptions getRedisWriteOptions(ReadableConfig ta...
    method createValueFormatProjection (line 184) | public static int[] createValueFormatProjection(

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/redis/options/RedisWriteOptions.java
  class RedisWriteOptions (line 7) | public class RedisWriteOptions {
    method getHostname (line 12) | public String getHostname() {
    method getPort (line 16) | public int getPort() {
    method RedisWriteOptions (line 52) | public RedisWriteOptions(int writeTtl, String hostname, int port, Stri...
    method getWriteTtl (line 61) | public int getWriteTtl() {
    method builder (line 65) | public static Builder builder() {
    method getWriteMode (line 69) | public String getWriteMode() {
    method isBatchMode (line 73) | public boolean isBatchMode() {
    method getBatchSize (line 77) | public int getBatchSize() {
    class Builder (line 82) | public static class Builder {
      method setWriteTtl (line 86) | public Builder setWriteTtl(int writeTtl) {
      method setHostname (line 104) | public Builder setHostname(String hostname) {
      method setPort (line 112) | public Builder setPort(int port) {
      method setWriteMode (line 117) | public Builder setWriteMode(String writeMode) {
      method build (line 122) | public RedisWriteOptions build() {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/redis/v1/RedisDynamicTableFactory.java
  class RedisDynamicTableFactory (line 30) | public class RedisDynamicTableFactory implements DynamicTableSourceFacto...
    method createDynamicTableSink (line 32) | @Override
    method factoryIdentifier (line 62) | @Override
    method requiredOptions (line 67) | @Override
    method optionalOptions (line 76) | @Override
    method createDynamicTableSource (line 87) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/redis/v1/source/RedisDynamicTableSource.java
  class RedisDynamicTableSource (line 20) | public class RedisDynamicTableSource implements LookupTableSource {
    method RedisDynamicTableSource (line 34) | public RedisDynamicTableSource(
    method getLookupRuntimeProvider (line 48) | @Override
    method createDeserialization (line 55) | private @Nullable DeserializationSchema<RowData> createDeserialization(
    method copy (line 67) | @Override
    method asSummaryString (line 72) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/redis/v1/source/RedisRowDataLookupFunction.java
  class RedisRowDataLookupFunction (line 47) | @Internal
    method RedisRowDataLookupFunction (line 67) | public RedisRowDataLookupFunction(
    method eval (line 84) | public void eval(Object... objects) throws IOException {
    method open (line 106) | @Override
    method close (line 170) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/redis/v2/RedisDynamicTableFactory.java
  class RedisDynamicTableFactory (line 36) | public class RedisDynamicTableFactory implements DynamicTableSourceFacto...
    method factoryIdentifier (line 38) | @Override
    method requiredOptions (line 43) | @Override
    method optionalOptions (line 51) | @Override
    method createDynamicTableSource (line 64) | @Override
    method createDynamicTableSink (line 97) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/redis/v2/sink/RedisDynamicTableSink.java
  class RedisDynamicTableSink (line 28) | public class RedisDynamicTableSink implements DynamicTableSink {
    method RedisDynamicTableSink (line 37) | public RedisDynamicTableSink(
    method createSerialization (line 48) | private @Nullable
    method getChangelogMode (line 61) | @Override
    method getSinkRuntimeProvider (line 73) | @Override
    method copy (line 95) | @Override
    method asSummaryString (line 100) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/redis/v2/source/RedisDynamicTableSource.java
  class RedisDynamicTableSource (line 23) | public class RedisDynamicTableSource implements LookupTableSource {
    method RedisDynamicTableSource (line 39) | public RedisDynamicTableSource(
    method getLookupRuntimeProvider (line 56) | @Override
    method createDeserialization (line 84) | private @Nullable DeserializationSchema<RowData> createDeserialization(
    method copy (line 96) | @Override
    method asSummaryString (line 101) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/redis/v2/source/RedisRowDataBatchLookupFunction.java
  class RedisRowDataBatchLookupFunction (line 51) | @Internal
    method RedisRowDataBatchLookupFunction (line 83) | public RedisRowDataBatchLookupFunction(
    method eval (line 112) | public void eval(Object... objects) throws IOException {
    method open (line 134) | @Override
    method close (line 257) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/redis/v2/source/RedisRowDataLookupFunction.java
  class RedisRowDataLookupFunction (line 48) | @Internal
    method RedisRowDataLookupFunction (line 78) | public RedisRowDataLookupFunction(
    method eval (line 107) | public void eval(Object... objects) throws IOException {
    method open (line 129) | @Override
    method close (line 212) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/socket/SocketDynamicTableFactory.java
  class SocketDynamicTableFactory (line 19) | public class SocketDynamicTableFactory implements DynamicTableSourceFact...
    method factoryIdentifier (line 34) | @Override
    method requiredOptions (line 39) | @Override
    method optionalOptions (line 48) | @Override
    method createDynamicTableSource (line 55) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/socket/SocketDynamicTableSource.java
  class SocketDynamicTableSource (line 14) | public class SocketDynamicTableSource implements ScanTableSource {
    method SocketDynamicTableSource (line 22) | public SocketDynamicTableSource(
    method getChangelogMode (line 35) | @Override
    method getScanRuntimeProvider (line 42) | @Override
    method copy (line 60) | @Override
    method asSummaryString (line 65) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/socket/SocketSourceFunction.java
  class SocketSourceFunction (line 15) | public class SocketSourceFunction extends RichSourceFunction<RowData> im...
    method SocketSourceFunction (line 25) | public SocketSourceFunction(String hostname, int port, byte byteDelimi...
    method getProducedType (line 33) | @Override
    method open (line 38) | @Override
    method run (line 47) | @Override
    method cancel (line 66) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/user_defined/UserDefinedDynamicTableFactory.java
  class UserDefinedDynamicTableFactory (line 19) | public class UserDefinedDynamicTableFactory implements DynamicTableSourc...
    method factoryIdentifier (line 26) | @Override
    method requiredOptions (line 31) | @Override
    method optionalOptions (line 39) | @Override
    method createDynamicTableSource (line 45) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/user_defined/UserDefinedDynamicTableSource.java
  class UserDefinedDynamicTableSource (line 33) | public class UserDefinedDynamicTableSource implements ScanTableSource
    method UserDefinedDynamicTableSource (line 46) | public UserDefinedDynamicTableSource(
    method getChangelogMode (line 55) | @Override
    method getScanRuntimeProvider (line 62) | @SneakyThrows
    method copy (line 81) | @Override
    method asSummaryString (line 86) | @Override
    method applyFilters (line 91) | @Override
    method applyLimit (line 96) | @Override
    method listPartitions (line 101) | @Override
    method applyPartitions (line 106) | @Override
    method supportsNestedProjection (line 111) | @Override
    method applyProjection (line 116) | @Override
    method listReadableMetadata (line 121) | @Override
    method applyReadableMetadata (line 128) | @Override
    method applyWatermark (line 133) | @Override
    method applySourceWatermark (line 138) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/user_defined/UserDefinedSource.java
  class UserDefinedSource (line 11) | public class UserDefinedSource extends RichSourceFunction<RowData> {
    method UserDefinedSource (line 17) | public UserDefinedSource(DeserializationSchema<RowData> dser) {
    method run (line 21) | @Override
    method cancel (line 31) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_04/type/BlinkPlannerTest.java
  class BlinkPlannerTest (line 19) | public class BlinkPlannerTest {
    method main (line 21) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_04/type/OldPlannerTest.java
  class OldPlannerTest (line 19) | public class OldPlannerTest {
    method main (line 21) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/ProtobufFormatTest.java
  class ProtobufFormatTest (line 12) | public class ProtobufFormatTest {
    method main (line 14) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/SocketWriteTest.java
  class SocketWriteTest (line 14) | public class SocketWriteTest {
    method main (line 17) | public static void main(String[] args) throws IOException, Interrupted...

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/csv/ChangelogCsvDeserializer.java
  class ChangelogCsvDeserializer (line 17) | public class ChangelogCsvDeserializer implements DeserializationSchema<R...
    method ChangelogCsvDeserializer (line 24) | public ChangelogCsvDeserializer(
    method getProducedType (line 35) | @Override
    method open (line 41) | @Override
    method deserialize (line 47) | @Override
    method parse (line 60) | private static Object parse(LogicalTypeRoot root, String value) {
    method isEndOfStream (line 71) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/csv/ChangelogCsvFormat.java
  class ChangelogCsvFormat (line 17) | public class ChangelogCsvFormat implements DecodingFormat<Deserializatio...
    method ChangelogCsvFormat (line 21) | public ChangelogCsvFormat(String columnDelimiter) {
    method createRuntimeDecoder (line 25) | @Override
    method getChangelogMode (line 45) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/csv/ChangelogCsvFormatFactory.java
  class ChangelogCsvFormatFactory (line 18) | public class ChangelogCsvFormatFactory implements DeserializationFormatF...
    method factoryIdentifier (line 25) | @Override
    method requiredOptions (line 30) | @Override
    method optionalOptions (line 35) | @Override
    method createDecodingFormat (line 42) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/protobuf/descriptors/Protobuf.java
  class Protobuf (line 15) | @PublicEvolving
    method Protobuf (line 21) | public Protobuf() {
    method messageClass (line 30) | public Protobuf messageClass(Class<? extends Message> messageClass) {
    method protobufDescriptorHttpGetUrl (line 41) | public Protobuf protobufDescriptorHttpGetUrl(String protobufDescriptor...
    method toFormatProperties (line 47) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/protobuf/descriptors/ProtobufValidator.java
  class ProtobufValidator (line 10) | public class ProtobufValidator extends FormatDescriptorValidator {
    method validate (line 16) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/protobuf/row/ProtobufDeserializationSchema.java
  class ProtobufDeserializationSchema (line 21) | public class ProtobufDeserializationSchema<T extends Message> extends Ab...
    method forGenericMessage (line 43) | public static ProtobufDeserializationSchema<Message> forGenericMessage...
    method forSpecificMessage (line 53) | public static <T extends GeneratedMessageV3> ProtobufDeserializationSc...
    method ProtobufDeserializationSchema (line 63) | @SuppressWarnings("unchecked")
    method deserialize (line 77) | @SuppressWarnings("unchecked")
    method readObject (line 84) | @SuppressWarnings("unchecked")
    method writeObject (line 97) | private void writeObject(ObjectOutputStream outputStream) throws IOExc...

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/protobuf/row/ProtobufRowDeserializationSchema.java
  class ProtobufRowDeserializationSchema (line 44) | public class ProtobufRowDeserializationSchema extends AbstractDeserializ...
    type DeserializationRuntimeConverter (line 77) | @FunctionalInterface
      method convert (line 79) | Object convert(Object object);
    method ProtobufRowDeserializationSchema (line 88) | public ProtobufRowDeserializationSchema(Class<? extends GeneratedMessa...
    method ProtobufRowDeserializationSchema (line 103) | public ProtobufRowDeserializationSchema(byte[] descriptorBytes) {
    method deserialize (line 113) | @Override
    method getProducedType (line 126) | @Override
    method createRowConverter (line 133) | private DeserializationRuntimeConverter createRowConverter(
    method createConverter (line 160) | @SuppressWarnings("unchecked")
    method createListConverter (line 297) | @SuppressWarnings("unchecked")
    method createObjectConverter (line 320) | private DeserializationRuntimeConverter createObjectConverter(TypeInfo...
    method convertToDecimal (line 332) | private BigDecimal convertToDecimal(byte[] bytes) {
    method convertToDate (line 336) | private Date convertToDate(Object object) {
    method convertToTime (line 351) | private Time convertToTime(Object object) {
    method convertToTimestamp (line 363) | private Timestamp convertToTimestamp(Object object) {
    method writeObject (line 375) | private void writeObject(ObjectOutputStream outputStream) throws IOExc...
    method readObject (line 383) | @SuppressWarnings("unchecked")

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/protobuf/row/ProtobufRowFormatFactory.java
  class ProtobufRowFormatFactory (line 31) | public class ProtobufRowFormatFactory extends TableFormatFactoryBase<Row>
    method ProtobufRowFormatFactory (line 34) | public ProtobufRowFormatFactory() {
    method supportedFormatProperties (line 38) | @Override
    method createDeserializationSchema (line 46) | @Override
    method createSerializationSchema (line 64) | @Override
    method httpGetDescriptorBytes (line 82) | public static byte[] httpGetDescriptorBytes(final String descriptorHtt...
    method getValidatedProperties (line 110) | private static DescriptorProperties getValidatedProperties(Map<String,...

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/protobuf/row/ProtobufRowSerializationSchema.java
  class ProtobufRowSerializationSchema (line 50) | public class ProtobufRowSerializationSchema implements SerializationSche...
    type SerializationRuntimeConverter (line 80) | @FunctionalInterface
      method convert (line 82) | Object convert(Object object);
    method ProtobufRowSerializationSchema (line 90) | public ProtobufRowSerializationSchema(Class<? extends GeneratedMessage...
    method ProtobufRowSerializationSchema (line 105) | public ProtobufRowSerializationSchema(byte[] descriptorBytes) {
    method serialize (line 115) | @Override
    method createRowConverter (line 126) | private SerializationRuntimeConverter createRowConverter(Descriptors.D...
    method createListConverter (line 152) | private SerializationRuntimeConverter createListConverter(TypeInformat...
    method createConverter (line 187) | @SuppressWarnings("unchecked")
    method convertFromDecimal (line 282) | private byte[] convertFromDecimal(BigDecimal decimal) {
    method convertFromDate (line 288) | private int convertFromDate(Date date) {
    method convertFromTime (line 294) | private int convertFromTime(Time date) {
    method convertFromTimestamp (line 300) | private long convertFromTimestamp(Timestamp date) {
    method convertFromEnum (line 306) | private Object convertFromEnum(FieldDescriptor fieldDescriptor, Object...
    method writeObject (line 329) | private void writeObject(ObjectOutputStream outputStream) throws IOExc...
    method readObject (line 337) | @SuppressWarnings("unchecked")

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/protobuf/row/ProtobufSerializationSchema.java
  class ProtobufSerializationSchema (line 7) | public class ProtobufSerializationSchema<T extends Message> implements S...
    method serialize (line 9) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/protobuf/row/ProtobufUtils.java
  class ProtobufUtils (line 22) | public class ProtobufUtils {
    method getDefaultInstance (line 24) | @SuppressWarnings("unchecked")
    method getDefaultInstance (line 34) | public static Message getDefaultInstance(byte[] descriptorBytes) {
    method getDescriptor (line 39) | public static Descriptors.Descriptor getDescriptor(byte[] descriptorBy...
    method getFileDescriptor (line 52) | public static FileDescriptor getFileDescriptor(byte[] descriptorBytes) {
    method getDescriptor (line 63) | public static Descriptors.Descriptor getDescriptor(Class<? extends Mes...
    method getBytes (line 68) | public static byte[] getBytes(InputStream is) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/protobuf/row/typeutils/ProtobufSchemaConverter.java
  class ProtobufSchemaConverter (line 38) | public class ProtobufSchemaConverter {
    method ProtobufSchemaConverter (line 40) | private ProtobufSchemaConverter() {
    method convertToTypeInfo (line 51) | @SuppressWarnings("unchecked")
    method convertToRowDataTypeInfo (line 59) | @SuppressWarnings("unchecked")
    method convertToTypeInfo (line 74) | @SuppressWarnings("unchecked")
    method convertToRowDataTypeInfo (line 82) | public static LogicalType convertToRowDataTypeInfo(Descriptors.Generic...
    method convertToTypeInfo (line 174) | public static TypeInformation<?> convertToTypeInfo(Descriptors.Generic...

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/protobuf/rowdata/ProtobufFormatFactory.java
  class ProtobufFormatFactory (line 28) | public class ProtobufFormatFactory implements DeserializationFormatFacto...
    method createDecodingFormat (line 33) | @Override
    method createEncodingFormat (line 67) | @Override
    method factoryIdentifier (line 73) | @Override
    method requiredOptions (line 78) | @Override
    method optionalOptions (line 83) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/protobuf/rowdata/ProtobufOptions.java
  class ProtobufOptions (line 7) | public class ProtobufOptions {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/protobuf/rowdata/ProtobufRowDataDeserializationSchema.java
  class ProtobufRowDataDeserializationSchema (line 23) | public class ProtobufRowDataDeserializationSchema extends AbstractDeseri...
    method ProtobufRowDataDeserializationSchema (line 65) | public ProtobufRowDataDeserializationSchema(
    method deserialize (line 108) | @Override
    method writeObject (line 130) | private void writeObject(ObjectOutputStream outputStream) throws IOExc...
    method readObject (line 139) | @SuppressWarnings("unchecked")

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/protobuf/rowdata/ProtobufRowDataSerializationSchema.java
  class ProtobufRowDataSerializationSchema (line 7) | public class ProtobufRowDataSerializationSchema implements Serialization...
    method serialize (line 8) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/protobuf/rowdata/ProtobufToRowDataConverters.java
  class ProtobufToRowDataConverters (line 40) | public class ProtobufToRowDataConverters implements Serializable {
    method ProtobufToRowDataConverters (line 49) | public ProtobufToRowDataConverters(boolean isDynamicMessage) {
    type ProtobufToRowDataConverter (line 54) | @FunctionalInterface
      method convert (line 56) | Object convert(Object object);
    method createRowDataConverterByLogicalType (line 59) | public ProtobufToRowDataConverter createRowDataConverterByLogicalType(
    method createConverterByLogicalType (line 88) | @SuppressWarnings("unchecked")
    method createRowDataConverterByDescriptor (line 204) | public ProtobufToRowDataConverter createRowDataConverterByDescriptor(
    method createConverterByDescriptor (line 234) | @SuppressWarnings("unchecked")
    method createArrayConverter (line 366) | @SuppressWarnings("unchecked")
    method convertToString (line 391) | private StringData convertToString(Object filedO) {
    method createObjectConverter (line 396) | private ProtobufToRowDataConverter createObjectConverter(LogicalType i...
    method convertToDecimal (line 410) | private BigDecimal convertToDecimal(byte[] bytes) {
    method convertToDate (line 414) | private Date convertToDate(Object object) {
    method convertToTime (line 429) | private Time convertToTime(Object object) {
    method convertToTimestamp (line 441) | private Timestamp convertToTimestamp(Object object) {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/protobuf/rowdata/RowDataToProtobufConverters.java
  class RowDataToProtobufConverters (line 4) | public class RowDataToProtobufConverters {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/utils/MoreRunnables.java
  class MoreRunnables (line 3) | public class MoreRunnables {
    method throwing (line 6) | public static <EXCEPTION extends Throwable> void throwing(ThrowableRun...

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/utils/MoreSuppliers.java
  class MoreSuppliers (line 4) | public class MoreSuppliers {
    method MoreSuppliers (line 6) | private MoreSuppliers() {
    method throwing (line 10) | public static <OUT> OUT throwing(ThrowableSupplier<OUT, Throwable> thr...

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/utils/ThrowableRunable.java
  type ThrowableRunable (line 3) | @FunctionalInterface
    method run (line 6) | void run() throws EXCEPTION;

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/utils/ThrowableSupplier.java
  type ThrowableSupplier (line 3) | @FunctionalInterface
    method get (line 6) | OUT get() throws EXCEPTION;

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_06/calcite/CalciteTest.java
  class CalciteTest (line 8) | public class CalciteTest {
    method main (line 10) | public static void main(String[] args) throws SqlParseException {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_06/calcite/ParserTest.java
  class ParserTest (line 19) | public class ParserTest {
    method main (line 21) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_06/calcite/javacc/JavaccCodeGenTest.java
  class JavaccCodeGenTest (line 5) | public class JavaccCodeGenTest {
    method main (line 7) | public static void main(String[] args) throws Exception {
    method version (line 12) | private static void version() throws Exception {
    method javacc (line 16) | private static void javacc() throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_06/calcite/javacc/Simple1Test.java
  class Simple1Test (line 7) | public class Simple1Test {
    method main (line 9) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_06/calcite/javacc/generatedcode/ParseException.java
  class ParseException (line 12) | public class ParseException extends Exception {
    method ParseException (line 32) | public ParseException(Token currentTokenVal,
    method ParseException (line 53) | public ParseException() {
    method ParseException (line 58) | public ParseException(String message) {
    method initialise (line 91) | private static String initialise(Token currentToken,
    method add_escapes (line 149) | static String add_escapes(String str) {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_06/calcite/javacc/generatedcode/Simple1.java
  class Simple1 (line 4) | public class Simple1 implements Simple1Constants {
    method main (line 7) | public static void main(String args[]) throws ParseException {
    method Input (line 13) | static final public void Input() throws ParseException {
    method MatchedBraces (line 46) | static final public void MatchedBraces() throws ParseException {
    method jj_la1_init_0 (line 75) | private static void jj_la1_init_0() {
    method Simple1 (line 80) | public Simple1(java.io.InputStream stream) {
    method Simple1 (line 84) | public Simple1(java.io.InputStream stream, String encoding) {
    method ReInit (line 101) | static public void ReInit(java.io.InputStream stream) {
    method ReInit (line 105) | static public void ReInit(java.io.InputStream stream, String encoding) {
    method Simple1 (line 115) | public Simple1(java.io.Reader stream) {
    method ReInit (line 132) | static public void ReInit(java.io.Reader stream) {
    method Simple1 (line 150) | public Simple1(Simple1TokenManager tm) {
    method ReInit (line 166) | public void ReInit(Simple1TokenManager tm) {
    method jj_consume_token (line 174) | static private Token jj_consume_token(int kind) throws ParseException {
    method getNextToken (line 190) | static final public Token getNextToken() {
    method getToken (line 199) | static final public Token getToken(int index) {
    method jj_ntk_f (line 208) | static private int jj_ntk_f() {
    method generateParseException (line 220) | static public ParseException generateParseException() {
    method trace_enabled (line 253) | static final public boolean trace_enabled() {
    method enable_tracing (line 258) | static final public void enable_tracing() {
    method disable_tracing (line 262) | static final public void disable_tracing() {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_06/calcite/javacc/generatedcode/Simple1Constants.java
  type Simple1Constants (line 7) | public interface Simple1Constants {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_06/calcite/javacc/generatedcode/Simple1TokenManager.java
  class Simple1TokenManager (line 5) | public class Simple1TokenManager implements Simple1Constants {
    method setDebugStream (line 10) | public static  void setDebugStream(java.io.PrintStream ds) { debugStre...
    method jjStopAtPos (line 11) | static private int jjStopAtPos(int pos, int kind)
    method jjMoveStringLiteralDfa0_0 (line 17) | static private int jjMoveStringLiteralDfa0_0(){
    method jjFillToken (line 36) | static protected Token jjFillToken()
    method getNextToken (line 70) | public static Token getNextToken()
    method SkipLexicalActions (line 123) | static void SkipLexicalActions(Token matchedToken)
    method MoreLexicalActions (line 131) | static void MoreLexicalActions()
    method TokenLexicalActions (line 140) | static void TokenLexicalActions(Token matchedToken)
    method jjCheckNAdd (line 148) | static private void jjCheckNAdd(int state)
    method jjAddStates (line 156) | static private void jjAddStates(int start, int end)
    method jjCheckNAddTwoStates (line 162) | static private void jjCheckNAddTwoStates(int state1, int state2)
    method Simple1TokenManager (line 169) | public Simple1TokenManager(SimpleCharStream stream){
    method Simple1TokenManager (line 178) | public Simple1TokenManager (SimpleCharStream stream, int lexState){
    method ReInit (line 185) | static public void ReInit(SimpleCharStream stream)
    method ReInitRounds (line 197) | static private void ReInitRounds()
    method ReInit (line 206) | static public void ReInit(SimpleCharStream stream, int lexState)
    method SwitchTo (line 214) | public static void SwitchTo(int lexState)

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_06/calcite/javacc/generatedcode/SimpleCharStream.java
  class SimpleCharStream (line 8) | public class SimpleCharStream
    method setTabSize (line 34) | static public void setTabSize(int i) { tabSize = i; }
    method getTabSize (line 35) | static public int getTabSize() { return tabSize; }
    method ExpandBuff (line 39) | static protected void ExpandBuff(boolean wrapAround)
    method FillBuff (line 88) | static protected void FillBuff() throws java.io.IOException
    method BeginToken (line 133) | static public char BeginToken() throws java.io.IOException
    method UpdateLineColumn (line 142) | static protected void UpdateLineColumn(char c)
    method readChar (line 183) | static public char readChar() throws java.io.IOException
    method getColumn (line 204) | @Deprecated
    method getLine (line 214) | @Deprecated
    method getEndColumn (line 225) | static public int getEndColumn() {
    method getEndLine (line 230) | static public int getEndLine() {
    method getBeginColumn (line 235) | static public int getBeginColumn() {
    method getBeginLine (line 240) | static public int getBeginLine() {
    method backup (line 245) | static public void backup(int amount) {
    method SimpleCharStream (line 253) | public SimpleCharStream(java.io.Reader dstream, int startline,
    method SimpleCharStream (line 271) | public SimpleCharStream(java.io.Reader dstream, int startline,
    method SimpleCharStream (line 278) | public SimpleCharStream(java.io.Reader dstream)
    method ReInit (line 284) | public void ReInit(java.io.Reader dstream, int startline,
    method ReInit (line 304) | public void ReInit(java.io.Reader dstream, int startline,
    method ReInit (line 311) | public void ReInit(java.io.Reader dstream)
    method SimpleCharStream (line 316) | public SimpleCharStream(java.io.InputStream dstream, String encoding, ...
    method SimpleCharStream (line 323) | public SimpleCharStream(java.io.InputStream dstream, int startline,
    method SimpleCharStream (line 330) | public SimpleCharStream(java.io.InputStream dstream, String encoding, ...
    method SimpleCharStream (line 337) | public SimpleCharStream(java.io.InputStream dstream, int startline,
    method SimpleCharStream (line 344) | public SimpleCharStream(java.io.InputStream dstream, String encoding) ...
    method SimpleCharStream (line 350) | public SimpleCharStream(java.io.InputStream dstream)
    method ReInit (line 356) | public void ReInit(java.io.InputStream dstream, String encoding, int s...
    method ReInit (line 363) | public void ReInit(java.io.InputStream dstream, int startline,
    method ReInit (line 370) | public void ReInit(java.io.InputStream dstream, String encoding) throw...
    method ReInit (line 376) | public void ReInit(java.io.InputStream dstream)
    method ReInit (line 381) | public void ReInit(java.io.InputStream dstream, String encoding, int s...
    method ReInit (line 387) | public void ReInit(java.io.InputStream dstream, int startline,
    method GetImage (line 393) | static public String GetImage()
    method GetSuffix (line 403) | static public char[] GetSuffix(int len)
    method Done (line 420) | static public void Done()
    method adjustBeginLineColumn (line 430) | static public void adjustBeginLineColumn(int newLine, int newCol)
    method getTrackLineColumn (line 473) | static boolean getTrackLineColumn() { return trackLineColumn; }
    method setTrackLineColumn (line 474) | static void setTrackLineColumn(boolean tlc) { trackLineColumn = tlc; }

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_06/calcite/javacc/generatedcode/Token.java
  class Token (line 7) | public class Token implements java.io.Serializable {
    method getValue (line 69) | public Object getValue() {
    method Token (line 76) | public Token() {}
    method Token (line 81) | public Token(int kind)
    method Token (line 89) | public Token(int kind, String image)
    method toString (line 98) | @Override
    method newToken (line 116) | public static Token newToken(int ofKind, String image)
    method newToken (line 124) | public static Token newToken(int ofKind)

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_06/calcite/javacc/generatedcode/TokenMgrError.java
  class TokenMgrError (line 4) | public class TokenMgrError extends Error
    method addEscapes (line 48) | protected static final String addEscapes(String str) {
    method LexicalErr (line 103) | protected static String LexicalErr(boolean EOFSeen, int lexState, int ...
    method getMessage (line 121) | @Override
    method TokenMgrError (line 131) | public TokenMgrError() {
    method TokenMgrError (line 135) | public TokenMgrError(String message, int reason) {
    method TokenMgrError (line 141) | public TokenMgrError(boolean EOFSeen, int lexState, int errorLine, int...

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_01_select_where/SelectWhereHiveDialect.java
  class SelectWhereHiveDialect (line 23) | public class SelectWhereHiveDialect {
    method main (line 26) | public static void main(String[] args) throws Exception {
    class UserDefinedSource (line 47) | private static class UserDefinedSource implements SourceFunction<Row>,...
      method run (line 51) | @Override
      method cancel (line 66) | @Override
      method getProducedType (line 71) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_01_select_where/SelectWhereTest.java
  class SelectWhereTest (line 23) | public class SelectWhereTest {
    method main (line 26) | public static void main(String[] args) throws Exception {
    class UserDefinedSource (line 76) | private static class UserDefinedSource implements SourceFunction<Row>,...
      method run (line 80) | @Override
      method cancel (line 95) | @Override
      method getProducedType (line 100) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_01_select_where/SelectWhereTest2.java
  class SelectWhereTest2 (line 13) | public class SelectWhereTest2 {
    method main (line 16) | public static void main(String[] args) throws Exception {
    class UserDefinedSource (line 49) | private static class UserDefinedSource implements SourceFunction<Row>,...
      method run (line 53) | @Override
      method cancel (line 68) | @Override
      method getProducedType (line 73) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_01_select_where/SelectWhereTest3.java
  class SelectWhereTest3 (line 23) | public class SelectWhereTest3 {
    method main (line 26) | public static void main(String[] args) throws Exception {
    class UserDefinedSource (line 75) | private static class UserDefinedSource implements SourceFunction<Row>,...
      method run (line 79) | @Override
      method cancel (line 94) | @Override
      method getProducedType (line 99) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_01_select_where/SelectWhereTest4.java
  class SelectWhereTest4 (line 22) | public class SelectWhereTest4 {
    method main (line 25) | public static void main(String[] args) throws Exception {
    class UserDefinedSource (line 66) | private static class UserDefinedSource implements SourceFunction<Row>,...
      method run (line 70) | @Override
      method cancel (line 85) | @Override
      method getProducedType (line 90) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_01_select_where/SelectWhereTest5.java
  class SelectWhereTest5 (line 9) | public class SelectWhereTest5 {
    method main (line 12) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_01_select_where/StreamExecCalc$10.java
  class StreamExecCalc$10 (line 3) | public class StreamExecCalc$10 extends org.apache.flink.table.runtime.op...
    method StreamExecCalc$10 (line 11) | public StreamExecCalc$10(
    method open (line 26) | @Override
    method processElement (line 32) | @Override
    method close (line 90) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_02_select_distinct/GroupAggsHandler$5.java
  class GroupAggsHandler$5 (line 4) | public final class GroupAggsHandler$5 implements org.apache.flink.table....
    method GroupAggsHandler$5 (line 12) | public GroupAggsHandler$5(Object[] references) throws Exception {
    method getRuntimeContext (line 16) | private org.apache.flink.api.common.functions.RuntimeContext getRuntim...
    method open (line 20) | @Override
    method accumulate (line 26) | @Override
    method retract (line 32) | @Override
    method merge (line 40) | @Override
    method setAccumulators (line 47) | @Override
    method resetAccumulators (line 53) | @Override
    method getAccumulators (line 59) | @Override
    method createAccumulators (line 65) | @Override
    method getValue (line 71) | @Override
    method cleanup (line 77) | @Override
    method close (line 83) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_02_select_distinct/KeyProjection$0.java
  class KeyProjection$0 (line 4) | public class KeyProjection$0 implements
    method KeyProjection$0 (line 12) | public KeyProjection$0(Object[] references) throws Exception {
    method apply (line 16) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_02_select_distinct/SelectDistinctTest.java
  class SelectDistinctTest (line 23) | public class SelectDistinctTest {
    method main (line 26) | public static void main(String[] args) throws Exception {
    class UserDefinedSource (line 82) | private static class UserDefinedSource implements SourceFunction<Row>,...
      method run (line 86) | @Override
      method cancel (line 101) | @Override
      method getProducedType (line 106) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_02_select_distinct/SelectDistinctTest2.java
  class SelectDistinctTest2 (line 15) | public class SelectDistinctTest2 {
    method main (line 18) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_03_group_agg/_01_group_agg/GroupAggMiniBatchTest.java
  class GroupAggMiniBatchTest (line 16) | public class GroupAggMiniBatchTest {
    method main (line 18) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_03_group_agg/_01_group_agg/GroupAggTest.java
  class GroupAggTest (line 15) | public class GroupAggTest {
    method main (line 17) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_03_group_agg/_01_group_agg/GroupAggsHandler$39.java
  class GroupAggsHandler$39 (line 4) | public final class GroupAggsHandler$39 implements org.apache.flink.table...
    method GroupAggsHandler$39 (line 24) | public GroupAggsHandler$39(Object[] references) throws Exception {
    method getRuntimeContext (line 28) | private org.apache.flink.api.common.functions.RuntimeContext getRuntim...
    method open (line 32) | @Override
    method accumulate (line 38) | @Override
    method retract (line 279) | @Override
    method merge (line 287) | @Override
    method setAccumulators (line 294) | @Override
    method resetAccumulators (line 372) | @Override
    method getAccumulators (line 402) | @Override
    method createAccumulators (line 455) | @Override
    method getValue (line 508) | @Override
    method cleanup (line 590) | @Override
    method close (line 596) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_03_group_agg/_02_count_distinct/CountDistinctGroupAggTest.java
  class CountDistinctGroupAggTest (line 15) | public class CountDistinctGroupAggTest {
    method main (line 17) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_03_group_agg/_02_count_distinct/GroupAggsHandler$17.java
  class GroupAggsHandler$17 (line 4) | public final class GroupAggsHandler$17 implements org.apache.flink.table...
    method GroupAggsHandler$17 (line 19) | public GroupAggsHandler$17(Object[] references) throws Exception {
    method getRuntimeContext (line 24) | private org.apache.flink.api.common.functions.RuntimeContext getRuntim...
    method open (line 28) | @Override
    method accumulate (line 40) | @Override
    method retract (line 108) | @Override
    method merge (line 116) | @Override
    method setAccumulators (line 123) | @Override
    method resetAccumulators (line 143) | @Override
    method getAccumulators (line 154) | @Override
    method createAccumulators (line 182) | @Override
    method getValue (line 211) | @Override
    method cleanup (line 229) | @Override
    method close (line 237) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_03_group_agg/_03_grouping_sets/GroupingSetsEqualsGroupAggUnionAllGroupAggTest2.java
  class GroupingSetsEqualsGroupAggUnionAllGroupAggTest2 (line 15) | public class GroupingSetsEqualsGroupAggUnionAllGroupAggTest2 {
    method main (line 17) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_03_group_agg/_03_grouping_sets/GroupingSetsGroupAggTest.java
  class GroupingSetsGroupAggTest (line 15) | public class GroupingSetsGroupAggTest {
    method main (line 17) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_03_group_agg/_03_grouping_sets/GroupingSetsGroupAggTest2.java
  class GroupingSetsGroupAggTest2 (line 15) | public class GroupingSetsGroupAggTest2 {
    method main (line 17) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_03_group_agg/_03_grouping_sets/StreamExecExpand$20.java
  class StreamExecExpand$20 (line 4) | public class StreamExecExpand$20 extends org.apache.flink.table.runtime....
    method StreamExecExpand$20 (line 14) | public StreamExecExpand$20(
    method open (line 30) | @Override
    method processElement (line 36) | @Override
    method close (line 149) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_03_group_agg/_04_cube/CubeGroupAggTest.java
  class CubeGroupAggTest (line 15) | public class CubeGroupAggTest {
    method main (line 17) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_03_group_agg/_04_cube/CubeGroupAggTest2.java
  class CubeGroupAggTest2 (line 15) | public class CubeGroupAggTest2 {
    method main (line 17) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_03_group_agg/_05_rollup/RollUpGroupAggTest.java
  class RollUpGroupAggTest (line 15) | public class RollUpGroupAggTest {
    method main (line 17) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_03_group_agg/_05_rollup/RollUpGroupAggTest2.java
  class RollUpGroupAggTest2 (line 15) | public class RollUpGroupAggTest2 {
    method main (line 17) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_04_window_agg/_01_tumble_window/TumbleWindow2GroupAggTest.java
  class TumbleWindow2GroupAggTest (line 9) | public class TumbleWindow2GroupAggTest {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_04_window_agg/_01_tumble_window/TumbleWindowTest.java
  class TumbleWindowTest (line 7) | public class TumbleWindowTest {
    method main (line 9) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_04_window_agg/_01_tumble_window/TumbleWindowTest2.java
  class TumbleWindowTest2 (line 7) | public class TumbleWindowTest2 {
    method main (line 9) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_04_window_agg/_01_tumble_window/TumbleWindowTest3.java
  class TumbleWindowTest3 (line 7) | public class TumbleWindowTest3 {
    method main (line 9) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_04_window_agg/_01_tumble_window/TumbleWindowTest4.java
  class TumbleWindowTest4 (line 7) | public class TumbleWindowTest4 {
    method main (line 9) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_04_window_agg/_01_tumble_window/TumbleWindowTest5.java
  class TumbleWindowTest5 (line 17) | public class TumbleWindowTest5 {
    method main (line 19) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_04_window_agg/_01_tumble_window/global_agg/GlobalWindowAggsHandler$232.java
  class GlobalWindowAggsHandler$232 (line 4) | public final class GlobalWindowAggsHandler$232
    method GlobalWindowAggsHandler$232 (line 34) | public GlobalWindowAggsHandler$232(Object[] references) throws Excepti...
    method getRuntimeContext (line 42) | private org.apache.flink.api.common.functions.RuntimeContext getRuntim...
    method open (line 46) | @Override
    method accumulate (line 61) | @Override
    method retract (line 312) | @Override
    method merge (line 320) | @Override
    method setAccumulators (line 618) | @Override
    method getAccumulators (line 702) | @Override
    method createAccumulators (line 758) | @Override
    method getValue (line 815) | @Override
    method cleanup (line 877) | @Override
    method close (line 887) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_04_window_agg/_01_tumble_window/global_agg/LocalWindowAggsHandler$162.java
  class LocalWindowAggsHandler$162 (line 4) | public final class LocalWindowAggsHandler$162
    method LocalWindowAggsHandler$162 (line 30) | public LocalWindowAggsHandler$162(Object[] references) throws Exception {
    method getRuntimeContext (line 36) | private org.apache.flink.api.common.functions.RuntimeContext getRuntim...
    method open (line 40) | @Override
    method accumulate (line 48) | @Override
    method retract (line 299) | @Override
    method merge (line 307) | @Override
    method setAccumulators (line 606) | @Override
    method getAccumulators (line 684) | @Override
    method createAccumulators (line 740) | @Override
    method getValue (line 797) | @Override
    method cleanup (line 859) | @Override
    method close (line 866) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_04_window_agg/_01_tumble_window/global_agg/StateWindowAggsHandler$300.java
  class StateWindowAggsHandler$300 (line 4) | public final class StateWindowAggsHandler$300
    method StateWindowAggsHandler$300 (line 35) | public StateWindowAggsHandler$300(Object[] references) throws Exception {
    method getRuntimeContext (line 42) | private org.apache.flink.api.common.functions.RuntimeContext getRuntim...
    method open (line 46) | @Override
    method accumulate (line 65) | @Override
    method retract (line 316) | @Override
    method merge (line 324) | @Override
    method setAccumulators (line 623) | @Override
    method getAccumulators (line 707) | @Override
    method createAccumulators (line 763) | @Override
    method getValue (line 820) | @Override
    method cleanup (line 882) | @Override
    method close (line 892) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_04_window_agg/_01_tumble_window/local_agg/KeyProjection$89.java
  class KeyProjection$89 (line 4) | public class KeyProjection$89 implements
    method KeyProjection$89 (line 12) | public KeyProjection$89(Object[] references) throws Exception {
    method apply (line 16) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_04_window_agg/_01_tumble_window/local_agg/LocalWindowAggsHandler$88.java
  class LocalWindowAggsHandler$88 (line 4) | public final class LocalWindowAggsHandler$88
    method LocalWindowAggsHandler$88 (line 30) | public LocalWindowAggsHandler$88(Object[] references) throws Exception {
    method getRuntimeContext (line 36) | private org.apache.flink.api.common.functions.RuntimeContext getRuntim...
    method open (line 40) | @Override
    method accumulate (line 48) | @Override
    method retract (line 299) | @Override
    method merge (line 307) | @Override
    method setAccumulators (line 605) | @Override
    method getAccumulators (line 683) | @Override
    method createAccumulators (line 739) | @Override
    method getValue (line 796) | @Override
    method cleanup (line 843) | @Override
    method close (line 850) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_04_window_agg/_02_cumulate_window/CumulateWindowGroupingSetsBigintTest.java
  class CumulateWindowGroupingSetsBigintTest (line 7) | public class CumulateWindowGroupingSetsBigintTest {
    method main (line 9) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_04_window_agg/_02_cumulate_window/CumulateWindowGroupingSetsTest.java
  class CumulateWindowGroupingSetsTest (line 7) | public class CumulateWindowGroupingSetsTest {
    method main (line 9) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_04_window_agg/_02_cumulate_window/CumulateWindowTest.java
  class CumulateWindowTest (line 7) | public class CumulateWindowTest {
    method main (line 9) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_04_window_agg/_02_cumulate_window/TumbleWindowEarlyFireTest.java
  class TumbleWindowEarlyFireTest (line 9) | public class TumbleWindowEarlyFireTest {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_04_window_agg/_02_cumulate_window/cumulate/global_agg/KeyProjection$301.java
  class KeyProjection$301 (line 4) | public class KeyProjection$301 implements
    method KeyProjection$301 (line 12) | public KeyProjection$301(Object[] references) throws Exception {
    method apply (line 16) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_04_window_agg/_02_cumulate_window/cumulate/local_agg/KeyProjection$89.java
  class KeyProjection$89 (line 4) | public class KeyProjection$89 implements
    method KeyProjection$89 (line 12) | public KeyProjection$89(Object[] references) throws Exception {
    method apply (line 16) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_04_window_agg/_02_cumulate_window/earlyfire/GroupAggsHandler$210.java
  class GroupAggsHandler$210 (line 7) | public final class GroupAggsHandler$210 implements org.apache.flink.tabl...
    method GroupAggsHandler$210 (line 61) | public GroupAggsHandler$210(java.lang.Object[] references) throws Exce...
    method getRuntimeContext (line 74) | private org.apache.flink.api.common.functions.RuntimeContext getRuntim...
    method open (line 78) | @Override
    method accumulate (line 133) | @Override
    method retract (line 437) | @Override
    method merge (line 774) | @Override
    method setAccumulators (line 781) | @Override
    method resetAccumulators (line 945) | @Override
    method getAccumulators (line 1000) | @Override
    method createAccumulators (line 1087) | @Override
    method getValue (line 1183) | @Override
    method cleanup (line 1331) | @Override
    method close (line 1345) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_04_window_agg/_03_hop_window/HopWindowGroupWindowAggTest.java
  class HopWindowGroupWindowAggTest (line 9) | public class HopWindowGroupWindowAggTest {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_05_over/_01_row_number/RowNumberOrderByBigintTest.java
  class RowNumberOrderByBigintTest (line 16) | public class RowNumberOrderByBigintTest {
    method main (line 18) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_05_over/_01_row_number/RowNumberOrderByStringTest.java
  class RowNumberOrderByStringTest (line 16) | public class RowNumberOrderByStringTest {
    method main (line 18) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_05_over/_01_row_number/RowNumberOrderByUnixTimestampTest.java
  class RowNumberOrderByUnixTimestampTest (line 16) | public class RowNumberOrderByUnixTimestampTest {
    method main (line 18) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_05_over/_01_row_number/RowNumberWithoutPartitionKeyTest.java
  class RowNumberWithoutPartitionKeyTest (line 16) | public class RowNumberWithoutPartitionKeyTest {
    method main (line 18) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_05_over/_01_row_number/RowNumberWithoutRowNumberEqual1Test.java
  class RowNumberWithoutRowNumberEqual1Test (line 9) | public class RowNumberWithoutRowNumberEqual1Test {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_05_over/_01_row_number/Scalar_UDF.java
  class Scalar_UDF (line 7) | public class Scalar_UDF extends ScalarFunction {
    method open (line 9) | @Override
    method eval (line 16) | public int eval(Long id, int remainder) {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_05_over/_02_agg/RangeIntervalProctimeTest.java
  class RangeIntervalProctimeTest (line 8) | public class RangeIntervalProctimeTest {
    method main (line 10) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_05_over/_02_agg/RangeIntervalRowtimeAscendingTest.java
  class RangeIntervalRowtimeAscendingTest (line 8) | public class RangeIntervalRowtimeAscendingTest {
    method main (line 10) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_05_over/_02_agg/RangeIntervalRowtimeBoundedOutOfOrdernessTest.java
  class RangeIntervalRowtimeBoundedOutOfOrdernessTest (line 8) | public class RangeIntervalRowtimeBoundedOutOfOrdernessTest {
    method main (line 10) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_05_over/_02_agg/RangeIntervalRowtimeStrictlyAscendingTest.java
  class RangeIntervalRowtimeStrictlyAscendingTest (line 8) | public class RangeIntervalRowtimeStrictlyAscendingTest {
    method main (line 10) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_05_over/_02_agg/RowIntervalTest.java
  class RowIntervalTest (line 8) | public class RowIntervalTest {
    method main (line 10) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_06_joins/_01_regular_joins/_01_inner_join/ConditionFunction$4.java
  class ConditionFunction$4 (line 4) | public class ConditionFunction$4 extends org.apache.flink.api.common.fun...
    method ConditionFunction$4 (line 8) | public ConditionFunction$4(Object[] references) throws Exception {
    method open (line 12) | @Override
    method apply (line 17) | @Override
    method close (line 24) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_06_joins/_01_regular_joins/_01_inner_join/_01_InnerJoinsTest.java
  class _01_InnerJoinsTest (line 9) | public class _01_InnerJoinsTest {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_06_joins/_01_regular_joins/_01_inner_join/_02_InnerJoinsOnNotEqualTest.java
  class _02_InnerJoinsOnNotEqualTest (line 9) | public class _02_InnerJoinsOnNotEqualTest {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_06_joins/_01_regular_joins/_02_outer_join/_01_LeftJoinsTest.java
  class _01_LeftJoinsTest (line 9) | public class _01_LeftJoinsTest {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_06_joins/_01_regular_joins/_02_outer_join/_02_RightJoinsTest.java
  class _02_RightJoinsTest (line 9) | public class _02_RightJoinsTest {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_06_joins/_01_regular_joins/_02_outer_join/_03_FullJoinsTest.java
  class _03_FullJoinsTest (line 9) | public class _03_FullJoinsTest {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_06_joins/_02_interval_joins/_01_proctime/Interval_Full_Joins_ProcesingTime_Test.java
  class Interval_Full_Joins_ProcesingTime_Test (line 9) | public class Interval_Full_Joins_ProcesingTime_Test {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_06_joins/_02_interval_joins/_01_proctime/Interval_Inner_Joins_ProcesingTime_Test.java
  class Interval_Inner_Joins_ProcesingTime_Test (line 9) | public class Interval_Inner_Joins_ProcesingTime_Test {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_06_joins/_02_interval_joins/_01_proctime/Interval_Left_Joins_ProcesingTime_Test.java
  class Interval_Left_Joins_ProcesingTime_Test (line 9) | public class Interval_Left_Joins_ProcesingTime_Test {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_06_joins/_02_interval_joins/_01_proctime/Interval_Right_Joins_ProcesingTime_Test.java
  class Interval_Right_Joins_ProcesingTime_Test (line 9) | public class Interval_Right_Joins_ProcesingTime_Test {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_06_joins/_02_interval_joins/_02_row_time/Interval_Full_JoinsOnNotEqual_EventTime_Test.java
  class Interval_Full_JoinsOnNotEqual_EventTime_Test (line 9) | public class Interval_Full_JoinsOnNotEqual_EventTime_Test {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_06_joins/_02_interval_joins/_02_row_time/Interval_Full_Joins_EventTime_Test.java
  class Interval_Full_Joins_EventTime_Test (line 9) | public class Interval_Full_Joins_EventTime_Test {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_06_joins/_02_interval_joins/_02_row_time/Interval_Inner_Joins_EventTime_Test.java
  class Interval_Inner_Joins_EventTime_Test (line 9) | public class Interval_Inner_Joins_EventTime_Test {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_06_joins/_02_interval_joins/_02_row_time/Interval_Left_Joins_EventTime_Test.java
  class Interval_Left_Joins_EventTime_Test (line 9) | public class Interval_Left_Joins_EventTime_Test {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_06_joins/_02_interval_joins/_02_row_time/Interval_Right_Joins_EventTime_Test.java
  class Interval_Right_Joins_EventTime_Test (line 9) | public class Interval_Right_Joins_EventTime_Test {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_06_joins/_03_temporal_join/_01_proctime/Temporal_Join_ProcesingTime_Test.java
  class Temporal_Join_ProcesingTime_Test (line 9) | public class Temporal_Join_ProcesingTime_Test {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_06_joins/_03_temporal_join/_02_row_time/Temporal_Join_EventTime_Test.java
  class Temporal_Join_EventTime_Test (line 9) | public class Temporal_Join_EventTime_Test {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_06_joins/_04_lookup_join/_01_redis/RedisBatchLookupTest2.java
  class RedisBatchLookupTest2 (line 12) | public class RedisBatchLookupTest2 {
    method main (line 14) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_06_joins/_04_lookup_join/_01_redis/RedisDemo.java
  class RedisDemo (line 16) | public class RedisDemo {
    method main (line 18) | public static void main(String[] args) {
    method singleConnect (line 24) | public static void singleConnect() {
    method poolConnect (line 42) | public static void poolConnect() {
    method pipeline (line 52) | public static void pipeline() {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_06_joins/_04_lookup_join/_01_redis/RedisLookupTest.java
  class RedisLookupTest (line 22) | public class RedisLookupTest {
    method main (line 24) | public static void main(String[] args) throws Exception {
    class UserDefinedSource (line 79) | private static class UserDefinedSource implements SourceFunction<Row>,...
      method run (line 83) | @Override
      method cancel (line 96) | @Override
      method getProducedType (line 101) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_06_joins/_04_lookup_join/_01_redis/RedisLookupTest2.java
  class RedisLookupTest2 (line 12) | public class RedisLookupTest2 {
    method main (line 14) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_06_joins/_05_array_expansion/_01_ArrayExpansionTest.java
  class _01_ArrayExpansionTest (line 9) | public class _01_ArrayExpansionTest {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_06_joins/_06_table_function/_01_inner_join/TableFunctionInnerJoin_Test.java
  class TableFunctionInnerJoin_Test (line 11) | public class TableFunctionInnerJoin_Test {
    method main (line 13) | public static void main(String[] args) throws Exception {
    class UserProfileTableFunction (line 54) | public static class UserProfileTableFunction extends TableFunction<Int...
      method eval (line 56) | public void eval(long userId) {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_06_joins/_06_table_function/_01_inner_join/TableFunctionInnerJoin_WithEmptyTableFunction_Test.java
  class TableFunctionInnerJoin_WithEmptyTableFunction_Test (line 17) | public class TableFunctionInnerJoin_WithEmptyTableFunction_Test {
    method main (line 19) | public static void main(String[] args) throws Exception {
    class UserProfile_EmptyTableFunction (line 90) | public static class UserProfile_EmptyTableFunction extends TableFuncti...
      method eval (line 92) | public void eval(long userId) {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_07_deduplication/DeduplicationProcessingTimeTest.java
  class DeduplicationProcessingTimeTest (line 7) | public class DeduplicationProcessingTimeTest {
    method main (line 9) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_07_deduplication/DeduplicationProcessingTimeTest1.java
  class DeduplicationProcessingTimeTest1 (line 7) | public class DeduplicationProcessingTimeTest1 {
    method main (line 9) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_07_deduplication/DeduplicationRowTimeTest.java
  class DeduplicationRowTimeTest (line 7) | public class DeduplicationRowTimeTest {
    method main (line 9) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_08_datastream_trans/AlertExample.java
  class AlertExample (line 12) | @Slf4j
    method main (line 15) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_08_datastream_trans/AlertExampleRetract.java
  class AlertExampleRetract (line 13) | @Slf4j
    method main (line 16) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_08_datastream_trans/AlertExampleRetractError.java
  class AlertExampleRetractError (line 12) | @Slf4j
    method main (line 15) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_08_datastream_trans/Test.java
  class Test (line 15) | public class Test {
    method main (line 17) | public static void main(String[] args) throws Exception {
    class UserDefinedSource (line 51) | private static class UserDefinedSource implements SourceFunction<Row>,...
      method run (line 55) | @Override
      method cancel (line 70) | @Override
      method getProducedType (line 75) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_09_set_operations/Except_Test.java
  class Except_Test (line 9) | public class Except_Test {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_09_set_operations/Exist_Test.java
  class Exist_Test (line 9) | public class Exist_Test {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_09_set_operations/In_Test.java
  class In_Test (line 9) | public class In_Test {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_09_set_operations/Intersect_Test.java
  class Intersect_Test (line 9) | public class Intersect_Test {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_09_set_operations/UnionAll_Test.java
  class UnionAll_Test (line 9) | public class UnionAll_Test {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_09_set_operations/Union_Test.java
  class Union_Test (line 9) | public class Union_Test {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_10_order_by/OrderBy_with_time_attr_Test.java
  class OrderBy_with_time_attr_Test (line 9) | public class OrderBy_with_time_attr_Test {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_10_order_by/OrderBy_without_time_attr_Test.java
  class OrderBy_without_time_attr_Test (line 9) | public class OrderBy_without_time_attr_Test {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_11_limit/Limit_Test.java
  class Limit_Test (line 9) | public class Limit_Test {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_12_topn/TopN_Test.java
  class TopN_Test (line 9) | public class TopN_Test {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_13_window_topn/WindowTopN_Test.java
  class WindowTopN_Test (line 9) | public class WindowTopN_Test {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_14_retract/Retract_Test.java
  class Retract_Test (line 9) | public class Retract_Test {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_15_exec_options/Default_Parallelism_Test.java
  class Default_Parallelism_Test (line 9) | public class Default_Parallelism_Test {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_15_exec_options/Idle_Timeout_Test.java
  class Idle_Timeout_Test (line 9) | public class Idle_Timeout_Test {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_15_exec_options/State_Ttl_Test.java
  class State_Ttl_Test (line 9) | public class State_Ttl_Test {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_16_optimizer_options/Agg_OnePhase_Strategy_window_Test.java
  class Agg_OnePhase_Strategy_window_Test (line 9) | public class Agg_OnePhase_Strategy_window_Test {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_16_optimizer_options/Agg_TwoPhase_Strategy_unbounded_Test.java
  class Agg_TwoPhase_Strategy_unbounded_Test (line 9) | public class Agg_TwoPhase_Strategy_unbounded_Test {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_16_optimizer_options/Agg_TwoPhase_Strategy_window_Test.java
  class Agg_TwoPhase_Strategy_window_Test (line 9) | public class Agg_TwoPhase_Strategy_window_Test {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_16_optimizer_options/DistinctAgg_Split_One_Distinct_Key_Test.java
  class DistinctAgg_Split_One_Distinct_Key_Test (line 9) | public class DistinctAgg_Split_One_Distinct_Key_Test {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_16_optimizer_options/DistinctAgg_Split_Two_Distinct_Key_Test.java
  class DistinctAgg_Split_Two_Distinct_Key_Test (line 9) | public class DistinctAgg_Split_Two_Distinct_Key_Test {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_17_table_options/Dml_Syc_False_Test.java
  class Dml_Syc_False_Test (line 10) | public class Dml_Syc_False_Test {
    method main (line 12) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_17_table_options/Dml_Syc_True_Test.java
  class Dml_Syc_True_Test (line 10) | public class Dml_Syc_True_Test {
    method main (line 12) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_17_table_options/TimeZone_window_Test.java
  class TimeZone_window_Test (line 9) | public class TimeZone_window_Test {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_18_performance_tuning/Count_Distinct_Filter_Test.java
  class Count_Distinct_Filter_Test (line 10) | public class Count_Distinct_Filter_Test {
    method main (line 12) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_08/batch/Utils.java
  class Utils (line 5) | public class Utils {
    method format (line 7) | public static String format(String sql) {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_08/batch/_01_ddl/HiveDDLTest.java
  class HiveDDLTest (line 24) | public class HiveDDLTest {
    method main (line 26) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_08/batch/_02_dml/HiveDMLBetweenAndTest.java
  class HiveDMLBetweenAndTest (line 27) | public class HiveDMLBetweenAndTest {
    method main (line 29) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_08/batch/_02_dml/HiveDMLTest.java
  class HiveDMLTest (line 25) | public class HiveDMLTest {
    method main (line 27) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_08/batch/_02_dml/HiveTest2.java
  class HiveTest2 (line 22) | public class HiveTest2 {
    method main (line 24) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_08/batch/_02_dml/_01_hive_dialect/HiveDMLTest.java
  class HiveDMLTest (line 23) | public class HiveDMLTest {
    method main (line 25) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_08/batch/_02_dml/_02_with_as/HIveWIthAsTest.java
  class HIveWIthAsTest (line 28) | public class HIveWIthAsTest {
    method main (line 30) | public static void main(String[] args) {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_08/batch/_02_dml/_03_substr/HiveSubstrTest.java
  class HiveSubstrTest (line 28) | public class HiveSubstrTest {
    method main (line 30) | public static void main(String[] args) {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_08/batch/_02_dml/_04_tumble_window/Test.java
  class Test (line 27) | public class Test {
    method main (line 44) | public static void main(String[] args) {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_08/batch/_02_dml/_04_tumble_window/Test1.java
  class Test1 (line 27) | public class Test1 {
    method main (line 44) | public static void main(String[] args) {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_08/batch/_02_dml/_04_tumble_window/Test2_BIGINT_SOURCE.java
  class Test2_BIGINT_SOURCE (line 27) | public class Test2_BIGINT_SOURCE {
    method main (line 44) | public static void main(String[] args) {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_08/batch/_02_dml/_04_tumble_window/Test3.java
  class Test3 (line 27) | public class Test3 {
    method main (line 44) | public static void main(String[] args) {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_08/batch/_02_dml/_04_tumble_window/Test5.java
  class Test5 (line 27) | public class Test5 {
    method main (line 44) | public static void main(String[] args) {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_08/batch/_02_dml/_05_batch_to_datastream/Test.java
  class Test (line 29) | public class Test {
    method main (line 31) | public static void main(String[] args) {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_08/batch/_02_dml/_06_select_where/Test.java
  class Test (line 34) | public class Test {
    method main (line 36) | public static void main(String[] args) throws NoSuchFieldException, Il...

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_08/batch/_03_hive_udf/HiveModuleV2.java
  class HiveModuleV2 (line 24) | public class HiveModuleV2 implements Module {
    method HiveModuleV2 (line 67) | public HiveModuleV2() {
    method HiveModuleV2 (line 71) | public HiveModuleV2(String hiveVersion) {
    method listFunctions (line 82) | @Override
    method getFunctionDefinition (line 95) | @Override
    method getHiveVersion (line 127) | public String getHiveVersion() {
    method registryHiveUDF (line 133) | public void registryHiveUDF(String hiveUDFName, String hiveUDFClassNam...

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_08/batch/_03_hive_udf/HiveUDFRegistryTest.java
  class HiveUDFRegistryTest (line 25) | public class HiveUDFRegistryTest {
    method main (line 27) | public static void main(String[] args) {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_08/batch/_03_hive_udf/HiveUDFRegistryUnloadTest.java
  class HiveUDFRegistryUnloadTest (line 25) | public class HiveUDFRegistryUnloadTest {
    method main (line 27) | public static void main(String[] args) {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_08/batch/_03_hive_udf/_01_GenericUDAFResolver2/HiveUDAF_hive_module_registry_Test.java
  class HiveUDAF_hive_module_registry_Test (line 17) | public class HiveUDAF_hive_module_registry_Test {
    method main (line 19) | public static void main(String[] args) throws IOException {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_08/batch/_03_hive_udf/_01_GenericUDAFResolver2/HiveUDAF_sql_registry_create_function_Test.java
  class HiveUDAF_sql_registry_create_function_Test (line 17) | public class HiveUDAF_sql_registry_create_function_Test {
    method main (line 19) | public static void main(String[] args) throws ClassNotFoundException, ...

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_08/batch/_03_hive_udf/_01_GenericUDAFResolver2/HiveUDAF_sql_registry_create_temporary_function_Test.java
  class HiveUDAF_sql_registry_create_temporary_function_Test (line 17) | public class HiveUDAF_sql_registry_create_temporary_function_Test {
    method main (line 19) | public static void main(String[] args) throws ClassNotFoundException, ...

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_08/batch/_03_hive_udf/_01_GenericUDAFResolver2/TestHiveUDAF.java
  class TestHiveUDAF (line 15) | public class TestHiveUDAF implements GenericUDAFResolver2 {
    method getEvaluator (line 17) | public GenericUDAFEvaluator getEvaluator(TypeInfo[] parameters) throws...
    method getEvaluator (line 22) | public GenericUDAFEvaluator getEvaluator(GenericUDAFParameterInfo para...
    class InneGenericUDAFEvaluatorr (line 28) | public static class InneGenericUDAFEvaluatorr extends GenericUDAFEvalu...
      method init (line 31) | @Override
      class StringAgg (line 38) | static class StringAgg implements AggregationBuffer {
      method getNewAggregationBuffer (line 42) | @Override
      method reset (line 48) | @Override
      method iterate (line 54) | @Override
      method terminatePartial (line 63) | @Override
      method merge (line 68) | @Override
      method terminate (line 77) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_08/batch/_03_hive_udf/_02_GenericUDTF/HiveUDTF_hive_module_registry_Test.java
  class HiveUDTF_hive_module_registry_Test (line 17) | public class HiveUDTF_hive_module_registry_Test {
    method main (line 19) | public static void main(String[] args) throws IOException {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_08/batch/_03_hive_udf/_02_GenericUDTF/HiveUDTF_sql_registry_create_function_Test.java
  class HiveUDTF_sql_registry_create_function_Test (line 17) | public class HiveUDTF_sql_registry_create_function_Test {
    method main (line 19) | public static void main(String[] args) throws ClassNotFoundException, ...

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_08/batch/_03_hive_udf/_02_GenericUDTF/HiveUDTF_sql_registry_create_temporary_function_Test.java
  class HiveUDTF_sql_registry_create_temporary_function_Test (line 17) | public class HiveUDTF_sql_registry_create_temporary_function_Test {
    method main (line 19) | public static void main(String[] args) throws ClassNotFoundException, ...

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_08/batch/_03_hive_udf/_02_GenericUDTF/TestHiveUDTF.java
  class TestHiveUDTF (line 13) | public class TestHiveUDTF extends GenericUDTF {
    method initialize (line 15) | @Override
    method process (line 27) | @Override
    method close (line 35) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_08/batch/_03_hive_udf/_03_built_in_udf/_01_get_json_object/HiveUDF_get_json_object_Test.java
  class HiveUDF_get_json_object_Test (line 28) | public class HiveUDF_get_json_object_Test {
    method main (line 30) | public static void main(String[] args) {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_08/batch/_03_hive_udf/_03_built_in_udf/_02_rlike/HiveUDF_rlike_Test.java
  class HiveUDF_rlike_Test (line 27) | public class HiveUDF_rlike_Test {
    method main (line 29) | public static void main(String[] args) {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_08/batch/_03_hive_udf/_04_GenericUDF/HiveUDF_hive_module_registry_Test.java
  class HiveUDF_hive_module_registry_Test (line 17) | public class HiveUDF_hive_module_registry_Test {
    method main (line 19) | public static void main(String[] args) throws IOException {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_08/batch/_03_hive_udf/_04_GenericUDF/HiveUDF_sql_registry_create_function_Test.java
  class HiveUDF_sql_registry_create_function_Test (line 17) | public class HiveUDF_sql_registry_create_function_Test {
    method main (line 19) | public static void main(String[] args) throws ClassNotFoundException, ...

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_08/batch/_03_hive_udf/_04_GenericUDF/HiveUDF_sql_registry_create_temporary_function_Test.java
  class HiveUDF_sql_registry_create_temporary_function_Test (line 17) | public class HiveUDF_sql_registry_create_temporary_function_Test {
    method main (line 19) | public static void main(String[] args) throws ClassNotFoundException, ...

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_08/batch/_03_hive_udf/_04_GenericUDF/TestGenericUDF.java
  class TestGenericUDF (line 12) | public class TestGenericUDF extends GenericUDF {
    method initialize (line 18) | @Override
    method evaluate (line 26) | @Override
    method getDisplayString (line 31) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_08/batch/_04_flink_udf/FlinkUDAF_Test.java
  class FlinkUDAF_Test (line 3) | public class FlinkUDAF_Test {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_08/batch/_04_flink_udf/FlinkUDF_Test.java
  class FlinkUDF_Test (line 3) | public class FlinkUDF_Test {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_08/batch/_04_flink_udf/FlinkUDTF_Test.java
  class FlinkUDTF_Test (line 3) | public class FlinkUDTF_Test {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_08/batch/_05_test/_01_batch_to_datastream/Test.java
  class Test (line 15) | public class Test {
    method main (line 17) | public static void main(String[] args) {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_09/udf/_01_hive_udf/_01_GenericUDF/HiveUDF_sql_registry_create_function_Test.java
  class HiveUDF_sql_registry_create_function_Test (line 15) | public class HiveUDF_sql_registry_create_function_Test {
    method main (line 17) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_09/udf/_01_hive_udf/_01_GenericUDF/HiveUDF_sql_registry_create_function_with_hive_catalog_Test.java
  class HiveUDF_sql_registry_create_function_with_hive_catalog_Test (line 15) | public class HiveUDF_sql_registry_create_function_with_hive_catalog_Test {
    method main (line 17) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_09/udf/_01_hive_udf/_01_GenericUDF/HiveUDF_sql_registry_create_temporary_function_Test.java
  class HiveUDF_sql_registry_create_temporary_function_Test (line 15) | public class HiveUDF_sql_registry_create_temporary_function_Test {
    method main (line 17) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_09/udf/_01_hive_udf/_01_GenericUDF/HiveUDF_sql_registry_create_temporary_function_with_hive_catalog_Test.java
  class HiveUDF_sql_registry_create_temporary_function_with_hive_catalog_Test (line 15) | public class HiveUDF_sql_registry_create_temporary_function_with_hive_ca...
    method main (line 17) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_09/udf/_01_hive_udf/_01_GenericUDF/TestGenericUDF.java
  class TestGenericUDF (line 12) | public class TestGenericUDF extends GenericUDF {
    method initialize (line 18) | @Override
    method evaluate (line 26) | @Override
    method getDisplayString (line 31) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_09/udf/_02_stream_hive_udf/HiveUDF_Error_Test.java
  class HiveUDF_Error_Test (line 15) | public class HiveUDF_Error_Test {
    method main (line 17) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_09/udf/_02_stream_hive_udf/HiveUDF_create_temporary_error_Test.java
  class HiveUDF_create_temporary_error_Test (line 15) | public class HiveUDF_create_temporary_error_Test {
    method main (line 17) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_09/udf/_02_stream_hive_udf/HiveUDF_hive_module_registry_Test.java
  class HiveUDF_hive_module_registry_Test (line 6) | public class HiveUDF_hive_module_registry_Test {
    method main (line 8) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_09/udf/_02_stream_hive_udf/HiveUDF_load_first_Test.java
  class HiveUDF_load_first_Test (line 17) | public class HiveUDF_load_first_Test {
    method main (line 19) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_09/udf/_02_stream_hive_udf/HiveUDF_load_second_Test.java
  class HiveUDF_load_second_Test (line 15) | public class HiveUDF_load_second_Test {
    method main (line 17) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_09/udf/_02_stream_hive_udf/TestGenericUDF.java
  class TestGenericUDF (line 12) | public class TestGenericUDF extends GenericUDF {
    method initialize (line 18) | @Override
    method evaluate (line 26) | @Override
    method getDisplayString (line 31) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_09/udf/_02_stream_hive_udf/UserDefinedSource.java
  class UserDefinedSource (line 11) | public class UserDefinedSource extends RichSourceFunction<RowData> {
    method UserDefinedSource (line 17) | public UserDefinedSource(DeserializationSchema<RowData> dser) {
    method run (line 21) | @Override
    method cancel (line 36) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_09/udf/_03_advanced_type_inference/AdvancedFunctionsExample.java
  class AdvancedFunctionsExample (line 11) | public class AdvancedFunctionsExample {
    method main (line 13) | public static void main(String[] args) throws Exception {
    method executeLastDatedValueFunction (line 28) | private static void executeLastDatedValueFunction(TableEnvironment env) {
    method executeInternalRowMergerFunction (line 56) | private static void executeInternalRowMergerFunction(TableEnvironment ...

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_09/udf/_03_advanced_type_inference/InternalRowMergerFunction.java
  class InternalRowMergerFunction (line 26) | public class InternalRowMergerFunction extends ScalarFunction {
    method getTypeInference (line 32) | @Override
    method eval (line 103) | public RowData eval(RowData r1, RowData r2) {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_09/udf/_03_advanced_type_inference/LastDatedValueFunction.java
  class LastDatedValueFunction (line 16) | public class LastDatedValueFunction<T>
    method getTypeInference (line 34) | @Override
    class Accumulator (line 77) | public static class Accumulator<T> {
    method createAccumulator (line 82) | @Override
    method accumulate (line 91) | public void accumulate(Accumulator<T> acc, T input, LocalDate date) {
    method getValue (line 98) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_09/udf/_04_udf/UDAF_Test.java
  class UDAF_Test (line 22) | public class UDAF_Test {
    method main (line 24) | public static void main(String[] args) throws Exception {
    class Sentence (line 62) | @Data
      method compareTo (line 71) | public int compareTo(Sentence s) {
    class CollectList1 (line 76) | public static class CollectList1 extends AggregateFunction<Sentence, S...
      method getValue (line 78) | @Override
      method createAccumulator (line 83) | @Override
      method accumulate (line 88) | public void accumulate(Sentence list, String msgid, Integer type, St...
      method merge (line 92) | public void merge(Sentence list, Iterable<Sentence> it) {
    class CollectList (line 110) | public static class CollectList extends AggregateFunction<List<Sentenc...
      method getValue (line 112) | @Override
      method createAccumulator (line 117) | @Override
      method accumulate (line 122) | public void accumulate(List<Sentence> list, String msgid, Integer ty...
      method merge (line 126) | public void merge(List<Sentence> list, Iterable<List<Sentence>> it) {
      method getAccumulatorType (line 132) | @Override
      method getResultType (line 140) | @Override
    class ToJson (line 148) | public static class ToJson extends ScalarFunction {
      method eval (line 149) | public String eval(List<String> in) {
    class CollectList2 (line 159) | @FunctionHint(
      method accumulate (line 167) | public void accumulate(TreeSetAccumulator acc, String value){
      method getValue (line 176) | @Override
      method createAccumulator (line 181) | @Override
    class TreeSetAccumulator (line 187) | public static class TreeSetAccumulator<T extends Comparable<?>>
      method add (line 194) | @Override
      method getLocalValue (line 199) | @Override
      method resetLocal (line 204) | @Override
      method merge (line 209) | @Override
      method clone (line 214) | @Override
      method toString (line 221) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_09/udf/_05_scalar_function/ExplodeUDTF.java
  class ExplodeUDTF (line 9) | public class ExplodeUDTF extends TableFunction<String> {
    method eval (line 11) | public void eval(@DataTypeHint("RAW") Object test) {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_09/udf/_05_scalar_function/ExplodeUDTFV2.java
  class ExplodeUDTFV2 (line 6) | public class ExplodeUDTFV2 extends TableFunction<String[]> {
    method eval (line 8) | public void eval(String worlds) {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_09/udf/_05_scalar_function/GetMapValue.java
  class GetMapValue (line 8) | public class GetMapValue extends ScalarFunction {
    method eval (line 10) | public String eval(@DataTypeHint("RAW") Object map, String key) {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_09/udf/_05_scalar_function/GetSetValue.java
  class GetSetValue (line 8) | public class GetSetValue extends ScalarFunction {
    method eval (line 10) | public String eval(@DataTypeHint("RAW") Object set) {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_09/udf/_05_scalar_function/ScalarFunctionTest.java
  class ScalarFunctionTest (line 9) | public class ScalarFunctionTest {
    method main (line 12) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_09/udf/_05_scalar_function/ScalarFunctionTest2.java
  class ScalarFunctionTest2 (line 9) | public class ScalarFunctionTest2 {
    method main (line 12) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_09/udf/_05_scalar_function/SetStringUDF.java
  class SetStringUDF (line 13) | public class SetStringUDF extends ScalarFunction {
    method eval (line 15) | @DataTypeHint("RAW")
    method getResultType (line 20) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_09/udf/_05_scalar_function/TableFunctionTest2.java
  class TableFunctionTest2 (line 9) | public class TableFunctionTest2 {
    method main (line 12) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_10_share/A.java
  class A (line 17) | public class A {
    method main (line 19) | public static void main(String[] args) throws Exception {
    class UserDefinedSource (line 77) | private static class UserDefinedSource implements SourceFunction<Row>,...
      method run (line 81) | @Override
      method cancel (line 94) | @Override
      method getProducedType (line 99) | @Override

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_11_explain/Explain_Test.java
  class Explain_Test (line 9) | public class Explain_Test {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_12_data_type/_01_interval/Timestamp3_Interval_To_Test.java
  class Timestamp3_Interval_To_Test (line 9) | public class Timestamp3_Interval_To_Test {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_12_data_type/_01_interval/Timestamp_ltz3_Interval_To_Test.java
  class Timestamp_ltz3_Interval_To_Test (line 9) | public class Timestamp_ltz3_Interval_To_Test {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_12_data_type/_02_user_defined/User.java
  class User (line 7) | public class User {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_12_data_type/_02_user_defined/UserDefinedDataTypes_Test.java
  class UserDefinedDataTypes_Test (line 9) | public class UserDefinedDataTypes_Test {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_12_data_type/_02_user_defined/UserDefinedDataTypes_Test2.java
  class UserDefinedDataTypes_Test2 (line 9) | public class UserDefinedDataTypes_Test2 {
    method main (line 11) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_12_data_type/_02_user_defined/UserScalarFunction.java
  class UserScalarFunction (line 7) | public class UserScalarFunction extends ScalarFunction {
    method eval (line 10) | public User eval(long i) {
    method eval (line 27) | public String eval(User i) {

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_12_data_type/_03_raw/RawScalarFunction.java
  class RawScalarFunction (line 11) | public class RawScalarFunction extends ScalarFunction {
    method eval (line 14) | public User eval(long i) {
    method eval (line 31) | @DataTypeHint(value = "RAW", bridgedTo = String.class, rawSerializer =...

FILE: flink-examples-1.13/src/main/java/flink/examples/sql/_12_data_type/_03_raw/Raw_DataTypes_Test2.java
  class Raw_DataTypes_Test2 (line 11) | public class Raw_DataTypes_Test2 {
    method main (line 13) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.13/src/test/java/flink/examples/sql/_06/calcite/CalciteTest.java
  class CalciteTest (line 15) | public class CalciteTest {
    method main (line 17) | public static void main(String[] args) {
    method config (line 26) | public static Frameworks.ConfigBuilder config() {

FILE: flink-examples-1.13/src/test/java/flink/examples/sql/_07/query/_06_joins/JaninoCompileTest.java
  class JaninoCompileTest (line 8) | public class JaninoCompileTest {
    method main (line 10) | public static void main(String[] args) throws Exception {

FILE: flink-examples-1.14/src/main/java/flink/examples/sql/_08/batch/HiveModuleV2.java
  class HiveModuleV2 (line 24) | public class HiveModuleV2 implements Module {
    method HiveModuleV2 (line 67) | public HiveModuleV2() {
    method HiveModuleV2 (line 71) | public HiveModuleV2(String hiveVersion) {
    method listFunctions (line 82) | @Override
    method getFunctionDefinition (line 95) | @Override
    method getHiveVersion (line 127) | public String getHiveVersion() {
    method registryHiveUDF (line 133) | public void registryHiveUDF(String hiveUDFName, String hiveUDFClassNam...

FILE: flink-examples-1.14/src/main/java/flink/examples/sql/_08/batch/Test.java
  class Test (line 33) | public class Test {
    method main (line 35) | public static void main(String[] args) {
    class UserDefinedSource (line 100) | private static class UserDefinedSource implements SourceFunction<Row>,...
      method run (line 104) | @Override
      method cancel (line 123) | @Override
      method getProducedType (line 128) | @Override
Condensed preview — 393 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (1,728K chars).
[
  {
    "path": ".gitignore",
    "chars": 146,
    "preview": "HELP.md\ntarget/\n!.mvn/wrapper/maven-wrapper.jar\n!**/src/main/**\n#**/src/test/**\n.idea/\n*.iml\n*.DS_Store\n\n### IntelliJ ID"
  },
  {
    "path": "README.md",
    "chars": 7531,
    "preview": "# 1.友情提示\n\n> 1. 联系我:如要有问题咨询,请联系我(公众号:[`大数据羊说`](#32公众号),备注来自`GitHub`)\n> 2. 该仓库会持续更新 flink 教程福利干货,麻烦路过的各位亲给这个项目点个 `star`,太不"
  },
  {
    "path": "flink-examples-1.10/pom.xml",
    "chars": 17729,
    "preview": "<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n<project xmlns=\"http://maven.apache.org/POM/4.0.0\"\n         xmlns:xsi=\"http://www"
  },
  {
    "path": "flink-examples-1.10/src/main/java/flink/examples/sql/_07/query/_06_joins/_02_interval_joins/_01_outer_join/WindowJoinFunction$46.java",
    "chars": 2059,
    "preview": "package flink.examples.sql._07.query._06_joins._02_interval_joins._01_outer_join;\n\n\npublic class WindowJoinFunction$46\n "
  },
  {
    "path": "flink-examples-1.10/src/main/java/flink/examples/sql/_07/query/_06_joins/_02_interval_joins/_01_outer_join/_06_Interval_Outer_Joins_EventTime_Test.java",
    "chars": 6471,
    "preview": "package flink.examples.sql._07.query._06_joins._02_interval_joins._01_outer_join;\n\nimport java.util.concurrent.TimeUnit;"
  },
  {
    "path": "flink-examples-1.12/.gitignore",
    "chars": 146,
    "preview": "HELP.md\ntarget/\n!.mvn/wrapper/maven-wrapper.jar\n!**/src/main/**\n#**/src/test/**\n.idea/\n*.iml\n*.DS_Store\n\n### IntelliJ ID"
  },
  {
    "path": "flink-examples-1.12/pom.xml",
    "chars": 20117,
    "preview": "<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n<project xmlns=\"http://maven.apache.org/POM/4.0.0\"\n         xmlns:xsi=\"http://www"
  },
  {
    "path": "flink-examples-1.12/src/main/java/flink/examples/datastream/_07/query/_04_window/_04_TumbleWindowTest.java",
    "chars": 2482,
    "preview": "package flink.examples.datastream._07.query._04_window;\n\nimport org.apache.flink.api.java.functions.KeySelector;\nimport "
  },
  {
    "path": "flink-examples-1.12/src/main/java/flink/examples/sql/_07/query/_04_window_agg/_04_TumbleWindowTest.java",
    "chars": 4588,
    "preview": "package flink.examples.sql._07.query._04_window_agg;\n\nimport java.util.concurrent.TimeUnit;\n\nimport org.apache.flink.api"
  },
  {
    "path": "flink-examples-1.12/src/main/java/flink/examples/sql/_07/query/_04_window_agg/_04_TumbleWindowTest_GroupingWindowAggsHandler$59.java",
    "chars": 16669,
    "preview": "package flink.examples.sql._07.query._04_window_agg;\n\n\npublic final class _04_TumbleWindowTest_GroupingWindowAggsHandler"
  },
  {
    "path": "flink-examples-1.12/src/main/java/flink/examples/sql/_07/query/_04_window_agg/_04_TumbleWindowTest_KeyProjection$69.java",
    "chars": 1656,
    "preview": "package flink.examples.sql._07.query._04_window_agg;\n\n\npublic final class _04_TumbleWindowTest_KeyProjection$69 implemen"
  },
  {
    "path": "flink-examples-1.12/src/main/java/flink/examples/sql/_07/query/_04_window_agg/_04_TumbleWindowTest_WatermarkGenerator$6.java",
    "chars": 1311,
    "preview": "package flink.examples.sql._07.query._04_window_agg;\n\n\npublic final class _04_TumbleWindowTest_WatermarkGenerator$6\n    "
  },
  {
    "path": "flink-examples-1.13/.gitignore",
    "chars": 146,
    "preview": "HELP.md\ntarget/\n!.mvn/wrapper/maven-wrapper.jar\n!**/src/main/**\n#**/src/test/**\n.idea/\n*.iml\n*.DS_Store\n\n### IntelliJ ID"
  },
  {
    "path": "flink-examples-1.13/pom.xml",
    "chars": 25308,
    "preview": "<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n<project xmlns=\"http://maven.apache.org/POM/4.0.0\"\n         xmlns:xsi=\"http://www"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/core/source/JaninoUtils.java",
    "chars": 580,
    "preview": "package flink.core.source;\n\nimport org.codehaus.janino.SimpleCompiler;\n\nimport lombok.extern.slf4j.Slf4j;\n\n\n@Slf4j\npubli"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/core/source/SourceFactory.java",
    "chars": 2674,
    "preview": "package flink.core.source;\n\nimport java.io.IOException;\n\nimport org.apache.flink.api.common.serialization.Deserializatio"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/FlinkEnvUtils.java",
    "chars": 10595,
    "preview": "package flink.examples;\n\nimport java.io.IOException;\nimport java.util.Optional;\nimport java.util.concurrent.TimeUnit;\n\ni"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/JacksonUtils.java",
    "chars": 2234,
    "preview": "package flink.examples;\n\nimport static com.fasterxml.jackson.core.JsonParser.Feature.ALLOW_COMMENTS;\nimport static com.f"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/codegen/JaninoUtils.java",
    "chars": 1490,
    "preview": "package flink.examples.datastream._01.bytedance.split.codegen;\n\nimport org.codehaus.janino.SimpleCompiler;\n\nimport flink"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/codegen/benchmark/Benchmark.java",
    "chars": 3266,
    "preview": "package flink.examples.datastream._01.bytedance.split.codegen.benchmark;\n\nimport org.codehaus.groovy.control.CompilerCon"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/job/SplitExampleJob.java",
    "chars": 5008,
    "preview": "package flink.examples.datastream._01.bytedance.split.job;\n\nimport java.util.Date;\nimport java.util.concurrent.TimeUnit;"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/job/start.sh",
    "chars": 791,
    "preview": "# 1.kafka 初始化\n\ncd /kafka-bin-目录\n\n# 启动 kafka server\n./kafka-server-start /usr/local/etc/kafka/server.properties &\n\n# 创建 3"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/kafka/KafkaProducerCenter.java",
    "chars": 2767,
    "preview": "package flink.examples.datastream._01.bytedance.split.kafka;\n\nimport java.util.Properties;\nimport java.util.concurrent.C"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/kafka/demo/Application.java",
    "chars": 921,
    "preview": "package flink.examples.datastream._01.bytedance.split.kafka.demo;\n\n\npublic class Application {\n\n    private String topic"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/kafka/demo/ConsumerThread.java",
    "chars": 2092,
    "preview": "package flink.examples.datastream._01.bytedance.split.kafka.demo;\n\nimport java.time.Duration;\nimport java.util.Collectio"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/kafka/demo/ProducerThread.java",
    "chars": 2167,
    "preview": "package flink.examples.datastream._01.bytedance.split.kafka.demo;\n\nimport java.util.Properties;\nimport java.util.concurr"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/model/ClientLogSink.java",
    "chars": 225,
    "preview": "package flink.examples.datastream._01.bytedance.split.model;\n\nimport lombok.Builder;\nimport lombok.Data;\n\n\n@Data\n@Builde"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/model/ClientLogSource.java",
    "chars": 278,
    "preview": "package flink.examples.datastream._01.bytedance.split.model;\n\nimport lombok.Builder;\nimport lombok.Data;\n\n\n@Data\n@Builde"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/model/DynamicProducerRule.java",
    "chars": 1268,
    "preview": "package flink.examples.datastream._01.bytedance.split.model;\n\n\nimport flink.examples.datastream._01.bytedance.split.code"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/model/Evaluable.java",
    "chars": 147,
    "preview": "package flink.examples.datastream._01.bytedance.split.model;\n\n\npublic interface Evaluable {\n\n    boolean eval(ClientLogS"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/zkconfigcenter/ZkBasedConfigCenter.java",
    "chars": 5359,
    "preview": "package flink.examples.datastream._01.bytedance.split.zkconfigcenter;\n\nimport java.lang.reflect.Type;\nimport java.util.H"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/zkconfigcenter/new.json",
    "chars": 150,
    "preview": "{\"1\":{\"condition\":\"1==1\",\"targetTopic\":\"tuzisir1\"},\"2\":{\"condition\":\"1!=1\",\"targetTopic\":\"tuzisir2\"},\"3\":{\"condition\":\"1"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/datastream/_01/bytedance/split/zkconfigcenter/old.json",
    "chars": 101,
    "preview": "{\"1\":{\"condition\":\"1==1\",\"targetTopic\":\"tuzisir1\"},\"2\":{\"condition\":\"1!=1\",\"targetTopic\":\"tuzisir2\"}}"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/datastream/_02/DataStreamTest.java",
    "chars": 8453,
    "preview": "package flink.examples.datastream._02;\n\nimport java.io.IOException;\nimport java.util.Properties;\nimport java.util.concur"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/datastream/_02/DataStreamTest1.java",
    "chars": 6645,
    "preview": "//package flink.examples.datastream._02;\n//\n//import java.io.IOException;\n//import java.util.Properties;\n//import java.u"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/datastream/_03/enums_state/EnumsStateTest.java",
    "chars": 1188,
    "preview": "package flink.examples.datastream._03.enums_state;\n\nimport org.apache.flink.api.common.typeinfo.TypeInformation;\nimport "
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/datastream/_03/enums_state/SenerioTest.java",
    "chars": 7177,
    "preview": "package flink.examples.datastream._03.enums_state;\n\nimport java.util.HashMap;\nimport java.util.Map;\nimport java.util.fun"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/datastream/_03/state/StateExamplesTest.java",
    "chars": 8664,
    "preview": "package flink.examples.datastream._03.state;\n\nimport java.util.LinkedList;\nimport java.util.List;\n\nimport org.apache.fli"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/datastream/_03/state/_01_broadcast_state/BroadcastStateTest.java",
    "chars": 7793,
    "preview": "package flink.examples.datastream._03.state._01_broadcast_state;\n\nimport java.util.ArrayList;\nimport java.util.List;\nimp"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/datastream/_03/state/_03_rocksdb/CreateStateBackendTest.java",
    "chars": 6732,
    "preview": "//package flink.examples.datastream._03.state._03_rocksdb;\n//\n//import java.util.LinkedList;\n//import java.util.List;\n//"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/datastream/_03/state/_03_rocksdb/GettingStartDemo.java",
    "chars": 1505,
    "preview": "package flink.examples.datastream._03.state._03_rocksdb;\n\nimport org.rocksdb.Options;\nimport org.rocksdb.RocksDB;\nimport"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/datastream/_03/state/_03_rocksdb/Rocksdb_OperatorAndKeyedState_StateStorageDIr_Test.java",
    "chars": 7484,
    "preview": "package flink.examples.datastream._03.state._03_rocksdb;\n\nimport java.util.List;\n\nimport org.apache.flink.api.common.sta"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/datastream/_03/state/_03_rocksdb/keyed_state/RocksBackendKeyedMapStateTest.java",
    "chars": 7314,
    "preview": "package flink.examples.datastream._03.state._03_rocksdb.keyed_state;\n\nimport java.util.LinkedList;\nimport java.util.List"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/datastream/_03/state/_03_rocksdb/keyed_state/RocksBackendKeyedValueStateTest.java",
    "chars": 4678,
    "preview": "package flink.examples.datastream._03.state._03_rocksdb.keyed_state;\n\nimport java.util.LinkedList;\nimport java.util.List"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/datastream/_03/state/_03_rocksdb/operator_state/KeyedStreamOperatorListStateTest.java",
    "chars": 4695,
    "preview": "package flink.examples.datastream._03.state._03_rocksdb.operator_state;\n\nimport java.util.List;\n\nimport org.apache.flink"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/datastream/_03/state/_03_rocksdb/operator_state/RocksBackendOperatorListStateTest.java",
    "chars": 3902,
    "preview": "package flink.examples.datastream._03.state._03_rocksdb.operator_state;\n\nimport java.util.List;\n\nimport org.apache.flink"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/datastream/_03/state/_04_filesystem/keyed_state/FsStateBackendKeyedMapStateTest.java",
    "chars": 4804,
    "preview": "package flink.examples.datastream._03.state._04_filesystem.keyed_state;\n\nimport java.util.LinkedList;\nimport java.util.L"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/datastream/_03/state/_04_filesystem/operator_state/FsStateBackendOperatorListStateTest.java",
    "chars": 3939,
    "preview": "package flink.examples.datastream._03.state._04_filesystem.operator_state;\n\nimport java.util.List;\n\nimport org.apache.fl"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/datastream/_03/state/_05_memory/keyed_state/MemoryStateBackendKeyedMapStateTest.java",
    "chars": 4921,
    "preview": "package flink.examples.datastream._03.state._05_memory.keyed_state;\n\nimport java.util.LinkedList;\nimport java.util.List;"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/datastream/_04/keyed_co_process/HashMapTest.java",
    "chars": 522,
    "preview": "package flink.examples.datastream._04.keyed_co_process;\n\n\nimport java.util.HashMap;\nimport java.util.Map.Entry;\n\npublic "
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/datastream/_04/keyed_co_process/_04_KeyedCoProcessFunctionTest.java",
    "chars": 10763,
    "preview": "package flink.examples.datastream._04.keyed_co_process;\n\nimport java.util.Map.Entry;\nimport java.util.concurrent.TimeUni"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/datastream/_05_ken/_01_watermark/WatermarkTest.java",
    "chars": 4303,
    "preview": "package flink.examples.datastream._05_ken._01_watermark;\n\nimport java.util.HashSet;\nimport java.util.Set;\nimport java.ut"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/datastream/_06_test/_01_event_proctime/OneJobWIthProcAndEventTimeWIndowTest.java",
    "chars": 5108,
    "preview": "package flink.examples.datastream._06_test._01_event_proctime;\n\nimport java.util.HashSet;\nimport java.util.Set;\nimport j"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/datastream/_06_test/_01_event_proctime/OneJobWIthTimerTest.java",
    "chars": 3866,
    "preview": "package flink.examples.datastream._06_test._01_event_proctime;\n\nimport org.apache.flink.api.java.functions.KeySelector;\n"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/datastream/_07_lambda_error/LambdaErrorTest.java",
    "chars": 2003,
    "preview": "package flink.examples.datastream._07_lambda_error;\n\nimport org.apache.flink.streaming.api.functions.source.SourceFuncti"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/datastream/_08_late_record/LatenessTest.java",
    "chars": 4353,
    "preview": "package flink.examples.datastream._08_late_record;\n\nimport org.apache.flink.api.common.functions.FlatMapFunction;\nimport"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/datastream/_09_join/_01_window_join/_01_Window_Join_Test.java",
    "chars": 2678,
    "preview": "//package flink.examples.datastream._09_join._01_window_join;\n//\n//import org.apache.flink.api.common.functions.FlatJoin"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/datastream/_09_join/_02_connect/_01_Connect_Test.java",
    "chars": 2775,
    "preview": "package flink.examples.datastream._09_join._02_connect;\n\nimport org.apache.flink.api.common.state.MapState;\nimport org.a"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/datastream/_10_agg/AggTest.java",
    "chars": 3662,
    "preview": "package flink.examples.datastream._10_agg;\n\nimport org.apache.flink.api.common.functions.AggregateFunction;\nimport org.a"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/practice/_01/dau/_01_DataStream_Session_Window.java",
    "chars": 3037,
    "preview": "package flink.examples.practice._01.dau;\n\nimport java.util.Arrays;\n\nimport org.apache.flink.api.java.tuple.Tuple3;\nimpor"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/question/datastream/_01/kryo_protobuf_no_more_bytes_left/KryoProtobufNoMoreBytesLeftTest.java",
    "chars": 1969,
    "preview": "package flink.examples.question.datastream._01.kryo_protobuf_no_more_bytes_left;\n\nimport java.lang.reflect.Method;\n\nimpo"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/question/sql/_01/lots_source_fields_poor_performance/EmbeddedKafka.java",
    "chars": 1341,
    "preview": "//package flink.examples.question.sql._01.lots_source_fields_poor_performance;\n//\n//import static net.mguenther.kafka.ju"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/question/sql/_01/lots_source_fields_poor_performance/_01_DataGenSourceTest.java",
    "chars": 5813,
    "preview": "package flink.examples.question.sql._01.lots_source_fields_poor_performance;\n\nimport java.util.Arrays;\nimport java.util."
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/question/sql/_01/lots_source_fields_poor_performance/_01_JsonSourceTest.java",
    "chars": 6856,
    "preview": "package flink.examples.question.sql._01.lots_source_fields_poor_performance;\n\nimport java.util.Arrays;\nimport java.util."
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/runtime/_01/future/CompletableFutureTest.java",
    "chars": 923,
    "preview": "package flink.examples.runtime._01.future;\n\nimport java.util.concurrent.CompletableFuture;\n\n\npublic class CompletableFut"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/runtime/_01/future/CompletableFutureTest4.java",
    "chars": 1098,
    "preview": "package flink.examples.runtime._01.future;\n\nimport java.util.concurrent.CompletableFuture;\n\n\npublic class CompletableFut"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/runtime/_01/future/CompletableFuture_AnyOf_Test3.java",
    "chars": 2050,
    "preview": "package flink.examples.runtime._01.future;\n\nimport java.util.concurrent.CompletableFuture;\n\n\npublic class CompletableFut"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/runtime/_01/future/CompletableFuture_ThenApplyAsync_Test2.java",
    "chars": 1114,
    "preview": "package flink.examples.runtime._01.future;\n\nimport java.util.concurrent.CompletableFuture;\n\n\npublic class CompletableFut"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/runtime/_01/future/CompletableFuture_ThenComposeAsync_Test2.java",
    "chars": 1155,
    "preview": "package flink.examples.runtime._01.future;\n\nimport java.util.concurrent.CompletableFuture;\n\n\npublic class CompletableFut"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/runtime/_01/future/FutureTest.java",
    "chars": 974,
    "preview": "package flink.examples.runtime._01.future;\n\nimport java.util.concurrent.Callable;\nimport java.util.concurrent.ExecutionE"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/runtime/_04/statebackend/CancelAndRestoreWithCheckpointTest.java",
    "chars": 6076,
    "preview": "package flink.examples.runtime._04.statebackend;\n\nimport java.util.Arrays;\nimport java.util.concurrent.TimeUnit;\n\nimport"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_01/countdistincterror/CountDistinctErrorTest.java",
    "chars": 3547,
    "preview": "package flink.examples.sql._01.countdistincterror;\n\nimport java.util.Arrays;\n\nimport org.apache.flink.api.java.tuple.Tup"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_01/countdistincterror/CountDistinctErrorTest2.java",
    "chars": 3667,
    "preview": "package flink.examples.sql._01.countdistincterror;\n\nimport java.util.Arrays;\n\nimport org.apache.flink.api.java.tuple.Tup"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_01/countdistincterror/CountDistinctErrorTest3.java",
    "chars": 3572,
    "preview": "package flink.examples.sql._01.countdistincterror;\n\nimport java.util.Arrays;\n\nimport org.apache.flink.api.java.tuple.Tup"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_01/countdistincterror/udf/Mod_UDF.java",
    "chars": 255,
    "preview": "package flink.examples.sql._01.countdistincterror.udf;\n\nimport org.apache.flink.table.functions.ScalarFunction;\n\n\npublic"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_01/countdistincterror/udf/StatusMapper1_UDF.java",
    "chars": 607,
    "preview": "package flink.examples.sql._01.countdistincterror.udf;\n\nimport org.apache.flink.table.functions.ScalarFunction;\n\n\npublic"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_01/countdistincterror/udf/StatusMapper_UDF.java",
    "chars": 670,
    "preview": "package flink.examples.sql._01.countdistincterror.udf;\n\nimport org.apache.flink.table.functions.TableFunction;\n\n\npublic "
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_02/timezone/TimeZoneTest.java",
    "chars": 2964,
    "preview": "package flink.examples.sql._02.timezone;\n\nimport java.util.Arrays;\n\nimport org.apache.flink.api.java.tuple.Tuple3;\nimpor"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_02/timezone/TimeZoneTest2.java",
    "chars": 5128,
    "preview": "package flink.examples.sql._02.timezone;\n\nimport java.util.Arrays;\n\nimport org.apache.flink.api.java.tuple.Tuple3;\nimpor"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_02/timezone/TimeZoneTest3.java",
    "chars": 2430,
    "preview": "package flink.examples.sql._02.timezone;\n\nimport flink.examples.FlinkEnvUtils;\nimport flink.examples.FlinkEnvUtils.Flink"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/CreateViewTest.java",
    "chars": 3660,
    "preview": "package flink.examples.sql._03.source_sink;\n\nimport java.util.Arrays;\n\nimport flink.examples.FlinkEnvUtils;\nimport flink"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/DataStreamSourceEventTimeTest.java",
    "chars": 3212,
    "preview": "package flink.examples.sql._03.source_sink;\n\nimport org.apache.flink.api.common.typeinfo.TypeInformation;\nimport org.apa"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/DataStreamSourceProcessingTimeTest.java",
    "chars": 2776,
    "preview": "package flink.examples.sql._03.source_sink;\n\nimport org.apache.flink.api.common.typeinfo.TypeInformation;\nimport org.apa"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/KafkaSourceTest.java",
    "chars": 1607,
    "preview": "package flink.examples.sql._03.source_sink;\n\nimport org.apache.flink.configuration.Configuration;\nimport org.apache.flin"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/RedisLookupTest.java",
    "chars": 3522,
    "preview": "package flink.examples.sql._03.source_sink;\n\nimport org.apache.flink.api.common.typeinfo.TypeInformation;\nimport org.apa"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/RedisSinkTest.java",
    "chars": 3046,
    "preview": "package flink.examples.sql._03.source_sink;\n\nimport org.apache.flink.api.common.typeinfo.TypeInformation;\nimport org.apa"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/SocketSourceTest.java",
    "chars": 2256,
    "preview": "package flink.examples.sql._03.source_sink;\n\nimport org.apache.flink.streaming.api.environment.StreamExecutionEnvironmen"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/TableApiKafkaSourceTest.java",
    "chars": 2772,
    "preview": "package flink.examples.sql._03.source_sink;\n\nimport org.apache.flink.api.common.typeinfo.TypeInformation;\nimport org.apa"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/UpsertKafkaSinkProtobufFormatSupportTest.java",
    "chars": 3598,
    "preview": "package flink.examples.sql._03.source_sink;\n\nimport org.apache.flink.configuration.Configuration;\n\nimport flink.examples"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/UpsertKafkaSinkTest.java",
    "chars": 3464,
    "preview": "package flink.examples.sql._03.source_sink;\n\nimport org.apache.flink.configuration.Configuration;\n\nimport flink.examples"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/UserDefinedSourceTest.java",
    "chars": 1469,
    "preview": "package flink.examples.sql._03.source_sink;\n\nimport java.util.Arrays;\n\nimport flink.examples.FlinkEnvUtils;\nimport flink"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/sink/Abilities_SinkFunction.java",
    "chars": 1517,
    "preview": "package flink.examples.sql._03.source_sink.abilities.sink;\n\nimport org.apache.flink.api.common.functions.util.PrintSinkO"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/sink/Abilities_TableSink.java",
    "chars": 2967,
    "preview": "package flink.examples.sql._03.source_sink.abilities.sink;\n\nimport java.util.HashMap;\nimport java.util.List;\nimport java"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/sink/Abilities_TableSinkFactory.java",
    "chars": 2286,
    "preview": "package flink.examples.sql._03.source_sink.abilities.sink;\n\nimport static org.apache.flink.configuration.ConfigOptions.k"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/sink/_01_SupportsWritingMetadata_Test.java",
    "chars": 1785,
    "preview": "package flink.examples.sql._03.source_sink.abilities.sink;\n\nimport java.util.Arrays;\n\nimport flink.examples.FlinkEnvUtil"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/Abilities_SourceFunction.java",
    "chars": 1988,
    "preview": "package flink.examples.sql._03.source_sink.abilities.source;\n\nimport org.apache.flink.api.common.serialization.Deseriali"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/Abilities_TableSource.java",
    "chars": 8262,
    "preview": "package flink.examples.sql._03.source_sink.abilities.source;\n\nimport java.util.HashMap;\nimport java.util.LinkedList;\nimp"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/Abilities_TableSourceFactory.java",
    "chars": 3580,
    "preview": "package flink.examples.sql._03.source_sink.abilities.source;\n\nimport java.util.HashSet;\nimport java.util.Set;\n\nimport or"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/_01_SupportsFilterPushDown_Test.java",
    "chars": 1571,
    "preview": "package flink.examples.sql._03.source_sink.abilities.source;\n\nimport java.util.Arrays;\n\nimport flink.examples.FlinkEnvUt"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/_02_SupportsLimitPushDown_Test.java",
    "chars": 1587,
    "preview": "package flink.examples.sql._03.source_sink.abilities.source;\n\nimport java.util.Arrays;\n\nimport flink.examples.FlinkEnvUt"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/_03_SupportsPartitionPushDown_Test.java",
    "chars": 1688,
    "preview": "package flink.examples.sql._03.source_sink.abilities.source;\n\nimport java.util.Arrays;\n\nimport flink.examples.FlinkEnvUt"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/_04_SupportsProjectionPushDown_JDBC_Test.java",
    "chars": 1550,
    "preview": "package flink.examples.sql._03.source_sink.abilities.source;\n\nimport java.util.Arrays;\n\nimport flink.examples.FlinkEnvUt"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/_04_SupportsProjectionPushDown_Test.java",
    "chars": 1663,
    "preview": "package flink.examples.sql._03.source_sink.abilities.source;\n\nimport java.util.Arrays;\n\nimport flink.examples.FlinkEnvUt"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/_05_SupportsReadingMetadata_Test.java",
    "chars": 1654,
    "preview": "package flink.examples.sql._03.source_sink.abilities.source;\n\nimport java.util.Arrays;\n\nimport flink.examples.FlinkEnvUt"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/_06_SupportsWatermarkPushDown_Test.java",
    "chars": 1907,
    "preview": "package flink.examples.sql._03.source_sink.abilities.source;\n\nimport java.util.Arrays;\n\nimport flink.examples.FlinkEnvUt"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/_07_SupportsSourceWatermark_Test.java",
    "chars": 2234,
    "preview": "package flink.examples.sql._03.source_sink.abilities.source;\n\nimport java.util.Arrays;\n\nimport flink.examples.FlinkEnvUt"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/before/Before_Abilities_SourceFunction.java",
    "chars": 2023,
    "preview": "package flink.examples.sql._03.source_sink.abilities.source.before;\n\nimport org.apache.flink.api.common.serialization.De"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/before/Before_Abilities_TableSource.java",
    "chars": 4209,
    "preview": "package flink.examples.sql._03.source_sink.abilities.source.before;\n\nimport org.apache.flink.api.common.eventtime.Waterm"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/before/Before_Abilities_TableSourceFactory.java",
    "chars": 3608,
    "preview": "package flink.examples.sql._03.source_sink.abilities.source.before;\n\nimport java.util.HashSet;\nimport java.util.Set;\n\nim"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/before/_01_Before_SupportsFilterPushDown_Test.java",
    "chars": 1606,
    "preview": "package flink.examples.sql._03.source_sink.abilities.source.before;\n\nimport java.util.Arrays;\n\nimport flink.examples.Fli"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/before/_02_Before_SupportsLimitPushDown_Test.java",
    "chars": 1622,
    "preview": "package flink.examples.sql._03.source_sink.abilities.source.before;\n\nimport java.util.Arrays;\n\nimport flink.examples.Fli"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/before/_03_Before_SupportsPartitionPushDown_Test.java",
    "chars": 1723,
    "preview": "package flink.examples.sql._03.source_sink.abilities.source.before;\n\nimport java.util.Arrays;\n\nimport flink.examples.Fli"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/before/_04_Before_SupportsProjectionPushDown_Test.java",
    "chars": 1698,
    "preview": "package flink.examples.sql._03.source_sink.abilities.source.before;\n\nimport java.util.Arrays;\n\nimport flink.examples.Fli"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/before/_05_Before_SupportsReadingMetadata_Test.java",
    "chars": 1562,
    "preview": "package flink.examples.sql._03.source_sink.abilities.source.before;\n\nimport java.util.Arrays;\n\nimport flink.examples.Fli"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/before/_06_Before_SupportsWatermarkPushDown_Test.java",
    "chars": 1765,
    "preview": "package flink.examples.sql._03.source_sink.abilities.source.before;\n\nimport java.util.Arrays;\n\nimport flink.examples.Fli"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/abilities/source/before/_07_Before_SupportsSourceWatermark_Test.java",
    "chars": 2207,
    "preview": "package flink.examples.sql._03.source_sink.abilities.source.before;\n\nimport java.util.Arrays;\n\nimport flink.examples.Fli"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/ddl/TableApiDDLTest.java",
    "chars": 3802,
    "preview": "package flink.examples.sql._03.source_sink.ddl;\n\nimport org.apache.flink.api.common.typeinfo.TypeInformation;\nimport org"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/redis/container/RedisCommandsContainer.java",
    "chars": 1241,
    "preview": "/*\n * Licensed to the Apache Software Foundation (ASF) under one or more\n * contributor license agreements.  See the NOT"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/redis/container/RedisCommandsContainerBuilder.java",
    "chars": 2250,
    "preview": "package flink.examples.sql._03.source_sink.table.redis.container;\n\nimport java.util.Objects;\n\nimport org.apache.commons."
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/redis/container/RedisContainer.java",
    "chars": 3437,
    "preview": "package flink.examples.sql._03.source_sink.table.redis.container;\n\nimport java.io.Closeable;\nimport java.io.IOException;"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/redis/demo/RedisDemo.java",
    "chars": 1208,
    "preview": "package flink.examples.sql._03.source_sink.table.redis.demo;\n\nimport java.util.HashMap;\n\nimport com.google.gson.Gson;\n\ni"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/redis/mapper/LookupRedisMapper.java",
    "chars": 1265,
    "preview": "package flink.examples.sql._03.source_sink.table.redis.mapper;\n\n\nimport java.io.IOException;\n\nimport org.apache.flink.ap"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/redis/mapper/RedisCommand.java",
    "chars": 469,
    "preview": "package flink.examples.sql._03.source_sink.table.redis.mapper;\n\nimport org.apache.flink.streaming.connectors.redis.commo"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/redis/mapper/RedisCommandDescription.java",
    "chars": 980,
    "preview": "package flink.examples.sql._03.source_sink.table.redis.mapper;\n\nimport org.apache.flink.streaming.connectors.redis.commo"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/redis/mapper/SetRedisMapper.java",
    "chars": 797,
    "preview": "package flink.examples.sql._03.source_sink.table.redis.mapper;\n\nimport org.apache.flink.streaming.connectors.redis.commo"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/redis/options/RedisLookupOptions.java",
    "chars": 4728,
    "preview": "package flink.examples.sql._03.source_sink.table.redis.options;\n\nimport java.io.Serializable;\n\n\npublic class RedisLookup"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/redis/options/RedisOptions.java",
    "chars": 8113,
    "preview": "package flink.examples.sql._03.source_sink.table.redis.options;\n\nimport static flink.examples.sql._03.source_sink.table."
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/redis/options/RedisWriteOptions.java",
    "chars": 3470,
    "preview": "package flink.examples.sql._03.source_sink.table.redis.options;\n\nimport org.apache.flink.configuration.ConfigOption;\nimp"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/redis/v1/RedisDynamicTableFactory.java",
    "chars": 4401,
    "preview": "package flink.examples.sql._03.source_sink.table.redis.v1;\n\nimport static flink.examples.sql._03.source_sink.table.redis"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/redis/v1/sink/RedisDynamicTableSink.java",
    "chars": 2284,
    "preview": "package flink.examples.sql._03.source_sink.table.redis.v1.sink;//package flink.examples.sql._03.source_sink.table.redis."
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/redis/v1/source/RedisDynamicTableSource.java",
    "chars": 2719,
    "preview": "package flink.examples.sql._03.source_sink.table.redis.v1.source;\n\nimport static flink.examples.sql._03.source_sink.tabl"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/redis/v1/source/RedisRowDataLookupFunction.java",
    "chars": 6602,
    "preview": "/*\n * Licensed to the Apache Software Foundation (ASF) under one\n * or more contributor license agreements.  See the NOT"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/redis/v2/RedisDynamicTableFactory.java",
    "chars": 5177,
    "preview": "package flink.examples.sql._03.source_sink.table.redis.v2;\n\nimport static flink.examples.sql._03.source_sink.table.redis"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/redis/v2/sink/RedisDynamicTableSink.java",
    "chars": 3743,
    "preview": "package flink.examples.sql._03.source_sink.table.redis.v2.sink;\n\nimport javax.annotation.Nullable;\n\nimport org.apache.fl"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/redis/v2/source/RedisDynamicTableSource.java",
    "chars": 3982,
    "preview": "package flink.examples.sql._03.source_sink.table.redis.v2.source;\n\nimport static flink.examples.sql._03.source_sink.tabl"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/redis/v2/source/RedisRowDataBatchLookupFunction.java",
    "chars": 10639,
    "preview": "/*\n * Licensed to the Apache Software Foundation (ASF) under one\n * or more contributor license agreements.  See the NOT"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/redis/v2/source/RedisRowDataLookupFunction.java",
    "chars": 8323,
    "preview": "/*\n * Licensed to the Apache Software Foundation (ASF) under one\n * or more contributor license agreements.  See the NOT"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/socket/SocketDynamicTableFactory.java",
    "chars": 3221,
    "preview": "package flink.examples.sql._03.source_sink.table.socket;\n\nimport java.util.HashSet;\nimport java.util.Set;\n\nimport org.ap"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/socket/SocketDynamicTableSource.java",
    "chars": 2468,
    "preview": "package flink.examples.sql._03.source_sink.table.socket;\n\nimport org.apache.flink.api.common.serialization.Deserializati"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/socket/SocketSourceFunction.java",
    "chars": 2162,
    "preview": "package flink.examples.sql._03.source_sink.table.socket;\n\nimport java.io.InputStream;\nimport java.net.InetSocketAddress;"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/user_defined/UserDefinedDynamicTableFactory.java",
    "chars": 2736,
    "preview": "package flink.examples.sql._03.source_sink.table.user_defined;\n\nimport java.util.HashSet;\nimport java.util.Set;\n\nimport "
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/user_defined/UserDefinedDynamicTableSource.java",
    "chars": 4868,
    "preview": "package flink.examples.sql._03.source_sink.table.user_defined;\n\nimport java.util.HashMap;\nimport java.util.List;\nimport "
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_03/source_sink/table/user_defined/UserDefinedSource.java",
    "chars": 1019,
    "preview": "package flink.examples.sql._03.source_sink.table.user_defined;\n\nimport org.apache.flink.api.common.serialization.Deseria"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_04/type/BlinkPlannerTest.java",
    "chars": 3230,
    "preview": "package flink.examples.sql._04.type;\n\nimport java.util.Arrays;\n\nimport org.apache.flink.api.java.tuple.Tuple3;\nimport or"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_04/type/JavaEnvTest.java",
    "chars": 2814,
    "preview": "package flink.examples.sql._04.type;//package flink.examples.sql._04.type;\n//\n//\n//import java.util.Arrays;\n//\n//import "
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_04/type/OldPlannerTest.java",
    "chars": 3226,
    "preview": "package flink.examples.sql._04.type;\n\nimport java.util.Arrays;\n\nimport org.apache.flink.api.java.tuple.Tuple3;\nimport or"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/ProtobufFormatTest.java",
    "chars": 2122,
    "preview": "package flink.examples.sql._05.format.formats;\n\nimport org.apache.flink.configuration.Configuration;\nimport org.apache.f"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/SocketWriteTest.java",
    "chars": 1235,
    "preview": "package flink.examples.sql._05.format.formats;\n\nimport java.io.IOException;\nimport java.net.ServerSocket;\nimport java.ne"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/csv/ChangelogCsvDeserializer.java",
    "chars": 2713,
    "preview": "package flink.examples.sql._05.format.formats.csv;\n\nimport java.util.List;\nimport java.util.regex.Pattern;\n\nimport org.a"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/csv/ChangelogCsvFormat.java",
    "chars": 2170,
    "preview": "package flink.examples.sql._05.format.formats.csv;\n\nimport java.util.List;\n\nimport org.apache.flink.api.common.serializa"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/csv/ChangelogCsvFormatFactory.java",
    "chars": 1919,
    "preview": "package flink.examples.sql._05.format.formats.csv;\n\nimport java.util.Collections;\nimport java.util.HashSet;\nimport java."
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/protobuf/descriptors/Protobuf.java",
    "chars": 1896,
    "preview": "package flink.examples.sql._05.format.formats.protobuf.descriptors;\n\nimport java.util.Map;\n\nimport org.apache.flink.anno"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/protobuf/descriptors/ProtobufValidator.java",
    "chars": 1582,
    "preview": "package flink.examples.sql._05.format.formats.protobuf.descriptors;\n\nimport org.apache.flink.table.api.ValidationExcepti"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/protobuf/row/ProtobufDeserializationSchema.java",
    "chars": 4084,
    "preview": "package flink.examples.sql._05.format.formats.protobuf.row;\n\nimport java.io.IOException;\nimport java.io.ObjectInputStrea"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/protobuf/row/ProtobufRowDeserializationSchema.java",
    "chars": 16147,
    "preview": "package flink.examples.sql._05.format.formats.protobuf.row;\n\n\nimport java.io.IOException;\nimport java.io.ObjectInputStre"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/protobuf/row/ProtobufRowFormatFactory.java",
    "chars": 5090,
    "preview": "package flink.examples.sql._05.format.formats.protobuf.row;\n\nimport java.io.InputStream;\nimport java.util.ArrayList;\nimp"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/protobuf/row/ProtobufRowSerializationSchema.java",
    "chars": 15043,
    "preview": "package flink.examples.sql._05.format.formats.protobuf.row;\n\n\nimport static com.google.protobuf.Descriptors.FieldDescrip"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/protobuf/row/ProtobufSerializationSchema.java",
    "chars": 356,
    "preview": "package flink.examples.sql._05.format.formats.protobuf.row;\n\nimport org.apache.flink.api.common.serialization.Serializat"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/protobuf/row/ProtobufUtils.java",
    "chars": 3480,
    "preview": "package flink.examples.sql._05.format.formats.protobuf.row;\n\nimport java.io.ByteArrayOutputStream;\nimport java.io.InputS"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/protobuf/row/typeutils/ProtobufSchemaConverter.java",
    "chars": 9701,
    "preview": "package flink.examples.sql._05.format.formats.protobuf.row.typeutils;\n\nimport java.util.List;\n\nimport org.apache.flink.a"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/protobuf/rowdata/ProtobufFormatFactory.java",
    "chars": 3264,
    "preview": "package flink.examples.sql._05.format.formats.protobuf.rowdata;\n\nimport static flink.examples.sql._05.format.formats.pro"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/protobuf/rowdata/ProtobufOptions.java",
    "chars": 991,
    "preview": "package flink.examples.sql._05.format.formats.protobuf.rowdata;\n\nimport org.apache.flink.configuration.ConfigOption;\nimp"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/protobuf/rowdata/ProtobufRowDataDeserializationSchema.java",
    "chars": 6393,
    "preview": "package flink.examples.sql._05.format.formats.protobuf.rowdata;\n\nimport static java.lang.String.format;\n\nimport java.io."
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/protobuf/rowdata/ProtobufRowDataSerializationSchema.java",
    "chars": 368,
    "preview": "package flink.examples.sql._05.format.formats.protobuf.rowdata;\n\nimport org.apache.flink.api.common.serialization.Serial"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/protobuf/rowdata/ProtobufToRowDataConverters.java",
    "chars": 18592,
    "preview": "package flink.examples.sql._05.format.formats.protobuf.rowdata;\n\nimport java.io.Serializable;\nimport java.math.BigDecima"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/protobuf/rowdata/RowDataToProtobufConverters.java",
    "chars": 111,
    "preview": "package flink.examples.sql._05.format.formats.protobuf.rowdata;\n\n\npublic class RowDataToProtobufConverters {\n}\n"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/utils/MoreRunnables.java",
    "chars": 339,
    "preview": "package flink.examples.sql._05.format.formats.utils;\n\npublic class MoreRunnables {\n\n\n    public static <EXCEPTION extend"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/utils/MoreSuppliers.java",
    "chars": 417,
    "preview": "package flink.examples.sql._05.format.formats.utils;\n\n\npublic class MoreSuppliers {\n\n    private MoreSuppliers() {\n     "
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/utils/ThrowableRunable.java",
    "chars": 177,
    "preview": "package flink.examples.sql._05.format.formats.utils;\n\n@FunctionalInterface\npublic interface ThrowableRunable<EXCEPTION e"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_05/format/formats/utils/ThrowableSupplier.java",
    "chars": 182,
    "preview": "package flink.examples.sql._05.format.formats.utils;\n\n@FunctionalInterface\npublic interface ThrowableSupplier<OUT, EXCEP"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_06/calcite/CalciteTest.java",
    "chars": 481,
    "preview": "package flink.examples.sql._06.calcite;\n\nimport org.apache.calcite.sql.SqlNode;\nimport org.apache.calcite.sql.parser.Sql"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_06/calcite/ParserTest.java",
    "chars": 3554,
    "preview": "package flink.examples.sql._06.calcite;\n\nimport java.util.Arrays;\n\nimport org.apache.flink.api.java.tuple.Tuple3;\nimport"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_06/calcite/javacc/JavaccCodeGenTest.java",
    "chars": 527,
    "preview": "package flink.examples.sql._06.calcite.javacc;\n\n\n\npublic class JavaccCodeGenTest {\n\n    public static void main(String[]"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_06/calcite/javacc/Simple1Test.java",
    "chars": 246,
    "preview": "package flink.examples.sql._06.calcite.javacc;\n\n\nimport flink.examples.sql._06.calcite.javacc.generatedcode.Simple1;\n\n\np"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_06/calcite/javacc/generatedcode/ParseException.java",
    "chars": 6281,
    "preview": "package flink.examples.sql._06.calcite.javacc.generatedcode;/* Generated By:JavaCC: Do not edit this line. ParseExceptio"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_06/calcite/javacc/generatedcode/Simple1.java",
    "chars": 7320,
    "preview": "package flink.examples.sql._06.calcite.javacc.generatedcode;/* Simple1.java */\n/* Generated By:JavaCC: Do not edit this "
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_06/calcite/javacc/generatedcode/Simple1Constants.java",
    "chars": 489,
    "preview": "package flink.examples.sql._06.calcite.javacc.generatedcode;/* Generated By:JavaCC: Do not edit this line. Simple1Consta"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_06/calcite/javacc/generatedcode/Simple1TokenManager.java",
    "chars": 6152,
    "preview": "package flink.examples.sql._06.calcite.javacc.generatedcode;/* Simple1TokenManager.java */\n/* Generated By:JavaCC: Do no"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_06/calcite/javacc/generatedcode/SimpleCharStream.java",
    "chars": 12411,
    "preview": "package flink.examples.sql._06.calcite.javacc.generatedcode;/* Generated By:JavaCC: Do not edit this line. SimpleCharStr"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_06/calcite/javacc/generatedcode/Token.java",
    "chars": 4130,
    "preview": "package flink.examples.sql._06.calcite.javacc.generatedcode;/* Generated By:JavaCC: Do not edit this line. Token.java Ve"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_06/calcite/javacc/generatedcode/TokenMgrError.java",
    "chars": 4489,
    "preview": "package flink.examples.sql._06.calcite.javacc.generatedcode;/* Generated By:JavaCC: Do not edit this line. TokenMgrError"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_01_select_where/SelectWhereHiveDialect.java",
    "chars": 2417,
    "preview": "package flink.examples.sql._07.query._01_select_where;\n\nimport org.apache.flink.api.common.typeinfo.TypeInformation;\nimp"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_01_select_where/SelectWhereTest.java",
    "chars": 3879,
    "preview": "package flink.examples.sql._07.query._01_select_where;\n\nimport java.util.concurrent.TimeUnit;\n\nimport org.apache.flink.a"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_01_select_where/SelectWhereTest2.java",
    "chars": 2498,
    "preview": "package flink.examples.sql._07.query._01_select_where;\n\nimport org.apache.flink.api.common.typeinfo.TypeInformation;\nimp"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_01_select_where/SelectWhereTest3.java",
    "chars": 3818,
    "preview": "package flink.examples.sql._07.query._01_select_where;\n\nimport java.util.concurrent.TimeUnit;\n\nimport org.apache.flink.a"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_01_select_where/SelectWhereTest4.java",
    "chars": 3506,
    "preview": "package flink.examples.sql._07.query._01_select_where;\n\nimport java.util.concurrent.TimeUnit;\n\nimport org.apache.flink.a"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_01_select_where/SelectWhereTest5.java",
    "chars": 1521,
    "preview": "package flink.examples.sql._07.query._01_select_where;\n\nimport java.util.Arrays;\n\nimport flink.examples.FlinkEnvUtils;\ni"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_01_select_where/StreamExecCalc$10.java",
    "chars": 2800,
    "preview": "package flink.examples.sql._07.query._01_select_where;\n\npublic class StreamExecCalc$10 extends org.apache.flink.table.ru"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_02_select_distinct/GroupAggsHandler$5.java",
    "chars": 2443,
    "preview": "package flink.examples.sql._07.query._02_select_distinct;\n\n\npublic final class GroupAggsHandler$5 implements org.apache."
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_02_select_distinct/KeyProjection$0.java",
    "chars": 1290,
    "preview": "package flink.examples.sql._07.query._02_select_distinct;\n\n\npublic class KeyProjection$0 implements\n        org.apache.f"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_02_select_distinct/SelectDistinctTest.java",
    "chars": 4003,
    "preview": "package flink.examples.sql._07.query._02_select_distinct;\n\nimport java.util.concurrent.TimeUnit;\n\nimport org.apache.flin"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_02_select_distinct/SelectDistinctTest2.java",
    "chars": 2718,
    "preview": "package flink.examples.sql._07.query._02_select_distinct;\n\nimport java.util.concurrent.TimeUnit;\n\nimport org.apache.flin"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_03_group_agg/_01_group_agg/GroupAggMiniBatchTest.java",
    "chars": 3956,
    "preview": "package flink.examples.sql._07.query._03_group_agg._01_group_agg;\n\nimport java.util.concurrent.TimeUnit;\n\nimport org.apa"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_03_group_agg/_01_group_agg/GroupAggTest.java",
    "chars": 4169,
    "preview": "package flink.examples.sql._07.query._03_group_agg._01_group_agg;\n\nimport java.util.concurrent.TimeUnit;\n\nimport org.apa"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_03_group_agg/_01_group_agg/GroupAggsHandler$39.java",
    "chars": 13882,
    "preview": "package flink.examples.sql._07.query._03_group_agg._01_group_agg;\n\n\npublic final class GroupAggsHandler$39 implements or"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_03_group_agg/_02_count_distinct/CountDistinctGroupAggTest.java",
    "chars": 2944,
    "preview": "package flink.examples.sql._07.query._03_group_agg._02_count_distinct;\n\nimport java.util.concurrent.TimeUnit;\n\nimport or"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_03_group_agg/_02_count_distinct/GroupAggsHandler$17.java",
    "chars": 6821,
    "preview": "package flink.examples.sql._07.query._03_group_agg._02_count_distinct;\n\n\npublic final class GroupAggsHandler$17 implemen"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_03_group_agg/_03_grouping_sets/GroupingSetsEqualsGroupAggUnionAllGroupAggTest2.java",
    "chars": 4121,
    "preview": "package flink.examples.sql._07.query._03_group_agg._03_grouping_sets;\n\nimport java.util.concurrent.TimeUnit;\n\nimport org"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_03_group_agg/_03_grouping_sets/GroupingSetsGroupAggTest.java",
    "chars": 3206,
    "preview": "package flink.examples.sql._07.query._03_group_agg._03_grouping_sets;\n\nimport java.util.concurrent.TimeUnit;\n\nimport org"
  },
  {
    "path": "flink-examples-1.13/src/main/java/flink/examples/sql/_07/query/_03_group_agg/_03_grouping_sets/GroupingSetsGroupAggTest2.java",
    "chars": 2967,
    "preview": "package flink.examples.sql._07.query._03_group_agg._03_grouping_sets;\n\nimport java.util.concurrent.TimeUnit;\n\nimport org"
  }
]

// ... and 193 more files (download for full content)

About this extraction

This page contains the full source code of the yangyichao-mango/flink-study GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 393 files (1.5 MB), approximately 379.8k tokens, and a symbol index with 1459 extracted functions, classes, methods, constants, and types. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.

Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.

Copied to clipboard!