丁云鹏 4 meses atrás
pai
commit
79423cf50a
38 arquivos alterados com 22 adições e 2837 exclusões
  1. 0 149
      recommend-model-produce/src/main/python/models/dnn/README.md
  2. 0 13
      recommend-model-produce/src/main/python/models/dnn/__init__.py
  3. 0 53
      recommend-model-produce/src/main/python/models/dnn/benchmark.yaml
  4. 0 5
      recommend-model-produce/src/main/python/models/dnn/benchmark_data.sh
  5. 0 55
      recommend-model-produce/src/main/python/models/dnn/benchmark_gpubox.yaml
  6. 0 93
      recommend-model-produce/src/main/python/models/dnn/benchmark_reader.py
  7. 0 63
      recommend-model-produce/src/main/python/models/dnn/config.yaml
  8. 0 55
      recommend-model-produce/src/main/python/models/dnn/config_bigdata.yaml
  9. 0 52
      recommend-model-produce/src/main/python/models/dnn/config_cpups.yaml
  10. 0 55
      recommend-model-produce/src/main/python/models/dnn/config_gpubox.yaml
  11. 0 54
      recommend-model-produce/src/main/python/models/dnn/config_queuedataset.yaml
  12. 0 55
      recommend-model-produce/src/main/python/models/dnn/config_to_static.yaml
  13. 0 84
      recommend-model-produce/src/main/python/models/dnn/criteo_reader.py
  14. 0 80
      recommend-model-produce/src/main/python/models/dnn/data/sample_data/train/sample_train.txt
  15. 0 96
      recommend-model-produce/src/main/python/models/dnn/dygraph_model.py
  16. 0 142
      recommend-model-produce/src/main/python/models/dnn/net.py
  17. 0 49
      recommend-model-produce/src/main/python/models/dnn/online.yaml
  18. 0 95
      recommend-model-produce/src/main/python/models/dnn/queuedataset_reader.py
  19. 0 129
      recommend-model-produce/src/main/python/models/dnn/static_model.py
  20. 0 138
      recommend-model-produce/src/main/python/models/dnn/static_model_lod.py
  21. 0 45
      recommend-model-produce/src/main/python/models/dssm/bq_reader_infer.py
  22. 0 48
      recommend-model-produce/src/main/python/models/dssm/bq_reader_train.py
  23. 0 82
      recommend-model-produce/src/main/python/models/dssm/bq_reader_train_insid.py
  24. 0 41
      recommend-model-produce/src/main/python/models/dssm/config.yaml
  25. 0 42
      recommend-model-produce/src/main/python/models/dssm/config_bigdata.yaml
  26. 9 7
      recommend-model-produce/src/main/python/models/dssm/config_infer.yaml
  27. 0 70
      recommend-model-produce/src/main/python/models/dssm/config_online.yaml
  28. 13 9
      recommend-model-produce/src/main/python/models/dssm/config_trainer.yaml
  29. 0 93
      recommend-model-produce/src/main/python/models/dssm/dygraph_model.py
  30. 0 146
      recommend-model-produce/src/main/python/models/dssm/net_bak.py
  31. 0 19
      recommend-model-produce/src/main/python/models/dssm/run.sh
  32. 0 112
      recommend-model-produce/src/main/python/models/dssm/static_model_bak.py
  33. 0 78
      recommend-model-produce/src/main/python/models/dssm/transform.py
  34. 0 100
      recommend-model-produce/src/main/python/models/wide_and_deep_dataset/data/part-0
  35. 0 100
      recommend-model-produce/src/main/python/models/wide_and_deep_dataset/data/part-1
  36. 0 202
      recommend-model-produce/src/main/python/models/wide_and_deep_dataset/model.py
  37. 0 47
      recommend-model-produce/src/main/python/models/wide_and_deep_dataset/reader.py
  38. 0 81
      recommend-model-produce/src/main/python/models/wide_and_deep_dataset/train.py

+ 0 - 149
recommend-model-produce/src/main/python/models/dnn/README.md

@@ -1,149 +0,0 @@
-# 基于DNN模型的点击率预估模型
-
-**[AI Studio在线运行环境](https://aistudio.baidu.com/aistudio/projectdetail/3240347)**
-
-以下是本例的简要目录结构及说明: 
-
-```
-├── data #样例数据
-    ├── sample_data #样例数据
-        ├── train
-            ├── sample_train.txt #训练数据样例
-├── __init__.py
-├── README.md #文档
-├── config.yaml # sample数据配置
-├── config_bigdata.yaml # 全量数据配置
-├── net.py # 模型核心组网(动静统一)
-├── criteo_reader.py #数据读取程序
-├── static_model.py # 构建静态图
-├── dygraph_model.py # 构建动态图
-├── benchmark.yaml # benchmark的配置
-├── benchmark_data.sh # benchmark的数据处理
-├── benchmark_reader.py # benchmark的数据读取程序
-```
-
-注:在阅读该示例前,建议您先了解以下内容:
-
-[paddlerec入门教程](https://github.com/PaddlePaddle/PaddleRec/blob/master/README.md)
-
-## 内容
-
-- [模型简介](#模型简介)
-- [数据准备](#数据准备)
-- [运行环境](#运行环境)
-- [快速开始](#快速开始)
-- [模型组网](#模型组网)
-- [效果复现](#效果复现)
-- [进阶使用](#进阶使用)
-- [FAQ](#FAQ)
-
-
-## 模型简介
-`CTR(Click Through Rate)`,即点击率,是“推荐系统/计算广告”等领域的重要指标,对其进行预估是商品推送/广告投放等决策的基础。简单来说,CTR预估对每次广告的点击情况做出预测,预测用户是点击还是不点击。CTR预估模型综合考虑各种因素、特征,在大量历史数据上训练,最终对商业决策提供帮助。
-
-## 数据准备
-
-训练及测试数据集选用[Display Advertising Challenge](https://www.kaggle.com/c/criteo-display-ad-challenge/)所用的Criteo数据集。该数据集包括两部分:训练集和测试集。训练集包含一段时间内Criteo的部分流量,测试集则对应训练数据后一天的广告点击流量。
-每一行数据格式如下所示:
-```
-<label> <integer feature 1> ... <integer feature 13> <categorical feature 1> ... <categorical feature 26>
-```
-其中```<label>```表示广告是否被点击,点击用1表示,未点击用0表示。```<integer feature>```代表数值特征(连续特征),共有13个连续特征。```<categorical feature>```代表分类特征(离散特征),共有26个离散特征。相邻两个特征用```\t```分隔,缺失特征用空格表示。测试集中```<label>```特征已被移除。  
-在模型目录的data目录下为您准备了快速运行的示例数据,若需要使用全量数据可以参考下方[效果复现](#效果复现)部分。
-
-## 运行环境
-PaddlePaddle>=2.0
-
-python 2.7/3.5/3.6/3.7
-
-os : windows/linux/macos 
-
-## 快速开始
-本文提供了样例数据可以供您快速体验,在任意目录下均可执行。在DNN模型目录的快速执行命令如下: 
-```bash
-# 进入模型目录
-# cd models/rank/dnn # 在任意目录均可运行
-# 动态图训练
-python -u ../../../tools/trainer.py -m config.yaml # 全量数据运行config_bigdata.yaml 
-# 动态图预测
-python -u ../../../tools/infer.py -m config.yaml 
-
-# 静态图训练
-python -u ../../../tools/static_trainer.py -m config.yaml # 全量数据运行config_bigdata.yaml 
-# 静态图预测
-python -u ../../../tools/static_infer.py -m config.yaml 
-``` 
-## 模型组网
-### 数据输入声明
-正如数据准备章节所介绍,Criteo数据集中,分为连续数据与离散(稀疏)数据,所以整体而言,CTR-DNN模型的数据输入层包括三个,分别是:`dense_input`用于输入连续数据,维度由超参数`dense_input_dim`指定,数据类型是归一化后的浮点型数据。`sparse_inputs`用于记录离散数据,在Criteo数据集中,共有26个slot,所以我们创建了名为`1~26`的26个稀疏参数输入,数据类型为整数;最后是每条样本的`label`,代表了是否被点击,数据类型是整数,0代表负样例,1代表正样例。
-
-### CTR-DNN模型组网
-
-CTR-DNN模型的组网比较直观,本质是一个二分类任务,代码参考`net.py`。模型主要组成是一个`Embedding`层,四个`FC`层,以及相应的分类任务的loss计算和auc计算。
-
-#### Embedding层
-首先介绍Embedding层的搭建方式:`Embedding`层的输入是`sparse_input`,由超参的`sparse_feature_number`和`sparse_feature_dimshape`定义。需要特别解释的是`is_sparse`参数,当我们指定`is_sprase=True`后,计算图会将该参数视为稀疏参数,反向更新以及分布式通信时,都以稀疏的方式进行,会极大的提升运行效率,同时保证效果一致。
-
-各个稀疏的输入通过Embedding层后,将其合并起来,置于一个list内,以方便进行concat的操作。
-
-```
-self.embedding = paddle.nn.Embedding(
-            self.sparse_feature_number,
-            self.sparse_feature_dim,
-            sparse=True,
-            weight_attr=paddle.ParamAttr(
-                name="SparseFeatFactors",
-                initializer=paddle.nn.initializer.Uniform()))
-```
-
-#### FC层
-将离散数据通过embedding查表得到的值,与连续数据的输入进行`concat`操作,合为一个整体输入,作为全链接层的原始输入。我们共设计了4层FC,每层FC的输出维度由超参`fc_sizes`指定,每层FC都后接一个`relu`激活函数,每层FC的初始化方式为符合正态分布的随机初始化,标准差与上一层的输出维度的平方根成反比。
-```
-sizes = [sparse_feature_dim * num_field + dense_feature_dim
-            ] + self.layer_sizes + [2]
-acts = ["relu" for _ in range(len(self.layer_sizes))] + [None]
-self._mlp_layers = []
-for i in range(len(layer_sizes) + 1):
-    linear = paddle.nn.Linear(
-        in_features=sizes[i],
-        out_features=sizes[i + 1],
-        weight_attr=paddle.ParamAttr(
-            initializer=paddle.nn.initializer.Normal(
-                std=1.0 / math.sqrt(sizes[i]))))
-    self.add_sublayer('linear_%d' % i, linear)
-    self._mlp_layers.append(linear)
-    if acts[i] == 'relu':
-        act = paddle.nn.ReLU()
-        self.add_sublayer('act_%d' % i, act)
-        self._mlp_layers.append(act)
-
-```
-#### Loss及Auc计算
-- 预测的结果通过一个输出shape为2的FC层给出,该FC层的激活函数是softmax,会给出每条样本分属于正负样本的概率。
-- 每条样本的损失由交叉熵给出。
-- 我们同时还会计算预测的auc。
-
-### 效果复现
-为了方便使用者能够快速的跑通每一个模型,我们在每个模型下都提供了样例数据。如果需要复现readme中的效果,请按如下步骤依次操作即可。 
-在全量数据下模型的指标如下:
-| 模型 | auc | batch_size | epoch_num| Time of each epoch |
-| :------| :------ | :------ | :------| :------ | 
-| dnn | 0.795+ | 512 | 4 | 约3小时 |
-
-1. 确认您当前所在目录为PaddleRec/models/rank/dnn 
-2. 进入paddlerec/datasets/criteo目录下,执行该脚本,会从国内源的服务器上下载我们预处理完成的criteo全量数据集,并解压到指定文件夹。
-``` bash
-cd ../../../datasets/criteo
-sh run.sh
-``` 
-3. 切回模型目录,执行命令运行全量数据
-```bash
-cd - # 切回模型目录
-# 动态图训练
-python -u ../../../tools/trainer.py -m config_bigdata.yaml # 全量数据运行config_bigdata.yaml 
-python -u ../../../tools/infer.py -m config_bigdata.yaml # 全量数据运行config_bigdata.yaml 
-```
-
-## 进阶使用
-  
-## FAQ

+ 0 - 13
recommend-model-produce/src/main/python/models/dnn/__init__.py

@@ -1,13 +0,0 @@
-# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.

+ 0 - 53
recommend-model-produce/src/main/python/models/dnn/benchmark.yaml

@@ -1,53 +0,0 @@
-# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-hyper_parameters:
-  optimizer:
-    class: Adam
-    learning_rate: 0.0001
-    adam_lazy_mode: True
-  sparse_inputs_slots: 27
-  sparse_feature_number: 1000001
-  sparse_feature_dim: 10
-  dense_input_dim: 13
-  fc_sizes: [400, 400, 400]
-
-runner:
-  epochs: 15
-  print_interval: 100
-
-  geo_step: 400
-  sync_mode: "async"  # sync / async /geo / heter
-  thread_num: 16
-  use_gpu: 0
-  
-  model_path: "static_model.py"
-  reader_type: "QueueDataset"  # DataLoader / QueueDataset / RecDataset
-  pipe_command: "python benchmark_reader.py"
-  dataset_debug: False
-  split_file_list: False
-
-  train_batch_size: 1000
-  train_data_dir: "train_data"
-  train_reader_path: "benchmark_reader"
-  model_save_path: "model"
-
-  infer_batch_size: 1000
-  test_data_dir: "test_data"
-  infer_reader_path: "benchmark_reader"
-  infer_load_path: "model"
-  infer_start_epoch: 0
-  infer_end_epoch: 1
-
-  

+ 0 - 5
recommend-model-produce/src/main/python/models/dnn/benchmark_data.sh

@@ -1,5 +0,0 @@
-echo "Begin DownLoad Criteo Data"
-wget --no-check-certificate https://paddlerec.bj.bcebos.com/benchmark/criteo_benchmark_data.tar.gz
-echo "Begin Unzip Criteo Data"
-tar -xf criteo_benchmark_data.tar.gz
-echo "Get Criteo Data Success"

+ 0 - 55
recommend-model-produce/src/main/python/models/dnn/benchmark_gpubox.yaml

@@ -1,55 +0,0 @@
-# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-# workspace
-#workspace: "models/rank/dnn"
-
-
-runner:
-  train_data_dir: "train_data"
-  train_reader_path: "criteo_reader" # importlib format
-  use_gpu: True
-  use_auc: True
-  train_batch_size: 2048
-  epochs: 3
-  print_interval: 10
-  model_save_path: "output_model_dnn_queue"
-
-  sync_mode: "gpubox"
-  thread_num: 30
-  reader_type: "InmemoryDataset"  # DataLoader / QueueDataset / RecDataset / InmemoryDataset
-  pipe_command: "python models/rank/dnn/benchmark_reader.py"
-  dataset_debug: False
-  split_file_list: False
-
-  infer_batch_size: 2
-  infer_reader_path: "criteo_reader" # importlib format
-  test_data_dir: "data/sample_data/train"
-  infer_load_path: "output_model_dnn_queue"
-  infer_start_epoch: 0
-  infer_end_epoch: 3
-# hyper parameters of user-defined network
-hyper_parameters:
-  # optimizer config
-  optimizer:
-    class: Adam
-    learning_rate: 0.001
-    strategy: async
-  # user-defined <key, value> pairs
-  sparse_inputs_slots: 27
-  sparse_feature_number: 1024
-  sparse_feature_dim: 11
-  dense_input_dim: 13
-  fc_sizes: [512, 256, 128, 32]
-  distributed_embedding: 0

+ 0 - 93
recommend-model-produce/src/main/python/models/dnn/benchmark_reader.py

@@ -1,93 +0,0 @@
-# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserved.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-import sys
-import yaml
-import six
-import os
-import copy
-import xxhash
-import paddle.distributed.fleet as fleet
-import logging
-
-cont_min_ = [0, -3, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
-cont_max_ = [20, 600, 100, 50, 64000, 500, 100, 50, 500, 10, 10, 10, 50]
-cont_diff_ = [20, 603, 100, 50, 64000, 500, 100, 50, 500, 10, 10, 10, 50]
-hash_dim_ = 1000001
-continuous_range_ = range(1, 14)
-categorical_range_ = range(14, 40)
-
-logging.basicConfig(
-    format='%(asctime)s - %(levelname)s - %(message)s', level=logging.INFO)
-logger = logging.getLogger(__name__)
-
-
-class Reader(fleet.MultiSlotDataGenerator):
-    def init(self, config):
-        self.config = config
-
-    def line_process(self, line):
-        features = line.rstrip('\n').split('\t')
-        dense_feature = []
-        sparse_feature = []
-        for idx in continuous_range_:
-            if features[idx] == "":
-                dense_feature.append(0.0)
-            else:
-                dense_feature.append(
-                    (float(features[idx]) - cont_min_[idx - 1]) /
-                    cont_diff_[idx - 1])
-        for idx in categorical_range_:
-            sparse_feature.append([
-                xxhash.xxh32(str(idx) + features[idx]).intdigest() % hash_dim_
-            ])
-        label = [int(features[0])]
-        return [label] + sparse_feature + [dense_feature]
-
-    def generate_sample(self, line):
-        "Dataset Generator"
-
-        def reader():
-            input_data = self.line_process(line)
-            feature_name = ["dense_input"]
-            for idx in categorical_range_:
-                feature_name.append("C" + str(idx - 13))
-            feature_name.append("label")
-            yield zip(feature_name, input_data)
-
-        return reader
-
-    def dataloader(self, file_list):
-        "DataLoader Pyreader Generator"
-
-        def reader():
-            for file in file_list:
-                with open(file, 'r') as f:
-                    for line in f:
-                        input_data = self.line_process(line)
-                        yield input_data
-
-        return reader
-
-
-if __name__ == "__main__":
-    yaml_path = sys.argv[1]
-    utils_path = sys.argv[2]
-    sys.path.append(utils_path)
-    import common_ps
-    yaml_helper = common_ps.YamlHelper()
-    config = yaml_helper.load_yaml(yaml_path)
-
-    r = Reader()
-    r.init(config)
-    r.run_from_stdin()

+ 0 - 63
recommend-model-produce/src/main/python/models/dnn/config.yaml

@@ -1,63 +0,0 @@
-# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-# workspace
-#workspace: "models/rank/dnn"
-
-
-runner:
-  train_data_dir: "data/sample_data/train"
-  train_reader_path: "criteo_reader" # importlib format
-  use_gpu: False
-  use_auc: False
-  train_batch_size: 2
-  epochs: 1
-  print_interval: 2
-  model_save_path: "output_model_dnn"
-  infer_batch_size: 2
-  infer_reader_path: "criteo_reader" # importlib format
-  test_data_dir: "data/sample_data/train"
-  infer_load_path: "output_model_dnn"
-  infer_start_epoch: 0
-  infer_end_epoch: 3
-  num_workers: 0
-  #use inference save model
-  # model_init_path: "output_model_dnn/2" # init model
-  use_inference: True
-  save_inference_feed_varnames: ["1"]
-  save_inference_fetch_varnames: ["1","2","3","4","5","6","7","8","9","10","11","12","13","14","15","16","17","18","19","20","21","22","23","24","25","26","dense_input","label", "SparseFeatFactors"]
-
-  # distribute_config
-  sync_mode: "async"
-  split_file_list: False
-  thread_num: 1
-
-  upload_oss: True
-  oss_object_name: "dyp/dnn.tar.gz"
-
-
-# hyper parameters of user-defined network
-hyper_parameters:
-  # optimizer config
-  optimizer:
-    class: Adam
-    learning_rate: 0.001
-    strategy: async
-  # user-defined <key, value> pairs
-  sparse_inputs_slots: 27
-  sparse_feature_number: 1000001
-  sparse_feature_dim: 9
-  dense_input_dim: 13
-  fc_sizes: [256, 256, 128]
-  distributed_embedding: 0

+ 0 - 55
recommend-model-produce/src/main/python/models/dnn/config_bigdata.yaml

@@ -1,55 +0,0 @@
-# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-# workspace
-#workspace: "models/rank/dnn"
-
-
-runner:
-  train_data_dir: "../../../datasets/criteo/slot_train_data_full"
-  train_reader_path: "criteo_reader" # importlib format
-  use_gpu: False
-  use_auc: True
-  train_batch_size: 512
-  epochs: 4
-  print_interval: 10
-  model_save_path: "output_model_dnn_all"
-  infer_reader_path: "criteo_reader" # importlib format
-  test_data_dir: "../../../datasets/criteo/slot_test_data_full"
-  infer_batch_size: 512
-  infer_load_path: "output_model_dnn_all"
-  infer_start_epoch: 0
-  infer_end_epoch: 4
-  num_workers: 0
-
-  #thread_num: 5
-  #reader_type: "QueueDataset"  # DataLoader / QueueDataset / RecDataset
-  #pipe_command: "python3.7 queuedataset_reader.py"
-  #dataset_debug: False
-  #split_file_list: False
-
-# hyper parameters of user-defined network
-hyper_parameters:
-  # optimizer config
-  optimizer:
-    class: Adam
-    learning_rate: 0.001
-    strategy: async
-  # user-defined <key, value> pairs
-  sparse_inputs_slots: 27
-  sparse_feature_number: 1000001
-  sparse_feature_dim: 9
-  dense_input_dim: 13
-  fc_sizes: [512, 256, 128, 32]
-  distributed_embedding: 0

+ 0 - 52
recommend-model-produce/src/main/python/models/dnn/config_cpups.yaml

@@ -1,52 +0,0 @@
-# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-hyper_parameters:
-  optimizer:
-    learning_rate: 0.0001
-    class: Adam
-    strategy: async
-  dense_input_dim: 13
-  sparse_inputs_slots: 27
-  sparse_feature_number: 1000001
-  sparse_feature_dim: 10
-  fc_sizes: [400, 400, 400]
-
-runner:
-  epochs: 15
-  print_interval: 100
-
-  geo_step: 400
-  sync_mode: "async"  # sync / async /geo / heter
-  thread_num: 16
-  use_gpu: False
-  use_auc: True
-  
-  model_path: "static_model.py"
-  reader_type: "QueueDataset"  # DataLoader / QueueDataset / RecDataset
-  pipe_command: "python models/rank/dnn/queuedataset_reader.py"
-  dataset_debug: False
-  split_file_list: False
-
-  train_batch_size: 1000
-  train_data_dir: "data/sample_data/train"
-  train_reader_path: "criteo_reader"
-  model_save_path: "model"
-
-  infer_reader_type: "DataLoader"  # DataLoader / QueueDataset / RecDataset
-  infer_batch_size: 1000
-  test_data_dir: "data/sample_data/train"
-  infer_reader_path: "criteo_reader"
-  infer_load_path: "model"
-  infer_start_epoch: 0
-  infer_end_epoch: 1

+ 0 - 55
recommend-model-produce/src/main/python/models/dnn/config_gpubox.yaml

@@ -1,55 +0,0 @@
-# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-# workspace
-#workspace: "models/rank/dnn"
-
-
-runner:
-  train_data_dir: "data/sample_data/train"
-  train_reader_path: "criteo_reader" # importlib format
-  use_gpu: True
-  use_auc: True
-  train_batch_size: 32
-  epochs: 3
-  print_interval: 10
-  model_save_path: "output_model_dnn_queue"
-
-  sync_mode: "gpubox"
-  thread_num: 30
-  reader_type: "InmemoryDataset"  # DataLoader / QueueDataset / RecDataset / InmemoryDataset
-  pipe_command: "python models/rank/dnn/queuedataset_reader.py"
-  dataset_debug: False
-  split_file_list: False
-
-  infer_batch_size: 2
-  infer_reader_path: "criteo_reader" # importlib format
-  test_data_dir: "data/sample_data/train"
-  infer_load_path: "output_model_dnn_queue"
-  infer_start_epoch: 0
-  infer_end_epoch: 3
-# hyper parameters of user-defined network
-hyper_parameters:
-  # optimizer config
-  optimizer:
-    class: Adam
-    learning_rate: 0.001
-    strategy: async
-  # user-defined <key, value> pairs
-  sparse_inputs_slots: 27
-  sparse_feature_number: 1024
-  sparse_feature_dim: 9
-  dense_input_dim: 13
-  fc_sizes: [512, 256, 128, 32]
-  distributed_embedding: 0

+ 0 - 54
recommend-model-produce/src/main/python/models/dnn/config_queuedataset.yaml

@@ -1,54 +0,0 @@
-# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-# workspace
-#workspace: "models/rank/dnn"
-
-
-runner:
-  train_data_dir: "data/sample_data/train"
-  train_reader_path: "criteo_reader" # importlib format
-  use_gpu: False
-  use_auc: True
-  train_batch_size: 2
-  epochs: 3
-  print_interval: 10
-  model_save_path: "output_model_dnn_queue"
-
-  thread_num: 1
-  reader_type: "QueueDataset"  # DataLoader / QueueDataset / RecDataset
-  pipe_command: "python3.7 queuedataset_reader.py"
-  dataset_debug: False
-  split_file_list: False
-
-  infer_batch_size: 2
-  infer_reader_path: "criteo_reader" # importlib format
-  test_data_dir: "data/sample_data/train"
-  infer_load_path: "output_model_dnn_queue"
-  infer_start_epoch: 0
-  infer_end_epoch: 3
-# hyper parameters of user-defined network
-hyper_parameters:
-  # optimizer config
-  optimizer:
-    class: Adam
-    learning_rate: 0.001
-    strategy: async
-  # user-defined <key, value> pairs
-  sparse_inputs_slots: 27
-  sparse_feature_number: 1000001
-  sparse_feature_dim: 9
-  dense_input_dim: 13
-  fc_sizes: [512, 256, 128, 32]
-  distributed_embedding: 0

+ 0 - 55
recommend-model-produce/src/main/python/models/dnn/config_to_static.yaml

@@ -1,55 +0,0 @@
-# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-# workspace
-#workspace: "models/rank/dnn"
-
-
-runner:
-  train_data_dir: "data/sample_data/train"
-  train_reader_path: "criteo_reader" # importlib format
-  model_init_path: "output_model_dnn/0" # model_init
-  use_gpu: False
-  use_auc: True
-  train_batch_size: 2
-  epochs: 1
-  print_interval: 2
-  model_save_path: "output_model_dnn2" # save path
-  infer_batch_size: 2
-  infer_reader_path: "criteo_reader" # importlib format
-  test_data_dir: "data/sample_data/train"
-  infer_load_path: "output_model_dnn"
-  infer_start_epoch: 0
-  infer_end_epoch: 3
-
-  # distribute_config
-  sync_mode: "async"
-  split_file_list: False
-  thread_num: 1
-
-
-# hyper parameters of user-defined network
-hyper_parameters:
-  # optimizer config
-  optimizer:
-    class: Adam
-    learning_rate: 0.001
-    strategy: async
-  # user-defined <key, value> pairs
-  sparse_inputs_slots: 27
-  sparse_feature_number: 1000001
-  sparse_feature_dim: 9
-  dense_input_dim: 13
-  fc_sizes: [512, 256, 128, 32]
-  distributed_embedding: 0

+ 0 - 84
recommend-model-produce/src/main/python/models/dnn/criteo_reader.py

@@ -1,84 +0,0 @@
-#   Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-from __future__ import print_function
-import numpy as np
-
-from paddle.io import IterableDataset
-
-
-class RecDataset(IterableDataset):
-    def __init__(self, file_list, config):
-        super(RecDataset, self).__init__()
-        self.file_list = file_list
-        self.init()
-
-    def init(self):
-        from operator import mul
-        padding = 0
-        sparse_slots = "click 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26"
-        self.sparse_slots = sparse_slots.strip().split(" ")
-        self.dense_slots = ["dense_feature"]
-        self.dense_slots_shape = [13]
-        self.slots = self.sparse_slots + self.dense_slots
-        self.slot2index = {}
-        self.visit = {}
-        for i in range(len(self.slots)):
-            self.slot2index[self.slots[i]] = i
-            self.visit[self.slots[i]] = False
-        self.padding = padding
-
-    def __iter__(self):
-        full_lines = []
-        self.data = []
-        for file in self.file_list:
-            with open(file, "r") as rf:
-                for l in rf:
-                    line = l.strip().split(" ")
-                    output = [(i, []) for i in self.slots]
-                    for i in line:
-                        slot_feasign = i.split(":")
-                        slot = slot_feasign[0]
-                        if slot not in self.slots:
-                            continue
-                        if slot in self.sparse_slots:
-                            feasign = int(slot_feasign[1])
-                        else:
-                            feasign = float(slot_feasign[1])
-                        output[self.slot2index[slot]][1].append(feasign)
-                        self.visit[slot] = True
-                    for i in self.visit:
-                        slot = i
-                        if not self.visit[slot]:
-                            if i in self.dense_slots:
-                                output[self.slot2index[i]][1].extend(
-                                    [self.padding] *
-                                    self.dense_slots_shape[self.slot2index[i]])
-                            else:
-                                output[self.slot2index[i]][1].extend(
-                                    [self.padding])
-                        else:
-                            self.visit[slot] = False
-                    # sparse
-                    output_list = []
-                    for key, value in output[:-1]:
-                        output_list.append(np.array(value).astype('int64'))
-                    # dense
-                    
-                    print("output_list", output_list)
-
-                    output_list.append(
-                        np.array(output[-1][1]).astype("float32"))
-                    # list
-                    yield output_list

+ 0 - 80
recommend-model-produce/src/main/python/models/dnn/data/sample_data/train/sample_train.txt

@@ -1,80 +0,0 @@
-click:0 dense_feature:0.0 dense_feature:0.00497512437811 dense_feature:0.05 dense_feature:0.08 dense_feature:0.207421875 dense_feature:0.028 dense_feature:0.35 dense_feature:0.08 dense_feature:0.082 dense_feature:0.0 dense_feature:0.4 dense_feature:0.0 dense_feature:0.08 1:737395 2:210498 3:903564 4:286224 5:286835 6:906818 7:906116 8:67180 9:27346 10:51086 11:142177 12:95024 13:157883 14:873363 15:600281 16:812592 17:228085 18:35900 19:880474 20:984402 21:100885 22:26235 23:410878 24:798162 25:499868 26:306163
-click:1 dense_feature:0.0 dense_feature:0.932006633499 dense_feature:0.02 dense_feature:0.14 dense_feature:0.0395625 dense_feature:0.328 dense_feature:0.98 dense_feature:0.12 dense_feature:1.886 dense_feature:0.0 dense_feature:1.8 dense_feature:0.0 dense_feature:0.14 1:715353 2:761523 3:432904 4:892267 5:515218 6:948614 7:266726 8:67180 9:27346 10:266081 11:286126 12:789480 13:49621 14:255651 15:47663 16:79797 17:342789 18:616331 19:880474 20:984402 21:242209 22:26235 23:669531 24:26284 25:269955 26:187951
-click:0 dense_feature:0.0 dense_feature:0.00829187396352 dense_feature:0.08 dense_feature:0.06 dense_feature:0.14125 dense_feature:0.076 dense_feature:0.05 dense_feature:0.22 dense_feature:0.208 dense_feature:0.0 dense_feature:0.2 dense_feature:0.0 dense_feature:0.06 1:737395 2:952384 3:511141 4:271077 5:286835 6:948614 7:903547 8:507110 9:27346 10:56047 11:612953 12:747707 13:977426 14:671506 15:158148 16:833738 17:342789 18:427155 19:880474 20:537425 21:916237 22:26235 23:468277 24:676936 25:751788 26:363967
-click:0 dense_feature:0.0 dense_feature:0.124378109453 dense_feature:0.02 dense_feature:0.04 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.08 dense_feature:0.024 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.04 1:210127 2:286436 3:183920 4:507656 5:286835 6:906818 7:199553 8:67180 9:502607 10:708281 11:809876 12:888238 13:375164 14:202774 15:459895 16:475933 17:555571 18:847163 19:26230 20:26229 21:808836 22:191474 23:410878 24:315120 25:26224 26:26223
-click:0 dense_feature:0.1 dense_feature:0.0149253731343 dense_feature:0.34 dense_feature:0.32 dense_feature:0.016421875 dense_feature:0.098 dense_feature:0.04 dense_feature:0.96 dense_feature:0.202 dense_feature:0.1 dense_feature:0.2 dense_feature:0.0 dense_feature:0.32 1:230803 2:817085 3:539110 4:388629 5:286835 6:948614 7:586040 8:67180 9:27346 10:271155 11:176640 12:827381 13:36881 14:202774 15:397299 16:411672 17:342789 18:474060 19:880474 20:984402 21:216871 22:26235 23:761351 24:787115 25:884722 26:904135
-click:0 dense_feature:0.0 dense_feature:0.00829187396352 dense_feature:0.13 dense_feature:0.04 dense_feature:0.246203125 dense_feature:0.108 dense_feature:0.05 dense_feature:0.04 dense_feature:0.03 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.04 1:737395 2:64837 3:259267 4:336976 5:515218 6:154084 7:847938 8:67180 9:27346 10:708281 11:776766 12:964800 13:324323 14:873363 15:212708 16:637238 17:681378 18:895034 19:673458 20:984402 21:18600 22:26235 23:410878 24:787115 25:884722 26:355412
-click:0 dense_feature:0.0 dense_feature:0.028192371476 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0245625 dense_feature:0.016 dense_feature:0.04 dense_feature:0.12 dense_feature:0.016 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.0 1:737395 2:554760 3:661483 4:263696 5:938478 6:906818 7:786926 8:67180 9:27346 10:245862 11:668197 12:745676 13:432600 14:413795 15:751427 16:272410 17:342789 18:422136 19:26230 20:26229 21:452501 22:26235 23:51381 24:776636 25:26224 26:26223
-click:0 dense_feature:0.0 dense_feature:0.00497512437811 dense_feature:1.95 dense_feature:0.28 dense_feature:0.092828125 dense_feature:0.57 dense_feature:0.06 dense_feature:0.4 dense_feature:0.4 dense_feature:0.0 dense_feature:0.2 dense_feature:0.0 dense_feature:0.4 1:371155 2:817085 3:773609 4:555449 5:938478 6:906818 7:166117 8:507110 9:27346 10:545822 11:316654 12:172765 13:989600 14:255651 15:792372 16:606361 17:342789 18:566554 19:880474 20:984402 21:235256 22:191474 23:700326 24:787115 25:884722 26:569095
-click:0 dense_feature:0.0 dense_feature:0.0912106135987 dense_feature:0.01 dense_feature:0.02 dense_feature:0.06625 dense_feature:0.018 dense_feature:0.05 dense_feature:0.06 dense_feature:0.098 dense_feature:0.0 dense_feature:0.4 dense_feature:0.0 dense_feature:0.04 1:230803 2:531472 3:284417 4:661677 5:938478 6:553107 7:21150 8:49466 9:27346 10:526914 11:164508 12:631773 13:882348 14:873363 15:523948 16:687081 17:342789 18:271301 19:26230 20:26229 21:647160 22:26235 23:410878 24:231695 25:26224 26:26223
-click:1 dense_feature:0.0 dense_feature:0.00663349917081 dense_feature:0.01 dense_feature:0.02 dense_feature:0.02153125 dense_feature:0.092 dense_feature:0.05 dense_feature:0.68 dense_feature:0.472 dense_feature:0.0 dense_feature:0.3 dense_feature:0.0 dense_feature:0.02 1:737395 2:532829 3:320762 4:887282 5:286835 6:25207 7:640357 8:67180 9:27346 10:695831 11:739268 12:835325 13:402539 14:873363 15:125813 16:168896 17:342789 18:374414 19:26230 20:26229 21:850229 22:26235 23:410878 24:480027 25:26224 26:26223
-click:0 dense_feature:0.0 dense_feature:0.00497512437811 dense_feature:0.05 dense_feature:0.04 dense_feature:0.086125 dense_feature:0.098 dense_feature:0.15 dense_feature:0.06 dense_feature:0.228 dense_feature:0.0 dense_feature:0.2 dense_feature:0.0 dense_feature:0.04 1:210127 2:999497 3:646348 4:520638 5:938478 6:906818 7:438398 8:67180 9:27346 10:975902 11:532544 12:708828 13:815045 14:255651 15:896230 16:663630 17:342789 18:820094 19:687226 20:537425 21:481536 22:26235 23:761351 24:888170 25:250729 26:381125
-click:1 dense_feature:0.1 dense_feature:0.00331674958541 dense_feature:0.02 dense_feature:0.02 dense_feature:0.00078125 dense_feature:0.002 dense_feature:0.73 dense_feature:0.08 dense_feature:0.254 dense_feature:0.1 dense_feature:1.4 dense_feature:0.0 dense_feature:0.02 1:715353 2:342833 3:551901 4:73418 5:286835 6:446063 7:219517 8:67180 9:27346 10:668726 11:40711 12:921745 13:361076 14:15048 15:214564 16:400893 17:228085 18:393370 19:26230 20:26229 21:383046 22:26235 23:700326 24:369764 25:26224 26:26223
-click:0 dense_feature:0.0 dense_feature:0.142620232172 dense_feature:0.04 dense_feature:0.1 dense_feature:0.08853125 dense_feature:0.028 dense_feature:0.01 dense_feature:0.1 dense_feature:0.028 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.1 1:737395 2:583707 3:519411 4:19103 5:286835 6:906818 7:801403 8:67180 9:27346 10:35743 11:626052 12:142351 13:988058 14:873363 15:617333 16:850339 17:276641 18:696084 19:26230 20:26229 21:121620 22:191474 23:468277 24:18340 25:26224 26:26223
-click:0 dense_feature:0.0 dense_feature:0.00995024875622 dense_feature:0.0 dense_feature:0.22 dense_feature:0.0251875 dense_feature:0.0 dense_feature:0.0 dense_feature:0.8 dense_feature:0.182 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.84 1:737395 2:19359 3:166075 4:381832 5:286835 6:446063 7:816009 8:67180 9:27346 10:708281 11:619790 12:524128 13:826787 14:202774 15:371495 16:392894 17:644532 18:271180 19:26230 20:26229 21:349978 22:26235 23:761351 24:517170 25:26224 26:26223
-click:0 dense_feature:0.0 dense_feature:0.0149253731343 dense_feature:0.52 dense_feature:0.1 dense_feature:6.25153125 dense_feature:0.0 dense_feature:0.0 dense_feature:0.3 dense_feature:0.03 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.1 1:230803 2:24784 3:519411 4:19103 5:843054 6:948614 7:529143 8:67180 9:502607 10:708281 11:430027 12:142351 13:529101 14:202774 15:618316 16:850339 17:644532 18:95370 19:880474 20:31181 21:121620 22:26235 23:744389 24:18340 25:269955 26:683431
-click:0 dense_feature:0.0 dense_feature:0.0480928689884 dense_feature:0.12 dense_feature:0.22 dense_feature:0.541703125 dense_feature:1.062 dense_feature:0.01 dense_feature:0.24 dense_feature:0.054 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.22 1:737395 2:378661 3:21539 4:552097 5:286835 6:553107 7:512138 8:67180 9:27346 10:708281 11:91094 12:516991 13:150114 14:873363 15:450569 16:353024 17:228085 18:539379 19:26230 20:26229 21:410733 22:26235 23:700326 24:272703 25:26224 26:26223
-click:0 dense_feature:0.0 dense_feature:0.016583747927 dense_feature:0.06 dense_feature:0.0 dense_feature:0.209625 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.09 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 1:737395 2:750131 3:807749 4:905739 5:286835 6:906818 7:11935 8:67180 9:27346 10:708281 11:505199 12:285350 13:724106 14:255651 15:625913 16:511836 17:644532 18:102288 19:26230 20:26229 21:726818 22:179327 23:744389 24:176417 25:26224 26:26223
-click:0 dense_feature:0.0 dense_feature:0.00663349917081 dense_feature:0.05 dense_feature:0.14 dense_feature:0.226703125 dense_feature:0.12 dense_feature:0.05 dense_feature:0.14 dense_feature:0.112 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.14 1:736218 2:690313 3:757279 4:763330 5:286835 6:553107 7:89560 8:642551 9:27346 10:128328 11:281593 12:246510 13:200341 14:255651 15:899145 16:807138 17:342789 18:659853 19:26230 20:26229 21:399608 22:26235 23:669531 24:787115 25:26224 26:26223
-click:0 dense_feature:0.0 dense_feature:0.00829187396352 dense_feature:0.3 dense_feature:0.2 dense_feature:0.021296875 dense_feature:0.83 dense_feature:0.2 dense_feature:0.56 dense_feature:1.122 dense_feature:0.0 dense_feature:0.5 dense_feature:0.0 dense_feature:0.2 1:715353 2:283434 3:523722 4:590869 5:286835 6:948614 7:25472 8:67180 9:27346 10:340404 11:811342 12:679454 13:897590 14:813514 15:578769 16:962576 17:342789 18:267210 19:310188 20:537425 21:746185 22:179327 23:761351 24:416923 25:253255 26:249672
-click:1 dense_feature:0.05 dense_feature:0.0149253731343 dense_feature:0.03 dense_feature:0.24 dense_feature:0.0 dense_feature:0.008 dense_feature:0.4 dense_feature:0.62 dense_feature:0.82 dense_feature:0.1 dense_feature:1.4 dense_feature:0.0 dense_feature:0.08 1:715353 2:532829 3:716475 4:940968 5:286835 6:948614 7:38171 8:67180 9:27346 10:619455 11:515541 12:779426 13:711791 14:255651 15:881750 16:408550 17:342789 18:612540 19:26230 20:26229 21:23444 22:26235 23:410878 24:88425 25:26224 26:26223
-click:0 dense_feature:0.0 dense_feature:0.00497512437811 dense_feature:0.11 dense_feature:0.08 dense_feature:0.135265625 dense_feature:0.426 dense_feature:0.06 dense_feature:0.06 dense_feature:0.42 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.08 1:737395 2:817085 3:506158 4:48876 5:286835 6:948614 7:95506 8:67180 9:27346 10:75825 11:220591 12:613471 13:159874 14:255651 15:121379 16:889290 17:681378 18:532453 19:880474 20:537425 21:717912 22:26235 23:270873 24:450199 25:884722 26:382723
-click:0 dense_feature:0.0 dense_feature:0.0829187396352 dense_feature:0.0 dense_feature:0.0 dense_feature:0.555859375 dense_feature:0.318 dense_feature:0.03 dense_feature:0.0 dense_feature:0.02 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.0 1:715353 2:465222 3:974451 4:892661 5:938478 6:948614 7:651987 8:67180 9:27346 10:708281 11:229311 12:545057 13:875629 14:149134 15:393524 16:213237 17:681378 18:540092 19:26230 20:26229 21:483290 22:26235 23:700326 24:946673 25:26224 26:26223
-click:1 dense_feature:0.05 dense_feature:0.854063018242 dense_feature:0.01 dense_feature:0.04 dense_feature:0.000171875 dense_feature:0.004 dense_feature:0.01 dense_feature:0.04 dense_feature:0.004 dense_feature:0.1 dense_feature:0.1 dense_feature:0.0 dense_feature:0.04 1:737395 2:99294 3:681584 4:398205 5:914075 6:906818 7:620358 8:67180 9:27346 10:147441 11:364583 12:535262 13:516341 14:813514 15:281303 16:714384 17:276641 18:443922 19:26230 20:26229 21:948746 22:26235 23:700326 24:928903 25:26224 26:26223
-click:0 dense_feature:0.0 dense_feature:0.00331674958541 dense_feature:0.0 dense_feature:0.0 dense_feature:0.45190625 dense_feature:0.048 dense_feature:0.01 dense_feature:0.16 dense_feature:0.044 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.0 1:737395 2:792512 3:676584 4:995262 5:938478 6:906818 7:888723 8:67180 9:27346 10:708281 11:310529 12:951172 13:885793 14:873363 15:62698 16:672021 17:276641 18:11502 19:880474 20:984402 21:501083 22:191474 23:744389 24:398029 25:218743 26:991064
-click:0 dense_feature:0.0 dense_feature:0.00663349917081 dense_feature:0.51 dense_feature:0.0 dense_feature:0.2689375 dense_feature:0.0 dense_feature:0.0 dense_feature:0.02 dense_feature:0.006 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 1:230803 2:239052 3:323170 4:474182 5:140103 6:553107 7:757837 8:524745 9:27346 10:743444 11:883533 12:123023 13:621127 14:255651 15:570872 16:883618 17:924903 18:984920 19:964183 20:984402 21:260134 22:179327 23:410878 24:787860 25:269955 26:949924
-click:0 dense_feature:0.0 dense_feature:0.273631840796 dense_feature:0.0 dense_feature:0.0 dense_feature:0.066453125 dense_feature:0.052 dense_feature:0.04 dense_feature:0.06 dense_feature:0.01 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.0 1:737395 2:531472 3:747313 4:362684 5:843054 6:553107 7:863980 8:718499 9:27346 10:881217 11:371751 12:168971 13:290788 14:202774 15:316669 16:269663 17:342789 18:136775 19:26230 20:26229 21:76865 22:26235 23:761351 24:441421 25:26224 26:26223
-click:0 dense_feature:0.0 dense_feature:0.116086235489 dense_feature:0.43 dense_feature:0.36 dense_feature:0.000953125 dense_feature:0.0 dense_feature:0.0 dense_feature:0.36 dense_feature:0.036 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.36 1:737395 2:24784 3:677469 4:820784 5:286835 6:553107 7:715520 8:718499 9:27346 10:708281 11:670424 12:122926 13:724619 14:873363 15:845517 16:488791 17:644532 18:183573 19:880474 20:31181 21:46761 22:26235 23:700326 24:629361 25:269955 26:862373
-click:0 dense_feature:2.55 dense_feature:0.0348258706468 dense_feature:0.01 dense_feature:0.38 dense_feature:0.001453125 dense_feature:0.046 dense_feature:1.11 dense_feature:0.44 dense_feature:2.312 dense_feature:0.2 dense_feature:1.1 dense_feature:0.0 dense_feature:0.46 1:594517 2:194636 3:496284 4:323209 5:286835 6:553107 7:259696 8:760861 9:27346 10:698046 11:478868 12:576074 13:635369 14:201966 15:926692 16:972906 17:342789 18:409802 19:26230 20:26229 21:395694 22:26235 23:410878 24:844671 25:26224 26:26223
-click:0 dense_feature:0.0 dense_feature:0.144278606965 dense_feature:0.43 dense_feature:0.22 dense_feature:0.00309375 dense_feature:0.15 dense_feature:0.14 dense_feature:0.54 dense_feature:0.152 dense_feature:0.0 dense_feature:0.2 dense_feature:0.1 dense_feature:0.22 1:737395 2:239052 3:456744 4:736474 5:286835 6:948614 7:13277 8:67180 9:27346 10:958384 11:778183 12:497627 13:136915 14:201966 15:757961 16:747483 17:228085 18:984920 19:905920 20:537425 21:472149 22:179327 23:410878 24:709155 25:269955 26:618673
-click:0 dense_feature:0.0 dense_feature:0.0132669983416 dense_feature:0.4 dense_feature:0.3 dense_feature:0.36440625 dense_feature:1.492 dense_feature:0.07 dense_feature:0.3 dense_feature:1.048 dense_feature:0.0 dense_feature:0.3 dense_feature:0.0 dense_feature:0.3 1:737395 2:19959 3:661391 4:748753 5:286835 6:948614 7:848540 8:67180 9:27346 10:708281 11:703964 12:72024 13:336272 14:255651 15:835686 16:703858 17:342789 18:274368 19:26230 20:26229 21:765452 22:26235 23:700326 24:815200 25:26224 26:26223
-click:0 dense_feature:0.0 dense_feature:0.0116086235489 dense_feature:0.01 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 1:210127 2:662691 3:334228 4:857003 5:286835 6:25207 7:280499 8:67180 9:502607 10:708281 11:195094 12:870026 13:783566 14:873363 15:139595 16:214259 17:555571 18:208248 19:880474 20:984402 21:471770 22:26235 23:744389 24:507551 25:383787 26:797121
-click:1 dense_feature:0.0 dense_feature:0.0348258706468 dense_feature:0.03 dense_feature:0.02 dense_feature:0.066140625 dense_feature:0.006 dense_feature:0.17 dense_feature:0.02 dense_feature:0.236 dense_feature:0.0 dense_feature:0.5 dense_feature:0.0 dense_feature:0.02 1:230803 2:999497 3:25361 4:892267 5:286835 6:906818 7:356528 8:67180 9:27346 10:5856 11:157692 12:554754 13:442501 14:255651 15:896230 16:248781 17:342789 18:820094 19:905920 20:984402 21:916436 22:26235 23:669531 24:26284 25:884722 26:187951
-click:0 dense_feature:0.0 dense_feature:4.62852404643 dense_feature:0.07 dense_feature:0.0 dense_feature:0.022671875 dense_feature:0.0 dense_feature:0.01 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.0 1:624252 2:344887 3:238747 4:308366 5:286835 6:553107 7:69291 8:67180 9:27346 10:781054 11:258240 12:546906 13:772337 14:873363 15:807640 16:525695 17:276641 18:613203 19:438655 20:984402 21:415123 22:191474 23:700326 24:729290 25:218743 26:953507
-click:0 dense_feature:0.0 dense_feature:0.00663349917081 dense_feature:0.06 dense_feature:0.02 dense_feature:0.06878125 dense_feature:0.044 dense_feature:0.01 dense_feature:0.22 dense_feature:0.044 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.02 1:737395 2:7753 3:871178 4:183530 5:286835 6:906818 7:273988 8:507110 9:27346 10:708281 11:942072 12:775997 13:612590 14:873363 15:669921 16:639940 17:681378 18:421122 19:880474 20:984402 21:410471 22:26235 23:410878 24:228420 25:269955 26:616000
-click:0 dense_feature:0.0 dense_feature:0.212271973466 dense_feature:0.02 dense_feature:0.28 dense_feature:0.113421875 dense_feature:0.06 dense_feature:0.02 dense_feature:0.28 dense_feature:0.194 dense_feature:0.0 dense_feature:0.2 dense_feature:0.0 dense_feature:0.28 1:210127 2:228963 3:692240 4:389834 5:938478 6:948614 7:125690 8:507110 9:27346 10:708281 11:549232 12:308284 13:262461 14:255651 15:629185 16:280660 17:276641 18:886164 19:26230 20:26229 21:367919 22:191474 23:700326 24:520083 25:26224 26:26223
-click:0 dense_feature:0.0 dense_feature:1.01658374793 dense_feature:0.01 dense_feature:0.02 dense_feature:0.11759375 dense_feature:0.08 dense_feature:0.02 dense_feature:0.02 dense_feature:0.024 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.02 1:230803 2:7753 3:194720 4:831884 5:286835 6:553107 7:620358 8:67180 9:27346 10:843010 11:424144 12:615986 13:516341 14:813514 15:782575 16:775856 17:342789 18:421122 19:880474 20:984402 21:110090 22:191474 23:700326 24:784174 25:269955 26:101161
-click:0 dense_feature:0.0 dense_feature:0.00663349917081 dense_feature:0.59 dense_feature:0.06 dense_feature:0.04321875 dense_feature:0.192 dense_feature:0.02 dense_feature:0.08 dense_feature:0.014 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.06 1:230803 2:532829 3:26258 4:853241 5:938478 6:948614 7:877607 8:67180 9:27346 10:613723 11:246387 12:538673 13:377975 14:873363 15:659013 16:601478 17:681378 18:199271 19:26230 20:26229 21:300137 22:26235 23:410878 24:372458 25:26224 26:26223
-click:0 dense_feature:0.0 dense_feature:0.06135986733 dense_feature:0.0 dense_feature:0.0 dense_feature:0.294671875 dense_feature:0.212 dense_feature:0.26 dense_feature:0.0 dense_feature:0.034 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.0 1:737395 2:154478 3:982044 4:501457 5:819883 6:906818 7:445051 8:67180 9:27346 10:976970 11:783630 12:609883 13:358461 14:15048 15:409791 16:756307 17:342789 18:480228 19:26230 20:26229 21:845147 22:26235 23:669531 24:124290 25:26224 26:26223
-click:1 dense_feature:0.05 dense_feature:0.537313432836 dense_feature:0.0 dense_feature:0.02 dense_feature:0.018578125 dense_feature:0.016 dense_feature:0.16 dense_feature:0.22 dense_feature:0.192 dense_feature:0.1 dense_feature:0.3 dense_feature:0.0 dense_feature:0.02 1:737395 2:194636 3:274597 4:418981 5:286835 6:553107 7:553528 8:67180 9:27346 10:901359 11:110700 12:108037 13:915461 14:255651 15:951604 16:421384 17:342789 18:728110 19:26230 20:26229 21:772733 22:191474 23:761351 24:844671 25:26224 26:26223
-click:0 dense_feature:0.1 dense_feature:0.00663349917081 dense_feature:0.16 dense_feature:0.26 dense_feature:0.00509375 dense_feature:0.122 dense_feature:0.03 dense_feature:0.94 dense_feature:0.526 dense_feature:0.1 dense_feature:0.1 dense_feature:0.0 dense_feature:1.1 1:210127 2:344887 3:343793 4:917598 5:286835 6:948614 7:220413 8:67180 9:27346 10:912799 11:370606 12:722621 13:569604 14:255651 15:499545 16:159495 17:342789 18:613203 19:305384 20:984402 21:844602 22:26235 23:410878 24:695516 25:218743 26:729263
-click:0 dense_feature:0.0 dense_feature:0.00497512437811 dense_feature:0.09 dense_feature:0.16 dense_feature:0.11221875 dense_feature:0.51 dense_feature:0.09 dense_feature:0.48 dense_feature:0.088 dense_feature:0.0 dense_feature:0.4 dense_feature:0.0 dense_feature:0.16 1:737395 2:532829 3:579624 4:980109 5:286835 6:948614 7:927736 8:67180 9:27346 10:970644 11:931289 12:377125 13:539272 14:873363 15:555779 16:405069 17:342789 18:701770 19:26230 20:26229 21:201088 22:26235 23:410878 24:113994 25:26224 26:26223
-click:0 dense_feature:0.0 dense_feature:0.182421227197 dense_feature:0.01 dense_feature:0.02 dense_feature:0.000109375 dense_feature:0.978 dense_feature:0.01 dense_feature:0.02 dense_feature:0.062 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.02 1:478318 2:158107 3:508317 4:452336 5:286835 6:948614 7:620358 8:67180 9:27346 10:147441 11:364583 12:34025 13:516341 14:873363 15:502825 16:683439 17:681378 18:889198 19:26230 20:26229 21:234451 22:26235 23:700326 24:256238 25:26224 26:26223
-click:0 dense_feature:0.0 dense_feature:0.469320066335 dense_feature:0.2 dense_feature:0.2 dense_feature:0.0705 dense_feature:0.102 dense_feature:0.05 dense_feature:0.22 dense_feature:0.194 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.2 1:715353 2:846239 3:573061 4:508181 5:286835 6:553107 7:892443 8:718499 9:27346 10:639370 11:866496 12:791636 13:895012 14:873363 15:362079 16:16082 17:228085 18:994402 19:880474 20:984402 21:35513 22:26235 23:669531 24:520197 25:934391 26:625657
-click:0 dense_feature:0.0 dense_feature:0.0729684908789 dense_feature:0.06 dense_feature:0.04 dense_feature:5.620296875 dense_feature:0.0 dense_feature:0.0 dense_feature:0.06 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.04 1:399845 2:239052 3:334610 4:593315 5:286835 6:948614 7:751495 8:67180 9:502607 10:111048 11:244081 12:115252 13:915518 14:873363 15:817451 16:296052 17:276641 18:984920 19:774721 20:984402 21:930636 22:26235 23:700326 24:975048 25:269955 26:266439
-click:1 dense_feature:0.05 dense_feature:0.0265339966833 dense_feature:0.07 dense_feature:0.22 dense_feature:1.5625e-05 dense_feature:0.008 dense_feature:0.04 dense_feature:0.36 dense_feature:0.088 dense_feature:0.1 dense_feature:0.3 dense_feature:0.0 dense_feature:0.08 1:737395 2:64837 3:534435 4:555449 5:286835 6:25207 7:661236 8:67180 9:27346 10:708281 11:785752 12:47348 13:524553 14:117289 15:776971 16:293528 17:681378 18:102169 19:758208 20:31181 21:27506 22:26235 23:410878 24:787115 25:884722 26:605635
-click:1 dense_feature:0.1 dense_feature:0.0464344941957 dense_feature:0.0 dense_feature:0.04 dense_feature:0.00059375 dense_feature:0.004 dense_feature:0.02 dense_feature:0.04 dense_feature:0.004 dense_feature:0.1 dense_feature:0.1 dense_feature:0.0 dense_feature:0.04 1:230803 2:7753 3:529866 4:437169 5:938478 6:948614 7:17274 8:67180 9:27346 10:461781 11:452641 12:302471 13:49621 14:873363 15:543432 16:858509 17:681378 18:402164 19:880474 20:984402 21:650184 22:191474 23:410878 24:492581 25:269955 26:217228
-click:0 dense_feature:0.55 dense_feature:0.00829187396352 dense_feature:0.03 dense_feature:0.0 dense_feature:0.0014375 dense_feature:0.004 dense_feature:0.36 dense_feature:0.0 dense_feature:0.042 dense_feature:0.1 dense_feature:0.4 dense_feature:0.0 dense_feature:0.0 1:26973 2:817085 3:961160 4:355882 5:843054 6:906818 7:417593 8:67180 9:27346 10:708281 11:402889 12:899379 13:552051 14:202774 15:532679 16:545549 17:342789 18:562805 19:880474 20:31181 21:355920 22:26235 23:700326 24:787115 25:884722 26:115004
-click:1 dense_feature:0.0 dense_feature:0.00663349917081 dense_feature:0.01 dense_feature:0.02 dense_feature:0.089296875 dense_feature:0.362 dense_feature:0.23 dense_feature:0.04 dense_feature:0.338 dense_feature:0.0 dense_feature:0.4 dense_feature:0.0 dense_feature:0.02 1:230803 2:977337 3:853759 4:880273 5:515218 6:25207 7:414263 8:437731 9:27346 10:205124 11:108170 12:676869 13:388798 14:255651 15:247232 16:172895 17:228085 18:543219 19:26230 20:26229 21:860937 22:179327 23:669531 24:959959 25:26224 26:26223
-click:0 dense_feature:0.0 dense_feature:0.0945273631841 dense_feature:0.62 dense_feature:0.24 dense_feature:0.11840625 dense_feature:0.368 dense_feature:0.07 dense_feature:0.24 dense_feature:0.144 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.48 1:737395 2:532829 3:805087 4:186661 5:286835 6:154084 7:468059 8:718499 9:27346 10:708281 11:968875 12:8177 13:47822 14:255651 15:979316 16:956543 17:342789 18:541633 19:26230 20:26229 21:646669 22:26235 23:410878 24:184909 25:26224 26:26223
-click:0 dense_feature:0.3 dense_feature:0.00497512437811 dense_feature:0.12 dense_feature:0.12 dense_feature:0.002890625 dense_feature:0.074 dense_feature:0.06 dense_feature:0.14 dense_feature:0.074 dense_feature:0.1 dense_feature:0.1 dense_feature:0.0 dense_feature:0.74 1:737395 2:64837 3:967865 4:249418 5:938478 6:948614 7:228716 8:67180 9:27346 10:627362 11:722606 12:193782 13:348283 14:255651 15:928582 16:221557 17:342789 18:895034 19:384556 20:984402 21:475712 22:26235 23:410878 24:492875 25:884722 26:468964
-click:0 dense_feature:0.0 dense_feature:0.177446102819 dense_feature:0.01 dense_feature:0.02 dense_feature:0.041859375 dense_feature:0.0 dense_feature:0.0 dense_feature:0.16 dense_feature:0.036 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.02 1:154064 2:834620 3:25206 4:25205 5:938478 6:948614 7:134101 8:92608 9:27346 10:708281 11:505199 12:25711 13:724106 14:671506 15:42927 16:25723 17:644532 18:1957 19:26230 20:26229 21:26236 22:26235 23:744389 24:26233 25:26224 26:26223
-click:0 dense_feature:0.0 dense_feature:5.61691542289 dense_feature:0.0 dense_feature:0.1 dense_feature:0.043796875 dense_feature:0.302 dense_feature:0.13 dense_feature:0.22 dense_feature:0.3 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.22 1:154184 2:19359 3:166075 4:381832 5:286835 6:906818 7:348227 8:49466 9:27346 10:645596 11:951584 12:524128 13:277250 14:255651 15:853732 16:392894 17:342789 18:619939 19:26230 20:26229 21:349978 22:26235 23:700326 24:517170 25:26224 26:26223
-click:1 dense_feature:0.0 dense_feature:0.00331674958541 dense_feature:0.0 dense_feature:0.0 dense_feature:0.093234375 dense_feature:0.022 dense_feature:0.04 dense_feature:0.02 dense_feature:0.02 dense_feature:0.0 dense_feature:0.2 dense_feature:0.0 dense_feature:0.0 1:715353 2:485136 3:386313 4:208181 5:286835 6:25207 7:227715 8:49466 9:27346 10:437476 11:733250 12:721260 13:389832 14:255651 15:47178 16:761962 17:342789 18:813169 19:26230 20:26229 21:464938 22:26235 23:410878 24:833196 25:26224 26:26223
-click:0 dense_feature:0.0 dense_feature:0.134328358209 dense_feature:0.0 dense_feature:0.14 dense_feature:0.00015625 dense_feature:0.0 dense_feature:0.0 dense_feature:0.14 dense_feature:0.014 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.14 1:737395 2:488655 3:221719 4:442408 5:286835 6:25207 7:898902 8:718499 9:27346 10:457066 11:290973 12:533168 13:949027 14:873363 15:270294 16:934635 17:924903 18:763017 19:880474 20:31181 21:517486 22:26235 23:410878 24:588215 25:499868 26:980179
-click:1 dense_feature:0.0 dense_feature:0.00331674958541 dense_feature:0.0 dense_feature:0.0 dense_feature:0.023578125 dense_feature:0.0 dense_feature:0.04 dense_feature:0.0 dense_feature:0.046 dense_feature:0.0 dense_feature:0.3 dense_feature:0.0 dense_feature:0.0 1:737395 2:729012 3:691820 4:351286 5:938478 6:553107 7:21150 8:67180 9:27346 10:947459 11:164508 12:205079 13:882348 14:255651 15:178324 16:282716 17:342789 18:193902 19:880474 20:31181 21:604480 22:191474 23:669531 24:727223 25:499868 26:236426
-click:1 dense_feature:0.1 dense_feature:0.00331674958541 dense_feature:0.0 dense_feature:0.0 dense_feature:0.00859375 dense_feature:0.006 dense_feature:1.55 dense_feature:0.16 dense_feature:0.06 dense_feature:0.2 dense_feature:1.6 dense_feature:0.0 dense_feature:0.0 1:712372 2:235347 3:483718 4:382039 5:914075 6:906818 7:727609 8:154004 9:27346 10:116648 11:40711 12:658199 13:361076 14:15048 15:15058 16:644988 17:342789 18:544170 19:26230 20:26229 21:251535 22:26235 23:700326 24:114111 25:26224 26:26223
-click:1 dense_feature:0.25 dense_feature:0.192371475954 dense_feature:0.06 dense_feature:0.36 dense_feature:0.0 dense_feature:0.02 dense_feature:0.09 dense_feature:0.42 dense_feature:0.042 dense_feature:0.2 dense_feature:0.3 dense_feature:0.3 dense_feature:0.0 1:737395 2:288975 3:885137 4:368487 5:515218 6:906818 7:569753 8:799133 9:27346 10:635043 11:883202 12:780104 13:492605 14:873363 15:234451 16:94894 17:796504 18:653705 19:880474 20:984402 21:400692 22:26235 23:410878 24:767424 25:934391 26:958132
-click:1 dense_feature:0.15 dense_feature:0.0398009950249 dense_feature:0.02 dense_feature:0.04 dense_feature:1.5625e-05 dense_feature:0.0 dense_feature:0.06 dense_feature:0.04 dense_feature:0.026 dense_feature:0.1 dense_feature:0.3 dense_feature:0.0 dense_feature:0.0 1:715353 2:532829 3:721632 4:377785 5:286835 6:553107 7:959856 8:718499 9:27346 10:737746 11:432444 12:706936 13:169268 14:873363 15:896219 16:461005 17:342789 18:286597 19:26230 20:26229 21:602049 22:26235 23:700326 24:510447 25:26224 26:26223
-click:0 dense_feature:0.0 dense_feature:0.00663349917081 dense_feature:0.05 dense_feature:0.08 dense_feature:0.155421875 dense_feature:0.55 dense_feature:0.08 dense_feature:0.24 dense_feature:1.73 dense_feature:0.0 dense_feature:0.3 dense_feature:0.0 dense_feature:0.08 1:737395 2:288975 3:385122 4:57409 5:286835 6:25207 7:339181 8:67180 9:27346 10:284863 11:531306 12:229544 13:32168 14:117289 15:632422 16:615549 17:342789 18:240865 19:880474 20:984402 21:253725 22:26235 23:410878 24:837371 25:934391 26:948190
-click:0 dense_feature:0.0 dense_feature:0.0398009950249 dense_feature:0.06 dense_feature:0.12 dense_feature:0.11359375 dense_feature:0.55 dense_feature:0.03 dense_feature:0.12 dense_feature:0.186 dense_feature:0.0 dense_feature:0.2 dense_feature:0.0 dense_feature:0.12 1:737395 2:158107 3:738359 4:343895 5:286835 6:948614 7:513189 8:760861 9:27346 10:741641 11:214926 12:142871 13:753229 14:873363 15:502825 16:864586 17:681378 18:889198 19:26230 20:26229 21:368414 22:191474 23:410878 24:256238 25:26224 26:26223
-click:1 dense_feature:0.25 dense_feature:0.00663349917081 dense_feature:0.03 dense_feature:0.04 dense_feature:7.8125e-05 dense_feature:0.0 dense_feature:0.48 dense_feature:0.06 dense_feature:0.004 dense_feature:0.2 dense_feature:1.3 dense_feature:0.0 dense_feature:0.0 1:737395 2:414770 3:100889 4:981572 5:286835 6:446063 7:600430 8:507110 9:27346 10:566014 11:40711 12:330691 13:361076 14:15048 15:176957 16:759140 17:342789 18:212244 19:26230 20:26229 21:688637 22:26235 23:634287 24:762432 25:26224 26:26223
-click:0 dense_feature:0.0 dense_feature:0.00663349917081 dense_feature:0.04 dense_feature:0.02 dense_feature:0.109765625 dense_feature:0.202 dense_feature:0.13 dense_feature:0.02 dense_feature:0.078 dense_feature:0.0 dense_feature:0.1 dense_feature:0.1 dense_feature:0.02 1:737395 2:7753 3:871178 4:183530 5:286835 6:948614 7:358953 8:718499 9:27346 10:837400 11:432444 12:775997 13:169268 14:255651 15:250644 16:639940 17:342789 18:421122 19:880474 20:984402 21:410471 22:26235 23:410878 24:228420 25:269955 26:870795
-click:0 dense_feature:0.05 dense_feature:0.162520729685 dense_feature:0.28 dense_feature:0.16 dense_feature:0.001046875 dense_feature:0.028 dense_feature:1.03 dense_feature:0.84 dense_feature:0.534 dense_feature:0.1 dense_feature:2.3 dense_feature:0.0 dense_feature:0.28 1:737395 2:334074 3:108983 4:898979 5:286835 6:948614 7:600430 8:718499 9:27346 10:668726 11:40711 12:62821 13:361076 14:202774 15:722413 16:688170 17:342789 18:746785 19:957809 20:984402 21:96056 22:191474 23:410878 24:703372 25:129305 26:591537
-click:0 dense_feature:0.2 dense_feature:0.0945273631841 dense_feature:0.02 dense_feature:0.18 dense_feature:0.021078125 dense_feature:0.046 dense_feature:0.52 dense_feature:0.44 dense_feature:0.18 dense_feature:0.1 dense_feature:0.8 dense_feature:0.0 dense_feature:0.22 1:663372 2:532829 3:714247 4:673800 5:286835 6:906818 7:219517 8:67180 9:27346 10:161916 11:40711 12:441505 13:361076 14:255651 15:992961 16:137571 17:796504 18:395194 19:26230 20:26229 21:800938 22:179327 23:410878 24:719782 25:26224 26:26223
-click:1 dense_feature:0.15 dense_feature:0.24543946932 dense_feature:0.0 dense_feature:0.12 dense_feature:0.0001875 dense_feature:0.004 dense_feature:0.08 dense_feature:0.12 dense_feature:0.072 dense_feature:0.1 dense_feature:0.4 dense_feature:0.0 dense_feature:0.04 1:663372 2:70321 3:202829 4:415480 5:286835 6:553107 7:32934 8:67180 9:27346 10:1873 11:699999 12:55775 13:371214 14:873363 15:685332 16:719499 17:342789 18:135819 19:26230 20:26229 21:973542 22:852086 23:410878 24:635223 25:26224 26:26223
-click:0 dense_feature:0.0 dense_feature:0.0679933665008 dense_feature:0.02 dense_feature:0.02 dense_feature:0.20015625 dense_feature:0.016 dense_feature:0.03 dense_feature:0.02 dense_feature:0.014 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.02 1:737395 2:229199 3:956202 4:475901 5:286835 6:948614 7:614385 8:718499 9:27346 10:171202 11:670646 12:566018 13:386065 14:873363 15:936716 16:825279 17:681378 18:758631 19:26230 20:26229 21:113534 22:26235 23:410878 24:551443 25:26224 26:26223
-click:1 dense_feature:0.05 dense_feature:0.00497512437811 dense_feature:0.04 dense_feature:0.22 dense_feature:0.015921875 dense_feature:0.022 dense_feature:0.04 dense_feature:0.4 dense_feature:0.182 dense_feature:0.1 dense_feature:0.2 dense_feature:0.0 dense_feature:0.22 1:737395 2:64837 3:751736 4:291977 5:286835 6:25207 7:377931 8:718499 9:27346 10:724396 11:433484 12:517940 13:439712 14:201966 15:628624 16:780717 17:342789 18:895034 19:880474 20:31181 21:463725 22:26235 23:410878 24:787115 25:884722 26:164940
-click:1 dense_feature:0.0 dense_feature:0.00995024875622 dense_feature:0.15 dense_feature:0.48 dense_feature:0.051375 dense_feature:0.0 dense_feature:0.0 dense_feature:0.06 dense_feature:0.556 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.5 1:737395 2:532829 3:158777 4:112926 5:286835 6:948614 7:764249 8:67180 9:27346 10:795273 11:330644 12:524443 13:78129 14:873363 15:127209 16:146094 17:342789 18:976129 19:26230 20:26229 21:901094 22:26235 23:410878 24:259263 25:26224 26:26223
-click:1 dense_feature:0.0 dense_feature:0.00497512437811 dense_feature:1.75 dense_feature:0.0 dense_feature:0.922828125 dense_feature:1.078 dense_feature:0.0 dense_feature:0.0 dense_feature:0.112 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 1:26973 2:62956 3:428206 4:935291 5:286835 6:446063 7:360307 8:437731 9:502607 10:957425 11:626052 12:641189 13:988058 14:217110 15:637914 16:293992 17:342789 18:832710 19:774721 20:537425 21:516798 22:191474 23:700326 24:204648 25:884722 26:776972
-click:1 dense_feature:1.95 dense_feature:0.00829187396352 dense_feature:0.08 dense_feature:0.1 dense_feature:0.01878125 dense_feature:0.044 dense_feature:0.42 dense_feature:0.24 dense_feature:0.358 dense_feature:0.1 dense_feature:0.2 dense_feature:0.1 dense_feature:0.26 1:737395 2:638265 3:526671 4:362576 5:938478 6:948614 7:999918 8:67180 9:27346 10:806276 11:181589 12:688684 13:367155 14:255651 15:709602 16:386859 17:228085 18:204112 19:668832 20:537425 21:541553 22:191474 23:410878 24:606704 25:49230 26:68113
-click:0 dense_feature:0.0 dense_feature:0.00331674958541 dense_feature:0.0 dense_feature:0.0 dense_feature:0.38159375 dense_feature:0.022 dense_feature:0.18 dense_feature:0.0 dense_feature:0.016 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.0 1:737395 2:841163 3:284187 4:385559 5:286835 6:446063 7:311604 8:67180 9:27346 10:38910 11:76230 12:520869 13:429321 14:255651 15:296507 16:542357 17:342789 18:377250 19:880474 20:31181 21:325494 22:26235 23:410878 24:26284 25:499868 26:467348
-click:0 dense_feature:0.0 dense_feature:0.00663349917081 dense_feature:0.08 dense_feature:0.0 dense_feature:0.077125 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.03 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 1:737395 2:238813 3:821667 4:209184 5:286835 6:906818 7:261420 8:67180 9:27346 10:748867 11:277196 12:790086 13:495408 14:873363 15:572266 16:281532 17:342789 18:99340 19:880474 20:537425 21:815896 22:26235 23:669531 24:17430 25:734238 26:251811
-click:0 dense_feature:0.0 dense_feature:0.210613598673 dense_feature:0.01 dense_feature:0.0 dense_feature:0.041375 dense_feature:0.0 dense_feature:0.0 dense_feature:0.08 dense_feature:0.026 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 1:737395 2:532829 3:559456 4:565823 5:286835 6:948614 7:48897 8:67180 9:27346 10:708281 11:214000 12:431427 13:477774 14:873363 15:637383 16:678446 17:276641 18:849284 19:26230 20:26229 21:758879 22:26235 23:410878 24:399458 25:26224 26:26223
-click:1 dense_feature:0.2 dense_feature:0.00331674958541 dense_feature:0.0 dense_feature:0.0 dense_feature:0.00440625 dense_feature:0.036 dense_feature:0.04 dense_feature:0.3 dense_feature:0.03 dense_feature:0.1 dense_feature:0.1 dense_feature:0.0 dense_feature:0.0 1:715353 2:532829 3:967094 4:707735 5:286835 6:948614 7:555710 8:154004 9:27346 10:708281 11:514992 12:158604 13:780149 14:255651 15:285282 16:149708 17:342789 18:553067 19:26230 20:26229 21:229985 22:26235 23:700326 24:777746 25:26224 26:26223
-click:1 dense_feature:0.0 dense_feature:0.00331674958541 dense_feature:0.0 dense_feature:0.0 dense_feature:0.23178125 dense_feature:0.222 dense_feature:0.06 dense_feature:0.0 dense_feature:0.408 dense_feature:0.0 dense_feature:0.2 dense_feature:0.0 dense_feature:0.0 1:715353 2:227084 3:456811 4:828682 5:286835 6:948614 7:406567 8:67180 9:27346 10:66123 11:598531 12:527138 13:731439 14:813514 15:35257 16:43339 17:342789 18:918487 19:26230 20:26229 21:580653 22:26235 23:410878 24:495283 25:26224 26:26223
-click:0 dense_feature:0.15 dense_feature:0.462686567164 dense_feature:0.08 dense_feature:0.22 dense_feature:0.00015625 dense_feature:0.022 dense_feature:0.03 dense_feature:0.52 dense_feature:0.022 dense_feature:0.1 dense_feature:0.1 dense_feature:0.0 dense_feature:0.22 1:576931 2:99294 3:263211 4:501662 5:938478 6:154084 7:128918 8:67180 9:27346 10:912799 11:801006 12:506258 13:378182 14:201966 15:150934 16:240427 17:681378 18:393279 19:26230 20:26229 21:152038 22:26235 23:700326 24:551443 25:26224 26:26223
-click:0 dense_feature:0.0 dense_feature:0.00331674958541 dense_feature:0.0 dense_feature:0.0 dense_feature:0.181484375 dense_feature:0.06 dense_feature:0.01 dense_feature:0.0 dense_feature:0.056 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.0 1:230803 2:283434 3:367596 4:197992 5:938478 6:948614 7:268098 8:67180 9:27346 10:870993 11:632267 12:139817 13:718764 14:255651 15:884839 16:80117 17:276641 18:556463 19:880474 20:537425 21:271358 22:26235 23:410878 24:488077 25:253255 26:584828
-click:0 dense_feature:0.0 dense_feature:0.00497512437811 dense_feature:0.0 dense_feature:0.16 dense_feature:4.790078125 dense_feature:0.0 dense_feature:0.0 dense_feature:0.28 dense_feature:0.016 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.2 1:737395 2:532829 3:158777 4:112926 5:286835 6:948614 7:277312 8:67180 9:502607 10:708281 11:755513 12:524443 13:4029 14:873363 15:503814 16:146094 17:644532 18:121590 19:26230 20:26229 21:901094 22:191474 23:744389 24:259263 25:26224 26:26223
-click:0 dense_feature:0.0 dense_feature:3.30845771144 dense_feature:0.0 dense_feature:0.04 dense_feature:0.022671875 dense_feature:0.062 dense_feature:0.01 dense_feature:0.4 dense_feature:0.062 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.04 1:663372 2:529436 3:511823 4:942782 5:286835 6:906818 7:190054 8:67180 9:27346 10:708281 11:32527 12:494263 13:652478 14:873363 15:616057 16:17325 17:342789 18:325238 19:26230 20:26229 21:256747 22:179327 23:410878 24:169709 25:26224 26:26223
-click:0 dense_feature:0.0 dense_feature:0.00829187396352 dense_feature:0.01 dense_feature:0.16 dense_feature:0.206765625 dense_feature:0.328 dense_feature:0.13 dense_feature:0.16 dense_feature:0.176 dense_feature:0.0 dense_feature:0.7 dense_feature:0.0 dense_feature:0.16 1:737395 2:552854 3:606082 4:267619 5:286835 6:948614 7:918889 8:67180 9:27346 10:708281 11:400024 12:972010 13:66330 14:255651 15:432931 16:650209 17:506108 18:212910 19:26230 20:26229 21:107726 22:26235 23:410878 24:718419 25:26224 26:26223

+ 0 - 96
recommend-model-produce/src/main/python/models/dnn/dygraph_model.py

@@ -1,96 +0,0 @@
-# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-import paddle
-import paddle.nn as nn
-import paddle.nn.functional as F
-import math
-
-import net
-
-
-class DygraphModel():
-    # define model
-    def create_model(self, config):
-        sparse_feature_number = config.get(
-            "hyper_parameters.sparse_feature_number")
-        sparse_feature_dim = config.get("hyper_parameters.sparse_feature_dim")
-        fc_sizes = config.get("hyper_parameters.fc_sizes")
-        sparse_fea_num = config.get('hyper_parameters.sparse_fea_num')
-        dense_feature_dim = config.get('hyper_parameters.dense_input_dim')
-        sparse_input_slot = config.get('hyper_parameters.sparse_inputs_slots')
-
-        dnn_model = net.DNNLayer(sparse_feature_number, sparse_feature_dim,
-                                 dense_feature_dim, sparse_input_slot - 1,
-                                 fc_sizes)
-        return dnn_model
-
-    # define feeds which convert numpy of batch data to paddle.tensor
-    def create_feeds(self, batch_data, config):
-        dense_feature_dim = config.get('hyper_parameters.dense_input_dim')
-        sparse_tensor = []
-        for b in batch_data[:-1]:
-            sparse_tensor.append(
-                paddle.to_tensor(b.numpy().astype('int64').reshape(-1, 1)))
-        dense_tensor = paddle.to_tensor(batch_data[-1].numpy().astype(
-            'float32').reshape(-1, dense_feature_dim))
-        label = sparse_tensor[0]
-        return label, sparse_tensor[1:], dense_tensor
-
-    # define loss function by predicts and label
-    def create_loss(self, raw_predict_2d, label):
-        cost = paddle.nn.functional.cross_entropy(
-            input=raw_predict_2d, label=label)
-        avg_cost = paddle.mean(x=cost)
-        return avg_cost
-
-    # define optimizer
-    def create_optimizer(self, dy_model, config):
-        lr = config.get("hyper_parameters.optimizer.learning_rate", 0.001)
-        optimizer = paddle.optimizer.Adam(
-            learning_rate=lr, parameters=dy_model.parameters())
-        return optimizer
-
-    # define metrics such as auc/acc
-    # multi-task need to define multi metric
-    def create_metrics(self):
-        metrics_list_name = ["auc"]
-        auc_metric = paddle.metric.Auc("ROC")
-        metrics_list = [auc_metric]
-        return metrics_list, metrics_list_name
-
-    # construct train forward phase
-    def train_forward(self, dy_model, metrics_list, batch_data, config):
-        label, sparse_tensor, dense_tensor = self.create_feeds(batch_data,
-                                                               config)
-
-        raw_pred_2d = dy_model.forward(sparse_tensor, dense_tensor)
-        loss = self.create_loss(raw_pred_2d, label)
-        # update metrics
-        predict_2d = paddle.nn.functional.softmax(raw_pred_2d)
-        metrics_list[0].update(preds=predict_2d.numpy(), labels=label.numpy())
-
-        print_dict = {'loss': loss}
-        # print_dict = None
-        return loss, metrics_list, print_dict
-
-    def infer_forward(self, dy_model, metrics_list, batch_data, config):
-        label, sparse_tensor, dense_tensor = self.create_feeds(batch_data,
-                                                               config)
-
-        raw_pred_2d = dy_model.forward(sparse_tensor, dense_tensor)
-        # update metrics
-        predict_2d = paddle.nn.functional.softmax(raw_pred_2d)
-        metrics_list[0].update(preds=predict_2d.numpy(), labels=label.numpy())
-        return metrics_list, None

+ 0 - 142
recommend-model-produce/src/main/python/models/dnn/net.py

@@ -1,142 +0,0 @@
-# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-import paddle
-import paddle.nn as nn
-import paddle.nn.functional as F
-import math
-
-
-class DNNLayer(nn.Layer):
-    def __init__(self,
-                 sparse_feature_number,
-                 sparse_feature_dim,
-                 dense_feature_dim,
-                 num_field,
-                 layer_sizes,
-                 sync_mode=None):
-        super(DNNLayer, self).__init__()
-        self.sync_mode = sync_mode
-        self.sparse_feature_number = sparse_feature_number
-        self.sparse_feature_dim = sparse_feature_dim
-        self.dense_feature_dim = dense_feature_dim
-        self.num_field = num_field
-        self.layer_sizes = layer_sizes
-
-        use_sparse = True
-        if paddle.is_compiled_with_custom_device('npu'):
-            use_sparse = False
-
-        self.embedding = paddle.nn.Embedding(
-            self.sparse_feature_number,
-            self.sparse_feature_dim,
-            sparse=use_sparse,
-            weight_attr=paddle.ParamAttr(
-                name="SparseFeatFactors",
-                initializer=paddle.nn.initializer.Uniform()))
-
-        sizes = [sparse_feature_dim * num_field + dense_feature_dim
-                 ] + self.layer_sizes + [2]
-        acts = ["relu" for _ in range(len(self.layer_sizes))] + [None]
-        self._mlp_layers = []
-        for i in range(len(layer_sizes) + 1):
-            linear = paddle.nn.Linear(
-                in_features=sizes[i],
-                out_features=sizes[i + 1],
-                weight_attr=paddle.ParamAttr(
-                    initializer=paddle.nn.initializer.Normal(
-                        std=1.0 / math.sqrt(sizes[i]))))
-            self.add_sublayer('linear_%d' % i, linear)
-            self._mlp_layers.append(linear)
-            if acts[i] == 'relu':
-                act = paddle.nn.ReLU()
-                self.add_sublayer('act_%d' % i, act)
-                self._mlp_layers.append(act)
-
-    def forward(self, sparse_inputs, dense_inputs, show_click=None):
-
-        sparse_embs = []
-        for s_input in sparse_inputs:
-            if self.sync_mode == "gpubox":
-                emb = paddle.static.nn.sparse_embedding(
-                    input=s_input,
-                    size=[
-                        self.sparse_feature_number, self.sparse_feature_dim + 2
-                    ],
-                    param_attr=paddle.ParamAttr(name="embedding"))
-                emb = paddle.static.nn.continuous_value_model(emb, show_click,
-                                                              False)
-            else:
-                emb = self.embedding(s_input)
-            emb = paddle.reshape(emb, shape=[-1, self.sparse_feature_dim])
-            sparse_embs.append(emb)
-
-        y_dnn = paddle.concat(x=sparse_embs + [dense_inputs], axis=1)
-
-        for n_layer in self._mlp_layers:
-            y_dnn = n_layer(y_dnn)
-
-        return y_dnn
-
-
-class StaticDNNLayer(nn.Layer):
-    def __init__(self, sparse_feature_number, sparse_feature_dim,
-                 dense_feature_dim, num_field, layer_sizes):
-        super(StaticDNNLayer, self).__init__()
-        self.sparse_feature_number = sparse_feature_number
-        self.sparse_feature_dim = sparse_feature_dim
-        self.dense_feature_dim = dense_feature_dim
-        self.num_field = num_field
-        self.layer_sizes = layer_sizes
-
-        #self.embedding = paddle.nn.Embedding(
-        #    self.sparse_feature_number,
-        #    self.sparse_feature_dim,
-        #    sparse=True,
-        #    weight_attr=paddle.ParamAttr(
-        #        name="SparseFeatFactors",
-        #        initializer=paddle.nn.initializer.Uniform()))
-
-        sizes = [sparse_feature_dim * num_field + dense_feature_dim
-                 ] + self.layer_sizes + [2]
-        acts = ["relu" for _ in range(len(self.layer_sizes))] + [None]
-        self._mlp_layers = []
-        for i in range(len(layer_sizes) + 1):
-            linear = paddle.nn.Linear(
-                in_features=sizes[i],
-                out_features=sizes[i + 1],
-                weight_attr=paddle.ParamAttr(
-                    initializer=paddle.nn.initializer.Normal(
-                        std=1.0 / math.sqrt(sizes[i]))))
-            self.add_sublayer('linear_%d' % i, linear)
-            self._mlp_layers.append(linear)
-            if acts[i] == 'relu':
-                act = paddle.nn.ReLU()
-                self.add_sublayer('act_%d' % i, act)
-                self._mlp_layers.append(act)
-
-    def forward(self, sparse_embs, dense_inputs):
-
-        #sparse_embs = []
-        #for s_input in sparse_inputs:
-        #    emb = self.embedding(s_input)
-        #    emb = paddle.reshape(emb, shape=[-1, self.sparse_feature_dim])
-        #    sparse_embs.append(emb)
-
-        y_dnn = paddle.concat(x=sparse_embs + [dense_inputs], axis=1)
-
-        for n_layer in self._mlp_layers:
-            y_dnn = n_layer(y_dnn)
-
-        return y_dnn

+ 0 - 49
recommend-model-produce/src/main/python/models/dnn/online.yaml

@@ -1,49 +0,0 @@
-# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-hyper_parameters:
-  optimizer:
-    class: Adam
-    learning_rate: 0.0001
-    adam_lazy_mode: True
-  sparse_inputs_slots: 27
-  sparse_feature_number: 1000001
-  sparse_feature_dim: 10
-  dense_input_dim: 13
-  fc_sizes: [400, 400, 400]
-
-runner:
-  train_data_dir: ["data/sample_data/train/"]
-  days: "{20191225..20191227}"
-  pass_per_day: 24
-
-  train_batch_size: 12
-  train_thread_num: 16
-  geo_step: 400
-  sync_mode: "async"  # sync / async /geo / heter
-
-  pipe_command: "python benchmark_reader.py"
-  print_interval: 100
-
-  use_gpu: 0
-
-  model_path: "static_model.py"
-  dataset_debug: False
-  model_save_path: "model"
-
-  # knock-in and knock-out
-  # create_num_threshold: 1      # knock-in
-  # max_keep_days: 60            # knock-out
-
-  

+ 0 - 95
recommend-model-produce/src/main/python/models/dnn/queuedataset_reader.py

@@ -1,95 +0,0 @@
-# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserved.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-import sys
-import yaml
-import six
-import os
-import copy
-import paddle.distributed.fleet as fleet
-import logging
-import numpy as np
-
-logging.basicConfig(
-    format='%(asctime)s - %(levelname)s - %(message)s', level=logging.INFO)
-logger = logging.getLogger(__name__)
-
-
-class Reader(fleet.MultiSlotDataGenerator):
-    def init(self, config):
-        self.config = config
-        padding = 0
-        sparse_slots = "click 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26"
-        self.sparse_slots = sparse_slots.strip().split(" ")
-        self.dense_slots = ["dense_feature"]
-        self.dense_slots_shape = [13]
-        self.slots = self.sparse_slots + self.dense_slots
-        self.slot2index = {}
-        self.visit = {}
-        for i in range(len(self.slots)):
-            self.slot2index[self.slots[i]] = i
-            self.visit[self.slots[i]] = False
-        self.padding = padding
-        logger.info("pipe init success")
-
-    def line_process(self, line):
-        line = line.strip().split(" ")
-        output = [(i, []) for i in self.slots]
-        for i in line:
-            slot_feasign = i.split(":")
-            slot = slot_feasign[0]
-            if slot not in self.slots:
-                continue
-            if slot in self.sparse_slots:
-                feasign = int(slot_feasign[1])
-            else:
-                feasign = float(slot_feasign[1])
-            output[self.slot2index[slot]][1].append(feasign)
-            self.visit[slot] = True
-        for i in self.visit:
-            slot = i
-            if not self.visit[slot]:
-                if i in self.dense_slots:
-                    output[self.slot2index[i]][1].extend(
-                        [self.padding] *
-                        self.dense_slots_shape[self.slot2index[i]])
-                else:
-                    output[self.slot2index[i]][1].extend([self.padding])
-            else:
-                self.visit[slot] = False
-
-        return output
-        #return [label] + sparse_feature + [dense_feature]
-    def generate_sample(self, line):
-        r"Dataset Generator"
-
-        def reader():
-            output_dict = self.line_process(line)
-            # {key, value} dict format: {'labels': [1], 'sparse_slot1': [2, 3], 'sparse_slot2': [4, 5, 6, 8], 'dense_slot': [1,2,3,4]} 
-            # dict must match static_model.create_feed()
-            yield output_dict
-
-        return reader
-
-
-if __name__ == "__main__":
-    yaml_path = sys.argv[1]
-    utils_path = sys.argv[2]
-    sys.path.append(utils_path)
-    import common_ps
-    yaml_helper = common_ps.YamlHelper()
-    config = yaml_helper.load_yaml(yaml_path)
-
-    r = Reader()
-    r.init(config)
-    r.run_from_stdin()

+ 0 - 129
recommend-model-produce/src/main/python/models/dnn/static_model.py

@@ -1,129 +0,0 @@
-# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-import math
-import paddle
-
-from net import DNNLayer, StaticDNNLayer
-
-
-class StaticModel():
-    def __init__(self, config):
-        self.cost = None
-        self.infer_target_var = None
-        self.config = config
-        self._init_hyper_parameters()
-        self.sync_mode = config.get("runner.sync_mode")
-
-    def _init_hyper_parameters(self):
-        self.is_distributed = False
-        self.distributed_embedding = False
-
-        if self.config.get("hyper_parameters.distributed_embedding", 0) == 1:
-            self.distributed_embedding = True
-
-        self.sparse_feature_number = self.config.get(
-            "hyper_parameters.sparse_feature_number")
-        self.sparse_feature_dim = self.config.get(
-            "hyper_parameters.sparse_feature_dim")
-        self.sparse_inputs_slots = self.config.get(
-            "hyper_parameters.sparse_inputs_slots")
-        self.dense_input_dim = self.config.get(
-            "hyper_parameters.dense_input_dim")
-        self.learning_rate = self.config.get(
-            "hyper_parameters.optimizer.learning_rate")
-        self.fc_sizes = self.config.get("hyper_parameters.fc_sizes")
-
-    def create_feeds(self, is_infer=False):
-        dense_input = paddle.static.data(
-            name="dense_input",
-            shape=[None, self.dense_input_dim],
-            dtype="float32")
-
-        # sparse_input_ids = [
-        #     paddle.static.data(
-        #         name="C" + str(i), shape=[None, 1], dtype="int64")
-        #     for i in range(1, self.sparse_inputs_slots)
-        # ]
-
-        sparse_input_ids = [
-            paddle.static.data(
-                name=str(i), shape=[None, 1], dtype="int64")
-            for i in range(1, self.sparse_inputs_slots)
-        ]
-
-        label = paddle.static.data(
-            name="label", shape=[None, 1], dtype="int64")
-
-        feeds_list = [label] + sparse_input_ids + [dense_input]
-        return feeds_list
-
-    def net(self, input, is_infer=False):
-        self.label_input = input[0]
-        self.sparse_inputs = input[1:self.sparse_inputs_slots]
-        self.dense_input = input[-1]
-        sparse_number = self.sparse_inputs_slots - 1
-
-        dnn_model = DNNLayer(
-            self.sparse_feature_number,
-            self.sparse_feature_dim,
-            self.dense_input_dim,
-            sparse_number,
-            self.fc_sizes,
-            sync_mode=self.sync_mode)
-
-        self.cast_label = paddle.cast(self.label_input, dtype='float32')
-        ones = paddle.full_like(
-            self.label_input, fill_value=1, dtype="float32")
-        show_click = paddle.cast(
-            paddle.concat(
-                [ones, self.cast_label], axis=1), dtype='float32')
-        show_click.stop_gradient = True
-        raw_predict_2d = dnn_model.forward(self.sparse_inputs,
-                                           self.dense_input, show_click)
-
-        predict_2d = paddle.nn.functional.softmax(raw_predict_2d)
-
-        self.predict = predict_2d
-
-        auc, batch_auc, [
-            self.batch_stat_pos, self.batch_stat_neg, self.stat_pos,
-            self.stat_neg
-        ] = paddle.static.auc(input=self.predict,
-                              label=self.label_input,
-                              num_thresholds=2**12,
-                              slide_steps=20)
-        self.inference_target_var = auc
-        if is_infer:
-            fetch_dict = {'auc': auc}
-            return fetch_dict
-
-        cost = paddle.nn.functional.cross_entropy(
-            input=raw_predict_2d, label=self.label_input)
-        avg_cost = paddle.mean(x=cost)
-        self._cost = avg_cost
-
-        fetch_dict = {'cost': avg_cost, 'auc': auc}
-        return fetch_dict
-
-    def create_optimizer(self, strategy=None):
-        optimizer = paddle.optimizer.Adam(
-            learning_rate=self.learning_rate, lazy_mode=True)
-        if strategy != None:
-            import paddle.distributed.fleet as fleet
-            optimizer = fleet.distributed_optimizer(optimizer, strategy)
-        optimizer.minimize(self._cost)
-
-    def infer_net(self, input):
-        return self.net(input, is_infer=True)

+ 0 - 138
recommend-model-produce/src/main/python/models/dnn/static_model_lod.py

@@ -1,138 +0,0 @@
-# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-import math
-import paddle
-
-from net import DNNLayer, StaticDNNLayer
-
-
-class StaticModel():
-    def __init__(self, config):
-        self.cost = None
-        self.infer_target_var = None
-        self.config = config
-        self._init_hyper_parameters()
-
-    def _init_hyper_parameters(self):
-        self.is_distributed = False
-        self.distributed_embedding = False
-
-        if self.config.get("hyper_parameters.distributed_embedding", 0) == 1:
-            self.distributed_embedding = True
-
-        self.sparse_feature_number = self.config.get(
-            "hyper_parameters.sparse_feature_number")
-        self.sparse_feature_dim = self.config.get(
-            "hyper_parameters.sparse_feature_dim")
-        self.sparse_inputs_slots = self.config.get(
-            "hyper_parameters.sparse_inputs_slots")
-        self.dense_input_dim = self.config.get(
-            "hyper_parameters.dense_input_dim")
-        self.learning_rate = self.config.get(
-            "hyper_parameters.optimizer.learning_rate")
-        self.fc_sizes = self.config.get("hyper_parameters.fc_sizes")
-
-    def create_feeds(self, is_infer=False):
-        dense_input = paddle.static.data(
-            name="dense_input",
-            shape=[None, self.dense_input_dim],
-            dtype="float32")
-
-        sparse_input_ids = [
-            paddle.static.data(
-                name="C" + str(i), shape=[None, 1], lod_level=1, dtype="int64")
-            for i in range(1, self.sparse_inputs_slots)
-        ]
-
-        label = paddle.static.data(
-            name="label", shape=[None, 1], dtype="int64")
-
-        feeds_list = [label] + sparse_input_ids + [dense_input]
-        return feeds_list
-
-    def net(self, input, is_infer=False):
-        self.label_input = input[0]
-        self.sparse_inputs = input[1:self.sparse_inputs_slots]
-        self.dense_input = input[-1]
-        sparse_number = self.sparse_inputs_slots - 1
-
-        def embedding_layer(input):
-            if self.distributed_embedding:
-                emb = paddle.static.nn.sparse_embedding(
-                    input=input,
-                    size=[
-                        self.sparse_feature_number, self.sparse_feature_dim
-                    ],
-                    param_attr=paddle.ParamAttr(
-                        name="SparseFeatFactors",
-                        initializer=paddle.nn.initializer.Uniform()))
-            else:
-                paddle.static.Print(input)
-
-                emb = paddle.static.nn.embedding(
-                    input=input,
-                    is_sparse=True,
-                    is_distributed=self.is_distributed,
-                    size=[
-                        self.sparse_feature_number, self.sparse_feature_dim
-                    ],
-                    param_attr=paddle.ParamAttr(
-                        name="SparseFeatFactors",
-                        initializer=paddle.nn.initializer.Uniform()))
-            emb_sum = paddle.static.nn.sequence_pool(
-                input=emb, pool_type='sum')
-            sequeeze_emb_sum = paddle.squeeze(emb_sum, axis=1)
-            #return emb_sum
-            return sequeeze_emb_sum
-
-        sparse_embs = list(map(embedding_layer, self.sparse_inputs))
-
-        dnn_model = StaticDNNLayer(
-            self.sparse_feature_number, self.sparse_feature_dim,
-            self.dense_input_dim, sparse_number, self.fc_sizes)
-
-        raw_predict_2d = dnn_model.forward(sparse_embs, self.dense_input)
-
-        predict_2d = paddle.nn.functional.softmax(raw_predict_2d)
-
-        self.predict = predict_2d
-
-        auc, batch_auc, _ = paddle.static.auc(input=self.predict,
-                                              label=self.label_input,
-                                              num_thresholds=2**12,
-                                              slide_steps=20)
-        self.inference_target_var = auc
-        if is_infer:
-            fetch_dict = {'auc': auc}
-            return fetch_dict
-
-        cost = paddle.nn.functional.cross_entropy(
-            input=raw_predict_2d, label=self.label_input)
-        avg_cost = paddle.mean(x=cost)
-        self._cost = avg_cost
-
-        fetch_dict = {'cost': avg_cost, 'auc': auc}
-        return fetch_dict
-
-    def create_optimizer(self, strategy=None):
-        optimizer = paddle.optimizer.Adam(
-            learning_rate=self.learning_rate, lazy_mode=True)
-        if strategy != None:
-            import paddle.distributed.fleet as fleet
-            optimizer = fleet.distributed_optimizer(optimizer, strategy)
-        optimizer.minimize(self._cost)
-
-    def infer_net(self, input):
-        return self.net(input, is_infer=True)

+ 0 - 45
recommend-model-produce/src/main/python/models/dssm/bq_reader_infer.py

@@ -1,45 +0,0 @@
-#   Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-from __future__ import print_function
-import numpy as np
-
-from paddle.io import IterableDataset
-
-
-class RecDataset(IterableDataset):
-    def __init__(self, file_list, config):
-        super(RecDataset, self).__init__()
-        self.file_list = file_list
-
-    def __iter__(self):
-        full_lines = []
-        for file in self.file_list:
-            with open(file, "r") as rf:
-                for line in rf:
-                    sample_values = line.rstrip('\n').split('    ')
-                    sample_id,  left_features = sample_values
-                    # 处理左右视频特征
-                    left_features = [float(x) for x in left_features.split(',')]
-                    # 验证特征维度
-                    if len(left_features) != self.feature_dim :
-                        return None
-                    
-                    # 构建输出列表
-                    output = []
-                    output.append(("sample_id", [sample_id]))  # 样本ID
-                    output.append(("left_features", left_features))   # 左视频特征
-
- 
-                    yield output_list

+ 0 - 48
recommend-model-produce/src/main/python/models/dssm/bq_reader_train.py

@@ -1,48 +0,0 @@
-#   Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-from __future__ import print_function
-import numpy as np
-
-from paddle.io import IterableDataset
-
-
-class RecDataset(IterableDataset):
-    def __init__(self, file_list, config):
-        super(RecDataset, self).__init__()
-        self.file_list = file_list
-
-    def __iter__(self):
-        full_lines = []
-        for file in self.file_list:
-            with open(file, "r") as rf:
-                for line in rf:
-                    output_list = []
-                    features = line.rstrip('\n').split('\t')
-                    query = [
-                        float(feature) for feature in features[0].split(',')
-                    ]
-                    output_list.append(np.array(query).astype('float32'))
-                    pos_doc = [
-                        float(feature) for feature in features[1].split(',')
-                    ]
-                    output_list.append(np.array(pos_doc).astype('float32'))
-
-                    for i in range(len(features) - 2):
-                        output_list.append(
-                            np.array([
-                                float(feature)
-                                for feature in features[i + 2].split(',')
-                            ]).astype('float32'))
-                    yield output_list

+ 0 - 82
recommend-model-produce/src/main/python/models/dssm/bq_reader_train_insid.py

@@ -1,82 +0,0 @@
-# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserved.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-import sys
-import yaml
-import six
-import os
-import copy
-import paddle.distributed.fleet as fleet
-import logging
-
-logging.basicConfig(
-    format='%(asctime)s - %(levelname)s - %(message)s', level=logging.INFO)
-logger = logging.getLogger(__name__)
-
-
-class Reader(fleet.MultiSlotStringDataGenerator):
-    def init(self, config):
-        self.config = config
-        self.neg_num = self.config.get("hyper_parameters.neg_num")
-
-    def line_process(self, line):
-        data = line.rstrip('\n').split('\t')
-        ins_id = [data[0]]
-        content = [data[1]]
-        features = data[2:]
-        query = features[0].split(',')
-        pos_doc = features[1].split(',')
-
-        neg_doc_list = []
-        for i in range(self.neg_num):
-            neg_doc_list.append(features[i + 2].split(','))
-
-        return [ins_id, content, query, pos_doc] + neg_doc_list
-
-    def generate_sample(self, line):
-        "Dataset Generator"
-
-        def reader():
-            input_data = self.line_process(line)
-            feature_name = ["insid", "content", "query", "pos_doc"]
-            for i in range(self.neg_num):
-                feature_name.append("neg_doc_{}".format(i))
-            yield zip(feature_name, input_data)
-
-        return reader
-
-    def dataloader(self, file_list):
-        "DataLoader Pyreader Generator"
-
-        def reader():
-            for file in file_list:
-                with open(file, 'r') as f:
-                    for line in f:
-                        input_data = self.line_process(line)
-                        yield input_data
-
-        return reader
-
-
-if __name__ == "__main__":
-    yaml_path = sys.argv[1]
-    utils_path = sys.argv[2]
-    sys.path.append(utils_path)
-    import common_ps
-    yaml_helper = common_ps.YamlHelper()
-    config = yaml_helper.load_yaml(yaml_path)
-
-    r = Reader()
-    r.init(config)
-    # r.init(None)
-    r.run_from_stdin()

+ 0 - 41
recommend-model-produce/src/main/python/models/dssm/config.yaml

@@ -1,41 +0,0 @@
-# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-runner:
-  train_data_dir: "data/train"
-  train_reader_path: "bq_reader_train"  # importlib format
-  train_batch_size: 8
-  model_save_path: "output_model_dssm"
-
-  use_gpu: False
-  epochs: 1
-  print_interval: 10
-  
-  test_data_dir: "data/test"
-  infer_reader_path: "bq_reader_infer"  # importlib format
-  infer_batch_size: 1
-  infer_load_path: "output_model_dssm"
-  infer_start_epoch: 0
-  infer_end_epoch: 1
-
-hyper_parameters:
-  optimizer:
-    class: adam
-    learning_rate: 0.001
-    strategy: sync
-  trigram_d: 2900
-  neg_num: 1
-  slice_end: 8
-  fc_sizes: [300, 300, 128]
-  fc_acts: ['relu', 'relu', 'relu']

+ 0 - 42
recommend-model-produce/src/main/python/models/dssm/config_bigdata.yaml

@@ -1,42 +0,0 @@
-# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-
-runner:
-  train_data_dir: "../../../datasets/BQ_dssm/big_train"
-  train_reader_path: "bq_reader_train"  # importlib format
-  train_batch_size: 128
-  model_save_path: "output_model_all_dssm"
-
-  use_gpu: False
-  epochs: 1
-  print_interval: 1
-  
-  test_data_dir: "../../../datasets/BQ_dssm/big_test"
-  infer_reader_path: "bq_reader_infer"  # importlib format
-  infer_batch_size: 1
-  infer_load_path: "output_model_all_dssm"
-  infer_start_epoch: 0
-  infer_end_epoch: 1
-
-hyper_parameters:
-  optimizer:
-    class: adam
-    learning_rate: 0.001
-    strategy: sync
-  trigram_d: 5913
-  neg_num: 1
-  slice_end: 128
-  fc_sizes: [300, 300, 128]
-  fc_acts: ['relu', 'relu', 'relu']

+ 9 - 7
recommend-model-produce/src/main/python/models/dssm/config_ps.yaml → recommend-model-produce/src/main/python/models/dssm/config_infer.yaml

@@ -14,26 +14,29 @@
 
 runner:
   train_data_dir: "/dw/recommend/model/55_dssm_i2i_traindata/"
-  train_reader_path: "bq_reader_train"  # importlib format
-  train_batch_size: 64
+  train_reader_path: "bq_reader_train_ps"  # importlib format
+  train_batch_size: 1024
   model_save_path: "output_model_dssm"
+  split_file_list: true
 
   reader_type: "QueueDataset"  # DataLoader / QueueDataset / RecDataset
   pipe_command: "python bq_reader_train_ps.py"
   thread_num: 1
   sync_mode: "async"
 
+
   use_gpu: False
-  epochs: 5
+  epochs: 3
   print_interval: 1
   
-  test_data_dir: "data/test"
-  infer_reader_path: "bq_reader_infer"  # importlib format
+  test_data_dir: "/dw/recommend/model/56_dssm_i2i_itempredData/20241206/part-00016.gz"
+  infer_reader_path: "bq_reader_train_ps"  # importlib format
   infer_batch_size: 1
-  infer_load_path: "output_model_dssm"
+  infer_load_path: "/app/output_model_dssm"
   infer_start_epoch: 0
   infer_end_epoch: 1
   infer_reader_type: "QueueDataset"
+  
   oss_object_name: "dyp/dssm.tar.gz"
 
 hyper_parameters:
@@ -43,7 +46,6 @@ hyper_parameters:
     strategy: sync
   trigram_d: 2900
   neg_num: 1
-  is_infer: True
   slice_end: 8
   fc_sizes: [300, 300, 128]
   fc_acts: ['relu', 'relu', 'relu']

+ 0 - 70
recommend-model-produce/src/main/python/models/dssm/config_online.yaml

@@ -1,70 +0,0 @@
-# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-runner:
-  train_data_dir: "data/train_with_insid"
-  test_data_dir: "data/test_with_insid"
-  # train_reader_path: "bq_reader_train"  # importlib format
-  days: "{20210803..20210804}"
-  pass_per_day: 1
-  train_batch_size: 8
-  test_batch_size: 8
-  model_save_path: "output_model_dssm"
-
-  reader_type: "InmemoryDataset"  # DataLoader / QueueDataset / RecDataset / InmemoryDataset
-  pipe_command: "python3 bq_reader_train_insid.py"
-
-  sync_mode: "async"
-  # thread_num: 1
-  train_thread_num: 1
-  test_thread_num: 1
-
-  use_gpu: False
-  epochs: 1
-  print_interval: 1
-
-  dataset_debug: False
-
-  # when you need to prune net, please set need_prune to True,
-  # and need to set prune_feed_vars and prune_target_var in static_model.py
-  need_prune: True
-
-  parse_ins_id: True
-  parse_content: True
-  
-  # when you need to dump fileds and params in training, please set need_train_dump to True,
-  # and need to set train_dump_fields and train_dump_params in static_model.py
-  need_train_dump: True
-  # train_dump_fields_dir: "afs:/xxx"
-  train_dump_fields_dir: "./train_dump_data"
-
-  # when you need to dump fileds in inference, please set need_infer_dump to True,
-  # and need to set infer_dump_fields in static_model.py
-  need_infer_dump: True
-  # infer_dump_fields_dir: "afs:/xxx"
-  infer_dump_fields_dir: "./infer_dump_data"
-
-  fs_name: "afs://xxx"
-  fs_ugi: "xxx,xxx"
-  
-hyper_parameters:
-  optimizer:
-    class: adam
-    learning_rate: 0.001
-    strategy: sync
-  trigram_d: 2900
-  neg_num: 1
-  slice_end: 8
-  fc_sizes: [300, 300, 128]
-  fc_acts: ['relu', 'relu', 'relu']

+ 13 - 9
recommend-model-produce/src/main/python/models/dssm/config_ps_bak.yaml → recommend-model-produce/src/main/python/models/dssm/config_trainer.yaml

@@ -13,27 +13,32 @@
 # limitations under the License.
 
 runner:
-  train_data_dir: "data/train"
-  train_reader_path: "bq_reader_train"  # importlib format
-  train_batch_size: 8
-  model_save_path: "output_model_dssm_debug"
+  train_data_dir: "/dw/recommend/model/55_dssm_i2i_traindata/"
+  train_reader_path: "bq_reader_train_ps"  # importlib format
+  train_batch_size: 1024
+  model_save_path: "output_model_dssm"
+  split_file_list: true
 
   reader_type: "QueueDataset"  # DataLoader / QueueDataset / RecDataset
   pipe_command: "python bq_reader_train_ps.py"
   thread_num: 1
   sync_mode: "async"
 
+
   use_gpu: False
-  epochs: 5
+  epochs: 3
   print_interval: 1
   
-  test_data_dir: "data/test"
-  infer_reader_path: "bq_reader_infer"  # importlib format
+  test_data_dir: "/dw/recommend/model/56_dssm_i2i_itempredData/20241206/part-00016.gz"
+  infer_reader_path: "bq_reader_train_ps"  # importlib format
   infer_batch_size: 1
-  infer_load_path: "output_model_dssm"
+  infer_load_path: "/app/output_model_dssm"
   infer_start_epoch: 0
   infer_end_epoch: 1
   infer_reader_type: "QueueDataset"
+  
+  oss_object_name: "dyp/dssm.tar.gz"
+
 hyper_parameters:
   optimizer:
     class: adam
@@ -41,7 +46,6 @@ hyper_parameters:
     strategy: sync
   trigram_d: 2900
   neg_num: 1
-  is_infer: True
   slice_end: 8
   fc_sizes: [300, 300, 128]
   fc_acts: ['relu', 'relu', 'relu']

+ 0 - 93
recommend-model-produce/src/main/python/models/dssm/dygraph_model.py

@@ -1,93 +0,0 @@
-# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-import paddle
-import paddle.nn as nn
-import paddle.nn.functional as F
-import math
-import net
-
-
-class DygraphModel():
-    # define model
-    def create_model(self, config):
-        trigram_d = config.get('hyper_parameters.trigram_d', None)
-        neg_num = config.get('hyper_parameters.neg_num', None)
-        slice_end = config.get('hyper_parameters.slice_end', None)
-        fc_sizes = config.get('hyper_parameters.fc_sizes', None)
-        fc_acts = config.get('hyper_parameters.fc_acts', None)
-
-        DSSM_model = net.DSSMLayer(trigram_d, neg_num, slice_end, fc_sizes,
-                                   fc_acts)
-        return DSSM_model
-
-    # define feeds which convert numpy of batch data to paddle.tensor 
-    def create_feeds_train(self, batch_data, trigram_d):
-        query = paddle.to_tensor(batch_data[0].numpy().astype('float32')
-                                 .reshape(-1, trigram_d))
-        doc_pos = paddle.to_tensor(batch_data[1].numpy().astype('float32')
-                                   .reshape(-1, trigram_d))
-        doc_negs = []
-        for ele in batch_data[2:]:
-            doc_negs.append(
-                paddle.to_tensor(ele.numpy().astype('float32').reshape(
-                    -1, trigram_d)))
-        return [query, doc_pos] + doc_negs
-
-    def create_feeds_infer(self, batch_data, trigram_d):
-        query = paddle.to_tensor(batch_data[0].numpy().astype('float32')
-                                 .reshape(-1, trigram_d))
-        doc_pos = paddle.to_tensor(batch_data[1].numpy().astype('float32')
-                                   .reshape(-1, trigram_d))
-        return [query, doc_pos]
-
-    # define loss function by predicts and label
-    def create_loss(self, hit_prob):
-        loss = -paddle.sum(paddle.log(hit_prob), axis=-1)
-        avg_cost = paddle.mean(x=loss)
-        return avg_cost
-
-    # define optimizer 
-    def create_optimizer(self, dy_model, config):
-        lr = config.get("hyper_parameters.optimizer.learning_rate", 0.001)
-        optimizer = paddle.optimizer.Adam(
-            learning_rate=lr, parameters=dy_model.parameters())
-        return optimizer
-
-    # define metrics such as auc/acc
-    # multi-task need to define multi metric
-    def create_metrics(self):
-        metrics_list_name = []
-        metrics_list = []
-        return metrics_list, metrics_list_name
-
-    # construct train forward phase  
-    def train_forward(self, dy_model, metrics_list, batch_data, config):
-        trigram_d = config.get('hyper_parameters.trigram_d', None)
-        inputs = self.create_feeds_train(batch_data, trigram_d)
-
-        R_Q_D_p, hit_prob = dy_model.forward(inputs, False)
-        loss = self.create_loss(hit_prob)
-        # update metrics
-        print_dict = {"loss": loss}
-        return loss, metrics_list, print_dict
-
-    def infer_forward(self, dy_model, metrics_list, batch_data, config):
-        trigram_d = config.get('hyper_parameters.trigram_d', None)
-        inputs = self.create_feeds_infer(batch_data, trigram_d)
-
-        R_Q_D_p, hit_prob = dy_model.forward(inputs, True)
-        # update metrics
-        print_dict = {"query_doc_sim": R_Q_D_p}
-        return metrics_list, print_dict

+ 0 - 146
recommend-model-produce/src/main/python/models/dssm/net_bak.py

@@ -1,146 +0,0 @@
-import paddle
-import paddle.nn as nn
-import paddle.nn.functional as F
-import numpy as np
-
-class DSSMLayer(nn.Layer):
-    def __init__(self, feature_nums=[5,5,5,5,5], embedding_dim=8, output_dim=16, 
-                 hidden_layers=[40, 32], hidden_acts=["relu", "relu"]):
-        super(DSSMLayer, self).__init__()
-        
-        self.feature_num = len(feature_nums)
-        self.embedding_dim = embedding_dim
-        self.output_dim = output_dim
-        # 第一层的输入维度是所有特征的embedding拼接
-        self.hidden_layers = [self.feature_num * embedding_dim] + hidden_layers + [output_dim]
-        self.hidden_acts = hidden_acts
-        
-        
-        # 为每个特征创建对应维度的Embedding层
-        self.left_embeddings = nn.LayerList([
-            nn.Embedding(
-                num_embeddings=feature_nums[i],
-                embedding_dim=embedding_dim,
-                weight_attr=paddle.ParamAttr(
-                    initializer=paddle.nn.initializer.XavierNormal()
-                )
-            ) for i in range(self.feature_num)
-        ])
-
-        self.right_embeddings = nn.LayerList([
-            nn.Embedding(
-                num_embeddings=feature_nums[i], 
-                embedding_dim=embedding_dim,
-                weight_attr=paddle.ParamAttr(
-                    initializer=paddle.nn.initializer.XavierNormal()
-                )
-            ) for i in range(self.feature_num)
-        ])
-
-        # 左视频塔
-        self._left_tower = []
-        for i in range(len(self.hidden_layers) - 1):
-            linear = paddle.nn.Linear(
-                in_features=self.hidden_layers[i],
-                out_features=self.hidden_layers[i + 1],
-                weight_attr=paddle.ParamAttr(
-                    initializer=paddle.nn.initializer.XavierNormal()
-                ),
-                bias_attr=paddle.ParamAttr(
-                    initializer=paddle.nn.initializer.Constant(value=0.0)
-                )
-            )
-            self.add_sublayer('left_linear_%d' % i, linear)
-            self._left_tower.append(linear)
-            
-            if i < len(hidden_acts) and self.hidden_acts[i] == "relu":
-                act = paddle.nn.ReLU()
-                self.add_sublayer('left_act_%d' % i, act)
-                self._left_tower.append(act)
-
-        # 右视频塔
-        self._right_tower = []
-        for i in range(len(self.hidden_layers) - 1):
-            linear = paddle.nn.Linear(
-                in_features=self.hidden_layers[i],
-                out_features=self.hidden_layers[i + 1],
-                weight_attr=paddle.ParamAttr(
-                    initializer=paddle.nn.initializer.XavierNormal()
-                ),
-                bias_attr=paddle.ParamAttr(
-                    initializer=paddle.nn.initializer.Constant(value=0.0)
-                )
-            )
-            self.add_sublayer('right_linear_%d' % i, linear)
-            self._right_tower.append(linear)
-            
-            if i < len(hidden_acts) and self.hidden_acts[i] == "relu":
-                act = paddle.nn.ReLU()
-                self.add_sublayer('right_act_%d' % i, act)
-                self._right_tower.append(act)
-
-    def _process_features(self, features, embeddings):
-        # 将每个特征转换为embedding
-        embedded_features = []
-        for i in range(self.feature_num):
-            feature = paddle.slice(
-                features, 
-                axes=[1], 
-                starts=[i], 
-                ends=[i+1]
-            )
-            feature = paddle.cast(feature, dtype='int64')
-            embedded = embeddings[i](feature)
-
-            embedded_features.append(embedded)
-        
-        # 将所有embedding连接起来
-   
-        return paddle.concat(embedded_features, axis=1)
-
-    def forward(self, left_features, right_features):
-        # 获取两个视频的特征表示      
-        left_vec, right_vec = self.get_vectors(left_features, right_features)
-
-        # 计算相似度
-        sim_score = F.cosine_similarity(
-            left_vec, 
-            right_vec, 
-            axis=1
-        ).reshape([-1, 1])
-
-        return sim_score, left_vec, right_vec
-
-    def get_vectors(self, left_features, right_features):
-        """获取两个视频的16维特征向量"""
-        # 处理左视频特征
-        
-        left_embedded = self._process_features(left_features, self.left_embeddings)
-        
-        # left_vec = left_embedded
-        left_vec = paddle.reshape(left_embedded, [-1, self.feature_num * self.embedding_dim])
-              
-        
-
-        
-        
-        
-        for i, layer in enumerate(self._left_tower):
-            left_vec = layer(left_vec)
-
-        
-        # 处理右视频特征
-        right_embedded = self._process_features(right_features, self.right_embeddings)
-        # right_vec = right_embedded
-        right_vec = paddle.reshape(right_embedded, [-1, self.feature_num * self.embedding_dim])  
-
-        for layer in self._right_tower:
-            right_vec = layer(right_vec)
-
-            
-        # 确保输出是L2归一化的
-        left_vec = F.normalize(left_vec, p=2, axis=1)
-        right_vec = F.normalize(right_vec, p=2, axis=1)
-        
-        return left_vec, right_vec
-

+ 0 - 19
recommend-model-produce/src/main/python/models/dssm/run.sh

@@ -1,19 +0,0 @@
-# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#!/bin/bash
-echo "................run................."
-python -u ../../../tools/trainer.py -m config_bigdata.yaml
-python -u ../../../tools/infer.py -m config_bigdata.yaml &> result.txt
-python transform.py
-python ../../../tools/cal_pos_neg.py pair.txt

+ 0 - 112
recommend-model-produce/src/main/python/models/dssm/static_model_bak.py

@@ -1,112 +0,0 @@
-import math
-import paddle
-from net import DSSMLayer
-
-class StaticModel():
-    def __init__(self, config):
-        self.cost = None
-        self.config = config
-        self._init_hyper_parameters()
-
-    def _init_hyper_parameters(self):
-        # 修改超参数初始化
-        self.feature_nums = self.config.get("hyper_parameters.feature_nums", [5,5,5,5,5])
-        self.embedding_dim = self.config.get("hyper_parameters.embedding_dim", 8)
-        self.output_dim = self.config.get("hyper_parameters.output_dim", 16)
-        self.hidden_layers = self.config.get("hyper_parameters.hidden_layers", [40, 32])
-        self.hidden_acts = self.config.get("hyper_parameters.hidden_acts", ["relu", "relu"])
-        self.learning_rate = self.config.get("hyper_parameters.optimizer.learning_rate", 0.001)
-        self.margin = self.config.get("hyper_parameters.margin", 0.3)  # 用于损失函数的margin参数
-        self.feature_num = len(self.feature_nums)
-
-    def create_feeds(self, is_infer=False):
-        # 定义输入数据占位符
-        # sample_id = paddle.static.data(
-        #    name="sample_id", shape=[-1, 1], dtype='int64')
-        feeds_list = []
-        if not is_infer:
-            label = paddle.static.data(
-                name="label", shape=[-1, 1], dtype='float32')
-            feeds_list.append(label)
-                    
-        left_features = paddle.static.data(
-            name="left_features", shape=[-1, self.feature_num], dtype='float32')
-        feeds_list.append(left_features)
-        right_features = paddle.static.data(
-            name="right_features", shape=[-1, self.feature_num], dtype='float32')
-        feeds_list.append(right_features)
-        
-        
-
-
-        return feeds_list
-
-    def net(self, input, is_infer=False):
-        # 创建模型实例
-        dssm_model = DSSMLayer(
-            feature_nums=self.feature_nums,
-            embedding_dim=self.embedding_dim,
-            output_dim=self.output_dim,
-            hidden_layers=self.hidden_layers,
-            hidden_acts=self.hidden_acts
-        )
-
-    
-
-        if is_infer:
-            left_features, right_features = input
-        else:
-            label,left_features, right_features = input
-
-
-        # 获取相似度和特征向量
-        sim_score, left_vec, right_vec = dssm_model(left_features, right_features)
-
-        self.inference_target_var = sim_score
-        self.left_vector = left_vec
-        self.right_vector = right_vec
-
-        if is_infer:
-            fetch_dict = {
-                'similarity': sim_score,
-                'left_vector': left_vec,
-                'right_vector': right_vec
-            }
-            return fetch_dict
-
-        # 计算损失
-        # 使用带margin的二元交叉熵损失
-        pos_mask = paddle.cast(label > 0.5, 'float32')
-        neg_mask = 1.0 - pos_mask
-        
-        positive_loss = -pos_mask * paddle.log(paddle.clip(sim_score, 1e-8, 1.0))
-        negative_loss = -neg_mask * paddle.log(paddle.clip(1 - sim_score + self.margin, 1e-8, 1.0))
-        
-        loss = positive_loss + negative_loss
-        avg_cost = paddle.mean(loss)
-        
-        self._cost = avg_cost
-
-        # 计算accuracy
-        predictions = paddle.cast(sim_score > 0.5, 'float32')
-        accuracy = paddle.mean(paddle.cast(paddle.equal(predictions, label), 'float32'))
-
-        fetch_dict = {
-            'loss': avg_cost,
-            'accuracy': accuracy,
-            #'similarity': sim_score,
-            #'left_vector': left_vec,
-            #'right_vector': right_vec
-        }
-        return fetch_dict
-
-    def create_optimizer(self, strategy=None):
-        optimizer = paddle.optimizer.Adam(
-            learning_rate=self.learning_rate)
-        if strategy is not None:
-            import paddle.distributed.fleet as fleet
-            optimizer = fleet.distributed_optimizer(optimizer, strategy)
-        optimizer.minimize(self._cost)
-
-    def infer_net(self, input):
-        return self.net(input, is_infer=True)

+ 0 - 78
recommend-model-produce/src/main/python/models/dssm/transform.py

@@ -1,78 +0,0 @@
-# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-import random
-import numpy as np
-import sklearn.metrics
-
-filename = './result.txt'
-f = open(filename, "r")
-lines = f.readlines()
-f.close()
-result = []
-for line in lines:
-    if "query_doc_sim" in str(line):
-        result.append(line)
-result = result[:-1]
-f = open(filename, "w")
-for i in range(len(result)):
-    f.write(str(result[i]))
-f.close()
-
-label = []
-filename = '../../../datasets/BQ_dssm/label.txt'
-f = open(filename, "r")
-#f.readline()
-num = 0
-for line in f.readlines():
-    num = num + 1
-    line = line.strip()
-    label.append(line)
-f.close()
-print(num)
-
-filename = './result.txt'
-sim = []
-for line in open(filename):
-    line = line.strip().split(",")
-    line[3] = line[3].split(":")
-    line = line[3][1].strip(" ")
-    line = line.strip("[")
-    line = line.strip("]")
-    sim.append(float(line))
-
-filename = '../../../datasets/BQ_dssm/big_test/test.txt'
-f = open(filename, "r")
-#f.readline()
-query = []
-for line in f.readlines():
-    line = line.strip().split("\t")
-    query.append(line[0])
-f.close()
-
-
-def takeFirst(x):
-    return x[0]
-
-
-filename = 'pair.txt'
-line = []
-print(len(query), len(sim), len(label))
-for i in range(len(sim)):
-    line.append([str(query[i]), str(sim[i]), str(label[i])])
-line.sort(key=takeFirst)
-f = open(filename, "w")
-for i in line:
-    f.write(i[0] + "\t" + i[1] + "\t" + i[2] + "\n")
-f.close()

+ 0 - 100
recommend-model-produce/src/main/python/models/wide_and_deep_dataset/data/part-0

@@ -1,100 +0,0 @@
-0	2	0			957	45	2	36	45	1	1			05db9164	532da141			25c83c98	7e0ccccf	a24c8c8e	0b153874	a73ee510	2e8e8e87	41b3f655		ce5114a2	cfef1c29	abd8f51e		07c540c4	bdc06043				c9d4222a	32c7478e			
-0	0	53	7	7	3975	118	7	42	125	0	2		7	05db9164	421b43cd	a5866a2d	29998ed1	25c83c98	7e0ccccf	dc7659bd	0b153874	a73ee510	03e48276	e51ddf94	6aaba33c	3516f6e6	b28479f6	2d0bb053	b041b04a	e5ba7672	2804effd			723b4dfd		3a171ecb	b34f3128		
-0		6	2	0	31		0	0	0		0		0	05db9164	207b2d81	45be6cde	12f4ddf4	25c83c98	7e0ccccf	38eb9cf4	5b392875	a73ee510	f6f942d1	7f8ffe57	cc2a128e	46f42a63	b28479f6	899da9d5	d604ae35	1e88c74f	25c88e42	21ddcdc9	a458ea53	0bede81d	ad3062eb	32c7478e	8539ffbf	001f3601	c38250b6
-0	0	0	1	1	15395	386	5	1	286	0	5		1	5a9ed9b0	8db5bc37	da5fcf9d	9ee19407	43b19349	6f6d9be8	ff6b8352	0b153874	a73ee510	66581c5b	15eced00	8db6c305	bd251a95	64c94865	b4a59bc6	46ee2cca	e5ba7672	821c30b8			4f5bc5bf		32c7478e	7cff0908		
-0	1	0	1	5	122	5	4	15	16	1	2	0	5	05db9164	90081f33	a32e5301	05bbc70d	25c83c98	fe6b92e5	dda1fed2	0b153874	a73ee510	4ce8f99f	7f8ffe57	6c339dfb	46f42a63	64c94865	98995c3b	a4eb16fd	e5ba7672	7181ccc8			3404a71d		3a171ecb	cd7bf52a		
-1	1	0	15	10	1432	89	2	37	69	1	2		10	be589b51	38d50e09	e65d85e1	837cde00	43b19349	7e0ccccf	051659fe	0b153874	a73ee510	df41254d	84eb4210	3bcba45c	5fffcbe5	b28479f6	7501d6be	9cbaae80	07c540c4	f855e3f0	21ddcdc9	5840adea	6ba01fc4		32c7478e	afb24112	001f3601	aa5f0a15
-1	8	1	1	0	28	0	8	0	0	1	1		0	5a9ed9b0	4f25e98b	c273ad95	c91dd596	25c83c98	fbad5c96	b3ddf65a	0b153874	a73ee510	43a17f3d	e973bfd7	3ce47b07	439cd4cc	051219e6	b6dcf31f	8ceec535	e5ba7672	7ef5affa	5b885066	a458ea53	77fad21b	ad3062eb	32c7478e	8eb4ac8d	001f3601	e9e9ecb0
-0	1	33		7	1252	34	1	13	63	1	1		8	5a9ed9b0	b56822db	3f7d0844	da13cca9	25c83c98	13718bbd	4aa938fc	0b153874	a73ee510	451bd4e4	7e40f08a	7dcc3969	1aa94af3	07d13a8f	e2876dac	354a37d9	e5ba7672	38dce391	21ddcdc9	b1252a9d	7b877178		32c7478e	f2e9f0dd	001f3601	6c27a535
-0	0	2			3746	162	3	3	113	0	1			05db9164	38d50e09	1475848f	486da071	5a3e1872	13718bbd	83be4a6a	37e4aa92	a73ee510	802a7512	d9b1e3ff	5dddd37d	cd98af01	07d13a8f	e24ff4c6	bbc2ca43	07c540c4	f855e3f0	21ddcdc9	5840adea	e8ef1267		3a171ecb	b2f178a3	001f3601	c3546e32
-0		345	3	4	35324	219	14	5	202		1		4	05db9164	2c16a946	dafcaa19	ec142e08	25c83c98	7e0ccccf	ca280131	0b153874	a73ee510	3b08e48b	868a9e47	f57cde97	fc5dea81	07d13a8f	18231224	e496e010	e5ba7672	74ef3502			c606b8ba		423fab69	9117a34a		
-0		47	2	4	2751	6	22	5	6		3	0	4	68fd1e64	dde11b16	63f98b5e	8cc7b33b	25c83c98	fe6b92e5	57ef5d40	1f89b562	a73ee510	cce72b25	6ae20392	7becd6e7	78644930	b28479f6	53c5f305	eca39129	e5ba7672	43dfe9bd			cf158609	ad3062eb	dbb486d7	10b3e56d		
-0		8	30	7	2828		0	38	49		0		10	68fd1e64	b56822db	7da86e4b	b733e495	25c83c98	13718bbd	dcc1b63d	5b392875	a73ee510	46a7deb3	731cd88c	ed397d6b	34d253f7	b28479f6	a9d1ba1a	056d8866	1e88c74f	38dce391	21ddcdc9	b1252a9d	deaf6b52		32c7478e	d9556584	001f3601	6c27a535
-0	0	75			28293	227	4	6	370	0	3			98237733	a0baa1e8	91405292	84e93084	4cf72387	fbad5c96	ad3508b1	5b392875	a73ee510	79f15f43	ad757a5a	f2ae7fb8	93b18cb5	32813e21	cb1612e3	89f3e9bc	e5ba7672	3735c118			0b446b76	ad3062eb	c7dc6720	80c73723		
-0	1	185	1	0	0	0	1	0	0	1	1		0	87552397	b56822db	7da86e4b	b733e495	25c83c98	7e0ccccf	003baf94	0b153874	a73ee510	bde51b15	e973bfd7	ed397d6b	439cd4cc	b28479f6	a9d1ba1a	056d8866	d4bb7bd8	38dce391	21ddcdc9	b1252a9d	deaf6b52	ad3062eb	bcdee96c	d9556584	001f3601	6c27a535
-0	0	-1			3827	17	10	4	46	0	1	1		05db9164	8084ee93	02cf9876	c18be181	25c83c98	fbad5c96	07d03e2a	0b153874	a73ee510	49d1ad89	7f8ffe57	8fe001f4	46f42a63	07d13a8f	422c8577	36103458	3486227d	52e44668			e587c466		32c7478e	3b183c5c		
-1		0	12	8	2551	49	19	8	27		1	0	8	68fd1e64	d7988e72	897faa69	ddf2b58b	25c83c98	7e0ccccf	7df3abb9	0b153874	a73ee510	3b08e48b	83dba508	54943d67	09cd9f24	07d13a8f	194c42a4	0c1c9401	3486227d	0f2f9850	21ddcdc9	a458ea53	2bbcc5d0	c9d4222a	32c7478e	3fdb382b	445bbe3b	49d68486
-0	0	2	2	1	7367	680	1	22	191	0	1		1	68fd1e64	8db5bc37	6fdb098b	f181ea30	25c83c98	fe6b92e5	18ffd618	37e4aa92	a73ee510	e286f1e6	fd7856c1	2bdfd009	6a430a5b	64c94865	007fa274	aaede3da	e5ba7672	e7c97dee			fcd1a962		3a171ecb	b7c6f617		
-0		-1			39870			0						05db9164	b961056b	961d73a9	06b1cf6e	25c83c98	7e0ccccf	49042125	0b153874	7cc72ec2	7e0d83d4	ba1ff80a	d4fbf673	b95f83fa	ab7390e9	dd244129	20340c29	1e88c74f	8222ff64			d4fa2b9c		3a171ecb	0ff91809		
-0		606	21	3	1211		0	41	41		0		9	5bfa8ab5	207b2d81	e48e5552	9ddc492e	25c83c98		49eb0b1a	0b153874	a73ee510	b14d9951	2839b07a	e5e1ca92	383a5973	b28479f6	3c767806	29fd6b7b	1e88c74f	395856b0	21ddcdc9	b1252a9d	e5191f27		bcdee96c	c23c2e19	001f3601	3d04cc90
-0	2	1	2	1	159	7	7	7	58	2	4		1	17f69355	f3b07830	3f6f79a2	5327e675	4cf72387	7e0ccccf	38ae26b9	5b392875	a73ee510	42635bfd	aadb87b9	2cc77b94	e9332a03	64c94865	b64212a7	87343f95	e5ba7672	048d01f4			6549ede4		3a171ecb	c657e6e5		
-1	7	1	5	7	0	0	13	9	10	1	2		0	05db9164	6887a43c	7eb22712	ef4fd7f1	25c83c98	fe6b92e5	c0251c88	5b392875	a73ee510	c510044d	ada36e89	c71493ed	c63ea0b4	cfef1c29	9221b8f3	92e4b1e3	e5ba7672	08ed8a1c	21ddcdc9	b1252a9d	9adaf9fb	c9d4222a	55dd3565	b43c75ff	445bbe3b	7826e9ae
-0		-1			126865		0	3	1		0			05db9164	38a947a1	353686f2	f7263320	43b19349	fbad5c96	a2f7459e	0b153874	7cc72ec2	d3787b55	15eced00	317bfd7d	bd251a95	07d13a8f	2ab464b7	1689e4de	e5ba7672	90b7bec5			dc55d6df	ad3062eb	3a171ecb	aa0115d2		
-0	0	616	1	6	2473	88	2	49	71	0	1		20	05db9164	c76014f5	df00c4f0	031bba14	b0530c50	7e0ccccf	0808742e	5b392875	a73ee510	ca82d9dd	2115d03b	336bb1d7	40dfba03	b28479f6	a46c3543	631f0045	07c540c4	93b0d1d7			1fe472e2		32c7478e	89bd83a1		
-1		563	3	2	12500	47	4	2	50		1		2	05db9164	403ea497	2cbec47f	3e2bfbda	25c83c98		f33e4fa1	37e4aa92	a73ee510	e029047e	7b5deffb	21a23bfe	269889be	b28479f6	91f74a64	587267a3	e5ba7672	a78bd508	21ddcdc9	5840adea	c2a93b37		32c7478e	1793a828	e8b83407	2fede552
-0	1	16	13	5	187	32	1	32	32	1	1	0	5	05db9164	4f25e98b	557a2bcb	2e17d6f6	25c83c98	7e0ccccf	ee805808	6c41e35e	a73ee510	0f1ee62c	0601c4d9	2c934d21	c4bf1f3a	07d13a8f	5cedaf14	949bfd42	d4bb7bd8	c04ce6df	5e89f4c8	b1252a9d	20562a99	ad3062eb	32c7478e	9117a34a	001f3601	54ca28ff
-0	0	21	2	2	53	21	8	2	36	0	1		2	05db9164	c1384774	24c93e37	d772d0ec	25c83c98	fbad5c96	e31fb017	0b153874	a73ee510	3b08e48b	d0727572	5662d3e8	16aa5daa	b28479f6	59e23b95	3f6a5fd0	776ce399	658dca4c	21ddcdc9	b1252a9d	3ba1c760	ad3062eb	32c7478e	ecc32110	ea9a246c	36e3666f
-0		1	3	4	211885		0	10	2		0		4	241546e0	d833535f	ad4b77ff	d16679b9	25c83c98	fbad5c96	74752b9b	0b153874	7cc72ec2	8b324596	3db1963c	a2f4e8b5	5dd86246	07d13a8f	943169c2	89052618	e5ba7672	281769c2			d4703ebd	ad3062eb	be7c41b4	aee52b6f		
-0		94			9679	0	1	0	0		1			68fd1e64	65ad571e	b72b7c0f	6c83c769	4cf72387	fe6b92e5	c9a05643	062b5529	a73ee510	157764b8	21c80bff	9c080521	4e215042	ad1cc976	49522e06	7ab65ac2	d4bb7bd8	6e284837			85ce55bd		c3dc6cef	5fca8ab2		
-0	0	21		4	2383	41	2	19	21	0	1	0	4	05db9164	8084ee93	02cf9876	c18be181	25c83c98	3bf701e7	d20b75da	0b153874	a73ee510	efea433b	ab147b82	8fe001f4	66110d1b	1adce6ef	5d922427	36103458	27c07bd6	003d4f4f			e587c466		bcdee96c	3b183c5c		
-0	0	4	19	31	2508	106	1	16	103	0	1		31	41edac3d	a5b69ae3	f84d40ab	78a5bd6a	25c83c98	7e0ccccf	354d03e6	0b153874	a73ee510	255f3655	3bcfd189	55ffe9ca	077640f4	1adce6ef	603a2e9e	4c63632d	d4bb7bd8	a1654f4f	21ddcdc9	5840adea	901b12ea		32c7478e	08b0ce98	2bf691b1	984e0db0
-0		1	136	19	11309	157	6	29	72		2		22	05db9164	58e67aaf	08083030	b7fbbe67	25c83c98	fbad5c96	7d63bf49	5b392875	a18233ea	c9f77507	c389b738	246cf651	d7ccab4e	07d13a8f	10935a85	b7b7ce5d	d4bb7bd8	c21c3e4c	3aae8792	a458ea53	8d0ca54e		c7dc6720	bf282e2d	9b3e8820	d576a861
-0	0	274	3	8	56	36	4	16	24	0	2		8	09ca0b81	4f25e98b	5caa38df	25f92f9d	25c83c98	fe6b92e5	c480abf1	0b153874	a73ee510	32390b96	df29f7bb	dd0abd1f	67b031b4	1adce6ef	fb2772ea	ec486b81	e5ba7672	bc5a0ff7	af7c4727	a458ea53	c7d0e43d	ad3062eb	bcdee96c	9dd84531	001f3601	ae47080f
-0	0	-1			3951	117	1	21	39	0	1	0		05db9164	403ea497	2cbec47f	3e2bfbda	4cf72387		7b26d3fe	0b153874	a73ee510	fa7d0797	043725ae	21a23bfe	7f0d7407	b28479f6	91f74a64	587267a3	d4bb7bd8	a78bd508	21ddcdc9	a458ea53	c2a93b37		3a171ecb	1793a828	e8b83407	2fede552
-0		1			4031		0	0	2		0			05db9164	e112a9de	9db30a48	b3dbc908	25c83c98	7e0ccccf	952279fd	5b392875	a73ee510	0cd894de	c22febf3	2598d8eb	5a504385	ad1cc976	f1e1df0a	9ab4d6b1	e5ba7672	fdbdefe6			bbf96cac	c9d4222a	c3dc6cef	8f079aa5		
-0		-1						0						05db9164	d833535f	77f2f2e5	d16679b9	25c83c98	fe6b92e5	970f01b2	37e4aa92	7cc72ec2	3b08e48b	36bccca0	9f32b866	80467802	07d13a8f	943169c2	31ca40b6	2005abd1	281769c2			dfcfc3fa		3a171ecb	aee52b6f		
-0	7	-1	327	3	1414	97	30	15	37	0	3		3	8cf07265	80e26c9b	9ecdca34	85dd697c	25c83c98	7e0ccccf	89391314	0b153874	a73ee510	d8a1c4f1	608452cc	f6b6edb8	cbb8fa8b	07d13a8f	e8f4b767	2d0bbe92	e5ba7672	005c6740	21ddcdc9	b1252a9d	eccbec78	c9d4222a	32c7478e	1793a828	e8b83407	9904c656
-0		1	8	4	1094		0	48	352		0		15	5a9ed9b0	08d6d899	9143c832	f56b7dd5	f281d2a7	7e0ccccf	76fc09f3	0b153874	a73ee510	3b08e48b	0b60ef54	ae1bb660	0721132d	07d13a8f	41f10449	bad5ee18	1e88c74f	698d1c68			0429f84b		be7c41b4	c0d61a5c		
-0	2	1	10		1186	5	40	6	275	1	13	0		87552397	78ccd99e	c109d265	52e9fdb4	25c83c98	fbad5c96	81f7f73c	5b392875	a73ee510	ebcc4ac8	6263d404	7b2da99a	aa1eb12e	051219e6	9917ad07	f85ed40e	e5ba7672	e7e991cb	f44bef3c	a458ea53	0f53d505		32c7478e	02666583	001f3601	86884549
-0	2	-1			99	1	2	1	1	1	1			05db9164	537e899b	5037b88e	9dde01fd	25c83c98	fe6b92e5	d5276ad8	0b153874	a73ee510	3b08e48b	1c80d81c	680d7261	0b1e410e	07d13a8f	14be02cc	c0673b44	07c540c4	65979fb7			e049c839		32c7478e	6095f986		
-1	4	1		5	1	0	4	45	44	2	2		0	05db9164	5dac953d	d032c263	c18be181	4cf72387	7e0ccccf	9b98e9fc	0b153874	a73ee510	2462946f	7f8ffe57	dfbb09fb	46f42a63	64c94865	ed807c25	84898b2a	e5ba7672	b79acaab			0014c32a		32c7478e	3b183c5c		
-1	1	27	2	4	1	2	21	8	66	1	3		2	05db9164	38a947a1	4470baf4	8c8a4c47	384874ce	fbad5c96	10cfa4ce	0b153874	a73ee510	72ce33ff	d0c3ead8	bb669e25	a7de95c2	b28479f6	8691120a	2b2ce127	e5ba7672	b133fcd4			2b796e4a		bcdee96c	8d365d3b		
-0	0	59		1	25398		0	19	10	0	0		1	68fd1e64	a796837e	af48fb08	e3cc371a	25c83c98	7e0ccccf	9a68af50	0b153874	7cc72ec2	b2ebcf4d	c4bd1c72	c78e8461	bcfc54a9	07d13a8f	870efc17	c4de5bba	07c540c4	9de259c3			7a593b43	c9d4222a	c7dc6720	8fc66e78		
-0		14	7	3	43236	78	0	15	52		0		3	05db9164	f8c8e8f8			89ff5705	7e0ccccf	1e3bdb1b	0b153874	a73ee510	8a99abc1	4352b29b		5f4de855	b28479f6	b15b8172		e5ba7672	d2f0bce2	21ddcdc9	5840adea			bcdee96c		f55c04b6	56be3401
-1	7	2	1	1	201	1	10	7	21	2	4		1	291b7ba2	333137d9	d577be04	26d1c179	25c83c98		e3e366c8	0b153874	a73ee510	e1a2ef0f	c6efad65	0a665a51	63db155e	1adce6ef	63ac89c1	bc19fa9c	e5ba7672	c61e82d7	21ddcdc9	b1252a9d	3e120d5e		32c7478e	98276f90	445bbe3b	38a0a8f1
-0	6	6	17	5	37	5	6	5	5	1	1		5	05db9164	65ad571e	06007265	7eafc40d	25c83c98	fe6b92e5	dcdd8d42	0b153874	a73ee510	e1a2ef0f	c1700682	6ed9dde9	0c66bf77	ad1cc976	49522e06	2759daf4	e5ba7672	6e284837			9d0b86ed		c3dc6cef	5fca8ab2		
-0	2	55	26	19	30	39	2	22	20	1	1		19	50d4de26	0b8e9caf	f95a3480	24031442	25c83c98	fe6b92e5	649c7ded	0b153874	a73ee510	60dca7a3	175d5d07	d49d44e6	21e58fe4	b28479f6	5340cb84	ecbb1f29	e5ba7672	ca6a63cf			0c6b0a44		bcdee96c	08b0ce98		
-1	1	1451	2	2	24	0	8	30	85	1	5	3	0	05db9164	38a947a1	353686f2	f7263320	4cf72387	fbad5c96	a5785c33	0b153874	a73ee510	3b08e48b	b38835a9	317bfd7d	5aceb3b4	07d13a8f	7722bd91	1689e4de	27c07bd6	90b7bec5			dc55d6df		423fab69	aa0115d2		
-1		0	37	30	36	77	12	30	157		5	3	28	68fd1e64	6e638bbc	5c51b5ba	1dae9b6c	4cf72387	fe6b92e5	6d0ca8d7	0b153874	a73ee510	361eec86	6939835e	27bdd67e	dc1d72e4	07d13a8f	1f29ec61	5bc896cd	27c07bd6	3cb7e3f0	21ddcdc9	5840adea	3fa701f0		55dd3565	8d653a3e	445bbe3b	8addf025
-1		-1	2		83214	104	0	0	1		0			05db9164	4f25e98b	21259ece	9ceceafa	25c83c98	fe6b92e5	019bb335	0b153874	7cc72ec2	c6577552	e2a3d92c	911f7581	a984ac48	1adce6ef	17d9b759	80216f56	e5ba7672	7ef5affa	21ddcdc9	b1252a9d	1a4b9964		3a171ecb	0ce9d40e	e8b83407	396dce83
-0	1	21	52	0	0	47	1	0	0	1	1		0	05db9164	38d50e09	948ee031	b7ab56a2	25c83c98	7e0ccccf	a86d9649	361384ce	a73ee510	42635bfd	aadb87b9	42bee2f2	e9332a03	b28479f6	06373944	67b3c631	d4bb7bd8	fffe2a63	21ddcdc9	b1252a9d	bd074856		3a171ecb	df487a73	001f3601	c27f155b
-0		-1			37586	55	1	4	37		1			5a9ed9b0	404660bb	2e4b14c7	f3608b1e	25c83c98	3bf701e7	0d339a25	37e4aa92	a73ee510	0ccaf4c7	7d756b25	2098d925	6f833c7a	b28479f6	abcca5c1	f7d36847	07c540c4	4b17f8a2	21ddcdc9	5840adea	2bce7b1d		32c7478e	f9f7eb22	f0f449dd	b2a97390
-0	0	3320	4	1	4122	83	6	26	70	0	1		3	8cf07265	78ccd99e	45bd7955	6cfd27d8	25c83c98	13718bbd	a4756aa0	5b392875	a73ee510	2f0da49f	3bfee234	3815b09c	888b8320	cfef1c29	798a3785	4b1d7076	e5ba7672	e7e991cb	9437f62f	b1252a9d	a9f61713		93bad2c0	3d146fbb	f0f449dd	91a61c29
-1	3	1	4	6	243	6	31	13	116	1	4		6	24eda356	fdbd6890	4948e114	77199c76	4cf72387	fbad5c96	62fc022b	0b153874	a73ee510	5612701e	e09c447b	72a05bfb	8dab0422	64c94865	0b2c122e	6e957363	e5ba7672	1910e2e3			ab5b7fc6	c9d4222a	423fab69	0ee8c452		
-0	2	2	25	14	1272	58	22	47	137	1	4	1	14	68fd1e64	0468d672	92c34e5f	6917d100	25c83c98	7e0ccccf	c63eac71	0b153874	a73ee510	d7a30208	2010b191	efc34af7	d48de876	1adce6ef	4f3b3616	d3408fd5	3486227d	9880032b	21ddcdc9	5840adea	8e7a84bb		423fab69	3f7eb911	ea9a246c	cdebf969
-0	2	0	11	13	172	29	2	24	23	1	1	1	23	05db9164	38a947a1	d0828830	09801e29	25c83c98	fe6b92e5	19e63407	37e4aa92	a73ee510	3b08e48b	00c11834	27e9858f	d1155458	b28479f6	d92de4ea	e294c66c	3486227d	09953728			0346d048		32c7478e	776eb2c5		
-1	0	49	143	1	5353	8	3	1	75	0	2		1	05db9164	89ddfee8	c2b008c5	e31f97bb	25c83c98	fbad5c96	1c86e0eb	a25968f2	a73ee510	34ccc264	755e4a50	e657c595	5978055e	1adce6ef	34cce7d2	0d7e5968	e5ba7672	5bb2ec8e	21ddcdc9	a458ea53	2c613179		423fab69	8eb162c5	f0f449dd	47ee0e11
-0		0	11	2	4236		0	12	130		0		11	5a9ed9b0	5b7b33dc			b2241560	7e0ccccf	d1208de2	0b153874	a73ee510	7ca23b4a	eb9e7931		837d93f2	07d13a8f	247f84ab		1e88c74f	3cbc29b4					55dd3565			
-1		2	2	1	16008		0	2	1		0		1	05db9164	38a947a1	f1722731	bef1cf93	25c83c98		17f619bc	0b153874	a73ee510	f90f47c5	e7c049c2	1b5efd69	36b96ed0	64c94865	51c5d5ca	f9e62e71	d4bb7bd8	be5810bd			6b23ba2d		32c7478e	043a382b		
-0	0	35	18	9	2687		0	30	90	0	0		9	5a9ed9b0	38d50e09	873cec9e	faeb53d1	25c83c98	7e0ccccf	64917feb	0b153874	a73ee510	3b08e48b	f045731b	1bad82f2	252ee845	b28479f6	06373944	754f444e	07c540c4	fffe2a63	21ddcdc9	b1252a9d	362ad5a2		32c7478e	df487a73	001f3601	c27f155b
-1	5	50	2	4	18	1	5	4	4	2	2		1	05db9164	4f25e98b	e5ecc1d4	26aac878	25c83c98	7e0ccccf	5f8e3e72	0b153874	a73ee510	3b08e48b	81029038	94073dd1	6d0b1734	b28479f6	df2f73e9	9f50aa07	07c540c4	bc5a0ff7	712d530c	b1252a9d	ef95f922		3a171ecb	a3bd4d33	001f3601	6b5cead0
-0		13	1	1	1020		0	1	1		0		1	05db9164	08d6d899	d158b948	cd08b588	25c83c98	fbad5c96	f8077d16	0b153874	a73ee510	3b08e48b	d24aec2b	5ea2e48b	f6224065	b28479f6	bfef54b3	6d922e3b	776ce399	87c6f83c			15fce809		bcdee96c	f96a556f		
-1	0	-1	13	0	5699	216	2	6	56	0	2		2	05db9164	6e638bbc	e1266b28	09e3cd5a	25c83c98		505ca254	0b153874	a73ee510	f6e4bc4d	4968ae8f	eb8ded57	18e370a6	07d13a8f	1f29ec61	3d9023a4	e5ba7672	3cb7e3f0	21ddcdc9	b1252a9d	31b4af04		32c7478e	8d653a3e	445bbe3b	8e1ae331
-1	17	89	26	5	1175	45	75	10	374	1	14		5	05db9164	71ca0a25	c86b2d8d	657dc3b9	25c83c98	fe6b92e5	d0792267	0b153874	a73ee510	7c0a503a	9700edac	1ca7a526	672d927b	b28479f6	a67c19b7	ba46c3a1	e5ba7672	9bf8ffef	21ddcdc9	b1252a9d	eb0fc6f8		32c7478e	df487a73	e8b83407	c27f155b
-0		2	5	0	104594			2					2	39af2607	80e26c9b	f57d3f44	b6951e6b	25c83c98		fa2da417	0b153874	7cc72ec2	d33462a3	358a1187	d120ba45	3966c8cd	07d13a8f	f3635baf	0cf975bf	d4bb7bd8	f54016b9	21ddcdc9	b1252a9d	c136e191		32c7478e	1793a828	e8b83407	66045105
-0	1	-1			118	0	4	2	13	1	2	2		3560b08b	5dac953d	d032c263	c18be181	25c83c98	7e0ccccf	61f42546	0b153874	a73ee510	3b08e48b	e0e79bd6	dfbb09fb	96fa211f	64c94865	ed807c25	84898b2a	27c07bd6	b79acaab			0014c32a		3a171ecb	3b183c5c		
-0		5	31	3	18756		0	11	2		0		4	87552397	04e09220	b1ecc6c4	5dff9b29	25c83c98	fbad5c96	2da1e879	5b392875	a73ee510	b7efa269	a05a0d99	2436ff75	e55dbe27	07d13a8f	f6b23a53	f4ead43c	1e88c74f	6fc84bfb			4f1aa25f		32c7478e	ded4aac9		
-0		25	16	20	8789	20	1	22	20		1		20	05db9164	537e899b	5037b88e	9dde01fd	25c83c98	fbad5c96	0d59e258	0b153874	a73ee510	5612701e	b9ec9192	680d7261	df5886ca	07d13a8f	6d68e99c	c0673b44	d4bb7bd8	b34aa802			e049c839		c7dc6720	6095f986		
-0	6	55	1	1	0	33	13	29	86	2	5		0	05db9164	421b43cd	3956eff2	29998ed1	25c83c98	fe6b92e5	38eb9cf4	0b153874	a73ee510	441dd290	7f8ffe57	6aaba33c	46f42a63	b28479f6	2d0bb053	b041b04a	e5ba7672	2804effd			723b4dfd		32c7478e	b34f3128		
-0		0		2	249180		0	2	50		0		2	05db9164	38a947a1	b9279298	40f36a12	384874ce	fe6b92e5	76d84582	0b153874	7cc72ec2	39cda501	7c53dc69	4c7c8101	4fd35e8f	1adce6ef	3ea7817e	4961b392	e5ba7672	bdd21ce2			2688e7ed		423fab69	9d70bc85		
-0	5	99	31	0	754	13	42	20	1068	1	8	5	13	5a9ed9b0	942f9a8d	4a75b52b	c6fdc148	25c83c98		d3f2ae29	0b153874	a73ee510	7f79890b	c4adf918	8eb3f772	85dbe138	1adce6ef	ae97ecc3	8213a764	8efede7f	1f868fdd	21ddcdc9	a458ea53	74be63ef		32c7478e	9af06ad9	9d93af03	cdfe5ab7
-0		-1			35745		0	0	2		0			39af2607	6887a43c	6d0ceb43	8d164e53	25c83c98	7e0ccccf	838c8fbe	0b153874	7cc72ec2	3b08e48b	f72fff3d	9fa694f3	03f77fd2	07d13a8f	eb1997cb	a54711b4	776ce399	570391ac	21ddcdc9	b1252a9d	78766d37		be7c41b4	9e0bee34	445bbe3b	df909817
-0	1	72	10	14	14	12	1	14	14	1	1		11	05db9164	421b43cd	7bd61a3f	29998ed1	25c83c98	fe6b92e5	a6a575e6	0b153874	a73ee510	45ab2c55	4829f487	6aaba33c	2180053c	b28479f6	2d0bb053	b041b04a	d4bb7bd8	2804effd			723b4dfd	ad3062eb	bcdee96c	b34f3128		
-0		1	1	3	5087	33	1	14	15		1	1	3	05db9164	68b3edbf	b00d1501	d16679b9	25c83c98	7e0ccccf	862c6367	0b153874	a73ee510	230a3832	6514ea2d	e0d76380	4738a95a	b28479f6	f511c49f	1203a270	3486227d	752d8b8a			73d06dde		32c7478e	aee52b6f		
-1	13	251	14	10	7	1	39	38	172	3	10		1	68fd1e64	89ddfee8	39eef0e8	13508380	25c83c98	fbad5c96	ad3508b1	5b392875	a73ee510	07704244	ad757a5a	4594f341	93b18cb5	07d13a8f	59a58e86	02882e54	e5ba7672	ae46962e	1d1eb838	b1252a9d	7b69ac9f		423fab69	45ab94c8	f0f449dd	c84c4aec
-1	0	1	7	26	0	243	25	15	754	0	6	0	0	68fd1e64	942f9a8d	d024aa4a	ca155841	4cf72387	fbad5c96	3f4ec687	0b153874	a73ee510	0e9ead52	c4adf918	08623920	85dbe138	b28479f6	ac182643	0ffc495e	27c07bd6	1f868fdd	f44bef3c	a458ea53	89883ec0	ad3062eb	32c7478e	e4c356ec	9d93af03	b775f5c2
-0	5	0	26	3	60	3	36	6	74	2	10		3	5a9ed9b0	58e67aaf	f1a75345	715dbf7b	4cf72387	fbad5c96	45e063a0	0b153874	a73ee510	27f4bf82	da89cb9b	d145dc65	165642be	b28479f6	62eca3c0	2ebf54b4	e5ba7672	c21c3e4c	338f20de	a458ea53	5bd3d286		32c7478e	bc8b14b9	9b3e8820	cdd2b5b7
-0		1	3		45904	111	0	1	27		0	0		05db9164	89ddfee8	3863b7f1	4daf48e1	25c83c98	fbad5c96	66acf824	0b153874	7cc72ec2	0ed4b00d	e192b186	dca65903	7df3a6c1	07d13a8f	4df3da6b	8784f12f	8efede7f	5bb2ec8e	3014a4b1	b1252a9d	d754f116	ad3062eb	423fab69	16291dd7	f0f449dd	e98cbe6a
-0		-1	2	2	2908	603	0	0	104		0		2	f473b8dc	38a947a1	223b0e16	ca55061c	25c83c98	7e0ccccf	eac6dc30	49dd1874	a73ee510	980d90f4	df29f7bb	156f99ef	67b031b4	1adce6ef	0e78291e	5fbf4a84	d4bb7bd8	1999bae9			deb9605d		32c7478e	e448275f		
-0		5	14	19	3708		0	35	376		0	0	19	05db9164	a796837e	08de7b18	97ce69e9	4cf72387	fe6b92e5	82f666b6	0b153874	a73ee510	03e48276	e51ddf94	c5011072	3516f6e6	cfef1c29	f0bf9094	5a9431f3	3486227d	1cdbd1c5			e754c5e1		3a171ecb	8fc66e78		
-1	3	3	4	7	102	30	4	23	23	1	2		23	8cf07265	26ece8a8	8c6bfe29	1e0ec6a2	25c83c98	fbad5c96	5c8931c6	0b153874	a73ee510	456b972a	77e7d573	1a614fd0	857a4197	07d13a8f	102fc449	d0b4477d	e5ba7672	87fd936e			b193bbca	ad3062eb	423fab69	5a456be6		
-1		12	2	2	7230	12	24	3	21		1		2	68fd1e64	80e26c9b	d3837635	230f1f17	25c83c98		f2d80b52	0b153874	a73ee510	4549ea1f	1bb4f435	b8b324f1	e8d4ea40	07d13a8f	f3635baf	1fb7f493	e5ba7672	f54016b9	21ddcdc9	a458ea53	90c2e498		32c7478e	1793a828	e8b83407	8efc26f8
-1		2	7	3	2934	48	4	4	117		3		3	05db9164	e5fb1af3	77d9caa7	932c3d89	25c83c98	7e0ccccf	ec874408	37e4aa92	a73ee510	5e2b2f1d	c6dfa670	21ca81df	3a4e700b	07d13a8f	b5de5956	cbe07a5c	e5ba7672	13145934	a34d2cf6	a458ea53	cbc662a7		3a171ecb	45a3e015	010f6491	b62a4ef5
-0		1	15	5	3699		0	36	77		0	0	5	05db9164	207b2d81	8a48553d	1e10bd9f	25c83c98	7e0ccccf	7f9907fe	5b392875	a73ee510	200e383b	a7b606c4	6803e296	eae197fd	b28479f6	3c767806	ff48ade9	e5ba7672	395856b0	21ddcdc9	b1252a9d	c3d093fb		3a171ecb	84a27184	001f3601	a30a3fb0
-0	4	175	12	2	923	55	8	48	103	1	2		2	5bfa8ab5	38a947a1	223b0e16	ca55061c	25c83c98	7e0ccccf	ade953a9	5b392875	a73ee510	4072f40f	29e4ad33	156f99ef	80467802	1adce6ef	0e78291e	5fbf4a84	e5ba7672	1999bae9			deb9605d		32c7478e	e448275f		
-0	11	1331	2	2	1296	8	11	13	54	1	1	1	4	68fd1e64	c8687797	5c7d8ff6	902872c9	25c83c98	fbad5c96	d20b4953	0b153874	a73ee510	fbbf2c95	46febd4d	79b87c55	949ea585	b28479f6	dc96c4b0	5627d7e0	3486227d	a7e06874	21ddcdc9	b1252a9d	4063500f	ad3062eb	32c7478e	54baf4d1	010f6491	ba676e3c
-1	3	4		2	1136	2	3	2	2	1	1		2	05db9164	f234d60e	01daaa01	1258049c	43b19349	fbad5c96	fae8ca82	0b153874	a73ee510	9a2a80f7	46d4b56a	ed98b1fb	ed738fad	07d13a8f	40fcbacb	4bf7ec4d	07c540c4	d942f032			8818bdec		3a171ecb	e4ef8e56		
-1	41	14	3	1	1	1	41	1	1	1	1		1	8cf07265	73a46ff0	85a07101	501abd52	0942e0a7	7e0ccccf	4ebdc6e2	0b153874	a73ee510	6417eabb	74475d27	593290d6	403e1842	1adce6ef	d57668e2	15b684be	e5ba7672	da507f45	21ddcdc9	5840adea	f4d7cf94		423fab69	b34f3128	ea9a246c	3090e38b
-1	0	19	2	1	0	63	2	3	7	0	2		0	05db9164	e18b1e61			384874ce	7e0ccccf	f417bf96	6c41e35e	a73ee510	3b08e48b	0ec1e215		44af41ef	07d13a8f	1d432c1e		e5ba7672	b2879faf				ad3062eb	3a171ecb			
-0		0	5	4	11141	218	1	24	217		1		4	68fd1e64	58e67aaf	2113709c	3bfbb842	4cf72387	fe6b92e5	cc8ce7f3	1f89b562	a73ee510	3b08e48b	b6ac69d0	27302de8	e987b058	07d13a8f	10935a85	7958d3dc	d4bb7bd8	c21c3e4c	55dd3565	a458ea53	192551e4		3a171ecb	48056b77	9b3e8820	76415198
-1		57	3	1	21443	49	8	1	38		1		1	5bfa8ab5	c5c1d6ae	bb85179d	98cd0302	25c83c98	fbad5c96	6855ef53	0b153874	a73ee510	175d6c71	b7094596	4750f0d1	1f9d2c38	07d13a8f	b25845fd	130b2582	3486227d	561cabfe	21ddcdc9	5840adea	ffbb089f		32c7478e	1026f362	7a402766	46f2af91
-0	0	0	10	5	1673	91	15	33	256	0	5		5	75ac2fe6	04e09220	b1ecc6c4	5dff9b29	25c83c98	7e0ccccf	63282fe3	0b153874	a73ee510	b95c890d	e6959f26	2436ff75	b57fa159	07d13a8f	f6b23a53	f4ead43c	8efede7f	6fc84bfb			4f1aa25f	ad3062eb	423fab69	ded4aac9		
-1		1	3	5	2985	13	1	5	7		1		5	05db9164	5dac953d	d032c263	c18be181	4cf72387	7e0ccccf	78c0b2ff	1f89b562	a73ee510	3b08e48b	eb4a9b83	dfbb09fb	c0bc5873	1adce6ef	32330105	84898b2a	d4bb7bd8	24de59c1			0014c32a		3a171ecb	3b183c5c		
-0		4	6	1	29743			21				0	1	05db9164	08d6d899	0bab1155	60d5f5a7	25c83c98	7e0ccccf	d6293852	0b153874	a73ee510	3b08e48b	c6cb726f	1d00cbc4	176d07bc	07d13a8f	41f10449	b93ac0ad	d4bb7bd8	698d1c68			bf8efd4c		72592995	f96a556f		
-0		1	1	3	5364	5	1	4	5		1		3	05db9164	207b2d81	057e845b	786673ae	25c83c98	6f6d9be8	f2a82962	0b153874	a73ee510	0ff7e0c6	c255f829	03aa3022	fe528cd1	b28479f6	899da9d5	dc377037	d4bb7bd8	25c88e42	21ddcdc9	a458ea53	907b8dff		32c7478e	7a8e7ed6	001f3601	9042adf0
-0	0	142	1	14	1559	85	13	5	267	0	3	0	14	5a9ed9b0	8ab240be	429e8271	c450716c	25c83c98	fe6b92e5	6fadbb76	1f89b562	a73ee510	fa7d0797	b5939c49	5c3be1d3	377af8aa	b28479f6	b4316eb3	4c0566cc	8efede7f	807ea8b0	21ddcdc9	5840adea	858c4106		32c7478e	2f0b2844	e8b83407	aa5f0a15
-1		1	5	9	8445	16	1	8	9		1	0	9	8cf07265	46bbf321	c5d94b65	5cc8f91d	384874ce	7e0ccccf	099d72d1	5b392875	a73ee510	230a3832	a6f5e788	75c79158	beaa48ab	243a4e68	bcdb9b50	208d4baf	3486227d	ce4d072d			6a909d9a		3a171ecb	1f68c81f		
-0		0	16	3	8297	46	7	5	8		1		3	be589b51	1cfdf714	2acc1a0e	1f2b62a4	4cf72387	7e0ccccf	b4ecbce4	0b153874	a73ee510	3b08e48b	8d68f0f6	cfe25cb7	4e9bebb4	687dfaf4	a54fca2b	446fa98b	e5ba7672	e88ffc9d	6f62a118	a458ea53	ff8c5410		3a171ecb	5029cba6	cb079c2d	a2de1476
-0	3	0	13	8	0	0	4	11	18	1	2		0	05db9164	0468d672	7b8d300a	c619d132	4cf72387	7e0ccccf	0fdf56d6	5b392875	a73ee510	42429aab	6241e24a	bb19e5a1	8c1a3ad8	b28479f6	234191d3	08042d48	e5ba7672	9880032b	21ddcdc9	5840adea	63e3637a	c9d4222a	bcdee96c	d33a0d83	ea9a246c	984e0db0
-0		0	9		326904			6						87552397	207b2d81	365d1d63	e9370452	25c83c98	fbad5c96	b10436ef	0b153874	7cc72ec2	2fed7fc5	4dee99ee	8ff467ea	d299b0dc	07d13a8f	0c67c4ca	acf5f625	07c540c4	395856b0	21ddcdc9	a458ea53	b6af5d81		3a171ecb	27e81296	001f3601	e1572e3b
-1	0	1	1		2708	27	12	0	37	0	4			5bfa8ab5	6c713117	f9513969	63bb9eb1	43b19349	fbad5c96	adbcc874	1f89b562	a73ee510	fa7d0797	46031dab	8ab52742	377af8aa	07d13a8f	78ebcaf1	0c98c1fc	e5ba7672	bf6b118a	21ddcdc9	b1252a9d	45664d1d		32c7478e	40de02ec	445bbe3b	b025bfb1

+ 0 - 100
recommend-model-produce/src/main/python/models/wide_and_deep_dataset/data/part-1

@@ -1,100 +0,0 @@
-0	2	0			957	45	2	36	45	1	1			05db9164	532da141			25c83c98	7e0ccccf	a24c8c8e	0b153874	a73ee510	2e8e8e87	41b3f655		ce5114a2	cfef1c29	abd8f51e		07c540c4	bdc06043				c9d4222a	32c7478e			
-0	0	53	7	7	3975	118	7	42	125	0	2		7	05db9164	421b43cd	a5866a2d	29998ed1	25c83c98	7e0ccccf	dc7659bd	0b153874	a73ee510	03e48276	e51ddf94	6aaba33c	3516f6e6	b28479f6	2d0bb053	b041b04a	e5ba7672	2804effd			723b4dfd		3a171ecb	b34f3128		
-0		6	2	0	31		0	0	0		0		0	05db9164	207b2d81	45be6cde	12f4ddf4	25c83c98	7e0ccccf	38eb9cf4	5b392875	a73ee510	f6f942d1	7f8ffe57	cc2a128e	46f42a63	b28479f6	899da9d5	d604ae35	1e88c74f	25c88e42	21ddcdc9	a458ea53	0bede81d	ad3062eb	32c7478e	8539ffbf	001f3601	c38250b6
-0	0	0	1	1	15395	386	5	1	286	0	5		1	5a9ed9b0	8db5bc37	da5fcf9d	9ee19407	43b19349	6f6d9be8	ff6b8352	0b153874	a73ee510	66581c5b	15eced00	8db6c305	bd251a95	64c94865	b4a59bc6	46ee2cca	e5ba7672	821c30b8			4f5bc5bf		32c7478e	7cff0908		
-0	1	0	1	5	122	5	4	15	16	1	2	0	5	05db9164	90081f33	a32e5301	05bbc70d	25c83c98	fe6b92e5	dda1fed2	0b153874	a73ee510	4ce8f99f	7f8ffe57	6c339dfb	46f42a63	64c94865	98995c3b	a4eb16fd	e5ba7672	7181ccc8			3404a71d		3a171ecb	cd7bf52a		
-1	1	0	15	10	1432	89	2	37	69	1	2		10	be589b51	38d50e09	e65d85e1	837cde00	43b19349	7e0ccccf	051659fe	0b153874	a73ee510	df41254d	84eb4210	3bcba45c	5fffcbe5	b28479f6	7501d6be	9cbaae80	07c540c4	f855e3f0	21ddcdc9	5840adea	6ba01fc4		32c7478e	afb24112	001f3601	aa5f0a15
-1	8	1	1	0	28	0	8	0	0	1	1		0	5a9ed9b0	4f25e98b	c273ad95	c91dd596	25c83c98	fbad5c96	b3ddf65a	0b153874	a73ee510	43a17f3d	e973bfd7	3ce47b07	439cd4cc	051219e6	b6dcf31f	8ceec535	e5ba7672	7ef5affa	5b885066	a458ea53	77fad21b	ad3062eb	32c7478e	8eb4ac8d	001f3601	e9e9ecb0
-0	1	33		7	1252	34	1	13	63	1	1		8	5a9ed9b0	b56822db	3f7d0844	da13cca9	25c83c98	13718bbd	4aa938fc	0b153874	a73ee510	451bd4e4	7e40f08a	7dcc3969	1aa94af3	07d13a8f	e2876dac	354a37d9	e5ba7672	38dce391	21ddcdc9	b1252a9d	7b877178		32c7478e	f2e9f0dd	001f3601	6c27a535
-0	0	2			3746	162	3	3	113	0	1			05db9164	38d50e09	1475848f	486da071	5a3e1872	13718bbd	83be4a6a	37e4aa92	a73ee510	802a7512	d9b1e3ff	5dddd37d	cd98af01	07d13a8f	e24ff4c6	bbc2ca43	07c540c4	f855e3f0	21ddcdc9	5840adea	e8ef1267		3a171ecb	b2f178a3	001f3601	c3546e32
-0		345	3	4	35324	219	14	5	202		1		4	05db9164	2c16a946	dafcaa19	ec142e08	25c83c98	7e0ccccf	ca280131	0b153874	a73ee510	3b08e48b	868a9e47	f57cde97	fc5dea81	07d13a8f	18231224	e496e010	e5ba7672	74ef3502			c606b8ba		423fab69	9117a34a		
-0		47	2	4	2751	6	22	5	6		3	0	4	68fd1e64	dde11b16	63f98b5e	8cc7b33b	25c83c98	fe6b92e5	57ef5d40	1f89b562	a73ee510	cce72b25	6ae20392	7becd6e7	78644930	b28479f6	53c5f305	eca39129	e5ba7672	43dfe9bd			cf158609	ad3062eb	dbb486d7	10b3e56d		
-0		8	30	7	2828		0	38	49		0		10	68fd1e64	b56822db	7da86e4b	b733e495	25c83c98	13718bbd	dcc1b63d	5b392875	a73ee510	46a7deb3	731cd88c	ed397d6b	34d253f7	b28479f6	a9d1ba1a	056d8866	1e88c74f	38dce391	21ddcdc9	b1252a9d	deaf6b52		32c7478e	d9556584	001f3601	6c27a535
-0	0	75			28293	227	4	6	370	0	3			98237733	a0baa1e8	91405292	84e93084	4cf72387	fbad5c96	ad3508b1	5b392875	a73ee510	79f15f43	ad757a5a	f2ae7fb8	93b18cb5	32813e21	cb1612e3	89f3e9bc	e5ba7672	3735c118			0b446b76	ad3062eb	c7dc6720	80c73723		
-0	1	185	1	0	0	0	1	0	0	1	1		0	87552397	b56822db	7da86e4b	b733e495	25c83c98	7e0ccccf	003baf94	0b153874	a73ee510	bde51b15	e973bfd7	ed397d6b	439cd4cc	b28479f6	a9d1ba1a	056d8866	d4bb7bd8	38dce391	21ddcdc9	b1252a9d	deaf6b52	ad3062eb	bcdee96c	d9556584	001f3601	6c27a535
-0	0	-1			3827	17	10	4	46	0	1	1		05db9164	8084ee93	02cf9876	c18be181	25c83c98	fbad5c96	07d03e2a	0b153874	a73ee510	49d1ad89	7f8ffe57	8fe001f4	46f42a63	07d13a8f	422c8577	36103458	3486227d	52e44668			e587c466		32c7478e	3b183c5c		
-1		0	12	8	2551	49	19	8	27		1	0	8	68fd1e64	d7988e72	897faa69	ddf2b58b	25c83c98	7e0ccccf	7df3abb9	0b153874	a73ee510	3b08e48b	83dba508	54943d67	09cd9f24	07d13a8f	194c42a4	0c1c9401	3486227d	0f2f9850	21ddcdc9	a458ea53	2bbcc5d0	c9d4222a	32c7478e	3fdb382b	445bbe3b	49d68486
-0	0	2	2	1	7367	680	1	22	191	0	1		1	68fd1e64	8db5bc37	6fdb098b	f181ea30	25c83c98	fe6b92e5	18ffd618	37e4aa92	a73ee510	e286f1e6	fd7856c1	2bdfd009	6a430a5b	64c94865	007fa274	aaede3da	e5ba7672	e7c97dee			fcd1a962		3a171ecb	b7c6f617		
-0		-1			39870			0						05db9164	b961056b	961d73a9	06b1cf6e	25c83c98	7e0ccccf	49042125	0b153874	7cc72ec2	7e0d83d4	ba1ff80a	d4fbf673	b95f83fa	ab7390e9	dd244129	20340c29	1e88c74f	8222ff64			d4fa2b9c		3a171ecb	0ff91809		
-0		606	21	3	1211		0	41	41		0		9	5bfa8ab5	207b2d81	e48e5552	9ddc492e	25c83c98		49eb0b1a	0b153874	a73ee510	b14d9951	2839b07a	e5e1ca92	383a5973	b28479f6	3c767806	29fd6b7b	1e88c74f	395856b0	21ddcdc9	b1252a9d	e5191f27		bcdee96c	c23c2e19	001f3601	3d04cc90
-0	2	1	2	1	159	7	7	7	58	2	4		1	17f69355	f3b07830	3f6f79a2	5327e675	4cf72387	7e0ccccf	38ae26b9	5b392875	a73ee510	42635bfd	aadb87b9	2cc77b94	e9332a03	64c94865	b64212a7	87343f95	e5ba7672	048d01f4			6549ede4		3a171ecb	c657e6e5		
-1	7	1	5	7	0	0	13	9	10	1	2		0	05db9164	6887a43c	7eb22712	ef4fd7f1	25c83c98	fe6b92e5	c0251c88	5b392875	a73ee510	c510044d	ada36e89	c71493ed	c63ea0b4	cfef1c29	9221b8f3	92e4b1e3	e5ba7672	08ed8a1c	21ddcdc9	b1252a9d	9adaf9fb	c9d4222a	55dd3565	b43c75ff	445bbe3b	7826e9ae
-0		-1			126865		0	3	1		0			05db9164	38a947a1	353686f2	f7263320	43b19349	fbad5c96	a2f7459e	0b153874	7cc72ec2	d3787b55	15eced00	317bfd7d	bd251a95	07d13a8f	2ab464b7	1689e4de	e5ba7672	90b7bec5			dc55d6df	ad3062eb	3a171ecb	aa0115d2		
-0	0	616	1	6	2473	88	2	49	71	0	1		20	05db9164	c76014f5	df00c4f0	031bba14	b0530c50	7e0ccccf	0808742e	5b392875	a73ee510	ca82d9dd	2115d03b	336bb1d7	40dfba03	b28479f6	a46c3543	631f0045	07c540c4	93b0d1d7			1fe472e2		32c7478e	89bd83a1		
-1		563	3	2	12500	47	4	2	50		1		2	05db9164	403ea497	2cbec47f	3e2bfbda	25c83c98		f33e4fa1	37e4aa92	a73ee510	e029047e	7b5deffb	21a23bfe	269889be	b28479f6	91f74a64	587267a3	e5ba7672	a78bd508	21ddcdc9	5840adea	c2a93b37		32c7478e	1793a828	e8b83407	2fede552
-0	1	16	13	5	187	32	1	32	32	1	1	0	5	05db9164	4f25e98b	557a2bcb	2e17d6f6	25c83c98	7e0ccccf	ee805808	6c41e35e	a73ee510	0f1ee62c	0601c4d9	2c934d21	c4bf1f3a	07d13a8f	5cedaf14	949bfd42	d4bb7bd8	c04ce6df	5e89f4c8	b1252a9d	20562a99	ad3062eb	32c7478e	9117a34a	001f3601	54ca28ff
-0	0	21	2	2	53	21	8	2	36	0	1		2	05db9164	c1384774	24c93e37	d772d0ec	25c83c98	fbad5c96	e31fb017	0b153874	a73ee510	3b08e48b	d0727572	5662d3e8	16aa5daa	b28479f6	59e23b95	3f6a5fd0	776ce399	658dca4c	21ddcdc9	b1252a9d	3ba1c760	ad3062eb	32c7478e	ecc32110	ea9a246c	36e3666f
-0		1	3	4	211885		0	10	2		0		4	241546e0	d833535f	ad4b77ff	d16679b9	25c83c98	fbad5c96	74752b9b	0b153874	7cc72ec2	8b324596	3db1963c	a2f4e8b5	5dd86246	07d13a8f	943169c2	89052618	e5ba7672	281769c2			d4703ebd	ad3062eb	be7c41b4	aee52b6f		
-0		94			9679	0	1	0	0		1			68fd1e64	65ad571e	b72b7c0f	6c83c769	4cf72387	fe6b92e5	c9a05643	062b5529	a73ee510	157764b8	21c80bff	9c080521	4e215042	ad1cc976	49522e06	7ab65ac2	d4bb7bd8	6e284837			85ce55bd		c3dc6cef	5fca8ab2		
-0	0	21		4	2383	41	2	19	21	0	1	0	4	05db9164	8084ee93	02cf9876	c18be181	25c83c98	3bf701e7	d20b75da	0b153874	a73ee510	efea433b	ab147b82	8fe001f4	66110d1b	1adce6ef	5d922427	36103458	27c07bd6	003d4f4f			e587c466		bcdee96c	3b183c5c		
-0	0	4	19	31	2508	106	1	16	103	0	1		31	41edac3d	a5b69ae3	f84d40ab	78a5bd6a	25c83c98	7e0ccccf	354d03e6	0b153874	a73ee510	255f3655	3bcfd189	55ffe9ca	077640f4	1adce6ef	603a2e9e	4c63632d	d4bb7bd8	a1654f4f	21ddcdc9	5840adea	901b12ea		32c7478e	08b0ce98	2bf691b1	984e0db0
-0		1	136	19	11309	157	6	29	72		2		22	05db9164	58e67aaf	08083030	b7fbbe67	25c83c98	fbad5c96	7d63bf49	5b392875	a18233ea	c9f77507	c389b738	246cf651	d7ccab4e	07d13a8f	10935a85	b7b7ce5d	d4bb7bd8	c21c3e4c	3aae8792	a458ea53	8d0ca54e		c7dc6720	bf282e2d	9b3e8820	d576a861
-0	0	274	3	8	56	36	4	16	24	0	2		8	09ca0b81	4f25e98b	5caa38df	25f92f9d	25c83c98	fe6b92e5	c480abf1	0b153874	a73ee510	32390b96	df29f7bb	dd0abd1f	67b031b4	1adce6ef	fb2772ea	ec486b81	e5ba7672	bc5a0ff7	af7c4727	a458ea53	c7d0e43d	ad3062eb	bcdee96c	9dd84531	001f3601	ae47080f
-0	0	-1			3951	117	1	21	39	0	1	0		05db9164	403ea497	2cbec47f	3e2bfbda	4cf72387		7b26d3fe	0b153874	a73ee510	fa7d0797	043725ae	21a23bfe	7f0d7407	b28479f6	91f74a64	587267a3	d4bb7bd8	a78bd508	21ddcdc9	a458ea53	c2a93b37		3a171ecb	1793a828	e8b83407	2fede552
-0		1			4031		0	0	2		0			05db9164	e112a9de	9db30a48	b3dbc908	25c83c98	7e0ccccf	952279fd	5b392875	a73ee510	0cd894de	c22febf3	2598d8eb	5a504385	ad1cc976	f1e1df0a	9ab4d6b1	e5ba7672	fdbdefe6			bbf96cac	c9d4222a	c3dc6cef	8f079aa5		
-0		-1						0						05db9164	d833535f	77f2f2e5	d16679b9	25c83c98	fe6b92e5	970f01b2	37e4aa92	7cc72ec2	3b08e48b	36bccca0	9f32b866	80467802	07d13a8f	943169c2	31ca40b6	2005abd1	281769c2			dfcfc3fa		3a171ecb	aee52b6f		
-0	7	-1	327	3	1414	97	30	15	37	0	3		3	8cf07265	80e26c9b	9ecdca34	85dd697c	25c83c98	7e0ccccf	89391314	0b153874	a73ee510	d8a1c4f1	608452cc	f6b6edb8	cbb8fa8b	07d13a8f	e8f4b767	2d0bbe92	e5ba7672	005c6740	21ddcdc9	b1252a9d	eccbec78	c9d4222a	32c7478e	1793a828	e8b83407	9904c656
-0		1	8	4	1094		0	48	352		0		15	5a9ed9b0	08d6d899	9143c832	f56b7dd5	f281d2a7	7e0ccccf	76fc09f3	0b153874	a73ee510	3b08e48b	0b60ef54	ae1bb660	0721132d	07d13a8f	41f10449	bad5ee18	1e88c74f	698d1c68			0429f84b		be7c41b4	c0d61a5c		
-0	2	1	10		1186	5	40	6	275	1	13	0		87552397	78ccd99e	c109d265	52e9fdb4	25c83c98	fbad5c96	81f7f73c	5b392875	a73ee510	ebcc4ac8	6263d404	7b2da99a	aa1eb12e	051219e6	9917ad07	f85ed40e	e5ba7672	e7e991cb	f44bef3c	a458ea53	0f53d505		32c7478e	02666583	001f3601	86884549
-0	2	-1			99	1	2	1	1	1	1			05db9164	537e899b	5037b88e	9dde01fd	25c83c98	fe6b92e5	d5276ad8	0b153874	a73ee510	3b08e48b	1c80d81c	680d7261	0b1e410e	07d13a8f	14be02cc	c0673b44	07c540c4	65979fb7			e049c839		32c7478e	6095f986		
-1	4	1		5	1	0	4	45	44	2	2		0	05db9164	5dac953d	d032c263	c18be181	4cf72387	7e0ccccf	9b98e9fc	0b153874	a73ee510	2462946f	7f8ffe57	dfbb09fb	46f42a63	64c94865	ed807c25	84898b2a	e5ba7672	b79acaab			0014c32a		32c7478e	3b183c5c		
-1	1	27	2	4	1	2	21	8	66	1	3		2	05db9164	38a947a1	4470baf4	8c8a4c47	384874ce	fbad5c96	10cfa4ce	0b153874	a73ee510	72ce33ff	d0c3ead8	bb669e25	a7de95c2	b28479f6	8691120a	2b2ce127	e5ba7672	b133fcd4			2b796e4a		bcdee96c	8d365d3b		
-0	0	59		1	25398		0	19	10	0	0		1	68fd1e64	a796837e	af48fb08	e3cc371a	25c83c98	7e0ccccf	9a68af50	0b153874	7cc72ec2	b2ebcf4d	c4bd1c72	c78e8461	bcfc54a9	07d13a8f	870efc17	c4de5bba	07c540c4	9de259c3			7a593b43	c9d4222a	c7dc6720	8fc66e78		
-0		14	7	3	43236	78	0	15	52		0		3	05db9164	f8c8e8f8			89ff5705	7e0ccccf	1e3bdb1b	0b153874	a73ee510	8a99abc1	4352b29b		5f4de855	b28479f6	b15b8172		e5ba7672	d2f0bce2	21ddcdc9	5840adea			bcdee96c		f55c04b6	56be3401
-1	7	2	1	1	201	1	10	7	21	2	4		1	291b7ba2	333137d9	d577be04	26d1c179	25c83c98		e3e366c8	0b153874	a73ee510	e1a2ef0f	c6efad65	0a665a51	63db155e	1adce6ef	63ac89c1	bc19fa9c	e5ba7672	c61e82d7	21ddcdc9	b1252a9d	3e120d5e		32c7478e	98276f90	445bbe3b	38a0a8f1
-0	6	6	17	5	37	5	6	5	5	1	1		5	05db9164	65ad571e	06007265	7eafc40d	25c83c98	fe6b92e5	dcdd8d42	0b153874	a73ee510	e1a2ef0f	c1700682	6ed9dde9	0c66bf77	ad1cc976	49522e06	2759daf4	e5ba7672	6e284837			9d0b86ed		c3dc6cef	5fca8ab2		
-0	2	55	26	19	30	39	2	22	20	1	1		19	50d4de26	0b8e9caf	f95a3480	24031442	25c83c98	fe6b92e5	649c7ded	0b153874	a73ee510	60dca7a3	175d5d07	d49d44e6	21e58fe4	b28479f6	5340cb84	ecbb1f29	e5ba7672	ca6a63cf			0c6b0a44		bcdee96c	08b0ce98		
-1	1	1451	2	2	24	0	8	30	85	1	5	3	0	05db9164	38a947a1	353686f2	f7263320	4cf72387	fbad5c96	a5785c33	0b153874	a73ee510	3b08e48b	b38835a9	317bfd7d	5aceb3b4	07d13a8f	7722bd91	1689e4de	27c07bd6	90b7bec5			dc55d6df		423fab69	aa0115d2		
-1		0	37	30	36	77	12	30	157		5	3	28	68fd1e64	6e638bbc	5c51b5ba	1dae9b6c	4cf72387	fe6b92e5	6d0ca8d7	0b153874	a73ee510	361eec86	6939835e	27bdd67e	dc1d72e4	07d13a8f	1f29ec61	5bc896cd	27c07bd6	3cb7e3f0	21ddcdc9	5840adea	3fa701f0		55dd3565	8d653a3e	445bbe3b	8addf025
-1		-1	2		83214	104	0	0	1		0			05db9164	4f25e98b	21259ece	9ceceafa	25c83c98	fe6b92e5	019bb335	0b153874	7cc72ec2	c6577552	e2a3d92c	911f7581	a984ac48	1adce6ef	17d9b759	80216f56	e5ba7672	7ef5affa	21ddcdc9	b1252a9d	1a4b9964		3a171ecb	0ce9d40e	e8b83407	396dce83
-0	1	21	52	0	0	47	1	0	0	1	1		0	05db9164	38d50e09	948ee031	b7ab56a2	25c83c98	7e0ccccf	a86d9649	361384ce	a73ee510	42635bfd	aadb87b9	42bee2f2	e9332a03	b28479f6	06373944	67b3c631	d4bb7bd8	fffe2a63	21ddcdc9	b1252a9d	bd074856		3a171ecb	df487a73	001f3601	c27f155b
-0		-1			37586	55	1	4	37		1			5a9ed9b0	404660bb	2e4b14c7	f3608b1e	25c83c98	3bf701e7	0d339a25	37e4aa92	a73ee510	0ccaf4c7	7d756b25	2098d925	6f833c7a	b28479f6	abcca5c1	f7d36847	07c540c4	4b17f8a2	21ddcdc9	5840adea	2bce7b1d		32c7478e	f9f7eb22	f0f449dd	b2a97390
-0	0	3320	4	1	4122	83	6	26	70	0	1		3	8cf07265	78ccd99e	45bd7955	6cfd27d8	25c83c98	13718bbd	a4756aa0	5b392875	a73ee510	2f0da49f	3bfee234	3815b09c	888b8320	cfef1c29	798a3785	4b1d7076	e5ba7672	e7e991cb	9437f62f	b1252a9d	a9f61713		93bad2c0	3d146fbb	f0f449dd	91a61c29
-1	3	1	4	6	243	6	31	13	116	1	4		6	24eda356	fdbd6890	4948e114	77199c76	4cf72387	fbad5c96	62fc022b	0b153874	a73ee510	5612701e	e09c447b	72a05bfb	8dab0422	64c94865	0b2c122e	6e957363	e5ba7672	1910e2e3			ab5b7fc6	c9d4222a	423fab69	0ee8c452		
-0	2	2	25	14	1272	58	22	47	137	1	4	1	14	68fd1e64	0468d672	92c34e5f	6917d100	25c83c98	7e0ccccf	c63eac71	0b153874	a73ee510	d7a30208	2010b191	efc34af7	d48de876	1adce6ef	4f3b3616	d3408fd5	3486227d	9880032b	21ddcdc9	5840adea	8e7a84bb		423fab69	3f7eb911	ea9a246c	cdebf969
-0	2	0	11	13	172	29	2	24	23	1	1	1	23	05db9164	38a947a1	d0828830	09801e29	25c83c98	fe6b92e5	19e63407	37e4aa92	a73ee510	3b08e48b	00c11834	27e9858f	d1155458	b28479f6	d92de4ea	e294c66c	3486227d	09953728			0346d048		32c7478e	776eb2c5		
-1	0	49	143	1	5353	8	3	1	75	0	2		1	05db9164	89ddfee8	c2b008c5	e31f97bb	25c83c98	fbad5c96	1c86e0eb	a25968f2	a73ee510	34ccc264	755e4a50	e657c595	5978055e	1adce6ef	34cce7d2	0d7e5968	e5ba7672	5bb2ec8e	21ddcdc9	a458ea53	2c613179		423fab69	8eb162c5	f0f449dd	47ee0e11
-0		0	11	2	4236		0	12	130		0		11	5a9ed9b0	5b7b33dc			b2241560	7e0ccccf	d1208de2	0b153874	a73ee510	7ca23b4a	eb9e7931		837d93f2	07d13a8f	247f84ab		1e88c74f	3cbc29b4					55dd3565			
-1		2	2	1	16008		0	2	1		0		1	05db9164	38a947a1	f1722731	bef1cf93	25c83c98		17f619bc	0b153874	a73ee510	f90f47c5	e7c049c2	1b5efd69	36b96ed0	64c94865	51c5d5ca	f9e62e71	d4bb7bd8	be5810bd			6b23ba2d		32c7478e	043a382b		
-0	0	35	18	9	2687		0	30	90	0	0		9	5a9ed9b0	38d50e09	873cec9e	faeb53d1	25c83c98	7e0ccccf	64917feb	0b153874	a73ee510	3b08e48b	f045731b	1bad82f2	252ee845	b28479f6	06373944	754f444e	07c540c4	fffe2a63	21ddcdc9	b1252a9d	362ad5a2		32c7478e	df487a73	001f3601	c27f155b
-1	5	50	2	4	18	1	5	4	4	2	2		1	05db9164	4f25e98b	e5ecc1d4	26aac878	25c83c98	7e0ccccf	5f8e3e72	0b153874	a73ee510	3b08e48b	81029038	94073dd1	6d0b1734	b28479f6	df2f73e9	9f50aa07	07c540c4	bc5a0ff7	712d530c	b1252a9d	ef95f922		3a171ecb	a3bd4d33	001f3601	6b5cead0
-0		13	1	1	1020		0	1	1		0		1	05db9164	08d6d899	d158b948	cd08b588	25c83c98	fbad5c96	f8077d16	0b153874	a73ee510	3b08e48b	d24aec2b	5ea2e48b	f6224065	b28479f6	bfef54b3	6d922e3b	776ce399	87c6f83c			15fce809		bcdee96c	f96a556f		
-1	0	-1	13	0	5699	216	2	6	56	0	2		2	05db9164	6e638bbc	e1266b28	09e3cd5a	25c83c98		505ca254	0b153874	a73ee510	f6e4bc4d	4968ae8f	eb8ded57	18e370a6	07d13a8f	1f29ec61	3d9023a4	e5ba7672	3cb7e3f0	21ddcdc9	b1252a9d	31b4af04		32c7478e	8d653a3e	445bbe3b	8e1ae331
-1	17	89	26	5	1175	45	75	10	374	1	14		5	05db9164	71ca0a25	c86b2d8d	657dc3b9	25c83c98	fe6b92e5	d0792267	0b153874	a73ee510	7c0a503a	9700edac	1ca7a526	672d927b	b28479f6	a67c19b7	ba46c3a1	e5ba7672	9bf8ffef	21ddcdc9	b1252a9d	eb0fc6f8		32c7478e	df487a73	e8b83407	c27f155b
-0		2	5	0	104594			2					2	39af2607	80e26c9b	f57d3f44	b6951e6b	25c83c98		fa2da417	0b153874	7cc72ec2	d33462a3	358a1187	d120ba45	3966c8cd	07d13a8f	f3635baf	0cf975bf	d4bb7bd8	f54016b9	21ddcdc9	b1252a9d	c136e191		32c7478e	1793a828	e8b83407	66045105
-0	1	-1			118	0	4	2	13	1	2	2		3560b08b	5dac953d	d032c263	c18be181	25c83c98	7e0ccccf	61f42546	0b153874	a73ee510	3b08e48b	e0e79bd6	dfbb09fb	96fa211f	64c94865	ed807c25	84898b2a	27c07bd6	b79acaab			0014c32a		3a171ecb	3b183c5c		
-0		5	31	3	18756		0	11	2		0		4	87552397	04e09220	b1ecc6c4	5dff9b29	25c83c98	fbad5c96	2da1e879	5b392875	a73ee510	b7efa269	a05a0d99	2436ff75	e55dbe27	07d13a8f	f6b23a53	f4ead43c	1e88c74f	6fc84bfb			4f1aa25f		32c7478e	ded4aac9		
-0		25	16	20	8789	20	1	22	20		1		20	05db9164	537e899b	5037b88e	9dde01fd	25c83c98	fbad5c96	0d59e258	0b153874	a73ee510	5612701e	b9ec9192	680d7261	df5886ca	07d13a8f	6d68e99c	c0673b44	d4bb7bd8	b34aa802			e049c839		c7dc6720	6095f986		
-0	6	55	1	1	0	33	13	29	86	2	5		0	05db9164	421b43cd	3956eff2	29998ed1	25c83c98	fe6b92e5	38eb9cf4	0b153874	a73ee510	441dd290	7f8ffe57	6aaba33c	46f42a63	b28479f6	2d0bb053	b041b04a	e5ba7672	2804effd			723b4dfd		32c7478e	b34f3128		
-0		0		2	249180		0	2	50		0		2	05db9164	38a947a1	b9279298	40f36a12	384874ce	fe6b92e5	76d84582	0b153874	7cc72ec2	39cda501	7c53dc69	4c7c8101	4fd35e8f	1adce6ef	3ea7817e	4961b392	e5ba7672	bdd21ce2			2688e7ed		423fab69	9d70bc85		
-0	5	99	31	0	754	13	42	20	1068	1	8	5	13	5a9ed9b0	942f9a8d	4a75b52b	c6fdc148	25c83c98		d3f2ae29	0b153874	a73ee510	7f79890b	c4adf918	8eb3f772	85dbe138	1adce6ef	ae97ecc3	8213a764	8efede7f	1f868fdd	21ddcdc9	a458ea53	74be63ef		32c7478e	9af06ad9	9d93af03	cdfe5ab7
-0		-1			35745		0	0	2		0			39af2607	6887a43c	6d0ceb43	8d164e53	25c83c98	7e0ccccf	838c8fbe	0b153874	7cc72ec2	3b08e48b	f72fff3d	9fa694f3	03f77fd2	07d13a8f	eb1997cb	a54711b4	776ce399	570391ac	21ddcdc9	b1252a9d	78766d37		be7c41b4	9e0bee34	445bbe3b	df909817
-0	1	72	10	14	14	12	1	14	14	1	1		11	05db9164	421b43cd	7bd61a3f	29998ed1	25c83c98	fe6b92e5	a6a575e6	0b153874	a73ee510	45ab2c55	4829f487	6aaba33c	2180053c	b28479f6	2d0bb053	b041b04a	d4bb7bd8	2804effd			723b4dfd	ad3062eb	bcdee96c	b34f3128		
-0		1	1	3	5087	33	1	14	15		1	1	3	05db9164	68b3edbf	b00d1501	d16679b9	25c83c98	7e0ccccf	862c6367	0b153874	a73ee510	230a3832	6514ea2d	e0d76380	4738a95a	b28479f6	f511c49f	1203a270	3486227d	752d8b8a			73d06dde		32c7478e	aee52b6f		
-1	13	251	14	10	7	1	39	38	172	3	10		1	68fd1e64	89ddfee8	39eef0e8	13508380	25c83c98	fbad5c96	ad3508b1	5b392875	a73ee510	07704244	ad757a5a	4594f341	93b18cb5	07d13a8f	59a58e86	02882e54	e5ba7672	ae46962e	1d1eb838	b1252a9d	7b69ac9f		423fab69	45ab94c8	f0f449dd	c84c4aec
-1	0	1	7	26	0	243	25	15	754	0	6	0	0	68fd1e64	942f9a8d	d024aa4a	ca155841	4cf72387	fbad5c96	3f4ec687	0b153874	a73ee510	0e9ead52	c4adf918	08623920	85dbe138	b28479f6	ac182643	0ffc495e	27c07bd6	1f868fdd	f44bef3c	a458ea53	89883ec0	ad3062eb	32c7478e	e4c356ec	9d93af03	b775f5c2
-0	5	0	26	3	60	3	36	6	74	2	10		3	5a9ed9b0	58e67aaf	f1a75345	715dbf7b	4cf72387	fbad5c96	45e063a0	0b153874	a73ee510	27f4bf82	da89cb9b	d145dc65	165642be	b28479f6	62eca3c0	2ebf54b4	e5ba7672	c21c3e4c	338f20de	a458ea53	5bd3d286		32c7478e	bc8b14b9	9b3e8820	cdd2b5b7
-0		1	3		45904	111	0	1	27		0	0		05db9164	89ddfee8	3863b7f1	4daf48e1	25c83c98	fbad5c96	66acf824	0b153874	7cc72ec2	0ed4b00d	e192b186	dca65903	7df3a6c1	07d13a8f	4df3da6b	8784f12f	8efede7f	5bb2ec8e	3014a4b1	b1252a9d	d754f116	ad3062eb	423fab69	16291dd7	f0f449dd	e98cbe6a
-0		-1	2	2	2908	603	0	0	104		0		2	f473b8dc	38a947a1	223b0e16	ca55061c	25c83c98	7e0ccccf	eac6dc30	49dd1874	a73ee510	980d90f4	df29f7bb	156f99ef	67b031b4	1adce6ef	0e78291e	5fbf4a84	d4bb7bd8	1999bae9			deb9605d		32c7478e	e448275f		
-0		5	14	19	3708		0	35	376		0	0	19	05db9164	a796837e	08de7b18	97ce69e9	4cf72387	fe6b92e5	82f666b6	0b153874	a73ee510	03e48276	e51ddf94	c5011072	3516f6e6	cfef1c29	f0bf9094	5a9431f3	3486227d	1cdbd1c5			e754c5e1		3a171ecb	8fc66e78		
-1	3	3	4	7	102	30	4	23	23	1	2		23	8cf07265	26ece8a8	8c6bfe29	1e0ec6a2	25c83c98	fbad5c96	5c8931c6	0b153874	a73ee510	456b972a	77e7d573	1a614fd0	857a4197	07d13a8f	102fc449	d0b4477d	e5ba7672	87fd936e			b193bbca	ad3062eb	423fab69	5a456be6		
-1		12	2	2	7230	12	24	3	21		1		2	68fd1e64	80e26c9b	d3837635	230f1f17	25c83c98		f2d80b52	0b153874	a73ee510	4549ea1f	1bb4f435	b8b324f1	e8d4ea40	07d13a8f	f3635baf	1fb7f493	e5ba7672	f54016b9	21ddcdc9	a458ea53	90c2e498		32c7478e	1793a828	e8b83407	8efc26f8
-1		2	7	3	2934	48	4	4	117		3		3	05db9164	e5fb1af3	77d9caa7	932c3d89	25c83c98	7e0ccccf	ec874408	37e4aa92	a73ee510	5e2b2f1d	c6dfa670	21ca81df	3a4e700b	07d13a8f	b5de5956	cbe07a5c	e5ba7672	13145934	a34d2cf6	a458ea53	cbc662a7		3a171ecb	45a3e015	010f6491	b62a4ef5
-0		1	15	5	3699		0	36	77		0	0	5	05db9164	207b2d81	8a48553d	1e10bd9f	25c83c98	7e0ccccf	7f9907fe	5b392875	a73ee510	200e383b	a7b606c4	6803e296	eae197fd	b28479f6	3c767806	ff48ade9	e5ba7672	395856b0	21ddcdc9	b1252a9d	c3d093fb		3a171ecb	84a27184	001f3601	a30a3fb0
-0	4	175	12	2	923	55	8	48	103	1	2		2	5bfa8ab5	38a947a1	223b0e16	ca55061c	25c83c98	7e0ccccf	ade953a9	5b392875	a73ee510	4072f40f	29e4ad33	156f99ef	80467802	1adce6ef	0e78291e	5fbf4a84	e5ba7672	1999bae9			deb9605d		32c7478e	e448275f		
-0	11	1331	2	2	1296	8	11	13	54	1	1	1	4	68fd1e64	c8687797	5c7d8ff6	902872c9	25c83c98	fbad5c96	d20b4953	0b153874	a73ee510	fbbf2c95	46febd4d	79b87c55	949ea585	b28479f6	dc96c4b0	5627d7e0	3486227d	a7e06874	21ddcdc9	b1252a9d	4063500f	ad3062eb	32c7478e	54baf4d1	010f6491	ba676e3c
-1	3	4		2	1136	2	3	2	2	1	1		2	05db9164	f234d60e	01daaa01	1258049c	43b19349	fbad5c96	fae8ca82	0b153874	a73ee510	9a2a80f7	46d4b56a	ed98b1fb	ed738fad	07d13a8f	40fcbacb	4bf7ec4d	07c540c4	d942f032			8818bdec		3a171ecb	e4ef8e56		
-1	41	14	3	1	1	1	41	1	1	1	1		1	8cf07265	73a46ff0	85a07101	501abd52	0942e0a7	7e0ccccf	4ebdc6e2	0b153874	a73ee510	6417eabb	74475d27	593290d6	403e1842	1adce6ef	d57668e2	15b684be	e5ba7672	da507f45	21ddcdc9	5840adea	f4d7cf94		423fab69	b34f3128	ea9a246c	3090e38b
-1	0	19	2	1	0	63	2	3	7	0	2		0	05db9164	e18b1e61			384874ce	7e0ccccf	f417bf96	6c41e35e	a73ee510	3b08e48b	0ec1e215		44af41ef	07d13a8f	1d432c1e		e5ba7672	b2879faf				ad3062eb	3a171ecb			
-0		0	5	4	11141	218	1	24	217		1		4	68fd1e64	58e67aaf	2113709c	3bfbb842	4cf72387	fe6b92e5	cc8ce7f3	1f89b562	a73ee510	3b08e48b	b6ac69d0	27302de8	e987b058	07d13a8f	10935a85	7958d3dc	d4bb7bd8	c21c3e4c	55dd3565	a458ea53	192551e4		3a171ecb	48056b77	9b3e8820	76415198
-1		57	3	1	21443	49	8	1	38		1		1	5bfa8ab5	c5c1d6ae	bb85179d	98cd0302	25c83c98	fbad5c96	6855ef53	0b153874	a73ee510	175d6c71	b7094596	4750f0d1	1f9d2c38	07d13a8f	b25845fd	130b2582	3486227d	561cabfe	21ddcdc9	5840adea	ffbb089f		32c7478e	1026f362	7a402766	46f2af91
-0	0	0	10	5	1673	91	15	33	256	0	5		5	75ac2fe6	04e09220	b1ecc6c4	5dff9b29	25c83c98	7e0ccccf	63282fe3	0b153874	a73ee510	b95c890d	e6959f26	2436ff75	b57fa159	07d13a8f	f6b23a53	f4ead43c	8efede7f	6fc84bfb			4f1aa25f	ad3062eb	423fab69	ded4aac9		
-1		1	3	5	2985	13	1	5	7		1		5	05db9164	5dac953d	d032c263	c18be181	4cf72387	7e0ccccf	78c0b2ff	1f89b562	a73ee510	3b08e48b	eb4a9b83	dfbb09fb	c0bc5873	1adce6ef	32330105	84898b2a	d4bb7bd8	24de59c1			0014c32a		3a171ecb	3b183c5c		
-0		4	6	1	29743			21				0	1	05db9164	08d6d899	0bab1155	60d5f5a7	25c83c98	7e0ccccf	d6293852	0b153874	a73ee510	3b08e48b	c6cb726f	1d00cbc4	176d07bc	07d13a8f	41f10449	b93ac0ad	d4bb7bd8	698d1c68			bf8efd4c		72592995	f96a556f		
-0		1	1	3	5364	5	1	4	5		1		3	05db9164	207b2d81	057e845b	786673ae	25c83c98	6f6d9be8	f2a82962	0b153874	a73ee510	0ff7e0c6	c255f829	03aa3022	fe528cd1	b28479f6	899da9d5	dc377037	d4bb7bd8	25c88e42	21ddcdc9	a458ea53	907b8dff		32c7478e	7a8e7ed6	001f3601	9042adf0
-0	0	142	1	14	1559	85	13	5	267	0	3	0	14	5a9ed9b0	8ab240be	429e8271	c450716c	25c83c98	fe6b92e5	6fadbb76	1f89b562	a73ee510	fa7d0797	b5939c49	5c3be1d3	377af8aa	b28479f6	b4316eb3	4c0566cc	8efede7f	807ea8b0	21ddcdc9	5840adea	858c4106		32c7478e	2f0b2844	e8b83407	aa5f0a15
-1		1	5	9	8445	16	1	8	9		1	0	9	8cf07265	46bbf321	c5d94b65	5cc8f91d	384874ce	7e0ccccf	099d72d1	5b392875	a73ee510	230a3832	a6f5e788	75c79158	beaa48ab	243a4e68	bcdb9b50	208d4baf	3486227d	ce4d072d			6a909d9a		3a171ecb	1f68c81f		
-0		0	16	3	8297	46	7	5	8		1		3	be589b51	1cfdf714	2acc1a0e	1f2b62a4	4cf72387	7e0ccccf	b4ecbce4	0b153874	a73ee510	3b08e48b	8d68f0f6	cfe25cb7	4e9bebb4	687dfaf4	a54fca2b	446fa98b	e5ba7672	e88ffc9d	6f62a118	a458ea53	ff8c5410		3a171ecb	5029cba6	cb079c2d	a2de1476
-0	3	0	13	8	0	0	4	11	18	1	2		0	05db9164	0468d672	7b8d300a	c619d132	4cf72387	7e0ccccf	0fdf56d6	5b392875	a73ee510	42429aab	6241e24a	bb19e5a1	8c1a3ad8	b28479f6	234191d3	08042d48	e5ba7672	9880032b	21ddcdc9	5840adea	63e3637a	c9d4222a	bcdee96c	d33a0d83	ea9a246c	984e0db0
-0		0	9		326904			6						87552397	207b2d81	365d1d63	e9370452	25c83c98	fbad5c96	b10436ef	0b153874	7cc72ec2	2fed7fc5	4dee99ee	8ff467ea	d299b0dc	07d13a8f	0c67c4ca	acf5f625	07c540c4	395856b0	21ddcdc9	a458ea53	b6af5d81		3a171ecb	27e81296	001f3601	e1572e3b
-1	0	1	1		2708	27	12	0	37	0	4			5bfa8ab5	6c713117	f9513969	63bb9eb1	43b19349	fbad5c96	adbcc874	1f89b562	a73ee510	fa7d0797	46031dab	8ab52742	377af8aa	07d13a8f	78ebcaf1	0c98c1fc	e5ba7672	bf6b118a	21ddcdc9	b1252a9d	45664d1d		32c7478e	40de02ec	445bbe3b	b025bfb1

+ 0 - 202
recommend-model-produce/src/main/python/models/wide_and_deep_dataset/model.py

@@ -1,202 +0,0 @@
-import paddle
-import paddle.nn as nn
-import paddle.nn.functional as F
-import math
-
-
-
-class WideDeepLayer(nn.Layer):
-    def __init__(self, sparse_feature_number, sparse_feature_dim,
-                 dense_feature_dim, num_field, layer_sizes):
-        super(WideDeepLayer, self).__init__()
-        self.sparse_feature_number = sparse_feature_number
-        self.sparse_feature_dim = sparse_feature_dim
-        self.dense_feature_dim = dense_feature_dim
-        self.num_field = num_field
-        self.layer_sizes = layer_sizes
-
-    def forward(self, sparse_inputs, dense_inputs):
-        # wide part
-        wide_w = paddle.static.create_parameter(
-            shape=[self.dense_feature_dim, 1],
-            dtype='float32',
-            name='wide_w',
-            default_initializer=paddle.nn.initializer.TruncatedNormal(
-                mean=0.0, std=1.0 / math.sqrt(self.dense_feature_dim)))
-        wide_b = paddle.static.create_parameter(
-            shape=[1],
-            dtype='float32',
-            name='wide_b',
-            default_initializer=paddle.nn.initializer.Constant(0.0))
-        
-        # 使用paddle.static.nn.fc的正确方式
-        wide_output = paddle.static.nn.fc(
-            x=dense_inputs,  # 使用x而不是input
-            size=1,
-            weight_attr=paddle.ParamAttr(
-                initializer=paddle.nn.initializer.TruncatedNormal(
-                    mean=0.0, std=1.0 / math.sqrt(self.dense_feature_dim))),
-            bias_attr=paddle.ParamAttr(
-                initializer=paddle.nn.initializer.Constant(0.0))
-        )
-
-        # deep part
-        sparse_embs = []
-        for i, s_input in enumerate(sparse_inputs):
-            emb = paddle.static.nn.embedding(
-                input=s_input,
-                size=[self.sparse_feature_number, self.sparse_feature_dim],
-                param_attr=paddle.ParamAttr(
-                    name=f"embedding_{i}",
-                    initializer=paddle.nn.initializer.Uniform()))
-            emb = paddle.reshape(emb, shape=[-1, self.sparse_feature_dim])
-            sparse_embs.append(emb)
-
-        deep_output = paddle.concat(x=sparse_embs + [dense_inputs], axis=1)
-        
-        # 创建深度网络层
-        sizes = [self.sparse_feature_dim * self.num_field + self.dense_feature_dim] + self.layer_sizes + [1]
-        acts = ["relu" for _ in range(len(self.layer_sizes))] + [None]
-        
-        for i in range(len(sizes) - 1):
-            deep_output = paddle.static.nn.fc(
-                x=deep_output,  # 使用x而不是input
-                size=sizes[i + 1],
-                activation=acts[i],  # 使用activation而不是act
-                weight_attr=paddle.ParamAttr(
-                    name=f'fc_{i}_w',
-                    initializer=paddle.nn.initializer.Normal(
-                        std=1.0 / math.sqrt(sizes[i]))),
-                bias_attr=paddle.ParamAttr(
-                    name=f'fc_{i}_b',
-                    initializer=paddle.nn.initializer.Constant(0.0))
-            )
-
-        prediction = paddle.add(x=wide_output, y=deep_output)
-        pred = F.sigmoid(prediction)
-        return pred
-
-
-
-
-
-class WideDeepModel:
-    def __init__(self, sparse_feature_number=1000001, sparse_inputs_slots=27, sparse_feature_dim=10, dense_input_dim=13, fc_sizes=[400, 400, 400]):
-        self.sparse_feature_number = sparse_feature_number
-        self.sparse_inputs_slots = sparse_inputs_slots
-        self.sparse_feature_dim = sparse_feature_dim
-        self.dense_input_dim = dense_input_dim
-        self.fc_sizes = fc_sizes
-
-        self._metrics = {}
-
-    def acc_metrics(self, pred, label):
-        correct_cnt = paddle.static.create_global_var(
-            name="right_cnt", persistable=True, dtype='float32', shape=[1], value=0)
-        total_cnt = paddle.static.create_global_var(
-            name="total_cnt", persistable=True, dtype='float32', shape=[1], value=0)
-
-        batch_cnt = paddle.sum(
-            paddle.full(shape=[paddle.shape(label)[0], 1], fill_value=1.0))
-        batch_accuracy = paddle.static.accuracy(input=pred, label=label)
-        batch_correct = batch_cnt * batch_accuracy
-
-        paddle.assign(correct_cnt + batch_correct, correct_cnt)
-        paddle.assign(total_cnt + batch_cnt, total_cnt)
-        accuracy = correct_cnt / total_cnt
-
-        self._metrics["acc"] = {}
-        self._metrics["acc"]["result"] = accuracy
-        self._metrics["acc"]["state"] = {
-            "total": (total_cnt, "float32"), "correct": (correct_cnt, "float32")}
-
-    def auc_metrics(self, pred, label):
-        auc, batch_auc, [batch_stat_pos, batch_stat_neg, stat_pos, stat_neg] = paddle.static.auc(input=pred,
-                                                                                                 label=label,
-                                                                                                 num_thresholds=2**12,
-                                                                                                 slide_steps=20)
-
-        self._metrics["auc"] = {}
-        self._metrics["auc"]["result"] = auc
-        self._metrics["auc"]["state"] = {"stat_pos": (
-            stat_pos, "int64"), "stat_neg": (stat_neg, "int64")}
-
-    def mae_metrics(self, pred, label):
-        abserr = paddle.static.create_global_var(
-            name="abserr", persistable=True, dtype='float32', shape=[1], value=0)
-        total_cnt = paddle.static.create_global_var(
-            name="total_cnt", persistable=True, dtype='float32', shape=[1], value=0)
-
-        batch_cnt = paddle.sum(
-            paddle.full(shape=[paddle.shape(label)[0], 1], fill_value=1.0))
-        batch_abserr = paddle.nn.functional.l1_loss(
-            pred, label, reduction='sum')
-
-        paddle.assign(abserr + batch_abserr, abserr)
-        paddle.assign(total_cnt + batch_cnt, total_cnt)
-        mae = abserr / total_cnt
-
-        self._metrics["mae"] = {}
-        self._metrics["mae"]["result"] = mae
-        self._metrics["mae"]["state"] = {
-            "total": (total_cnt, "float32"), "abserr": (abserr, "float32")}
-
-    def mse_metrics(self, pred, label):
-        sqrerr = paddle.static.create_global_var(
-            name="sqrerr", persistable=True, dtype='float32', shape=[1], value=0)
-        total_cnt = paddle.static.create_global_var(
-            name="total_cnt", persistable=True, dtype='float32', shape=[1], value=0)
-
-        batch_cnt = paddle.sum(
-            paddle.full(shape=[paddle.shape(label)[0], 1], fill_value=1.0))
-        batch_sqrerr = paddle.nn.functional.mse_loss(
-            pred, label, reduction='sum')
-
-        paddle.assign(sqrerr + batch_sqrerr, sqrerr)
-        paddle.assign(total_cnt + batch_cnt, total_cnt)
-        mse = sqrerr / total_cnt
-        rmse = paddle.sqrt(mse)
-
-        self._metrics["mse"] = {}
-        self._metrics["mse"]["result"] = mse
-        self._metrics["mse"]["state"] = {
-            "total": (total_cnt, "float32"), "sqrerr": (sqrerr, "float32")}
-
-        self._metrics["rmse"] = {}
-        self._metrics["rmse"]["result"] = rmse
-        self._metrics["rmse"]["state"] = {
-            "total": (total_cnt, "float32"), "sqrerr": (sqrerr, "float32")}
-
-    def net(self, is_train=True):
-        dense_input = paddle.static.data(name="dense_input", shape=[
-                                         None, self.dense_input_dim], dtype="float32")
-
-        sparse_inputs = [
-            paddle.static.data(name="C" + str(i),
-                               shape=[None, 1],
-                               lod_level=1,
-                               dtype="int64") for i in range(1, self.sparse_inputs_slots)
-        ]
-
-        label_input = paddle.static.data(
-            name="label", shape=[None, 1], dtype="int64")
-
-        self.inputs = [dense_input] + sparse_inputs + [label_input]
-
-        wide_deep_model = WideDeepLayer(self.sparse_feature_number, self.sparse_feature_dim,
-                                        self.dense_input_dim, self.sparse_inputs_slots - 1, self.fc_sizes)
-
-        pred = wide_deep_model.forward(sparse_inputs, dense_input)
-        predict_2d = paddle.concat(x=[1 - pred, pred], axis=1)
-        label_float = paddle.cast(label_input, dtype="float32")
-
-        with paddle.utils.unique_name.guard():
-            self.acc_metrics(pred, label_input)
-            self.auc_metrics(predict_2d, label_input)
-            self.mae_metrics(pred, label_float)
-            self.mse_metrics(pred, label_float)
-
-        # loss
-        cost = paddle.nn.functional.log_loss(input=pred, label=label_float)
-        avg_cost = paddle.mean(x=cost)
-        self.loss = avg_cost

+ 0 - 47
recommend-model-produce/src/main/python/models/wide_and_deep_dataset/reader.py

@@ -1,47 +0,0 @@
-import paddle
-import paddle.distributed.fleet as fleet
-import os
-import sys
-
-cont_min_ = [0, -3, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
-cont_max_ = [20, 600, 100, 50, 64000, 500, 100, 50, 500, 10, 10, 10, 50]
-cont_diff_ = [20, 603, 100, 50, 64000, 500, 100, 50, 500, 10, 10, 10, 50]
-hash_dim_ = 1024
-continuous_range_ = range(1, 14)
-categorical_range_ = range(14, 40)
-
-
-class WideDeepDatasetReader(fleet.MultiSlotDataGenerator):
-
-    def line_process(self, line):
-        features = line.rstrip('\n').split('\t')
-        dense_feature = []
-        sparse_feature = []
-        for idx in continuous_range_:
-            if features[idx] == "":
-                dense_feature.append(0.0)
-            else:
-                dense_feature.append(
-                    (float(features[idx]) - cont_min_[idx - 1]) / cont_diff_[idx - 1])
-        for idx in categorical_range_:
-            sparse_feature.append(
-                [hash(str(idx) + features[idx]) % hash_dim_])
-        label = [int(features[0])]
-        return [dense_feature]+sparse_feature+[label]
-    
-    def generate_sample(self, line):
-        def wd_reader():
-            input_data = self.line_process(line)
-            feature_name = ["dense_input"]
-            for idx in categorical_range_:
-                feature_name.append("C" + str(idx - 13))
-            feature_name.append("label")
-            yield zip(feature_name, input_data)
-        
-        return wd_reader
-
-if __name__ == "__main__":
-    my_data_generator = WideDeepDatasetReader()
-    #my_data_generator.set_batch(16)
-
-    my_data_generator.run_from_stdin()

+ 0 - 81
recommend-model-produce/src/main/python/models/wide_and_deep_dataset/train.py

@@ -1,81 +0,0 @@
-from paddle.distributed.fleet.utils.ps_util import DistributedInfer
-import paddle.distributed.fleet as fleet
-import numpy as np
-from model import WideDeepModel
-from reader import WideDeepDatasetReader 
-import os
-import sys
-
-import paddle
-paddle.enable_static()
-
-
-def distributed_training(exe, train_model, train_data_path="./data", batch_size=4, epoch_num=1):
-
-    # if you want to use InMemoryDataset, please invoke load_into_memory/release_memory at train_from_dataset front and back.
-    #dataset = paddle.distributed.InMemoryDataset()
-    #dataset.load_into_memory()
-    # train_from_dataset ...
-    #dataset.release_memory()
-
-    dataset = paddle.distributed.QueueDataset()
-    thread_num = 1
-    dataset.init(use_var=model.inputs, pipe_command="python reader.py", batch_size=batch_size, thread_num=thread_num)
-
-    train_files_list = [os.path.join(train_data_path, x)
-                          for x in os.listdir(train_data_path)]
-    
-    for epoch_id in range(epoch_num):
-        dataset.set_filelist(train_files_list)
-        exe.train_from_dataset(paddle.static.default_main_program(),
-                               dataset,
-                               paddle.static.global_scope(), 
-                               debug=True, 
-                               fetch_list=[train_model.loss],
-                               fetch_info=["loss"],
-                               print_period=1)
-
-
-def clear_metric_state(model, place):
-    for metric_name in model._metrics:
-        for _, state_var_tuple in model._metrics[metric_name]["state"].items():
-            var = paddle.static.global_scope().find_var(
-                state_var_tuple[0].name)
-            if var is None:
-                continue
-            var = var.get_tensor()
-            data_zeros = np.zeros(var._get_dims()).astype(state_var_tuple[1])
-            var.set(data_zeros, place)
-
-
-fleet.init(is_collective=False)
-
-model = WideDeepModel()
-model.net(is_train=True)
-
-strategy = fleet.DistributedStrategy()
-strategy.a_sync = True
-
-optimizer = paddle.optimizer.SGD(learning_rate=0.0001)
-
-optimizer = fleet.distributed_optimizer(optimizer, strategy)
-
-optimizer.minimize(model.loss)
-
-
-if fleet.is_server():
-    fleet.init_server()
-    fleet.run_server()
-
-if fleet.is_worker():
-    place = paddle.CPUPlace()
-    exe = paddle.static.Executor(place)
-
-    exe.run(paddle.static.default_startup_program())
-
-    fleet.init_worker()
-
-    distributed_training(exe, model)
-    clear_metric_state(model, place)
-
-    fleet.stop_worker()