Use the TensorFlow template application

Introduction

TensorFlow template application represents generic TensorFlow application code that users can use directly without writing TensorFlow application code. User training data is generally a dense CSV format, or a sparse LIBSVM format or picture, which can be converted into tfrecords, the model itself can use code generation, through different combinations of parameters can be implemented through the TensorFlow application, Users directly download the template or use the Xiaomi Cloud-ml service to submit the application directly, without having to write even single line of code to practice generating a model.

CSV data

If the data is dense or even an image, it can be converted to CSV format and then saved locally or to FDS using a Python script or Spark to convert it into TFRecords.

The denseclassifier.py training model is then used. The template address is [ https://github.com/tobegit3hub/deep_recommend_system/blob/master/dense classifier.py ](https://github.com/tobegit3hub/deep_recommend_system/blob/master/dense_classifier.py).

Cancer data

The default is to use the Cancer dataset with a feature_size of 9 and a label_size of 2. Users can set such parameters as the epoch number, learning rate, optimizer, DNN or the number of network layers.

./dense_classifier.py --batch_size 1024 --epoch_number 1000 --step_to_validate 10 --optmizier adagrad --model dnn --model_network "128 32 8"

Iris data

To use the iris dataset, you must specify a feature_size of 4 and a label_size of 3.

./dense_classifier.py --train_tfrecords_file ./data/iris/iris_train.csv.tfrecords --validate_tfrecords_file ./data/iris/iris_test.csv.tfrecords --feature_size 4 --label_size 3

Lung cancer data

To use the lung cancer dataset, you must specify a feature_size o 262144 and a label_size of 2. You can also specify use of a CNN model.

./dense_classifier.py --train_tfrecords_file ./data/lung/fa7a21165ae152b13def786e6afc3edf.dcm.csv.tfrecords --validate_tfrecords_file ./data/lung/fa7a21165ae152b13def786e6afc3edf.dcm.csv.tfrecords --feature_size 262144 --label_size 2 --batch_size 2 --validate_batch_size 2 --epoch_number -1 --model cnn

LIBSVM data

If the data can be expressed in LIBSVM format, you can use Python scripts or Spark to convert it to TFRecords and save it locally or to FDS. Then, use the sparse_classifier.py training model. The template address is https://github.com/tobegit3hub/deep_recommend_system/blob/master/sparse_classifier.py.

Implementation methods

When developing TensorFlow applications, you can define variable parameters as hyperparameters accepted by the command line. Users can import different hyperparameters to different models to ensure the universality of the applications.

Applications that lack universality and thus require considerable customization cannot currently be converted into TemposFlow model applications. These can be developed and maintained by the user.