TFLearn

TF.Learn

tf的高级api:TF.Learn

import tensorflow as tf
from tensorflow.examples.tutorials.mnist import input_data
mnist = input_data.read_data_sets('tmp/data/')
X_train = mnist.train.images
print(X_train.shape)
X_test = mnist.test.images
y_train = mnist.train.labels.astype("int")
y_test = mnist.test.labels.astype("int")
from sklearn.preprocessing import StandardScaler
scaler = StandardScaler()
X_train = scaler.fit_transform(X_train)
X_test = scaler.transform(X_test)
feature_columns = tf.contrib.learn.infer_real_valued_columns_from_input(X_train)
dnn_clf = tf.contrib.learn.DNNClassifier(hidden_units=[300, 100],n_classes=10,feature_columns=feature_columns)
# dnn_clf = tf.contrib.learn.SKCompat(dnn_clf)
dnn_clf.fit(X_train, y_train, batch_size=50, steps=40000)
from sklearn.metrics import accuracy_score
y_pred = list(dnn_clf.predict(X_test))
"""y_pre"""
accuracy_score(y_test,y_pred)
dnn_clf.evaluate(X_test, y_test)

tf的低级api

底下两个链接非常清晰地说明了softmax和cross entropy的关系。先输出softmax,然后计算cross entropy的损失。

softmax

交叉熵

交叉熵计算公式

使用TF提供的函数采用全连接,Using dense() instead of neuron_layer()

Note: the book uses tensorflow.contrib.layers.fully_connected() rather than tf.layers.dense() (which did not exist when this chapter was written). It is now preferable to use tf.layers.dense(), because anything in the contrib module may change or be deleted without notice. The dense() function is almost identical to the fully_connected() function, except for a few minor differences:

  • several parameters are renamed: scope becomes name, activation_fn becomes activation (and similarly the _fn suffix is removed from other parameters such as normalizer_fn), weights_initializer becomes kernel_initializer, etc.

  • the default activation is now None rather than tf.nn.relu.

不使用提供的函数,自行连接全连接的代码

Reference

Last updated

Was this helpful?