{:check ["true"]}
In this notebook, we will demonstrate the most economical way to build a Keras model to perform binary classification.
import tensorflow as tf
import tensorflow.keras as keras
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Input, Dense
from tensorflow.keras.optimizers import SGD
import numpy as np
import matplotlib.pyplot as pl
Let's first load the data.
data = np.loadtxt('./linear_classifier_data.csv',
delimiter=',',
skiprows=1)
x_data = data[:, [0,1]]
y_data = data[:, -1].astype(int)
Instead of having a separate activation, we can compactly specify the activation function as part of the dense layer.
model = Sequential([
Input(shape=(2,)),
Dense(1, activation='sigmoid'),
])
Now, we can compile the model with the appropriate loss function and optimizer.
Note, we can use string shortnames instead of actual loss function objects.
model.compile(
loss = 'binary_crossentropy',
optimizer = SGD(learning_rate=1e-3),
metrics = ['acc'],
)
model.summary()
model.fit(x_data, y_data, epochs=10)
We can plot the classification separation boundary.
xs = np.linspace(-10, 10, 100)
ys = np.linspace(-10, 10, 100)
xx, yy = np.meshgrid(xs, ys)
input = np.concatenate([
xx.reshape((100, 100, 1)),
yy.reshape((100, 100, 1)),
], axis=-1).reshape(-1, 2)
z = model.predict(input).reshape(100, 100)
pl.contourf(xx, yy, z);
pl.scatter(x_data[y_data == 0, 0], x_data[y_data==0, 1], color='red')
pl.scatter(x_data[y_data == 1, 0], x_data[y_data==1, 1], color='blue')