Introduction
Once again, long time no see! In Part 1 (over a year ago…lol whoops), we gathered images of Hall of Fame eligible players from baseball-reference.com. Since then, I have reworked the image collection pipeline. I published the images to Hugging Face so this document can consume it directly to make life a little easier for you all! No more 18k image downloads!
With that said, we’re going to build a neural network that predicts whether a player is a Hall of Famer based solely on their facial features. No stats. No accolades. Just their face.
Let’s slide in!
Setup - Required Packages
First, we need to set up our environment. This project uses Python with TensorFlow and Keras for the neural network, plus Hugging Face Hub for dataset sync.
import os
import sys
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
from PIL import Image
from pathlib import Path
import tensorflow as tf
from tensorflow import keras
from tensorflow.keras import layers
import random
from huggingface_hub import snapshot_download
from sklearn.model_selection import train_test_split
# Set random seeds for reproducibility
random.seed(1992 )
np.random.seed(1992 )
tf.random.set_seed(1992 )
print (f"TensorFlow version: { tf. __version__} " )
print (f"Keras version: { keras. __version__} " )
print (f"Python executable: { sys. executable} " )
TensorFlow version: 2.20.0
Keras version: 3.12.0
Python executable: C:\Projects\rpy\posts\mlb-hof-part-2\.venv\Scripts\pythonw.exe
# Pull dataset only if local cache is missing required files
repo_id = "rpy-ai/mlb-hof-faces"
data_dir = Path("data/mlb-hof-faces" )
hof_dir = data_dir / "hof"
nothof_dir = data_dir / "not-hof"
required_items_exist = (
hof_dir.exists() and
nothof_dir.exists()
)
if required_items_exist:
print (f"Using existing local dataset cache at: { data_dir} " )
else :
print ("Local dataset cache missing required files." )
print (f"Downloading snapshot from Hugging Face: { repo_id} " )
data_dir.mkdir(parents= True , exist_ok= True )
snapshot_download(
repo_id= repo_id,
repo_type= "dataset" ,
local_dir= str (data_dir),
allow_patterns= [
"hof/*" ,
"not-hof/*"
],
)
print ("Dataset snapshot downloaded." )
Using existing local dataset cache at: data\mlb-hof-faces
The Data
We’re using the curated images from Hugging Face. The dataset stores class labels by directory (hof/ and not-hof/) with filenames set to each player’s playerid.
# Get all images directly from the categorized folders
hof_files = list (hof_dir.glob("*.jpg" ))
nothof_files = list (nothof_dir.glob("*.jpg" ))
print (f"Total HOF images: { len (hof_files)} " )
print (f"Total non-HOF images: { len (nothof_files)} " )
# Build labeled path list and stratified split
records = (
[(p, 1 ) for p in hof_files] +
[(p, 0 ) for p in nothof_files]
)
all_paths = [r[0 ] for r in records]
all_labels = [r[1 ] for r in records]
train_paths, test_paths, train_y, test_y = train_test_split(
all_paths,
all_labels,
test_size= 0.25 ,
random_state= 42 ,
stratify= all_labels,
)
print (f"Training samples: { len (train_paths)} " )
print (f"Test samples: { len (test_paths)} " )
print (f"Total images to process: { len (all_paths)} " )
Total HOF images: 262
Total non-HOF images: 1077
Training samples: 1004
Test samples: 335
Total images to process: 1339
Image Preprocessing
Neural networks can be a little picky with data inputs so we will do the following:
Convert images to grayscale
We want to eliminate as much noise as possible (Read about the model limitations here )
Resize all images to 128x128 pixels (with aspect-preserving padding)
Feed them into a small CNN for classification
TARGET_SIZE = 128
def resize_with_padding(img, target_size= TARGET_SIZE):
"""Resize while preserving aspect ratio, then pad to square."""
from PIL import ImageOps
contained = ImageOps.contain(img, (target_size, target_size), method= Image.Resampling.BILINEAR)
return ImageOps.pad(contained, (target_size, target_size), color= 0 , method= Image.Resampling.BILINEAR)
def load_and_preprocess_image(img_path):
"""Load an image, convert to grayscale, and resize to 128x128."""
img = Image.open (img_path).convert('L' ) # Convert to grayscale
img = resize_with_padding(img, TARGET_SIZE)
img_array = np.array(img, dtype= np.float32) / 255.0 # Normalize to [0, 1]
return img_array # 2D image array
def preprocess_image_paths(image_paths):
"""Preprocess a sequence of image paths into a CNN-ready tensor."""
processed = []
for i, img_path in enumerate (image_paths):
if (i + 1 ) % 100 == 0 :
print (f"Processed { i + 1 } / { len (image_paths)} images" )
# Add channel dimension -> (H, W, 1)
processed.append(load_and_preprocess_image(img_path)[..., np.newaxis])
return np.array(processed, dtype= np.float32)
print ("Processing training images..." )
train_x = preprocess_image_paths(train_paths)
print ("Processing test images..." )
test_x = preprocess_image_paths(test_paths)
Processing training images...
Processed 100/1004 images
Processed 200/1004 images
Processed 300/1004 images
Processed 400/1004 images
Processed 500/1004 images
Processed 600/1004 images
Processed 700/1004 images
Processed 800/1004 images
Processed 900/1004 images
Processed 1000/1004 images
Processing test images...
Processed 100/335 images
Processed 200/335 images
Processed 300/335 images
Now let’s create our training and test sets:
# Create labels (1 = HOF, 0 = not HOF)
train_y = np.array(train_y)
test_y = np.array(test_y)
# Convert to categorical (one-hot encoding)
train_labels = keras.utils.to_categorical(train_y, num_classes= 2 )
test_labels = keras.utils.to_categorical(test_y, num_classes= 2 )
print (f"Training data shape: { train_x. shape} " )
print (f"Test data shape: { test_x. shape} " )
print (f"Training labels shape: { train_labels. shape} " )
print (f"Test labels shape: { test_labels. shape} " )
# Class balance snapshot
n_train_hof = int (train_y.sum ())
n_train_not_hof = len (train_y) - n_train_hof
print (f"Train class counts -> Not HOF: { n_train_not_hof} , HOF: { n_train_hof} " )
# Simple inverse-frequency class weights to help with imbalance
total_train = len (train_y)
class_weight = {
0 : total_train / (2 * n_train_not_hof),
1 : total_train / (2 * n_train_hof),
}
print (f"Class weights: { class_weight} " )
Training data shape: (1004, 128, 128, 1)
Test data shape: (335, 128, 128, 1)
Training labels shape: (1004, 2)
Test labels shape: (335, 2)
Train class counts -> Not HOF: 808, HOF: 196
Class weights: {0: 0.6212871287128713, 1: 2.561224489795918}
Building the Model
Time to build our neural network! Because this is meant as more of a fun exercise than anything, let’s go with something simple:
Input : 128x128x1 grayscale image
Conv blocks : small stack of convolution + max-pooling layers
Dense head : compact fully connected layer
Output layer : 2 neurons with softmax activation (HOF or not)
Nothing fancy here. We just want a model that is hopefully useful.
model = keras.Sequential([
layers.Input(shape= (TARGET_SIZE, TARGET_SIZE, 1 )),
layers.Conv2D(32 , (3 , 3 ), activation= 'relu' , padding= 'same' ),
layers.MaxPooling2D((2 , 2 )),
layers.Conv2D(64 , (3 , 3 ), activation= 'relu' , padding= 'same' ),
layers.MaxPooling2D((2 , 2 )),
layers.Conv2D(128 , (3 , 3 ), activation= 'relu' , padding= 'same' ),
layers.MaxPooling2D((2 , 2 )),
layers.Flatten(),
layers.Dense(128 , activation= 'relu' ),
layers.Dropout(0.4 ),
layers.Dense(2 , activation= 'softmax' )
])
model.summary()
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type) ┃ Output Shape ┃ Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv2d (Conv2D ) │ (None , 128 , 128 , 32 ) │ 320 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ max_pooling2d (MaxPooling2D ) │ (None , 64 , 64 , 32 ) │ 0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv2d_1 (Conv2D ) │ (None , 64 , 64 , 64 ) │ 18,496 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ max_pooling2d_1 (MaxPooling2D ) │ (None , 32 , 32 , 64 ) │ 0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv2d_2 (Conv2D ) │ (None , 32 , 32 , 128 ) │ 73,856 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ max_pooling2d_2 (MaxPooling2D ) │ (None , 16 , 16 , 128 ) │ 0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ flatten (Flatten ) │ (None , 32768 ) │ 0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense ) │ (None , 128 ) │ 4,194,432 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout ) │ (None , 128 ) │ 0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense_1 (Dense ) │ (None , 2 ) │ 258 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
Total params: 4,287,362 (16.35 MB)
Trainable params: 4,287,362 (16.35 MB)
Non-trainable params: 0 (0.00 B)
Training the Model
Let’s compile and train our model. We’ll use binary crossentropy as our loss function and RMSprop as our optimizer. We’ll also track AUC, precision, and recall so we’re not relying only on accuracy.
model.compile (
loss= 'binary_crossentropy' ,
optimizer= 'adam' ,
metrics= [
'accuracy' ,
keras.metrics.AUC(name= 'auc' ),
keras.metrics.Precision(name= 'precision' ),
keras.metrics.Recall(name= 'recall' ),
]
)
Now for the actual training step.
# Define model path
model_path = Path("mlb_hof_model.keras" )
# Toggle this to force retraining even when a saved model exists
use_saved_model = False
if use_saved_model and model_path.exists():
print (f"Loading saved model from { model_path} ..." )
model = keras.models.load_model(model_path)
# Create a dummy history object or dict since we didn't train
history = None
print ("Model loaded successfully." )
else :
if model_path.exists() and not use_saved_model:
print ("Saved model found, but use_saved_model=False so retraining..." )
print ("Training model..." )
early_stop = keras.callbacks.EarlyStopping(
monitor= "val_loss" ,
patience= 6 ,
restore_best_weights= True
)
history = model.fit(
train_x, train_labels,
epochs= 50 ,
batch_size= 32 ,
validation_split= 0.2 ,
class_weight= class_weight,
callbacks= [early_stop],
verbose= 1
)
print (f"Saving model to { model_path} ..." )
model.save(model_path)
# Plot training history only if we trained the model
if history is not None :
plt.figure(figsize= (12 , 4 ))
plt.subplot(1 , 2 , 1 )
plt.plot(history.history['accuracy' ], label= 'Training Accuracy' )
plt.plot(history.history['val_accuracy' ], label= 'Validation Accuracy' )
plt.title('Model Accuracy' )
plt.xlabel('Epoch' )
plt.ylabel('Accuracy' )
plt.legend()
plt.grid(True )
plt.subplot(1 , 2 , 2 )
plt.plot(history.history['loss' ], label= 'Training Loss' )
plt.plot(history.history['val_loss' ], label= 'Validation Loss' )
plt.title('Model Loss' )
plt.xlabel('Epoch' )
plt.ylabel('Loss' )
plt.legend()
plt.grid(True )
plt.tight_layout()
plt.show()
else :
print ("Model was loaded from disk, so no training history to plot." )
Saved model found, but use_saved_model=False so retraining...
Training model...
Epoch 1/50
1/26 ━━━━━━━━━━━━━━━━━━━━ 54s 2s/step - accuracy: 0.5000 - auc: 0.4878 - loss: 0.6342 - precision: 0.5000 - recall: 0.5000 2/26 ━━━━━━━━━━━━━━━━━━━━ 3s 160ms/step - accuracy: 0.5938 - auc: 0.6341 - loss: 0.6418 - precision: 0.5938 - recall: 0.5938 3/26 ━━━━━━━━━━━━━━━━━━━━ 3s 153ms/step - accuracy: 0.6250 - auc: 0.6763 - loss: 0.6943 - precision: 0.6250 - recall: 0.6250 4/26 ━━━━━━━━━━━━━━━━━━━━ 3s 152ms/step - accuracy: 0.6133 - auc: 0.6745 - loss: 0.7089 - precision: 0.6133 - recall: 0.6133 5/26 ━━━━━━━━━━━━━━━━━━━━ 3s 152ms/step - accuracy: 0.5981 - auc: 0.6641 - loss: 0.7188 - precision: 0.5981 - recall: 0.5981 6/26 ━━━━━━━━━━━━━━━━━━━━ 3s 152ms/step - accuracy: 0.5792 - auc: 0.6477 - loss: 0.7244 - precision: 0.5792 - recall: 0.5792 7/26 ━━━━━━━━━━━━━━━━━━━━ 2s 151ms/step - accuracy: 0.5589 - auc: 0.6286 - loss: 0.7264 - precision: 0.5589 - recall: 0.5589 8/26 ━━━━━━━━━━━━━━━━━━━━ 2s 152ms/step - accuracy: 0.5389 - auc: 0.6091 - loss: 0.7259 - precision: 0.5389 - recall: 0.5389 9/26 ━━━━━━━━━━━━━━━━━━━━ 2s 152ms/step - accuracy: 0.5203 - auc: 0.5908 - loss: 0.7246 - precision: 0.5203 - recall: 0.520310/26 ━━━━━━━━━━━━━━━━━━━━ 2s 152ms/step - accuracy: 0.5036 - auc: 0.5743 - loss: 0.7233 - precision: 0.5036 - recall: 0.503611/26 ━━━━━━━━━━━━━━━━━━━━ 2s 152ms/step - accuracy: 0.4911 - auc: 0.5598 - loss: 0.7219 - precision: 0.4911 - recall: 0.491112/26 ━━━━━━━━━━━━━━━━━━━━ 2s 152ms/step - accuracy: 0.4840 - auc: 0.5485 - loss: 0.7201 - precision: 0.4840 - recall: 0.484013/26 ━━━━━━━━━━━━━━━━━━━━ 1s 152ms/step - accuracy: 0.4795 - auc: 0.5399 - loss: 0.7197 - precision: 0.4795 - recall: 0.479514/26 ━━━━━━━━━━━━━━━━━━━━ 1s 152ms/step - accuracy: 0.4778 - auc: 0.5338 - loss: 0.7190 - precision: 0.4778 - recall: 0.477815/26 ━━━━━━━━━━━━━━━━━━━━ 1s 151ms/step - accuracy: 0.4775 - auc: 0.5290 - loss: 0.7189 - precision: 0.4775 - recall: 0.477516/26 ━━━━━━━━━━━━━━━━━━━━ 1s 151ms/step - accuracy: 0.4786 - auc: 0.5255 - loss: 0.7184 - precision: 0.4786 - recall: 0.478617/26 ━━━━━━━━━━━━━━━━━━━━ 1s 151ms/step - accuracy: 0.4809 - auc: 0.5230 - loss: 0.7178 - precision: 0.4809 - recall: 0.480918/26 ━━━━━━━━━━━━━━━━━━━━ 1s 151ms/step - accuracy: 0.4833 - auc: 0.5210 - loss: 0.7177 - precision: 0.4833 - recall: 0.483319/26 ━━━━━━━━━━━━━━━━━━━━ 1s 151ms/step - accuracy: 0.4861 - auc: 0.5197 - loss: 0.7176 - precision: 0.4861 - recall: 0.486120/26 ━━━━━━━━━━━━━━━━━━━━ 0s 151ms/step - accuracy: 0.4887 - auc: 0.5187 - loss: 0.7175 - precision: 0.4887 - recall: 0.488721/26 ━━━━━━━━━━━━━━━━━━━━ 0s 151ms/step - accuracy: 0.4913 - auc: 0.5181 - loss: 0.7173 - precision: 0.4913 - recall: 0.491322/26 ━━━━━━━━━━━━━━━━━━━━ 0s 150ms/step - accuracy: 0.4935 - auc: 0.5177 - loss: 0.7171 - precision: 0.4935 - recall: 0.493523/26 ━━━━━━━━━━━━━━━━━━━━ 0s 150ms/step - accuracy: 0.4956 - auc: 0.5174 - loss: 0.7171 - precision: 0.4956 - recall: 0.495624/26 ━━━━━━━━━━━━━━━━━━━━ 0s 150ms/step - accuracy: 0.4973 - auc: 0.5172 - loss: 0.7170 - precision: 0.4973 - recall: 0.497325/26 ━━━━━━━━━━━━━━━━━━━━ 0s 150ms/step - accuracy: 0.4985 - auc: 0.5170 - loss: 0.7169 - precision: 0.4985 - recall: 0.498526/26 ━━━━━━━━━━━━━━━━━━━━ 0s 146ms/step - accuracy: 0.4997 - auc: 0.5168 - loss: 0.7168 - precision: 0.4997 - recall: 0.499726/26 ━━━━━━━━━━━━━━━━━━━━ 7s 178ms/step - accuracy: 0.5280 - auc: 0.5124 - loss: 0.7139 - precision: 0.5280 - recall: 0.5280 - val_accuracy: 0.6418 - val_auc: 0.6117 - val_loss: 0.6926 - val_precision: 0.6418 - val_recall: 0.6418
Epoch 2/50
1/26 ━━━━━━━━━━━━━━━━━━━━ 4s 184ms/step - accuracy: 0.4375 - auc: 0.4722 - loss: 0.6408 - precision: 0.4375 - recall: 0.4375 2/26 ━━━━━━━━━━━━━━━━━━━━ 3s 155ms/step - accuracy: 0.5078 - auc: 0.5761 - loss: 0.6301 - precision: 0.5078 - recall: 0.5078 3/26 ━━━━━━━━━━━━━━━━━━━━ 3s 156ms/step - accuracy: 0.5469 - auc: 0.6275 - loss: 0.6524 - precision: 0.5469 - recall: 0.5469 4/26 ━━━━━━━━━━━━━━━━━━━━ 3s 157ms/step - accuracy: 0.5801 - auc: 0.6646 - loss: 0.6572 - precision: 0.5801 - recall: 0.5801 5/26 ━━━━━━━━━━━━━━━━━━━━ 3s 158ms/step - accuracy: 0.6016 - auc: 0.6850 - loss: 0.6657 - precision: 0.6016 - recall: 0.6016 6/26 ━━━━━━━━━━━━━━━━━━━━ 3s 157ms/step - accuracy: 0.6176 - auc: 0.6997 - loss: 0.6720 - precision: 0.6176 - recall: 0.6176 7/26 ━━━━━━━━━━━━━━━━━━━━ 2s 157ms/step - accuracy: 0.6314 - auc: 0.7130 - loss: 0.6752 - precision: 0.6314 - recall: 0.6314 8/26 ━━━━━━━━━━━━━━━━━━━━ 2s 156ms/step - accuracy: 0.6443 - auc: 0.7252 - loss: 0.6761 - precision: 0.6443 - recall: 0.6443 9/26 ━━━━━━━━━━━━━━━━━━━━ 2s 155ms/step - accuracy: 0.6557 - auc: 0.7361 - loss: 0.6762 - precision: 0.6557 - recall: 0.655710/26 ━━━━━━━━━━━━━━━━━━━━ 2s 155ms/step - accuracy: 0.6654 - auc: 0.7447 - loss: 0.6763 - precision: 0.6654 - recall: 0.665411/26 ━━━━━━━━━━━━━━━━━━━━ 2s 155ms/step - accuracy: 0.6739 - auc: 0.7515 - loss: 0.6765 - precision: 0.6739 - recall: 0.673912/26 ━━━━━━━━━━━━━━━━━━━━ 2s 155ms/step - accuracy: 0.6817 - auc: 0.7579 - loss: 0.6760 - precision: 0.6817 - recall: 0.681713/26 ━━━━━━━━━━━━━━━━━━━━ 2s 155ms/step - accuracy: 0.6877 - auc: 0.7616 - loss: 0.6771 - precision: 0.6877 - recall: 0.687714/26 ━━━━━━━━━━━━━━━━━━━━ 1s 156ms/step - accuracy: 0.6934 - auc: 0.7655 - loss: 0.6776 - precision: 0.6934 - recall: 0.693415/26 ━━━━━━━━━━━━━━━━━━━━ 1s 156ms/step - accuracy: 0.6982 - auc: 0.7682 - loss: 0.6787 - precision: 0.6982 - recall: 0.698216/26 ━━━━━━━━━━━━━━━━━━━━ 1s 157ms/step - accuracy: 0.7022 - auc: 0.7706 - loss: 0.6794 - precision: 0.7022 - recall: 0.702217/26 ━━━━━━━━━━━━━━━━━━━━ 1s 157ms/step - accuracy: 0.7057 - auc: 0.7727 - loss: 0.6799 - precision: 0.7057 - recall: 0.705718/26 ━━━━━━━━━━━━━━━━━━━━ 1s 157ms/step - accuracy: 0.7085 - auc: 0.7742 - loss: 0.6808 - precision: 0.7085 - recall: 0.708519/26 ━━━━━━━━━━━━━━━━━━━━ 1s 157ms/step - accuracy: 0.7106 - auc: 0.7755 - loss: 0.6817 - precision: 0.7106 - recall: 0.710620/26 ━━━━━━━━━━━━━━━━━━━━ 0s 157ms/step - accuracy: 0.7115 - auc: 0.7761 - loss: 0.6825 - precision: 0.7115 - recall: 0.711521/26 ━━━━━━━━━━━━━━━━━━━━ 0s 157ms/step - accuracy: 0.7115 - auc: 0.7761 - loss: 0.6832 - precision: 0.7115 - recall: 0.711522/26 ━━━━━━━━━━━━━━━━━━━━ 0s 157ms/step - accuracy: 0.7109 - auc: 0.7756 - loss: 0.6839 - precision: 0.7109 - recall: 0.710923/26 ━━━━━━━━━━━━━━━━━━━━ 0s 157ms/step - accuracy: 0.7099 - auc: 0.7747 - loss: 0.6846 - precision: 0.7099 - recall: 0.709924/26 ━━━━━━━━━━━━━━━━━━━━ 0s 157ms/step - accuracy: 0.7082 - auc: 0.7733 - loss: 0.6853 - precision: 0.7082 - recall: 0.708225/26 ━━━━━━━━━━━━━━━━━━━━ 0s 157ms/step - accuracy: 0.7062 - auc: 0.7716 - loss: 0.6860 - precision: 0.7062 - recall: 0.706226/26 ━━━━━━━━━━━━━━━━━━━━ 4s 165ms/step - accuracy: 0.6563 - auc: 0.7304 - loss: 0.7002 - precision: 0.6563 - recall: 0.6563 - val_accuracy: 0.3930 - val_auc: 0.3652 - val_loss: 0.6942 - val_precision: 0.3930 - val_recall: 0.3930
Epoch 3/50
1/26 ━━━━━━━━━━━━━━━━━━━━ 4s 187ms/step - accuracy: 0.5312 - auc: 0.5107 - loss: 0.6390 - precision: 0.5312 - recall: 0.5312 2/26 ━━━━━━━━━━━━━━━━━━━━ 3s 151ms/step - accuracy: 0.5000 - auc: 0.4791 - loss: 0.6292 - precision: 0.5000 - recall: 0.5000 3/26 ━━━━━━━━━━━━━━━━━━━━ 3s 153ms/step - accuracy: 0.5278 - auc: 0.5056 - loss: 0.6511 - precision: 0.5278 - recall: 0.5278 4/26 ━━━━━━━━━━━━━━━━━━━━ 3s 153ms/step - accuracy: 0.5306 - auc: 0.5140 - loss: 0.6561 - precision: 0.5306 - recall: 0.5306 5/26 ━━━━━━━━━━━━━━━━━━━━ 3s 152ms/step - accuracy: 0.5345 - auc: 0.5191 - loss: 0.6646 - precision: 0.5345 - recall: 0.5345 6/26 ━━━━━━━━━━━━━━━━━━━━ 3s 151ms/step - accuracy: 0.5426 - auc: 0.5266 - loss: 0.6709 - precision: 0.5426 - recall: 0.5426 7/26 ━━━━━━━━━━━━━━━━━━━━ 2s 151ms/step - accuracy: 0.5499 - auc: 0.5360 - loss: 0.6741 - precision: 0.5499 - recall: 0.5499 8/26 ━━━━━━━━━━━━━━━━━━━━ 2s 151ms/step - accuracy: 0.5564 - auc: 0.5455 - loss: 0.6750 - precision: 0.5564 - recall: 0.5564 9/26 ━━━━━━━━━━━━━━━━━━━━ 2s 151ms/step - accuracy: 0.5644 - auc: 0.5584 - loss: 0.6751 - precision: 0.5644 - recall: 0.564410/26 ━━━━━━━━━━━━━━━━━━━━ 2s 151ms/step - accuracy: 0.5723 - auc: 0.5711 - loss: 0.6752 - precision: 0.5723 - recall: 0.572311/26 ━━━━━━━━━━━━━━━━━━━━ 2s 151ms/step - accuracy: 0.5802 - auc: 0.5835 - loss: 0.6754 - precision: 0.5802 - recall: 0.580212/26 ━━━━━━━━━━━━━━━━━━━━ 2s 150ms/step - accuracy: 0.5883 - auc: 0.5962 - loss: 0.6749 - precision: 0.5883 - recall: 0.588313/26 ━━━━━━━━━━━━━━━━━━━━ 1s 150ms/step - accuracy: 0.5950 - auc: 0.6061 - loss: 0.6760 - precision: 0.5950 - recall: 0.595014/26 ━━━━━━━━━━━━━━━━━━━━ 1s 149ms/step - accuracy: 0.6016 - auc: 0.6159 - loss: 0.6766 - precision: 0.6016 - recall: 0.601615/26 ━━━━━━━━━━━━━━━━━━━━ 1s 149ms/step - accuracy: 0.6075 - auc: 0.6243 - loss: 0.6777 - precision: 0.6075 - recall: 0.607516/26 ━━━━━━━━━━━━━━━━━━━━ 1s 149ms/step - accuracy: 0.6128 - auc: 0.6322 - loss: 0.6784 - precision: 0.6128 - recall: 0.612817/26 ━━━━━━━━━━━━━━━━━━━━ 1s 149ms/step - accuracy: 0.6176 - auc: 0.6393 - loss: 0.6789 - precision: 0.6176 - recall: 0.617618/26 ━━━━━━━━━━━━━━━━━━━━ 1s 149ms/step - accuracy: 0.6216 - auc: 0.6456 - loss: 0.6797 - precision: 0.6216 - recall: 0.621619/26 ━━━━━━━━━━━━━━━━━━━━ 1s 149ms/step - accuracy: 0.6253 - auc: 0.6513 - loss: 0.6805 - precision: 0.6253 - recall: 0.625320/26 ━━━━━━━━━━━━━━━━━━━━ 0s 149ms/step - accuracy: 0.6278 - auc: 0.6555 - loss: 0.6812 - precision: 0.6278 - recall: 0.627821/26 ━━━━━━━━━━━━━━━━━━━━ 0s 149ms/step - accuracy: 0.6297 - auc: 0.6584 - loss: 0.6817 - precision: 0.6297 - recall: 0.629722/26 ━━━━━━━━━━━━━━━━━━━━ 0s 150ms/step - accuracy: 0.6310 - auc: 0.6604 - loss: 0.6824 - precision: 0.6310 - recall: 0.631023/26 ━━━━━━━━━━━━━━━━━━━━ 0s 150ms/step - accuracy: 0.6319 - auc: 0.6619 - loss: 0.6830 - precision: 0.6319 - recall: 0.631924/26 ━━━━━━━━━━━━━━━━━━━━ 0s 150ms/step - accuracy: 0.6322 - auc: 0.6624 - loss: 0.6837 - precision: 0.6322 - recall: 0.632225/26 ━━━━━━━━━━━━━━━━━━━━ 0s 150ms/step - accuracy: 0.6321 - auc: 0.6621 - loss: 0.6843 - precision: 0.6321 - recall: 0.632126/26 ━━━━━━━━━━━━━━━━━━━━ 4s 160ms/step - accuracy: 0.6276 - auc: 0.6520 - loss: 0.6977 - precision: 0.6276 - recall: 0.6276 - val_accuracy: 0.4826 - val_auc: 0.5107 - val_loss: 0.6927 - val_precision: 0.4826 - val_recall: 0.4826
Epoch 4/50
1/26 ━━━━━━━━━━━━━━━━━━━━ 5s 207ms/step - accuracy: 0.4375 - auc: 0.5059 - loss: 0.6292 - precision: 0.4375 - recall: 0.4375 2/26 ━━━━━━━━━━━━━━━━━━━━ 4s 169ms/step - accuracy: 0.4453 - auc: 0.5175 - loss: 0.6209 - precision: 0.4453 - recall: 0.4453 3/26 ━━━━━━━━━━━━━━━━━━━━ 3s 168ms/step - accuracy: 0.4913 - auc: 0.5625 - loss: 0.6403 - precision: 0.4913 - recall: 0.4913 4/26 ━━━━━━━━━━━━━━━━━━━━ 3s 164ms/step - accuracy: 0.5189 - auc: 0.5884 - loss: 0.6452 - precision: 0.5189 - recall: 0.5189 5/26 ━━━━━━━━━━━━━━━━━━━━ 3s 163ms/step - accuracy: 0.5426 - auc: 0.6071 - loss: 0.6552 - precision: 0.5426 - recall: 0.5426 6/26 ━━━━━━━━━━━━━━━━━━━━ 3s 161ms/step - accuracy: 0.5607 - auc: 0.6205 - loss: 0.6625 - precision: 0.5607 - recall: 0.5607 7/26 ━━━━━━━━━━━━━━━━━━━━ 3s 163ms/step - accuracy: 0.5775 - auc: 0.6344 - loss: 0.6661 - precision: 0.5775 - recall: 0.5775 8/26 ━━━━━━━━━━━━━━━━━━━━ 2s 163ms/step - accuracy: 0.5918 - auc: 0.6468 - loss: 0.6671 - precision: 0.5918 - recall: 0.5918 9/26 ━━━━━━━━━━━━━━━━━━━━ 2s 163ms/step - accuracy: 0.6039 - auc: 0.6577 - loss: 0.6673 - precision: 0.6039 - recall: 0.603910/26 ━━━━━━━━━━━━━━━━━━━━ 2s 162ms/step - accuracy: 0.6145 - auc: 0.6683 - loss: 0.6671 - precision: 0.6145 - recall: 0.614511/26 ━━━━━━━━━━━━━━━━━━━━ 2s 161ms/step - accuracy: 0.6234 - auc: 0.6766 - loss: 0.6674 - precision: 0.6234 - recall: 0.623412/26 ━━━━━━━━━━━━━━━━━━━━ 2s 161ms/step - accuracy: 0.6310 - auc: 0.6842 - loss: 0.6670 - precision: 0.6310 - recall: 0.631013/26 ━━━━━━━━━━━━━━━━━━━━ 2s 161ms/step - accuracy: 0.6372 - auc: 0.6903 - loss: 0.6679 - precision: 0.6372 - recall: 0.637214/26 ━━━━━━━━━━━━━━━━━━━━ 1s 161ms/step - accuracy: 0.6430 - auc: 0.6962 - loss: 0.6683 - precision: 0.6430 - recall: 0.643015/26 ━━━━━━━━━━━━━━━━━━━━ 1s 162ms/step - accuracy: 0.6476 - auc: 0.7007 - loss: 0.6692 - precision: 0.6476 - recall: 0.647616/26 ━━━━━━━━━━━━━━━━━━━━ 1s 163ms/step - accuracy: 0.6512 - auc: 0.7040 - loss: 0.6699 - precision: 0.6512 - recall: 0.651217/26 ━━━━━━━━━━━━━━━━━━━━ 1s 164ms/step - accuracy: 0.6542 - auc: 0.7070 - loss: 0.6701 - precision: 0.6542 - recall: 0.654218/26 ━━━━━━━━━━━━━━━━━━━━ 1s 165ms/step - accuracy: 0.6567 - auc: 0.7094 - loss: 0.6709 - precision: 0.6567 - recall: 0.656719/26 ━━━━━━━━━━━━━━━━━━━━ 1s 165ms/step - accuracy: 0.6590 - auc: 0.7118 - loss: 0.6715 - precision: 0.6590 - recall: 0.659020/26 ━━━━━━━━━━━━━━━━━━━━ 0s 166ms/step - accuracy: 0.6605 - auc: 0.7132 - loss: 0.6720 - precision: 0.6605 - recall: 0.660521/26 ━━━━━━━━━━━━━━━━━━━━ 0s 166ms/step - accuracy: 0.6613 - auc: 0.7138 - loss: 0.6724 - precision: 0.6613 - recall: 0.661322/26 ━━━━━━━━━━━━━━━━━━━━ 0s 166ms/step - accuracy: 0.6618 - auc: 0.7138 - loss: 0.6728 - precision: 0.6618 - recall: 0.661823/26 ━━━━━━━━━━━━━━━━━━━━ 0s 166ms/step - accuracy: 0.6618 - auc: 0.7134 - loss: 0.6733 - precision: 0.6618 - recall: 0.661824/26 ━━━━━━━━━━━━━━━━━━━━ 0s 166ms/step - accuracy: 0.6615 - auc: 0.7125 - loss: 0.6738 - precision: 0.6615 - recall: 0.661525/26 ━━━━━━━━━━━━━━━━━━━━ 0s 166ms/step - accuracy: 0.6609 - auc: 0.7113 - loss: 0.6743 - precision: 0.6609 - recall: 0.660926/26 ━━━━━━━━━━━━━━━━━━━━ 0s 162ms/step - accuracy: 0.6603 - auc: 0.7102 - loss: 0.6747 - precision: 0.6603 - recall: 0.660326/26 ━━━━━━━━━━━━━━━━━━━━ 5s 179ms/step - accuracy: 0.6476 - auc: 0.6829 - loss: 0.6843 - precision: 0.6476 - recall: 0.6476 - val_accuracy: 0.6915 - val_auc: 0.7790 - val_loss: 0.6105 - val_precision: 0.6915 - val_recall: 0.6915
Epoch 5/50
1/26 ━━━━━━━━━━━━━━━━━━━━ 5s 209ms/step - accuracy: 0.6250 - auc: 0.6992 - loss: 0.6600 - precision: 0.6250 - recall: 0.6250 2/26 ━━━━━━━━━━━━━━━━━━━━ 3s 161ms/step - accuracy: 0.6641 - auc: 0.7401 - loss: 0.6429 - precision: 0.6641 - recall: 0.6641 3/26 ━━━━━━━━━━━━━━━━━━━━ 3s 163ms/step - accuracy: 0.6823 - auc: 0.7650 - loss: 0.6532 - precision: 0.6823 - recall: 0.6823 4/26 ━━━━━━━━━━━━━━━━━━━━ 3s 163ms/step - accuracy: 0.6914 - auc: 0.7808 - loss: 0.6514 - precision: 0.6914 - recall: 0.6914 5/26 ━━━━━━━━━━━━━━━━━━━━ 3s 161ms/step - accuracy: 0.6994 - auc: 0.7862 - loss: 0.6603 - precision: 0.6994 - recall: 0.6994 6/26 ━━━━━━━━━━━━━━━━━━━━ 3s 160ms/step - accuracy: 0.7017 - auc: 0.7872 - loss: 0.6668 - precision: 0.7017 - recall: 0.7017 7/26 ━━━━━━━━━━━━━━━━━━━━ 3s 160ms/step - accuracy: 0.7029 - auc: 0.7882 - loss: 0.6694 - precision: 0.7029 - recall: 0.7029 8/26 ━━━━━━━━━━━━━━━━━━━━ 2s 161ms/step - accuracy: 0.7015 - auc: 0.7875 - loss: 0.6696 - precision: 0.7015 - recall: 0.7015 9/26 ━━━━━━━━━━━━━━━━━━━━ 2s 162ms/step - accuracy: 0.6980 - auc: 0.7855 - loss: 0.6689 - precision: 0.6980 - recall: 0.698010/26 ━━━━━━━━━━━━━━━━━━━━ 2s 162ms/step - accuracy: 0.6957 - auc: 0.7843 - loss: 0.6681 - precision: 0.6957 - recall: 0.695711/26 ━━━━━━━━━━━━━━━━━━━━ 2s 162ms/step - accuracy: 0.6921 - auc: 0.7817 - loss: 0.6676 - precision: 0.6921 - recall: 0.692112/26 ━━━━━━━━━━━━━━━━━━━━ 2s 162ms/step - accuracy: 0.6876 - auc: 0.7783 - loss: 0.6667 - precision: 0.6876 - recall: 0.687613/26 ━━━━━━━━━━━━━━━━━━━━ 2s 161ms/step - accuracy: 0.6839 - auc: 0.7752 - loss: 0.6669 - precision: 0.6839 - recall: 0.683914/26 ━━━━━━━━━━━━━━━━━━━━ 1s 161ms/step - accuracy: 0.6805 - auc: 0.7722 - loss: 0.6667 - precision: 0.6805 - recall: 0.680515/26 ━━━━━━━━━━━━━━━━━━━━ 1s 162ms/step - accuracy: 0.6772 - auc: 0.7690 - loss: 0.6671 - precision: 0.6772 - recall: 0.677216/26 ━━━━━━━━━━━━━━━━━━━━ 1s 162ms/step - accuracy: 0.6742 - auc: 0.7659 - loss: 0.6672 - precision: 0.6742 - recall: 0.674217/26 ━━━━━━━━━━━━━━━━━━━━ 1s 161ms/step - accuracy: 0.6714 - auc: 0.7630 - loss: 0.6670 - precision: 0.6714 - recall: 0.671418/26 ━━━━━━━━━━━━━━━━━━━━ 1s 161ms/step - accuracy: 0.6692 - auc: 0.7607 - loss: 0.6671 - precision: 0.6692 - recall: 0.669219/26 ━━━━━━━━━━━━━━━━━━━━ 1s 161ms/step - accuracy: 0.6675 - auc: 0.7588 - loss: 0.6673 - precision: 0.6675 - recall: 0.667520/26 ━━━━━━━━━━━━━━━━━━━━ 0s 160ms/step - accuracy: 0.6654 - auc: 0.7562 - loss: 0.6674 - precision: 0.6654 - recall: 0.665421/26 ━━━━━━━━━━━━━━━━━━━━ 0s 160ms/step - accuracy: 0.6634 - auc: 0.7536 - loss: 0.6673 - precision: 0.6634 - recall: 0.663422/26 ━━━━━━━━━━━━━━━━━━━━ 0s 160ms/step - accuracy: 0.6612 - auc: 0.7508 - loss: 0.6674 - precision: 0.6612 - recall: 0.661223/26 ━━━━━━━━━━━━━━━━━━━━ 0s 160ms/step - accuracy: 0.6592 - auc: 0.7482 - loss: 0.6676 - precision: 0.6592 - recall: 0.659224/26 ━━━━━━━━━━━━━━━━━━━━ 0s 159ms/step - accuracy: 0.6573 - auc: 0.7455 - loss: 0.6678 - precision: 0.6573 - recall: 0.657325/26 ━━━━━━━━━━━━━━━━━━━━ 0s 159ms/step - accuracy: 0.6551 - auc: 0.7427 - loss: 0.6680 - precision: 0.6551 - recall: 0.655126/26 ━━━━━━━━━━━━━━━━━━━━ 4s 168ms/step - accuracy: 0.6015 - auc: 0.6733 - loss: 0.6728 - precision: 0.6015 - recall: 0.6015 - val_accuracy: 0.5572 - val_auc: 0.6034 - val_loss: 0.6770 - val_precision: 0.5572 - val_recall: 0.5572
Epoch 6/50
1/26 ━━━━━━━━━━━━━━━━━━━━ 5s 205ms/step - accuracy: 0.4688 - auc: 0.5571 - loss: 0.6026 - precision: 0.4688 - recall: 0.4688 2/26 ━━━━━━━━━━━━━━━━━━━━ 4s 168ms/step - accuracy: 0.4688 - auc: 0.5717 - loss: 0.5980 - precision: 0.4688 - recall: 0.4688 3/26 ━━━━━━━━━━━━━━━━━━━━ 3s 165ms/step - accuracy: 0.4965 - auc: 0.6086 - loss: 0.6093 - precision: 0.4965 - recall: 0.4965 4/26 ━━━━━━━━━━━━━━━━━━━━ 3s 162ms/step - accuracy: 0.5111 - auc: 0.6254 - loss: 0.6125 - precision: 0.5111 - recall: 0.5111 5/26 ━━━━━━━━━━━━━━━━━━━━ 3s 161ms/step - accuracy: 0.5276 - auc: 0.6376 - loss: 0.6245 - precision: 0.5276 - recall: 0.5276 6/26 ━━━━━━━━━━━━━━━━━━━━ 3s 162ms/step - accuracy: 0.5378 - auc: 0.6449 - loss: 0.6339 - precision: 0.5378 - recall: 0.5378 7/26 ━━━━━━━━━━━━━━━━━━━━ 3s 162ms/step - accuracy: 0.5483 - auc: 0.6546 - loss: 0.6378 - precision: 0.5483 - recall: 0.5483 8/26 ━━━━━━━━━━━━━━━━━━━━ 2s 162ms/step - accuracy: 0.5574 - auc: 0.6619 - loss: 0.6391 - precision: 0.5574 - recall: 0.5574 9/26 ━━━━━━━━━━━━━━━━━━━━ 2s 161ms/step - accuracy: 0.5645 - auc: 0.6679 - loss: 0.6394 - precision: 0.5645 - recall: 0.564510/26 ━━━━━━━━━━━━━━━━━━━━ 2s 161ms/step - accuracy: 0.5715 - auc: 0.6743 - loss: 0.6391 - precision: 0.5715 - recall: 0.571511/26 ━━━━━━━━━━━━━━━━━━━━ 2s 160ms/step - accuracy: 0.5774 - auc: 0.6789 - loss: 0.6401 - precision: 0.5774 - recall: 0.577412/26 ━━━━━━━━━━━━━━━━━━━━ 2s 160ms/step - accuracy: 0.5825 - auc: 0.6831 - loss: 0.6402 - precision: 0.5825 - recall: 0.582513/26 ━━━━━━━━━━━━━━━━━━━━ 2s 160ms/step - accuracy: 0.5868 - auc: 0.6865 - loss: 0.6415 - precision: 0.5868 - recall: 0.586814/26 ━━━━━━━━━━━━━━━━━━━━ 1s 160ms/step - accuracy: 0.5912 - auc: 0.6899 - loss: 0.6422 - precision: 0.5912 - recall: 0.591215/26 ━━━━━━━━━━━━━━━━━━━━ 1s 160ms/step - accuracy: 0.5949 - auc: 0.6926 - loss: 0.6434 - precision: 0.5949 - recall: 0.594916/26 ━━━━━━━━━━━━━━━━━━━━ 1s 160ms/step - accuracy: 0.5983 - auc: 0.6951 - loss: 0.6443 - precision: 0.5983 - recall: 0.598317/26 ━━━━━━━━━━━━━━━━━━━━ 1s 160ms/step - accuracy: 0.6013 - auc: 0.6974 - loss: 0.6447 - precision: 0.6013 - recall: 0.601318/26 ━━━━━━━━━━━━━━━━━━━━ 1s 160ms/step - accuracy: 0.6037 - auc: 0.6993 - loss: 0.6456 - precision: 0.6037 - recall: 0.603719/26 ━━━━━━━━━━━━━━━━━━━━ 1s 160ms/step - accuracy: 0.6063 - auc: 0.7014 - loss: 0.6462 - precision: 0.6063 - recall: 0.606320/26 ━━━━━━━━━━━━━━━━━━━━ 0s 160ms/step - accuracy: 0.6085 - auc: 0.7031 - loss: 0.6468 - precision: 0.6085 - recall: 0.608521/26 ━━━━━━━━━━━━━━━━━━━━ 0s 161ms/step - accuracy: 0.6104 - auc: 0.7044 - loss: 0.6471 - precision: 0.6104 - recall: 0.610422/26 ━━━━━━━━━━━━━━━━━━━━ 0s 160ms/step - accuracy: 0.6123 - auc: 0.7055 - loss: 0.6475 - precision: 0.6123 - recall: 0.612323/26 ━━━━━━━━━━━━━━━━━━━━ 0s 160ms/step - accuracy: 0.6141 - auc: 0.7067 - loss: 0.6479 - precision: 0.6141 - recall: 0.614124/26 ━━━━━━━━━━━━━━━━━━━━ 0s 160ms/step - accuracy: 0.6156 - auc: 0.7076 - loss: 0.6483 - precision: 0.6156 - recall: 0.615625/26 ━━━━━━━━━━━━━━━━━━━━ 0s 160ms/step - accuracy: 0.6168 - auc: 0.7081 - loss: 0.6486 - precision: 0.6168 - recall: 0.616826/26 ━━━━━━━━━━━━━━━━━━━━ 0s 156ms/step - accuracy: 0.6178 - auc: 0.7085 - loss: 0.6490 - precision: 0.6178 - recall: 0.617826/26 ━━━━━━━━━━━━━━━━━━━━ 4s 170ms/step - accuracy: 0.6438 - auc: 0.7187 - loss: 0.6568 - precision: 0.6438 - recall: 0.6438 - val_accuracy: 0.6119 - val_auc: 0.6575 - val_loss: 0.6578 - val_precision: 0.6119 - val_recall: 0.6119
Epoch 7/50
1/26 ━━━━━━━━━━━━━━━━━━━━ 5s 202ms/step - accuracy: 0.5312 - auc: 0.6411 - loss: 0.5738 - precision: 0.5312 - recall: 0.5312 2/26 ━━━━━━━━━━━━━━━━━━━━ 3s 156ms/step - accuracy: 0.5469 - auc: 0.6430 - loss: 0.5791 - precision: 0.5469 - recall: 0.5469 3/26 ━━━━━━━━━━━━━━━━━━━━ 3s 159ms/step - accuracy: 0.5694 - auc: 0.6672 - loss: 0.5934 - precision: 0.5694 - recall: 0.5694 4/26 ━━━━━━━━━━━━━━━━━━━━ 3s 161ms/step - accuracy: 0.5833 - auc: 0.6816 - loss: 0.5965 - precision: 0.5833 - recall: 0.5833 5/26 ━━━━━━━━━━━━━━━━━━━━ 3s 163ms/step - accuracy: 0.5967 - auc: 0.6888 - loss: 0.6096 - precision: 0.5967 - recall: 0.5967 6/26 ━━━━━━━━━━━━━━━━━━━━ 3s 164ms/step - accuracy: 0.6075 - auc: 0.6921 - loss: 0.6193 - precision: 0.6075 - recall: 0.6075 7/26 ━━━━━━━━━━━━━━━━━━━━ 3s 167ms/step - accuracy: 0.6183 - auc: 0.6986 - loss: 0.6233 - precision: 0.6183 - recall: 0.6183 8/26 ━━━━━━━━━━━━━━━━━━━━ 3s 168ms/step - accuracy: 0.6269 - auc: 0.7048 - loss: 0.6246 - precision: 0.6269 - recall: 0.6269 9/26 ━━━━━━━━━━━━━━━━━━━━ 2s 170ms/step - accuracy: 0.6340 - auc: 0.7102 - loss: 0.6248 - precision: 0.6340 - recall: 0.634010/26 ━━━━━━━━━━━━━━━━━━━━ 2s 171ms/step - accuracy: 0.6406 - auc: 0.7155 - loss: 0.6247 - precision: 0.6406 - recall: 0.640611/26 ━━━━━━━━━━━━━━━━━━━━ 2s 172ms/step - accuracy: 0.6451 - auc: 0.7197 - loss: 0.6252 - precision: 0.6451 - recall: 0.645112/26 ━━━━━━━━━━━━━━━━━━━━ 2s 172ms/step - accuracy: 0.6495 - auc: 0.7237 - loss: 0.6249 - precision: 0.6495 - recall: 0.649513/26 ━━━━━━━━━━━━━━━━━━━━ 2s 172ms/step - accuracy: 0.6534 - auc: 0.7274 - loss: 0.6256 - precision: 0.6534 - recall: 0.653414/26 ━━━━━━━━━━━━━━━━━━━━ 2s 171ms/step - accuracy: 0.6576 - auc: 0.7314 - loss: 0.6257 - precision: 0.6576 - recall: 0.657615/26 ━━━━━━━━━━━━━━━━━━━━ 1s 171ms/step - accuracy: 0.6611 - auc: 0.7348 - loss: 0.6262 - precision: 0.6611 - recall: 0.661116/26 ━━━━━━━━━━━━━━━━━━━━ 1s 171ms/step - accuracy: 0.6641 - auc: 0.7377 - loss: 0.6265 - precision: 0.6641 - recall: 0.664117/26 ━━━━━━━━━━━━━━━━━━━━ 1s 171ms/step - accuracy: 0.6664 - auc: 0.7402 - loss: 0.6265 - precision: 0.6664 - recall: 0.666418/26 ━━━━━━━━━━━━━━━━━━━━ 1s 170ms/step - accuracy: 0.6684 - auc: 0.7422 - loss: 0.6269 - precision: 0.6684 - recall: 0.668419/26 ━━━━━━━━━━━━━━━━━━━━ 1s 170ms/step - accuracy: 0.6703 - auc: 0.7440 - loss: 0.6273 - precision: 0.6703 - recall: 0.670320/26 ━━━━━━━━━━━━━━━━━━━━ 1s 169ms/step - accuracy: 0.6713 - auc: 0.7449 - loss: 0.6276 - precision: 0.6713 - recall: 0.671321/26 ━━━━━━━━━━━━━━━━━━━━ 0s 169ms/step - accuracy: 0.6720 - auc: 0.7453 - loss: 0.6277 - precision: 0.6720 - recall: 0.672022/26 ━━━━━━━━━━━━━━━━━━━━ 0s 169ms/step - accuracy: 0.6723 - auc: 0.7453 - loss: 0.6277 - precision: 0.6723 - recall: 0.672323/26 ━━━━━━━━━━━━━━━━━━━━ 0s 168ms/step - accuracy: 0.6727 - auc: 0.7453 - loss: 0.6278 - precision: 0.6727 - recall: 0.672724/26 ━━━━━━━━━━━━━━━━━━━━ 0s 168ms/step - accuracy: 0.6730 - auc: 0.7451 - loss: 0.6281 - precision: 0.6730 - recall: 0.673025/26 ━━━━━━━━━━━━━━━━━━━━ 0s 167ms/step - accuracy: 0.6735 - auc: 0.7452 - loss: 0.6282 - precision: 0.6735 - recall: 0.673526/26 ━━━━━━━━━━━━━━━━━━━━ 5s 176ms/step - accuracy: 0.6874 - auc: 0.7486 - loss: 0.6285 - precision: 0.6874 - recall: 0.6874 - val_accuracy: 0.7015 - val_auc: 0.7933 - val_loss: 0.5592 - val_precision: 0.7015 - val_recall: 0.7015
Epoch 8/50
1/26 ━━━━━━━━━━━━━━━━━━━━ 5s 200ms/step - accuracy: 0.7500 - auc: 0.8115 - loss: 0.5880 - precision: 0.7500 - recall: 0.7500 2/26 ━━━━━━━━━━━━━━━━━━━━ 3s 154ms/step - accuracy: 0.7188 - auc: 0.7970 - loss: 0.6016 - precision: 0.7188 - recall: 0.7188 3/26 ━━━━━━━━━━━━━━━━━━━━ 3s 156ms/step - accuracy: 0.7118 - auc: 0.7991 - loss: 0.6116 - precision: 0.7118 - recall: 0.7118 4/26 ━━━━━━━━━━━━━━━━━━━━ 3s 157ms/step - accuracy: 0.7038 - auc: 0.7956 - loss: 0.6096 - precision: 0.7038 - recall: 0.7038 5/26 ━━━━━━━━━━━━━━━━━━━━ 3s 157ms/step - accuracy: 0.6918 - auc: 0.7869 - loss: 0.6145 - precision: 0.6918 - recall: 0.6918 6/26 ━━━━━━━━━━━━━━━━━━━━ 3s 156ms/step - accuracy: 0.6754 - auc: 0.7735 - loss: 0.6190 - precision: 0.6754 - recall: 0.6754 7/26 ━━━━━━━━━━━━━━━━━━━━ 2s 157ms/step - accuracy: 0.6606 - auc: 0.7609 - loss: 0.6201 - precision: 0.6606 - recall: 0.6606 8/26 ━━━━━━━━━━━━━━━━━━━━ 2s 157ms/step - accuracy: 0.6439 - auc: 0.7461 - loss: 0.6206 - precision: 0.6439 - recall: 0.6439 9/26 ━━━━━━━━━━━━━━━━━━━━ 2s 157ms/step - accuracy: 0.6302 - auc: 0.7332 - loss: 0.6206 - precision: 0.6302 - recall: 0.630210/26 ━━━━━━━━━━━━━━━━━━━━ 2s 157ms/step - accuracy: 0.6210 - auc: 0.7231 - loss: 0.6207 - precision: 0.6210 - recall: 0.621011/26 ━━━━━━━━━━━━━━━━━━━━ 2s 157ms/step - accuracy: 0.6136 - auc: 0.7144 - loss: 0.6210 - precision: 0.6136 - recall: 0.613612/26 ━━━━━━━━━━━━━━━━━━━━ 2s 157ms/step - accuracy: 0.6074 - auc: 0.7074 - loss: 0.6207 - precision: 0.6074 - recall: 0.607413/26 ━━━━━━━━━━━━━━━━━━━━ 2s 157ms/step - accuracy: 0.6030 - auc: 0.7024 - loss: 0.6213 - precision: 0.6030 - recall: 0.603014/26 ━━━━━━━━━━━━━━━━━━━━ 1s 157ms/step - accuracy: 0.6007 - auc: 0.6998 - loss: 0.6210 - precision: 0.6007 - recall: 0.600715/26 ━━━━━━━━━━━━━━━━━━━━ 1s 157ms/step - accuracy: 0.5996 - auc: 0.6982 - loss: 0.6213 - precision: 0.5996 - recall: 0.599616/26 ━━━━━━━━━━━━━━━━━━━━ 1s 157ms/step - accuracy: 0.5992 - auc: 0.6976 - loss: 0.6212 - precision: 0.5992 - recall: 0.599217/26 ━━━━━━━━━━━━━━━━━━━━ 1s 157ms/step - accuracy: 0.5994 - auc: 0.6978 - loss: 0.6207 - precision: 0.5994 - recall: 0.599418/26 ━━━━━━━━━━━━━━━━━━━━ 1s 156ms/step - accuracy: 0.5996 - auc: 0.6983 - loss: 0.6208 - precision: 0.5996 - recall: 0.599619/26 ━━━━━━━━━━━━━━━━━━━━ 1s 156ms/step - accuracy: 0.6003 - auc: 0.6993 - loss: 0.6206 - precision: 0.6003 - recall: 0.600320/26 ━━━━━━━━━━━━━━━━━━━━ 0s 156ms/step - accuracy: 0.6010 - auc: 0.7002 - loss: 0.6203 - precision: 0.6010 - recall: 0.601021/26 ━━━━━━━━━━━━━━━━━━━━ 0s 156ms/step - accuracy: 0.6017 - auc: 0.7010 - loss: 0.6198 - precision: 0.6017 - recall: 0.601722/26 ━━━━━━━━━━━━━━━━━━━━ 0s 156ms/step - accuracy: 0.6021 - auc: 0.7014 - loss: 0.6194 - precision: 0.6021 - recall: 0.602123/26 ━━━━━━━━━━━━━━━━━━━━ 0s 156ms/step - accuracy: 0.6026 - auc: 0.7020 - loss: 0.6189 - precision: 0.6026 - recall: 0.602624/26 ━━━━━━━━━━━━━━━━━━━━ 0s 156ms/step - accuracy: 0.6031 - auc: 0.7024 - loss: 0.6186 - precision: 0.6031 - recall: 0.603125/26 ━━━━━━━━━━━━━━━━━━━━ 0s 156ms/step - accuracy: 0.6038 - auc: 0.7032 - loss: 0.6180 - precision: 0.6038 - recall: 0.603826/26 ━━━━━━━━━━━━━━━━━━━━ 4s 166ms/step - accuracy: 0.6239 - auc: 0.7230 - loss: 0.6031 - precision: 0.6239 - recall: 0.6239 - val_accuracy: 0.7164 - val_auc: 0.7842 - val_loss: 0.5859 - val_precision: 0.7164 - val_recall: 0.7164
Epoch 9/50
1/26 ━━━━━━━━━━━━━━━━━━━━ 4s 197ms/step - accuracy: 0.7500 - auc: 0.8008 - loss: 0.4348 - precision: 0.7500 - recall: 0.7500 2/26 ━━━━━━━━━━━━━━━━━━━━ 3s 157ms/step - accuracy: 0.7656 - auc: 0.8370 - loss: 0.4588 - precision: 0.7656 - recall: 0.7656 3/26 ━━━━━━━━━━━━━━━━━━━━ 3s 157ms/step - accuracy: 0.7708 - auc: 0.8445 - loss: 0.5282 - precision: 0.7708 - recall: 0.7708 4/26 ━━━━━━━━━━━━━━━━━━━━ 3s 157ms/step - accuracy: 0.7734 - auc: 0.8471 - loss: 0.5576 - precision: 0.7734 - recall: 0.7734 5/26 ━━━━━━━━━━━━━━━━━━━━ 3s 159ms/step - accuracy: 0.7650 - auc: 0.8411 - loss: 0.5825 - precision: 0.7650 - recall: 0.7650 6/26 ━━━━━━━━━━━━━━━━━━━━ 3s 160ms/step - accuracy: 0.7495 - auc: 0.8281 - loss: 0.6015 - precision: 0.7495 - recall: 0.7495 7/26 ━━━━━━━━━━━━━━━━━━━━ 3s 159ms/step - accuracy: 0.7342 - auc: 0.8144 - loss: 0.6123 - precision: 0.7342 - recall: 0.7342 8/26 ━━━━━━━━━━━━━━━━━━━━ 2s 159ms/step - accuracy: 0.7186 - auc: 0.7998 - loss: 0.6195 - precision: 0.7186 - recall: 0.7186 9/26 ━━━━━━━━━━━━━━━━━━━━ 2s 159ms/step - accuracy: 0.7063 - auc: 0.7862 - loss: 0.6243 - precision: 0.7063 - recall: 0.706310/26 ━━━━━━━━━━━━━━━━━━━━ 2s 159ms/step - accuracy: 0.6969 - auc: 0.7747 - loss: 0.6282 - precision: 0.6969 - recall: 0.696911/26 ━━━━━━━━━━━━━━━━━━━━ 2s 159ms/step - accuracy: 0.6914 - auc: 0.7655 - loss: 0.6311 - precision: 0.6914 - recall: 0.691412/26 ━━━━━━━━━━━━━━━━━━━━ 2s 158ms/step - accuracy: 0.6881 - auc: 0.7587 - loss: 0.6326 - precision: 0.6881 - recall: 0.688113/26 ━━━━━━━━━━━━━━━━━━━━ 2s 158ms/step - accuracy: 0.6852 - auc: 0.7530 - loss: 0.6355 - precision: 0.6852 - recall: 0.685214/26 ━━━━━━━━━━━━━━━━━━━━ 1s 158ms/step - accuracy: 0.6835 - auc: 0.7483 - loss: 0.6377 - precision: 0.6835 - recall: 0.683515/26 ━━━━━━━━━━━━━━━━━━━━ 1s 158ms/step - accuracy: 0.6818 - auc: 0.7438 - loss: 0.6402 - precision: 0.6818 - recall: 0.681816/26 ━━━━━━━━━━━━━━━━━━━━ 1s 157ms/step - accuracy: 0.6800 - auc: 0.7396 - loss: 0.6421 - precision: 0.6800 - recall: 0.680017/26 ━━━━━━━━━━━━━━━━━━━━ 1s 157ms/step - accuracy: 0.6772 - auc: 0.7349 - loss: 0.6436 - precision: 0.6772 - recall: 0.677218/26 ━━━━━━━━━━━━━━━━━━━━ 1s 157ms/step - accuracy: 0.6739 - auc: 0.7302 - loss: 0.6454 - precision: 0.6739 - recall: 0.673919/26 ━━━━━━━━━━━━━━━━━━━━ 1s 157ms/step - accuracy: 0.6705 - auc: 0.7258 - loss: 0.6469 - precision: 0.6705 - recall: 0.670520/26 ━━━━━━━━━━━━━━━━━━━━ 0s 156ms/step - accuracy: 0.6670 - auc: 0.7211 - loss: 0.6482 - precision: 0.6670 - recall: 0.667021/26 ━━━━━━━━━━━━━━━━━━━━ 0s 156ms/step - accuracy: 0.6634 - auc: 0.7164 - loss: 0.6494 - precision: 0.6634 - recall: 0.663422/26 ━━━━━━━━━━━━━━━━━━━━ 0s 156ms/step - accuracy: 0.6601 - auc: 0.7120 - loss: 0.6504 - precision: 0.6601 - recall: 0.660123/26 ━━━━━━━━━━━━━━━━━━━━ 0s 156ms/step - accuracy: 0.6573 - auc: 0.7082 - loss: 0.6513 - precision: 0.6573 - recall: 0.657324/26 ━━━━━━━━━━━━━━━━━━━━ 0s 155ms/step - accuracy: 0.6545 - auc: 0.7046 - loss: 0.6523 - precision: 0.6545 - recall: 0.654525/26 ━━━━━━━━━━━━━━━━━━━━ 0s 155ms/step - accuracy: 0.6518 - auc: 0.7012 - loss: 0.6532 - precision: 0.6518 - recall: 0.651826/26 ━━━━━━━━━━━━━━━━━━━━ 4s 166ms/step - accuracy: 0.5890 - auc: 0.6211 - loss: 0.6733 - precision: 0.5890 - recall: 0.5890 - val_accuracy: 0.5970 - val_auc: 0.5979 - val_loss: 0.6870 - val_precision: 0.5970 - val_recall: 0.5970
Epoch 10/50
1/26 ━━━━━━━━━━━━━━━━━━━━ 4s 200ms/step - accuracy: 0.7188 - auc: 0.7422 - loss: 0.5418 - precision: 0.7188 - recall: 0.7188 2/26 ━━━━━━━━━━━━━━━━━━━━ 3s 153ms/step - accuracy: 0.7578 - auc: 0.7820 - loss: 0.5415 - precision: 0.7578 - recall: 0.7578 3/26 ━━━━━━━━━━━━━━━━━━━━ 3s 153ms/step - accuracy: 0.7622 - auc: 0.7884 - loss: 0.5719 - precision: 0.7622 - recall: 0.7622 4/26 ━━━━━━━━━━━━━━━━━━━━ 3s 154ms/step - accuracy: 0.7630 - auc: 0.7979 - loss: 0.5792 - precision: 0.7630 - recall: 0.7630 5/26 ━━━━━━━━━━━━━━━━━━━━ 3s 154ms/step - accuracy: 0.7579 - auc: 0.7957 - loss: 0.5879 - precision: 0.7579 - recall: 0.7579 6/26 ━━━━━━━━━━━━━━━━━━━━ 3s 154ms/step - accuracy: 0.7418 - auc: 0.7810 - loss: 0.5970 - precision: 0.7418 - recall: 0.7418 7/26 ━━━━━━━━━━━━━━━━━━━━ 2s 153ms/step - accuracy: 0.7264 - auc: 0.7668 - loss: 0.6014 - precision: 0.7264 - recall: 0.7264 8/26 ━━━━━━━━━━━━━━━━━━━━ 2s 153ms/step - accuracy: 0.7147 - auc: 0.7541 - loss: 0.6040 - precision: 0.7147 - recall: 0.7147 9/26 ━━━━━━━━━━━━━━━━━━━━ 2s 152ms/step - accuracy: 0.7086 - auc: 0.7484 - loss: 0.6047 - precision: 0.7086 - recall: 0.708610/26 ━━━━━━━━━━━━━━━━━━━━ 2s 153ms/step - accuracy: 0.7046 - auc: 0.7459 - loss: 0.6059 - precision: 0.7046 - recall: 0.704611/26 ━━━━━━━━━━━━━━━━━━━━ 2s 153ms/step - accuracy: 0.7026 - auc: 0.7454 - loss: 0.6074 - precision: 0.7026 - recall: 0.702612/26 ━━━━━━━━━━━━━━━━━━━━ 2s 152ms/step - accuracy: 0.7022 - auc: 0.7466 - loss: 0.6078 - precision: 0.7022 - recall: 0.702213/26 ━━━━━━━━━━━━━━━━━━━━ 1s 152ms/step - accuracy: 0.7023 - auc: 0.7482 - loss: 0.6090 - precision: 0.7023 - recall: 0.702314/26 ━━━━━━━━━━━━━━━━━━━━ 1s 151ms/step - accuracy: 0.7027 - auc: 0.7499 - loss: 0.6093 - precision: 0.7027 - recall: 0.702715/26 ━━━━━━━━━━━━━━━━━━━━ 1s 151ms/step - accuracy: 0.7028 - auc: 0.7508 - loss: 0.6102 - precision: 0.7028 - recall: 0.702816/26 ━━━━━━━━━━━━━━━━━━━━ 1s 151ms/step - accuracy: 0.7031 - auc: 0.7516 - loss: 0.6107 - precision: 0.7031 - recall: 0.703117/26 ━━━━━━━━━━━━━━━━━━━━ 1s 151ms/step - accuracy: 0.7031 - auc: 0.7520 - loss: 0.6108 - precision: 0.7031 - recall: 0.703118/26 ━━━━━━━━━━━━━━━━━━━━ 1s 151ms/step - accuracy: 0.7025 - auc: 0.7517 - loss: 0.6114 - precision: 0.7025 - recall: 0.702519/26 ━━━━━━━━━━━━━━━━━━━━ 1s 151ms/step - accuracy: 0.7018 - auc: 0.7514 - loss: 0.6119 - precision: 0.7018 - recall: 0.701820/26 ━━━━━━━━━━━━━━━━━━━━ 0s 150ms/step - accuracy: 0.7011 - auc: 0.7509 - loss: 0.6121 - precision: 0.7011 - recall: 0.701121/26 ━━━━━━━━━━━━━━━━━━━━ 0s 150ms/step - accuracy: 0.7003 - auc: 0.7503 - loss: 0.6121 - precision: 0.7003 - recall: 0.700322/26 ━━━━━━━━━━━━━━━━━━━━ 0s 150ms/step - accuracy: 0.6992 - auc: 0.7494 - loss: 0.6122 - precision: 0.6992 - recall: 0.699223/26 ━━━━━━━━━━━━━━━━━━━━ 0s 150ms/step - accuracy: 0.6981 - auc: 0.7488 - loss: 0.6122 - precision: 0.6981 - recall: 0.698124/26 ━━━━━━━━━━━━━━━━━━━━ 0s 150ms/step - accuracy: 0.6972 - auc: 0.7483 - loss: 0.6122 - precision: 0.6972 - recall: 0.697225/26 ━━━━━━━━━━━━━━━━━━━━ 0s 150ms/step - accuracy: 0.6963 - auc: 0.7479 - loss: 0.6122 - precision: 0.6963 - recall: 0.696326/26 ━━━━━━━━━━━━━━━━━━━━ 4s 158ms/step - accuracy: 0.6750 - auc: 0.7400 - loss: 0.6095 - precision: 0.6750 - recall: 0.6750 - val_accuracy: 0.6318 - val_auc: 0.6930 - val_loss: 0.6496 - val_precision: 0.6318 - val_recall: 0.6318
Epoch 11/50
1/26 ━━━━━━━━━━━━━━━━━━━━ 4s 195ms/step - accuracy: 0.7188 - auc: 0.7881 - loss: 0.4664 - precision: 0.7188 - recall: 0.7188 2/26 ━━━━━━━━━━━━━━━━━━━━ 3s 152ms/step - accuracy: 0.7344 - auc: 0.8041 - loss: 0.4796 - precision: 0.7344 - recall: 0.7344 3/26 ━━━━━━━━━━━━━━━━━━━━ 3s 154ms/step - accuracy: 0.7431 - auc: 0.8198 - loss: 0.4967 - precision: 0.7431 - recall: 0.7431 4/26 ━━━━━━━━━━━━━━━━━━━━ 3s 152ms/step - accuracy: 0.7526 - auc: 0.8340 - loss: 0.4950 - precision: 0.7526 - recall: 0.7526 5/26 ━━━━━━━━━━━━━━━━━━━━ 3s 151ms/step - accuracy: 0.7558 - auc: 0.8386 - loss: 0.5040 - precision: 0.7558 - recall: 0.7558 6/26 ━━━━━━━━━━━━━━━━━━━━ 3s 150ms/step - accuracy: 0.7531 - auc: 0.8354 - loss: 0.5117 - precision: 0.7531 - recall: 0.7531 7/26 ━━━━━━━━━━━━━━━━━━━━ 2s 150ms/step - accuracy: 0.7501 - auc: 0.8328 - loss: 0.5142 - precision: 0.7501 - recall: 0.7501 8/26 ━━━━━━━━━━━━━━━━━━━━ 2s 149ms/step - accuracy: 0.7462 - auc: 0.8297 - loss: 0.5151 - precision: 0.7462 - recall: 0.7462 9/26 ━━━━━━━━━━━━━━━━━━━━ 2s 149ms/step - accuracy: 0.7435 - auc: 0.8277 - loss: 0.5157 - precision: 0.7435 - recall: 0.743510/26 ━━━━━━━━━━━━━━━━━━━━ 2s 150ms/step - accuracy: 0.7420 - auc: 0.8264 - loss: 0.5172 - precision: 0.7420 - recall: 0.742011/26 ━━━━━━━━━━━━━━━━━━━━ 2s 149ms/step - accuracy: 0.7409 - auc: 0.8254 - loss: 0.5190 - precision: 0.7409 - recall: 0.740912/26 ━━━━━━━━━━━━━━━━━━━━ 2s 149ms/step - accuracy: 0.7406 - auc: 0.8249 - loss: 0.5206 - precision: 0.7406 - recall: 0.740613/26 ━━━━━━━━━━━━━━━━━━━━ 1s 149ms/step - accuracy: 0.7406 - auc: 0.8243 - loss: 0.5230 - precision: 0.7406 - recall: 0.740614/26 ━━━━━━━━━━━━━━━━━━━━ 1s 149ms/step - accuracy: 0.7406 - auc: 0.8240 - loss: 0.5243 - precision: 0.7406 - recall: 0.740615/26 ━━━━━━━━━━━━━━━━━━━━ 1s 149ms/step - accuracy: 0.7404 - auc: 0.8235 - loss: 0.5259 - precision: 0.7404 - recall: 0.740416/26 ━━━━━━━━━━━━━━━━━━━━ 1s 149ms/step - accuracy: 0.7405 - auc: 0.8231 - loss: 0.5270 - precision: 0.7405 - recall: 0.740517/26 ━━━━━━━━━━━━━━━━━━━━ 1s 149ms/step - accuracy: 0.7406 - auc: 0.8229 - loss: 0.5277 - precision: 0.7406 - recall: 0.740618/26 ━━━━━━━━━━━━━━━━━━━━ 1s 149ms/step - accuracy: 0.7403 - auc: 0.8223 - loss: 0.5290 - precision: 0.7403 - recall: 0.740319/26 ━━━━━━━━━━━━━━━━━━━━ 1s 149ms/step - accuracy: 0.7400 - auc: 0.8218 - loss: 0.5299 - precision: 0.7400 - recall: 0.740020/26 ━━━━━━━━━━━━━━━━━━━━ 0s 149ms/step - accuracy: 0.7399 - auc: 0.8215 - loss: 0.5306 - precision: 0.7399 - recall: 0.739921/26 ━━━━━━━━━━━━━━━━━━━━ 0s 149ms/step - accuracy: 0.7397 - auc: 0.8210 - loss: 0.5313 - precision: 0.7397 - recall: 0.739722/26 ━━━━━━━━━━━━━━━━━━━━ 0s 148ms/step - accuracy: 0.7394 - auc: 0.8205 - loss: 0.5320 - precision: 0.7394 - recall: 0.739423/26 ━━━━━━━━━━━━━━━━━━━━ 0s 148ms/step - accuracy: 0.7394 - auc: 0.8202 - loss: 0.5324 - precision: 0.7394 - recall: 0.739424/26 ━━━━━━━━━━━━━━━━━━━━ 0s 149ms/step - accuracy: 0.7393 - auc: 0.8200 - loss: 0.5330 - precision: 0.7393 - recall: 0.739325/26 ━━━━━━━━━━━━━━━━━━━━ 0s 149ms/step - accuracy: 0.7392 - auc: 0.8199 - loss: 0.5334 - precision: 0.7392 - recall: 0.739226/26 ━━━━━━━━━━━━━━━━━━━━ 4s 157ms/step - accuracy: 0.7385 - auc: 0.8173 - loss: 0.5440 - precision: 0.7385 - recall: 0.7385 - val_accuracy: 0.5871 - val_auc: 0.6180 - val_loss: 0.7235 - val_precision: 0.5871 - val_recall: 0.5871
Epoch 12/50
1/26 ━━━━━━━━━━━━━━━━━━━━ 4s 194ms/step - accuracy: 0.6562 - auc: 0.7812 - loss: 0.3977 - precision: 0.6562 - recall: 0.6562 2/26 ━━━━━━━━━━━━━━━━━━━━ 3s 156ms/step - accuracy: 0.6719 - auc: 0.7759 - loss: 0.4302 - precision: 0.6719 - recall: 0.6719 3/26 ━━━━━━━━━━━━━━━━━━━━ 3s 153ms/step - accuracy: 0.6875 - auc: 0.7818 - loss: 0.4520 - precision: 0.6875 - recall: 0.6875 4/26 ━━━━━━━━━━━━━━━━━━━━ 3s 154ms/step - accuracy: 0.7012 - auc: 0.7921 - loss: 0.4555 - precision: 0.7012 - recall: 0.7012 5/26 ━━━━━━━━━━━━━━━━━━━━ 3s 152ms/step - accuracy: 0.7134 - auc: 0.8005 - loss: 0.4618 - precision: 0.7134 - recall: 0.7134 6/26 ━━━━━━━━━━━━━━━━━━━━ 3s 151ms/step - accuracy: 0.7204 - auc: 0.8055 - loss: 0.4679 - precision: 0.7204 - recall: 0.7204 7/26 ━━━━━━━━━━━━━━━━━━━━ 2s 151ms/step - accuracy: 0.7285 - auc: 0.8126 - loss: 0.4692 - precision: 0.7285 - recall: 0.7285 8/26 ━━━━━━━━━━━━━━━━━━━━ 2s 151ms/step - accuracy: 0.7355 - auc: 0.8188 - loss: 0.4684 - precision: 0.7355 - recall: 0.7355 9/26 ━━━━━━━━━━━━━━━━━━━━ 2s 150ms/step - accuracy: 0.7418 - auc: 0.8244 - loss: 0.4672 - precision: 0.7418 - recall: 0.741810/26 ━━━━━━━━━━━━━━━━━━━━ 2s 150ms/step - accuracy: 0.7482 - auc: 0.8302 - loss: 0.4647 - precision: 0.7482 - recall: 0.748211/26 ━━━━━━━━━━━━━━━━━━━━ 2s 149ms/step - accuracy: 0.7530 - auc: 0.8342 - loss: 0.4628 - precision: 0.7530 - recall: 0.753012/26 ━━━━━━━━━━━━━━━━━━━━ 2s 149ms/step - accuracy: 0.7565 - auc: 0.8370 - loss: 0.4620 - precision: 0.7565 - recall: 0.756513/26 ━━━━━━━━━━━━━━━━━━━━ 1s 149ms/step - accuracy: 0.7595 - auc: 0.8393 - loss: 0.4629 - precision: 0.7595 - recall: 0.759514/26 ━━━━━━━━━━━━━━━━━━━━ 1s 149ms/step - accuracy: 0.7620 - auc: 0.8412 - loss: 0.4631 - precision: 0.7620 - recall: 0.762015/26 ━━━━━━━━━━━━━━━━━━━━ 1s 149ms/step - accuracy: 0.7643 - auc: 0.8429 - loss: 0.4630 - precision: 0.7643 - recall: 0.764316/26 ━━━━━━━━━━━━━━━━━━━━ 1s 149ms/step - accuracy: 0.7656 - auc: 0.8442 - loss: 0.4625 - precision: 0.7656 - recall: 0.765617/26 ━━━━━━━━━━━━━━━━━━━━ 1s 149ms/step - accuracy: 0.7666 - auc: 0.8452 - loss: 0.4621 - precision: 0.7666 - recall: 0.766618/26 ━━━━━━━━━━━━━━━━━━━━ 1s 149ms/step - accuracy: 0.7672 - auc: 0.8456 - loss: 0.4624 - precision: 0.7672 - recall: 0.767219/26 ━━━━━━━━━━━━━━━━━━━━ 1s 149ms/step - accuracy: 0.7680 - auc: 0.8461 - loss: 0.4625 - precision: 0.7680 - recall: 0.768020/26 ━━━━━━━━━━━━━━━━━━━━ 0s 149ms/step - accuracy: 0.7687 - auc: 0.8466 - loss: 0.4622 - precision: 0.7687 - recall: 0.768721/26 ━━━━━━━━━━━━━━━━━━━━ 0s 148ms/step - accuracy: 0.7692 - auc: 0.8470 - loss: 0.4622 - precision: 0.7692 - recall: 0.769222/26 ━━━━━━━━━━━━━━━━━━━━ 0s 148ms/step - accuracy: 0.7697 - auc: 0.8474 - loss: 0.4624 - precision: 0.7697 - recall: 0.769723/26 ━━━━━━━━━━━━━━━━━━━━ 0s 148ms/step - accuracy: 0.7703 - auc: 0.8481 - loss: 0.4625 - precision: 0.7703 - recall: 0.770324/26 ━━━━━━━━━━━━━━━━━━━━ 0s 148ms/step - accuracy: 0.7708 - auc: 0.8486 - loss: 0.4629 - precision: 0.7708 - recall: 0.770825/26 ━━━━━━━━━━━━━━━━━━━━ 0s 149ms/step - accuracy: 0.7715 - auc: 0.8492 - loss: 0.4632 - precision: 0.7715 - recall: 0.771526/26 ━━━━━━━━━━━━━━━━━━━━ 4s 157ms/step - accuracy: 0.7870 - auc: 0.8640 - loss: 0.4679 - precision: 0.7870 - recall: 0.7870 - val_accuracy: 0.4975 - val_auc: 0.5409 - val_loss: 0.8338 - val_precision: 0.4975 - val_recall: 0.4975
Epoch 13/50
1/26 ━━━━━━━━━━━━━━━━━━━━ 5s 203ms/step - accuracy: 0.5938 - auc: 0.7354 - loss: 0.4243 - precision: 0.5938 - recall: 0.5938 2/26 ━━━━━━━━━━━━━━━━━━━━ 3s 154ms/step - accuracy: 0.6172 - auc: 0.7372 - loss: 0.4439 - precision: 0.6172 - recall: 0.6172 3/26 ━━━━━━━━━━━━━━━━━━━━ 3s 149ms/step - accuracy: 0.6302 - auc: 0.7431 - loss: 0.4632 - precision: 0.6302 - recall: 0.6302 4/26 ━━━━━━━━━━━━━━━━━━━━ 3s 149ms/step - accuracy: 0.6484 - auc: 0.7590 - loss: 0.4608 - precision: 0.6484 - recall: 0.6484 5/26 ━━━━━━━━━━━━━━━━━━━━ 3s 147ms/step - accuracy: 0.6638 - auc: 0.7739 - loss: 0.4585 - precision: 0.6638 - recall: 0.6638 6/26 ━━━━━━━━━━━━━━━━━━━━ 2s 147ms/step - accuracy: 0.6729 - auc: 0.7828 - loss: 0.4592 - precision: 0.6729 - recall: 0.6729 7/26 ━━━━━━━━━━━━━━━━━━━━ 2s 147ms/step - accuracy: 0.6839 - auc: 0.7931 - loss: 0.4560 - precision: 0.6839 - recall: 0.6839 8/26 ━━━━━━━━━━━━━━━━━━━━ 2s 148ms/step - accuracy: 0.6951 - auc: 0.8031 - loss: 0.4516 - precision: 0.6951 - recall: 0.6951 9/26 ━━━━━━━━━━━━━━━━━━━━ 2s 148ms/step - accuracy: 0.7047 - auc: 0.8119 - loss: 0.4479 - precision: 0.7047 - recall: 0.704710/26 ━━━━━━━━━━━━━━━━━━━━ 2s 148ms/step - accuracy: 0.7136 - auc: 0.8199 - loss: 0.4446 - precision: 0.7136 - recall: 0.713611/26 ━━━━━━━━━━━━━━━━━━━━ 2s 149ms/step - accuracy: 0.7216 - auc: 0.8271 - loss: 0.4408 - precision: 0.7216 - recall: 0.721612/26 ━━━━━━━━━━━━━━━━━━━━ 2s 148ms/step - accuracy: 0.7291 - auc: 0.8337 - loss: 0.4369 - precision: 0.7291 - recall: 0.729113/26 ━━━━━━━━━━━━━━━━━━━━ 1s 148ms/step - accuracy: 0.7359 - auc: 0.8395 - loss: 0.4337 - precision: 0.7359 - recall: 0.735914/26 ━━━━━━━━━━━━━━━━━━━━ 1s 148ms/step - accuracy: 0.7417 - auc: 0.8446 - loss: 0.4302 - precision: 0.7417 - recall: 0.741715/26 ━━━━━━━━━━━━━━━━━━━━ 1s 148ms/step - accuracy: 0.7468 - auc: 0.8490 - loss: 0.4274 - precision: 0.7468 - recall: 0.746816/26 ━━━━━━━━━━━━━━━━━━━━ 1s 148ms/step - accuracy: 0.7512 - auc: 0.8528 - loss: 0.4246 - precision: 0.7512 - recall: 0.751217/26 ━━━━━━━━━━━━━━━━━━━━ 1s 148ms/step - accuracy: 0.7547 - auc: 0.8559 - loss: 0.4223 - precision: 0.7547 - recall: 0.754718/26 ━━━━━━━━━━━━━━━━━━━━ 1s 148ms/step - accuracy: 0.7572 - auc: 0.8581 - loss: 0.4213 - precision: 0.7572 - recall: 0.757219/26 ━━━━━━━━━━━━━━━━━━━━ 1s 148ms/step - accuracy: 0.7592 - auc: 0.8598 - loss: 0.4205 - precision: 0.7592 - recall: 0.759220/26 ━━━━━━━━━━━━━━━━━━━━ 0s 148ms/step - accuracy: 0.7606 - auc: 0.8611 - loss: 0.4200 - precision: 0.7606 - recall: 0.760621/26 ━━━━━━━━━━━━━━━━━━━━ 0s 148ms/step - accuracy: 0.7616 - auc: 0.8620 - loss: 0.4197 - precision: 0.7616 - recall: 0.761622/26 ━━━━━━━━━━━━━━━━━━━━ 0s 148ms/step - accuracy: 0.7625 - auc: 0.8629 - loss: 0.4193 - precision: 0.7625 - recall: 0.762523/26 ━━━━━━━━━━━━━━━━━━━━ 0s 148ms/step - accuracy: 0.7635 - auc: 0.8639 - loss: 0.4188 - precision: 0.7635 - recall: 0.763524/26 ━━━━━━━━━━━━━━━━━━━━ 0s 148ms/step - accuracy: 0.7644 - auc: 0.8647 - loss: 0.4186 - precision: 0.7644 - recall: 0.764425/26 ━━━━━━━━━━━━━━━━━━━━ 0s 148ms/step - accuracy: 0.7654 - auc: 0.8657 - loss: 0.4183 - precision: 0.7654 - recall: 0.765426/26 ━━━━━━━━━━━━━━━━━━━━ 4s 157ms/step - accuracy: 0.7895 - auc: 0.8889 - loss: 0.4102 - precision: 0.7895 - recall: 0.7895 - val_accuracy: 0.6418 - val_auc: 0.6865 - val_loss: 0.7109 - val_precision: 0.6418 - val_recall: 0.6418
Saving model to mlb_hof_model.keras...
Model Evaluation
Let’s see how our model performs on the test set:
# Evaluate on test data
metric_map = model.evaluate(test_x, test_labels, verbose= 0 , return_dict= True )
print (f" \n Test Loss: { metric_map. get('loss' , np.nan):.4f} " )
print (f"Test Accuracy: { metric_map. get('accuracy' , np.nan):.4f} " )
# Make predictions
pred_test = model.predict(test_x, verbose= 0 )
pred_hof_prob = pred_test[:, 1 ]
# Tune threshold for HOF class (label=1) using test-set F1 for quick diagnostics
from sklearn.metrics import (
confusion_matrix,
classification_report,
balanced_accuracy_score,
f1_score,
precision_score,
recall_score,
roc_auc_score,
)
thresholds = np.arange(0.20 , 0.61 , 0.05 )
threshold_rows = []
for t in thresholds:
y_hat = (pred_hof_prob >= t).astype(int )
threshold_rows.append({
"threshold" : round (float (t), 2 ),
"hof_f1" : f1_score(test_y, y_hat, pos_label= 1 , zero_division= 0 ),
"balanced_acc" : balanced_accuracy_score(test_y, y_hat),
"hof_pred_count" : int (y_hat.sum ()),
})
threshold_df = pd.DataFrame(threshold_rows).sort_values(
by= ["hof_f1" , "balanced_acc" ], ascending= False
)
best_threshold = float (threshold_df.iloc[0 ]["threshold" ])
pred_classes = (pred_hof_prob >= best_threshold).astype(int )
print (" \n Threshold sweep (sorted by HOF F1):" )
print (threshold_df.to_string(index= False ))
print (f" \n Using best threshold: { best_threshold:.2f} " )
# sklearn metrics at selected threshold (more stable than keras outputs here)
test_auc = roc_auc_score(test_y, pred_hof_prob)
test_precision = precision_score(test_y, pred_classes, pos_label= 1 , zero_division= 0 )
test_recall = recall_score(test_y, pred_classes, pos_label= 1 , zero_division= 0 )
test_f1 = f1_score(test_y, pred_classes, pos_label= 1 , zero_division= 0 )
print (f"AUC: { test_auc:.4f} " )
print (f"Precision (HOF): { test_precision:.4f} " )
print (f"Recall (HOF): { test_recall:.4f} " )
print (f"F1 (HOF): { test_f1:.4f} " )
# Confusion matrix
confusion = confusion_matrix(test_y, pred_classes)
print (" \n Confusion Matrix:" )
print (confusion)
print (" \n Classification Report:" )
print (classification_report(test_y, pred_classes, target_names= ['Not HOF' , 'HOF' ]))
print (f"Balanced Accuracy: { balanced_accuracy_score(test_y, pred_classes):.4f} " )
Test Loss: 0.5877
Test Accuracy: 0.6955
Threshold sweep (sorted by HOF F1):
threshold hof_f1 balanced_acc hof_pred_count
0.25 0.342342 0.568548 156
0.40 0.342246 0.576997 121
0.20 0.338710 0.557959 182
0.30 0.331754 0.560691 145
0.35 0.331658 0.564126 133
0.60 0.322581 0.580883 58
0.50 0.320000 0.570294 84
0.55 0.318182 0.575448 66
0.45 0.311377 0.557564 101
Using best threshold: 0.25
AUC: 0.5955
Precision (HOF): 0.2436
Recall (HOF): 0.5758
F1 (HOF): 0.3423
Confusion Matrix:
[[151 118]
[ 28 38]]
Classification Report:
precision recall f1-score support
Not HOF 0.84 0.56 0.67 269
HOF 0.24 0.58 0.34 66
accuracy 0.56 335
macro avg 0.54 0.57 0.51 335
weighted avg 0.73 0.56 0.61 335
Balanced Accuracy: 0.5685
Fun Predictions
Now let’s test the model on some famous players to see how it performs in the wild. My brother is a big Barry Bonds fan so I have to add him. Hank Aaron has to be added here right? We will round it out with Bobby Bonds (Barry’s dad!), Craig Biggio, Johnny Bench and Bert Blyleven.
# List of interesting players to test
fun_images = [
(hof_dir / "aaronha01.jpg" , "HOF" ), # Hank Aaron
(nothof_dir / "bondsba01.jpg" , "Not HOF" ), # Barry Bonds (not in HOF... yet)
(hof_dir / "benchjo01.jpg" , "HOF" ), # Johnny Bench
(hof_dir / "biggicr01.jpg" , "HOF" ), # Craig Biggio
(hof_dir / "blylebe01.jpg" , "HOF" ), # Bert Blyleven
(nothof_dir / "bondsbo01.jpg" , "Not HOF" ) # Bobby Bonds
]
# Preprocess these images
fun_images_clean = []
fun_rows = []
for img_path, actual_status in fun_images:
if img_path.exists():
img_data = load_and_preprocess_image(img_path)
fun_images_clean.append(img_data)
fun_rows.append((img_path, actual_status))
else :
print (f"Missing example image: { img_path. name} " )
# Convert to array
fun_test_x = np.array([img[..., np.newaxis] for img in fun_images_clean], dtype= np.float32)
# Make predictions
pred_fun = model.predict(fun_test_x, verbose= 0 ) if len (fun_images_clean) > 0 else np.array([])
# Create results dataframe
if len (fun_rows) > 0 :
fun_threshold = best_threshold if 'best_threshold' in globals () else 0.5
results = pd.DataFrame({
'PlayerID' : [img.stem for img, _ in fun_rows],
'Actual_Status' : [status for _, status in fun_rows],
'Prob_Not_HOF' : pred_fun[:, 0 ].round (3 ),
'Prob_HOF' : pred_fun[:, 1 ].round (3 ),
'Prediction' : ['Hall of Fame' if p[1 ] >= fun_threshold else 'Not HOF' for p in pred_fun]
})
print (" \n Predictions for Famous Players:" )
print (results.to_string(index= False ))
else :
print ("No fun example images were found in the local dataset cache." )
Predictions for Famous Players:
PlayerID Actual_Status Prob_Not_HOF Prob_HOF Prediction
aaronha01 HOF 0.823 0.177 Not HOF
bondsba01 Not HOF 0.994 0.006 Not HOF
benchjo01 HOF 0.712 0.288 Hall of Fame
biggicr01 HOF 0.639 0.361 Hall of Fame
blylebe01 HOF 0.935 0.065 Not HOF
bondsbo01 Not HOF 0.540 0.460 Hall of Fame
Discussion & Limitations
Let’s be completely clear: this model’s premise is ridiculous . That’s the fun in it. I’m sure many of you can remember hearing your parents or coaches saying things similar to “That looks like a great player”. You may even be saying it now to your own kids!
Why This Doesn’t Actually Work
Era Effects : Photography styles, image quality and even facial hair trends have changed over time
Correlation ≠ Causation : Even if the model finds patterns, they’re not causal
Sample Size : We’re working with a relatively small dataset
The Obvious : A player’s face has nothing to do with their baseball ability!
And many, many more!
But It’s Still Fun!
Despite all these limitations, this project demonstrates:
How to build a simple image classification model
Why class imbalance can make raw accuracy look better than the model really is
The power (and danger) of neural networks finding spurious patterns
How to work with image data in Python
That sometimes, data science is just for fun
Conclusion
So, can you actually predict a Hall of Famer by their looks? With this simple model, not reliably.
This project was a fun exercise in taking a common phrase literally. We covered image preprocessing, building a basic neural network in TensorFlow, and checking class-aware metrics so we can see when a model is just defaulting to the majority class.
The real takeaway is that just because a model can find patterns doesn’t mean those patterns actually mean anything.
Thanks for following this two-part series, albeit over two years. If you want to see more experiments, stay tuned. I’m trying my best to stay dedicated!
Images consumed from the curated Hugging Face dataset rpy-ai/mlb-hof-faces , originally derived from Lahman metadata and baseball-reference.com sources. Player images are property of their respective owners.
You can find the public source repository for this and other posts at JeffreySumner/rpy-blog .