*Bounty: 50*

*Bounty: 50*

I have data with 5 output classes. The training data has the following no of samples for these 5 classes:

[706326, 32211, 2856, 3050, 901]

I am using the following keras (tf.keras) code:

```
class_weights = class_weight.compute_class_weight('balanced',
np.unique(y_train),
y_train)
model = tf.keras.Sequential([
tf.keras.layers.Dense(50, input_shape=(dataX.shape[1],)),
tf.keras.layers.Dropout(rate = 0.5),
tf.keras.layers.Dense(50, activation=tf.nn.relu),
tf.keras.layers.Dropout(rate = 0.5),
tf.keras.layers.Dense(50, activation=tf.nn.relu),
tf.keras.layers.Dropout(rate = 0.5),
tf.keras.layers.Dense(50, activation=tf.nn.relu),
tf.keras.layers.Dropout(rate = 0.5),
tf.keras.layers.Dense(5, activation=tf.nn.softmax) ])
adam = tf.keras.optimizers.Adam(lr=0.5)
model.compile(optimizer=adam,
loss='sparse_categorical_crossentropy',
metrics=[metrics.sparse_categorical_accuracy])
model.fit(X_train,y_train, epochs=5, batch_size=32, class_weight=class_weights)
y_pred = np.argmax(model.predict(X_test), axis=1)
```

The first line on class_weight is taken from one of the answers in to this question: How to set class weights for imbalanced classes in Keras?

I know about this answer: Multi-class neural net always predicting 1 class after optimization . The difference is that in that case, the class weights wasn’t used whereas I am using it.

I am using sparse_categorical_crossentropy which accepts categories as integers (don’t need to convert them to one-hot encoding), but I also tried categorical_crossentropy and still the same problem.

I have of course tried different learning rate, batch_size, no of epochs, optimizer, and depth/length of the network. But it always is stuck at ~0.94 accuracy which is essentially I would get if I predict the first class all the time.

Not sure what is missing here. Any error? Or should I use some other specialized deep network?