1/29/18

Continued working on Artificial neural network project. The first lecture I followed along with explained the implementation of k-fold cross validation, which aims to analyze the mean accuracy and variance of a classifier. I found that the mean accuracy of my classifier was actually around 79%, although the original accuracy I got when I trained it once was around 83%. The second lecture involved using a technique called dropout that deals with the problem of a neural network "overfitting" to the training data set (which results in the network being too "rigid",  leading it to make inaccurate predictions). Dropout involves turning off random neurons in the neural network in different iterations so that the network can better "learn" individual features in the data, making sure the neurons aren't too reliant on each other to make predictions. This process will likely be useful in my poker game, as there are many different situations that occur within the game that I want my neural network to recognize.

Comments

Popular Posts