Thursday, April 18, 2013

It's like we knew how the NBA would finish!

The NBA season has come to an end and so has our model predictions.  We developed a two models that could predict the winner of each game and who would cover the spread.  Starting just two weeks into season, our model was off to a tremendous start!

To recap, for the entire season we were able to predict 81.04% of the winners for every NBA game this season.  Since our website debut in early March, we maintained a respectable 77.03% accuracy.  In terms of covering the spread, our season average held an astonishing 72.71% accuracy rate.  Since early March, we saw a lower accuracy of 61.54%.  We only saw three days since March 4 with a record < .500. 

Although our accuracies were lower at the end of the season, this is to be expected - especially in the last two weeks.  For most of March, our spread accuracy hovered around the 66% mark.  As key players started to sit and injuries started to pile up, our model had a difficult time adjusting.  For instance, the Spurs were crushing teams throughout the season -- our model thought they should maintain that consistency until the final day.  Most people would not pick every game -- we chose not to discriminate!  Whether it was a confidence of 1% or 99%, we made our pick.  

We hope you enjoyed following our NBA picks.  Remember you can check out all our previous picks and results on the NBA Game Predictions page of our website.  We will soon be moving onto MLB -- expect us to roll out our model in late May (our model needs more pitching data). 

Tuesday, April 2, 2013

Perduco Sports / Georgia Tech - Great Minds Think Alike!

A Perduco Sports fanatic recently sent me an article that talks about Georgia Tech's use of a Logistic Regression/Markov Chain model to predict NCAA tournament brackets.  Their model has correctly picked the National Champion 3 of the last 5 years.  Georgia Tech claims their model has been the most accurate over the last 10 years -- proving better than more than 80 other models/organizations.  This piqued my interest. 

Taking a look at their bracket, I was excited but not surprised.  What excited me -- their model chose Florida to win the National Championship -- same as Perduco Sports.  Of course, we now know that is impossible since they were shredded by the Wolverines.  However, our use of neural networks and their use of regression/markov models seems to show that statistically, Florida should have been a serious contender for the title game! 

However, what didn't surprise me was their lack of upset picks throughout the tournament.  Their only true upset picks came at the 7-10 and 6-11 matchups.  Not one 5-12 matchup was selected, which is surprising since nearly 40% of the 12 seeds have won first round in the last 10 years.  This lack of upset picks indicates they follow the RPI system very closely -- although they claim to be better than this system. 

Unfortunately, 2013 appears to be a bust for Georgia Tech's model.  Although "madness" is very difficult to predict, I was pleased to see another advanced analytical model obtaining similar results to our approach. 

Perduco Sports had 5 of the Elite 8 teams (just missing Kansas), with all of our Final Four teams still fighting.  There were 16 different possible ways this last Sat/Sun could have gone down.  Luck was not on our side as the one possible combination occurred where we didn't have one team in the Final Four.  There's always next year!