Date post: | 19-Jan-2016 |
Category: |
Documents |
Upload: | randell-carroll |
View: | 214 times |
Download: | 0 times |
Transfer Market Optimizerby
Colton Freund and Zachary Krepps
Soccer
Each team has 11 players
Attacker
Midfielder
Defender
Goalkeeper
Two 45 minute halves
Field size
Length is between 100 yards and 130 yards
Width is between 50 yards and 100 yards
Player Transfers
Transfer windows for England
Pre-Season July 1 - September 1
Mid-Season January 1 – January 31
Recent transfer prices from English Premier League
Raheem Sterling, Midfielder, from Manchester City to Liverpool, 69.4 ME
Josh Vickers, Goalkeeper, from Swansea to Arsenal, Free
Transfer values are not player wages
Players can be transferred between leagues
How do they work?
Option 1
Club A has a player
Club B wants said player
Club A and Club B work out a price to let said player out of his contract
Club B and the player work out a wage
The player is transferred
Option 2
The player’s contract for Club A expires
Club B works out a wage for said player
The player is transferred
Formal Problem Statement
Formally, we are looking to predict player transfer fees using a backpropagation algorithm, topological sort and averaging algorithm to compare the algorithms efficiency and accuracy to correctly predict player transfer fees for future seasons. The constraints to be factored in these algorithms will be:
Inputs
Age
Position
# of yellow and red cards
Appearences
Minutes
Tackles
Interceptions per game
Fouls
Offsides
Clearances
Dribbles
Own goals
Goals
Assists
Shots per game
Key passes
# of times fouled
Average passes per game
Passing %
Crosses
Backpropagation
Create Neural Network
Set Random Weights
Create test set
While Error is too large
Run in puts through network
Sum Inputs X Weights
Output through sigmoid function
Compute Error
Use error to re-adjust weights
Backpropagation Implementation
O(n^2) with respect to the number of neurons
28(inputs) -> 30(hidden) –>60(hidden)->40(hidden) -> 1(output, between 0, 1)
Neuron objects stored in list
Neuron objects responsible for
Taking inputs
Keeping track of all its weights
Computing its error
Computing its output
Updating its weight
Backpropagation Results
Network has output between 0 and 1
Output = (value)/(max transfer value for season)
Value = (Output)*(max transfer value for season)
Currently training on all players transferred in 14/15 transfer window
Once network is trained:
Run stats for players transferred in 15/16 to predict value
Compare to their actual transfer value
Determine accuracy of this trained network
Topological Sort
Split the data of players up into four different groups
Keeper
Defender
Midfielder
Attacker
Take a specific player and compare each statistic with the comparable statistic of the players in the data.
Find the player with the closest statistical match on the 26 different parameters and insert them into the list.
Use the surrounding players to match a transfer fee with the specific player
Average Algorithm
Split the data of players up into four different groups
Keeper
Defender
Midfielder
Attacker
Take a specific player and compare each statistic with the comparable statistic of the players in the data.
For each statistic find the closest player(s) and add there fee onto a stack.
Average all the fees together for the specific players transfer fee.
Findings
The percent difference
Percent Difference = abs(actual fee – proposed fee)/mean * 100
The Average Algorithm and Topological Sort are between 7% to 200%
The Average Algorithm is always better than the Topological Sort
Questions
Why is there such a wide range of error?
More data
Outliers
Others that have tried fee prediction
What does the term Backpropagation refer to?
The re-adjusting of weights in a neural network based on the output/output error.
What is the complexity of the Neural Network? Why?
O(n^2) with respect to number of neurons in the largest layer. For each additional neuron, it is O(n) to iterate through its respective layer list. After it is selected within its own layer you calculate the output by iterating through all the inputs (same as number of outputs or Neurons from the previous layer) O(n) and multiplying them by their weights . Hence the complexity of O(n^2).
Questions?