Client

An esports agency dedicated to advertising and promoting the gaming industry.

Background

Goose Gaming leads the ranking of players in Counter Strike, Dota 2, World of Tanks and Apex.

Create a single criteria for the evaluation of players

01 /

Task

We have created an algorithm that determines the strongest players by the number of points earned and takes into account the complexity of the game.

02/

Step 01

Analyzed game statistics for three months

Step 02

Step 03

Step 04

Allocated groups of players with non-standard statistics

Step 05

Step 06

Determined the difficulty of each game

Integrated the algorithm with the Goose Gaming system

Calculated the average duration of the fight in each game

We wrote a algorithm that evaluates all players: leaders and everyone else

We calculated the amount of killed enemies among the three most successful players in the context of each week.

We noticed that CS: GO players get the highest number of kills.

For Apex, data has only been collected since the middle of october two thousand and twentieth.

For Apex, data has only been collected since the middle of october two thousand and twentieth.

How does the algorithm work

We calculated the amount of killed enemies among the top ten players in the context of each week.

CS: GO is again in the lead, but the gap from Dota 2 is smaller, the number of WoT kills is also not significantly lower.

and at the end Dota 2 takes the lead

Consequently, the very best CS:GO players get a disproportionate amount of kills.

Next, we looked at the average number of kills for the players with the lowest indicators: starting from the eleventh place to the end.

As you can see, CS:GO players also dominate among the laggards.

If you pay attention to the total number of players every week, you can see that Dota 2 is the most massive game, while CS: GO, on the contrary, collects the least number of players.

Eventually:

In order to predict average kill count per game, linear regression was built

Weekly odds are calculated as the ratio of the model's predictions for each game.

The coefficients were adjusted as new data is collected.

The coefficients were adjusted as new data is collected.

At the time of delivery of the project, the coefficients were as follows:

Difficulty coefficient -

Difficulty coefficient -

Difficulty coefficient -

Difficulty coefficient -

0,37

0,18

0,48

1

03/

This number can be explained by the fact that CS:GO get kills in and had a larger skilled player pool

04/

3 weeks

4 weeks

week 1

2 months

development

system debugging

launch

05/

Problem

To train the algorithm, we needed to process a large data array of a non-standard format.

We optimized this database and utilized a separate service we could further work with.

Decision

Problem

5% of the players in each game earned an order of magnitude more points than everyone else.

Decision

A separate evaluation algorithm was written for them.

01

02

project duration

In games there are always participants with indicators well above average. In Counter Strike, the average number of killed enemies is 5 thousand. For professionals, this figure reaches 20 - 50 thousand.

06/

Manager

Developer

Project Manager

Konstantin Kubrak

Rinat Mullakhmetov

Alexandra Shchetinina

Emotions from the project

07/

Project manager

Konstantin Kubrak

Â«The problem we were solving looked like a classic example from a school textbook on machine learning. If it existedÂ».

Gamer

Alexander

Â«The algorithm allows us to solve the problem of cheating among the rating participants, because it equalizes the opportunities.Â».

Goose Gaming still uses the system

We plan to add new games to the algorithm

08/

Contact us