Close/Cross position win rates (updated for WCS)

A few weeks ago, I wrote a post comparing win rates on cross and close positions 4-player maps. Here’s what it looks like with the WCS replays released by Blizzard yesterday.

(Edit: Fragbite also released their replays, so those are included as well)

PvT PvZ TvZ
Frost
Close Positions 47-39 (54.65%) 49-64 (43.36%) 39-49 (44.32%)
Cross Positions 23-20 (53.49%) 36-25 (59.02%) 15-12 (55.56%)
Total 71-59 (54.62%) 88-92 (48.89%) 55-64 (46.22%)
Alterzim Stronghold
Close Positions 22-17 (56.41%) 29-24 (54.72%) 12-8 (60.00%)
Cross Positions 4-9 (30.77%) 16-10 (61.54%) 5-5 (50.00%)
Total 26-26 (50.00%) 45-34 (56.96%) 17-13 (56.67%)

Methods and Discussion

The replays are mostly from tournament replay packs, with the vast majority being from WCS. Of course, all conclusions must be drawn from what is still a relatively small sample size.

Note that close positions are twice as likely as cross positions. I’m not sure whether that fact is noted as often as it should be.

A confounding factor here is bans. I’m not exactly sure what the effect is, but I’m sure it’s relevant since it’s not a uniformly random sample of potential games.

Frost appears to be far more favorable for Zerg at cross positions. Notably, it crosses over the 50-50 mark depending on the positions. It’s not immediately clear to me whether that advantage is coming early or late, but the game lengths are visible on that page as well if you want to look into that.

Alterzim is less clear, though that PvZ is quite dramatic. It’s a super-small sample size, though, but there may that does or doesn’t confirm common knowledge.

Let me know if there’s any other analyses I have previously done or that you would like to see in the future. The data just got a lot richer and relevant, so hopefully there’s good stuff in there to discover.

Update after a year of extracting build orders from replays

A year ago, I launched Spawning Tool, and it’s grown tremendously in that time. It started out as an experiment in using Blizzard’s replay format to grab build orders. Since then, it has become a site for organizing and labeling replays not only to steal build orders but also to analyze replays in bulk.

So where are we now? By the numbers, Spawning Tool has:

  1. 9,051 replays uploaded
  2. 132,364 replay tags
  3. 23,092 lines of code

I like to hope that Spawning Tool has contributed meaningfully to our understanding and analysis of StarCraft. Highlights are:

  1. Comparing win rates by supply difference (part 1, part 2, reddit)
  2. Putting PvT Blink Stalkers in perspective (blog)
  3. Finding close/cross position win rates (blog, reddit)

I ended up going on hiatus for quite awhile around the beginning of this year, but I have cranked out a few changes recently to highlight as well:

  1. Tags are now directly searchable so you can understand the hierarchy and dig down into specific builds and players
  2. Added spawn positions for players to mark cross and close positions
  3. Started using machine learning to suggest build order tags for replays
  4. Added an easy accept/reject option for rapidly labeling build orders
  5. Drag-and-drop file upload
  6. and lots of other bug fixes, optimizations, and changes

Of course, I have an ask for all of you as well:

  1. Label a few build orders just by accepting or rejecting suggested builds. The archive of replays is as good as its searchability, and build orders still require human expert knowledge
  2. Fill out a survey about your experience with Spawning Tool. I would love to know where to take the site from here

Thanks to everyone in the community for their support. Specifically, I would like to mention GraylinKim (creator of the sc2reader library), dsjoerg (creator of ggtracker), and ChanmanV (host of so many shows) for all of their help in getting Spawning Tool this far. I look forward to seeing what else we can do in the next year!