Box Office Results: Leaderboard Updated as 'Heat' and 'White House Down' Almost $1 Million Off Estimates

Box Office Results: Leaderboard UpdatedI've updated the RopeofSilicon Box Office Challenge leaderboard after processing the results from this weekend's box office where Monsters University dipped about 45% for its second weekend at #1 with $45.6 million.

The Heat and White House Down both came in $900,000 under their Sunday estimates with The Heat bringing in $39.1 million and White House Down ending the weekend with a disappointing $24.8 million.

The other film readers were predicting was the second weekend of World War Z, which ended up dropping 55% for a $29.7 million weekend.

In other news, Man of Steel crossed the $500 million mark worldwide this weekend -- $248.7 million domestically, and $271.7 internationally -- making it the biggest June release ever. With a reported budget of $225 million this is good news for Warner Bros. as the film is only 17 days into its overall release.

As for that leaderboard, here is the current top ten, but you can see the full list of rankings right here and for the complete box office top ten and points awarded for this weekend specifically click here.

If anything this weekend proved it's never too late to get in the game. This coming weekend you'll be offering up predictions for The Lone Ranger, Despicable Me 2 and a couple other, yet to be determined titles. If you didn't get in the game this weekend, be sure and watch out for Laremy's Box Office Oracle article on Thursday and get your name on that leaderboard.

  1. Austin Rains (8 points)
  2. Charles Ritter (8 points)
  3. Rick (8 points)
  4. Andrew13 (5 points)
  5. dam94 (5 points)
  6. Driver (5 points)
  7. Duncan Houst (5 points)
  8. GregDinskisk (5 points)
  9. Jack Tyler (5 points)
  10. Ryan Maddux (5 points)
  • GregDinskisk

    I've gotta step it up... Gotta get them points!!!

  • TimmaeXVX

    0.2 off on Monsters and posted my predictions pretty early and I'm still not in the top 5. This game is easily at least 90% luck right now.

    I'd rather recommend a system like this:

    prediction within 5M of actuals: 1 point
    within 3M: 2 points
    within 1M: 3 points
    within 0.5M: 4 points
    Exact: 5 points

    That's not as good as Laremy's idea of a golf-inspired system that tracks exactly how close you are every week but it would require no extra coding, you'd only have to award a lot more points.

    The way it is right now it's basically a lottery and the fact that it's almost impossible to get points will discourage people quickly.

    • Brad Brevet

      "[I]t would require no extra coding."

      Well, it would, but I think you may be onto something. I'll take a look into it.

      • Chris Etrata

        I also wish the contest goes around estimated numbers instead of actuals.

        • Brad Brevet

          Always going to be actuals, why would I do it around estimates?

    • Brad Brevet

      I've been messing around with some numbers and I think I may have come up with a solution because I think you have a fair point.

      The golf idea Laremy had wouldn't really work because it would essentially require everyone to play every week and if they didn't there would have to be a penalty involved, which I don't really like. So it has to be a point system that goes up.

      Your idea, however, is a better way of looking at it, but it can't be that cut and dry. For example, last weekend only one person was within $5 million of Man of Steel and then The Bling Ring only made $2 million so a bunch of people would have either received no points or a ton of points in the two cases.

      So... what I'm putting together is a point system where I take the average of all user predictions, find the difference between that number and the actual result and anyone with a prediction closer than that gets a point. Here's an example of what I'm working on right now with this week's numbers for The Heat.

      The Heat made $39.1 million, the average reader prediction was $36.855 million. The difference is $2.245 million. Therefore any prediction that was off by $2.245 million would get a point. The point breakdown for this example would look like this, taking the $2.245 number and dividing by five:

      • 6 Points for perfect prediction
      • 5 Points for prediction within $0.449
      • 4 Points for prediction within $0.898 million
      • 3 Points for prediction within $1.347 million
      • 2 Points for prediction within $1.796 million
      • 1 Point for prediction within $2.245 million

      With this example, 41 people of the 136 that predicted would have received at least one point.

      • maja

        I think that this would work well - more people would feel part of the game scoring points (rather than feeling demotivated) and the scoring at the top would be closer. The only downside is that it would be harder for new people to get into the game later on in the year but I guess that's the bonus of being involved all year round. If this leaderboard is implemented, would this be applied retrospectively for the last couple of weeks?

        In my opinion it would be worth keeping two serparate leaderboards, as mentioned below, until the year on a trial basis and see how it pans out for next year.

  • Athar

    Is the leaderboard updated Brad? Because my prediction for Monster's University was spot on. And it was also the first prediction on board. But I have not got any points this weekend.

    • Brad Brevet

      The actual was $45.6 million and your prediction was $46.5 million, $0.9 million off.

      • Athar

        Apologies for the error. It's just that your article mentions actuals as $46.5 million which caused the confusion.

        • Brad Brevet

          Argh, sorry about that. My fault in the article, but the results are accurate. I am working on a new way of scoring the game as you can see from my comment above. Trying to make it a little more even for all that play and get close predictions.

          • Athar

            I think changing the grading system might create some issues.
            1. After a couple of weeks, the chances of getting new users to predict would reduce as by then the points amassed by the current predictors would have increased.
            2. I think the emphasis of the changed system (which you have come up with) is more on being closer than the rest rather than being accurate. So if I am not sure about the potential of say Despicable Me 2, all I have to do is take the aggregate of the Top Commentors, and then add a few dollars up or down as per my judgement. The chances of scoring points would be higher.

            I think TimmaeXVX's system would be more suited but instead of actual numbers, we should consider fixed percentages.
            Eg. For The Heat, those within 1% should get 5 points, 2% should get 4 points and so on.

            • Brad Brevet

              I understand what you're saying about the possibility of someone gaming the system, which is actually why I implemented the "predict first and get the points" option so people couldn't copy others' work.

              I don't like percentages in this case because they make for a pretty wide margin in some cases. Perhaps the best option is to create a second leaderboard calculating an accuracy prediction and averaging that out over the course of the year. The higher the percentage, the higher you rank.

              At the very least, by the end of the year we'll have two charts to compare and see how close they are. If a change needs to be made for next year I can do that, if not we can keep both boards.

              • Athar

                I was a little sceptical too about percentages because if the same is applied to The Bling Ring predictions, the numbers we would have to consider would be in Thousands which is a pain to predict, calculating is an altogether different issue.

                Maybe we can tweak percentages as per range, say 2-5 million predictions would get 5 points for users whose predictions are within 10%, 4 points for predictions within 15%.

                I know this concept would be a little difficult to manage.
                For now the 2 Leaderboard concept sounds nice.

              • Brad Brevet

                It's interesting because I notice when Box Office Mojo used to do a prediction contest at one point they worked on points and switched to accuracy percentages instead for the same reasons Timmae mentioned above.

  • topyxyz

    Estimates give false hope. Oh well, better luck next time.