Odds, Log Odds, and Logit: One Concept, Three Views

Published on January 4, 2026
Odds, Log Odds, and Logit: One Concept, Three Views

Why you should care about odds

Probabilities are intuitive, but they are not always convenient for updates. Odds are often easier because evidence updates become simple multiplication.

Then log odds make the same updates additive, which is useful for combining multiple pieces of evidence.

The logit function is the mapping between probability and log odds. Same concept, three views.

1) Probability to odds

Let p be a probability between 0 and 1.

• odds = p divided by (1 minus p)

Examples:

• p = 0.50 gives odds = 1.0000

• p = 0.20 gives odds = 0.2500

• p = 0.80 gives odds = 4.0000

Odds are a ratio. Odds of 4 means the event is four times as likely to happen as not happen.

2) Odds to probability

To go back:

• p = odds divided by (1 plus odds)

Example:

• odds = 2.0000 gives p = 2 divided by 3 = 0.6667

3) What log odds are

Log odds are simply the natural log of odds:

• log odds = ln(odds)

Key intuition:

• odds multiplication becomes log odds addition.

If evidence multiplies odds by 1.5, then log odds increase by ln(1.5). If you combine two independent evidence updates, you add their log odds shifts.

4) What logit is

Logit is the log odds expressed directly from probability:

• logit(p) = ln(p divided by (1 minus p))

So logit(p) equals log odds. People often say "logit" when they mean the probability-to-log-odds transform.

5) Likelihood ratio updates in each view

In the odds form of Bayes theorem:

posterior odds = prior odds times likelihood ratio

In log odds form:

• log posterior odds = log prior odds plus ln(likelihood ratio)

This is why log odds are popular in modeling. They convert evidence into additive increments.

Worked example

Start with prior probability p = 0.40:

• prior odds = 0.40 divided by 0.60 = 0.6667

• prior log odds = ln(0.6667) = -0.4055

Now suppose evidence implies likelihood ratio = 2.0:

• posterior odds = 0.6667 times 2.0 = 1.3334

• posterior p = 1.3334 divided by 2.3334 = 0.5714

In log odds:

• posterior log odds = -0.4055 plus ln(2.0)

• ln(2.0) = 0.6931

• posterior log odds = 0.2876

• convert back gives p = 0.5714

Same result, different view.

How this connects to prediction markets

Markets quote prices, but you think in probabilities. Conversions matter because small mistakes can create false edge.

Key points:

• Convert market price to implied probability and confirm price scale.

• Do not treat odds as the same thing as market "odds" language in sportsbooks. Here, odds are just a probability ratio.

• If you model updates in log odds, convert back to probability before computing fair price and edge.

Common mistakes

Mixing percent and probability: 60 percent is p = 0.60, not 60.

Using base 10 logs by accident: log odds and logit typically use natural logs. Consistency matters more than base, but do not mix bases.

Extreme values: p close to 0 or 1 creates huge logit values. That is why fake certainty is dangerous and often signals overconfidence.

Forgetting costs: even perfect conversions do not guarantee profit if you ignore fees and execution costs.

Takeaway

Use probabilities to communicate, odds to update, and log odds or logit to combine evidence cleanly. Then convert back to probability to compute fair price, edge, and break-even decisions in prediction markets.

Related

Bayes for Humans: Updating with Odds and Likelihood Ratios

Predicted Probability: How to Build a Forecast You Can Trust

Market Price to Implied Probability: Avoiding 100x Errors

← Back to Guides