Advertisement

Logistic Regression Details Pt 2: Maximum Likelihood

Logistic Regression Details Pt 2: Maximum Likelihood This video follows from where we left off in Part 1 in this series on the details of Logistic Regression. This time we're going to talk about how the squiggly line is optimized to best fit the data.

NOTE: This StatQuest assumes that you are already familiar with Part 1 in this series, Logistic Regression Details Pt1: Coefficients:


⭐ NOTE: When I code, I use Kite, a free AI-powered coding assistant that will help you code faster and smarter. The Kite plugin integrates with all the top editors and IDEs to give you smart completions and documentation while you’re typing. I love it!

For a complete index of all the StatQuest videos, check out:


If you'd like to support StatQuest, please consider...
Patreon:
...or...
YouTube Membership:

...a cool StatQuest t-shirt or sweatshirt (USA/Europe):
(everywhere):


...buying one or two of my songs (or go large and get a whole album!)


...or just donating to StatQuest!


Lastly, if you want to keep up with me as I research and create new StatQuests, follow me on twitter:


#statquest #logistic #MLE

StatQuest,Joshua Starmer,Statistics,Logistic Regression,Maximum Likelihood,Machine Learning,Data Mining,Log Odds,Clearly Explained,

Post a Comment

0 Comments