Implemented Logistic Regression from scratch using Gradient Descent and compared it with sklearn's LogisticRegression. π
β Sigmoid Activation for probability estimation
β Gradient Descent Optimization for parameter updates
β Decision Boundary Visualization comparing manual vs sklearn implementation
β NumPy-based implementation with step-by-step optimization
Both models successfully converge, and their decision boundaries closely match! π―
π Check out the full notebook for implementation details!