Jun-01-2019, 04:58 PM
Hi there. I've been using Python in my calculus class for a few semesters now and have a decent draft collection of notebooks that I'm considering moving to a book, wanted to gauge interest in such an endeavor. The main idea is to integrate much more from the world of the approximate and statistics into the traditional content. Here is what my current outline would look like, appreciate any feedback or suggestions. I'd like to try to find a way that the Neural Network is the culmination of our work and we demonstrate how this contemporary idea relies on so much from Calc I under the hood.
-
Section I
- Introduction to Python -- representing functions as tables, graphs, formulas
- Introduction to Python II -- control flows and the Babylonian Algorithm
- The Area Problem -- approximating areas with simple geometries
- Summations I -- Finite examples
- Summations II -- Moving to the infinite
- Back to Areas -- Riemann Sums
- Probability I -- Finite Examples
- Probability II -- Continuous Examples
- Putting it Together -- Filters and Convolution
Section II
- Approximating Slope -- finite examples
- Invoking Infinity -- definition of derivative
- Linear Approximation
- Move to 3D -- Define and plot functions
- Partial Differentiation -- Derivatives with respect to anything
- Optimization I -- introduction to finding maximum and minimum values
- Optimization II -- the case of Least Squares
- Finding Zeros -- Newton's Method
Section III
- Populations I: Arithmetic and Geometric Growth
- Populations II: Logistic Growth
- Solving ODE's: Euler's Method
- Application: Gradient Descent
- Application: Logistic Regression
Section IV
- Introducing ANN's: The Perceptron
- Using Gradients: The Chain Rule
- Derivatives Again: Autodiff