In this challenge, I expand the linear example into polynomial regression!

💻Challenge:

Links discussed in this challenge:

🔗 TensorFlow.js:

🎥 Linear Regression:

🚂Website:

💡Github:

💖Membership:

🛒Store:

📚Books:

🖋️Twitter:

Video editing by Mathieu Blanchette.

🎥 TensorFlow.js:

🎥 Intelligence and Learning:

🎥Coding Challenges:

🎥Intro to Programming using p5.js:…

sql tutorial for beginners with examples

I was able to do your variable degree exercise by using an array of tensor scalars, but I couldn't figure out how to do it with the weights stored in a tensor1d, which seems like the natural way to store weights. Anyone get it working with the weights in that format?

actually right now u can use a number at the .pow function

thanks for the wonderful video. where can i get the code for this program

this is the first time I understand what tensorflow is for, thanks

love the @roguenasa tshirt The Coding Train, love your work and dedication to education too!

The "fancy high degree" polynomial you're looking for is Lagrange Interpolation Polynomial. Its degree is not that high (amount of points -1 ). And yes its use is often inappropriate. I think the choice of the degree depends of the context.

"-What is the degree of the theoritical function that usually describes what you're studying?

-Oh, it's the trajectory of a falling object so degree should be 2

-Here we go"

Or you could do a Bézier curve

this project was very fun to follow along 🙂

Dan, Thanks to your effort. I have a question. How can I get the Loss(?) (without build .model.

One of the insightful things I realised( which may not be such a big deal) is that the "a" term will end up being zero if there are only two points on the canvas. If more than two points are present( close to a line) but still don't define a line accurately, the "a" term is expected to be very very small.

great

One of my favourite videos, thank you Dan

I do believe that I have an answer to your challenge of choosing a degree for the polynomial and then finding the regression. I've not very good with coding, but I hope I can relay the ideas enough that a coding solution could be found. First, all polynomials should be generated with the binomial theorem: ( N choose J ) * (ax + b) ^ (N – J).. "N choose J" is the number of combinations of choosing j-items out of n-items.. Or: N! / [ (N-J)! * (J)! ]. In this example, N is the degree of the polynomial and you iterate over J from 0 to N. Then there needs to be the same number of Polynomials generated as N. Place each polynomial into the row of an N x N Matrix. From here sum the coefficients for the respective terms and use these to draw the polynomial. Training would require the a and b terms to be tweaked for each polynomial. My only fear with this when dealing with an even root the leading coefficient can't be negative. maybe a negative needs to be hard coded somewhere?

Great job! It's awesome! You can make a real chatbot with AI something like is LUIS.

thanks for the tutorial, https://www.indmind.ga/polynomial-regression

Next, you can make

INFINITEparts of this challenge, like:1. Linear Regression

2. Quadratic Regression

3. Cubic Regression

4. Quartic Regression

5. Quintic Regression, etc,.

I wonder whether it might be helpful, with an eye to generalising to higher degree polynomials, to think not about

(a×x^3 + b×x^2 + c×x + d)

so much as

(((a × x + b) × x + c) × x + d)

so you can set up an array of coefficients and simply iterate through them applying .mul() and .add()

Very nice video, as allways. I justwanted to mention that for instance in physiks linear regression is usually enough because most of the time you get funktion f~x^n and then you can plot your data over x^n insted of x which then should give you a linear distribution (if the formular and data are correct) 😀

If we have some data points coordinates and with this algorithm draw the graph and finally return to us the result function? Nice video!!!!!

Later in the series will you be covering subjects like generative adversarial networks?

Could you do a dwitter challenge?

You can approach this using linear algebra instead so that the degree can be dynamic like you want it. Least squares is a method that uses a matrix to find the coefficients of your polynomial. It is exactly the method you need to use to be able to avoid overfitting as well (any set of n points can be fit by a polynomial of degree n-1, but it may not follow the real trend of your data because it is trying too hard to precisely fit the data given) , because the dimensions of the matrix are exactly equal to the degree of the polynomial that you attempt to use, and can be increased and decreased by hand or using error analysis.

The formula that you use is inverse(A(tranpose)*A)*A(transpose)*vector_of_y_values, where the matrix A is the Vandermonde matrix where each row is the successive powers of each x value. This gives you the vector of your coefficients for that particular degree of polynomial.

https://en.wikipedia.org/wiki/Vandermonde_matrix

So instead of having to hard code adding new variables, you can dynamically change the size (drop down menu?) of a 2-D array for the matrix and a 1-D array for the vector of y-values used (this couldn't be larger than one less than the number of points that you've currently drawn) and then compute your coefficients that way. (Does js use arrays? I don't code in js)

Then you'd just use your coefficents in a loop with your tensorflow adds and pow operations.

P.S. I love that you nerded out doing polynomial regression even though most other people would say "Ew, math"

Right after watching this video I was like "Pffff, I can do better than that. I will make it approximate the given points with a conic". For those of you who aren't math nerds: conics are generalization of ellipses, parabolas and hyperbolas. Copied the code, made some tweaks in it. All of a sudden I understand I got no idea how to draw a conic (ofc I can draw it pixelwise but that is clearly too slow). I found that bezier curves can easily help drawing a parabola, but nothing about hyperbola… Guess I'm stuck now.

Cx 16:28

It would be nice to print the R^2 value as well; perhaps you could also use that to choose the order of the equation? Eg, if R<0.9, increase order by 1. It would also be interesting to plot R vs order

Would you try to program a mobius strip??

OmG !! Great teacher. Where the math is really applied… loving to learn. Thanks a ton..

Hm, y=mx+d? In Austria it's y=kx+d

Can you use tensor in processing? I would like to see a coding challenge with tensor in the processing environment. Keep up the good work!

This is freaking beautiful

You're my favourite youtuber ! 🚂🦄✨

Man… People like you makes the world a better place to live, "saludos desde Argentina"

U r the best teacher in the world.

Huge fan😊

now you have to do logistic regression too. even though its classification and not regression

Love your beard

Dan, I think the next natural step is to redo the Nature of Code with TF, with the rules of motion no longer dictated by the applyForce function, rather using the rules to train the AI to position the boids. The initial algorithm becomes the training wheels, and the AI takes over.

It's starting to look like magic. The black box aspect of this most impressive demonstration is worrisome. It works, but you don't know how. It can tell you the difference between a cat and a dog, but not how it figured it out. This is where ethics become so important, because if I'm rejected for a loan by an AI, I'm going to want a reason, not just a result.

Dan, thanks for this amazing tutorial. I was actually thinking to implement it myself and you noticed my comment asking about it in the chat. Cool, at least now I know how to do it.

First of all, I absolutely love this series on tensorflow.js! Great job! I am learning a lot — I doubt if I would ever have delved into this topic without your hands-on demos.

Anyway, having worked with programs that evalutate polynomials in the past, I thought I'd make a suggestion for efficiency and (IMHO) elegance:

Notice that:

ax^2 + bx + c can be written as ((ax + b)x + c — saving one multiply operation.

and..

ax^3 + bx^2 + cx + d can be written as (((ax + b)x + c)x + d – saving 3 multiply operations (if you count cubing as 2 multiply's).

The higher the degree, the more multiply operations you save.

But even better, look at how easy it is to add a degree to the polynomial. Just multiply the previous expression by x and add the next coefficient. And now there's no need to call the square function, etc.

So, your predict code for one degree would be

const ys = a.mul(xs).add(b) (I'm using

ainstead ofmhere, to demo the symmetry that follows)For 2nd degree (quadratic) it's

const ys = a.mul(xs).add(b).mul(xs).add(c)

For 3rd degree (cubic) it's

const ys = a.mul(xs).add(b).mul(xs).add(c).mul(xs).add(d)

Notice that there's only one level of parentheses making it easy to understand, and for each degree you just append a multiply and add operation.

Whaddya think?

Man, you guys put a ton of work into cleaning up the stream, looks awesome! Fruit train pulling into the station. 🚂 🍇

Dan, if you look for a good resource for algebra, there is 3blue1brown‘s series.

I learn more here than my actual coding class