# Coding Challenge #105: Polynomial Regression with TensorFlow.js

In this challenge, I expand the linear example into polynomial regression!

💻Challenge:

🔗 TensorFlow.js:
🎥 Linear Regression:

🚂Website:
💡Github:
💖Membership:
🛒Store:
📚Books:
Video editing by Mathieu Blanchette.

🎥 TensorFlow.js:
🎥 Intelligence and Learning:
🎥Coding Challenges:
🎥Intro to Programming using p5.js:…

sql tutorial for beginners with examples View all posts by coder →

### 42 Comments on “Coding Challenge #105: Polynomial Regression with TensorFlow.js”

1. Brett L says:

I was able to do your variable degree exercise by using an array of tensor scalars, but I couldn't figure out how to do it with the weights stored in a tensor1d, which seems like the natural way to store weights. Anyone get it working with the weights in that format?

2. Franco TheHating says:

actually right now u can use a number at the .pow function

3. thanks for the wonderful video. where can i get the code for this program

4. 丘杰 says:

this is the first time I understand what tensorflow is for, thanks

5. james wolf says:

love the @roguenasa tshirt The Coding Train, love your work and dedication to education too!

6. Pierre Ardouin says:

The "fancy high degree" polynomial you're looking for is Lagrange Interpolation Polynomial. Its degree is not that high (amount of points -1 ). And yes its use is often inappropriate. I think the choice of the degree depends of the context.
"-What is the degree of the theoritical function that usually describes what you're studying?
-Oh, it's the trajectory of a falling object so degree should be 2
-Here we go"

7. Holobrine says:

Or you could do a Bézier curve

8. Chris James says:

this project was very fun to follow along 🙂

9. Dadosza says:

Dan, Thanks to your effort. I have a question. How can I get the Loss(?) (without build .model.

10. Rupin Chheda says:

One of the insightful things I realised( which may not be such a big deal) is that the "a" term will end up being zero if there are only two points on the canvas. If more than two points are present( close to a line) but still don't define a line accurately, the "a" term is expected to be very very small.

11. tensor flow says:

great

12. GoddersGaming says:

One of my favourite videos, thank you Dan

13. Chris Townsend says:

I do believe that I have an answer to your challenge of choosing a degree for the polynomial and then finding the regression. I've not very good with coding, but I hope I can relay the ideas enough that a coding solution could be found. First, all polynomials should be generated with the binomial theorem: ( N choose J ) * (ax + b) ^ (N – J).. "N choose J" is the number of combinations of choosing j-items out of n-items.. Or: N! / [ (N-J)! * (J)! ]. In this example, N is the degree of the polynomial and you iterate over J from 0 to N. Then there needs to be the same number of Polynomials generated as N. Place each polynomial into the row of an N x N Matrix. From here sum the coefficients for the respective terms and use these to draw the polynomial. Training would require the a and b terms to be tweaked for each polynomial. My only fear with this when dealing with an even root the leading coefficient can't be negative. maybe a negative needs to be hard coded somewhere?

14. elinaxyz86 says:

Great job! It's awesome! You can make a real chatbot with AI something like is LUIS.

15. Dhakshithraam S says:

Next, you can make INFINITE parts of this challenge, like:
1. Linear Regression
3. Cubic Regression
4. Quartic Regression
5. Quintic Regression, etc,.

16. Phil Boswell says:

I wonder whether it might be helpful, with an eye to generalising to higher degree polynomials, to think not about

(a×x^3 + b×x^2 + c×x + d)

so much as

(((a × x + b) × x + c) × x + d)

so you can set up an array of coefficients and simply iterate through them applying .mul() and .add()

17. fNktn says:

Very nice video, as allways. I justwanted to mention that for instance in physiks linear regression is usually enough because most of the time you get funktion f~x^n and then you can plot your data over x^n insted of x which then should give you a linear distribution (if the formular and data are correct) 😀

18. Rafael says:

If we have some data points coordinates and with this algorithm draw the graph and finally return to us the result function? Nice video!!!!!

19. Dawson Harvey says:

Later in the series will you be covering subjects like generative adversarial networks?

20. 10300 says:

Could you do a dwitter challenge?

21. Charles Clauss says:

You can approach this using linear algebra instead so that the degree can be dynamic like you want it. Least squares is a method that uses a matrix to find the coefficients of your polynomial. It is exactly the method you need to use to be able to avoid overfitting as well (any set of n points can be fit by a polynomial of degree n-1, but it may not follow the real trend of your data because it is trying too hard to precisely fit the data given) , because the dimensions of the matrix are exactly equal to the degree of the polynomial that you attempt to use, and can be increased and decreased by hand or using error analysis.

The formula that you use is inverse(A(tranpose)*A)*A(transpose)*vector_of_y_values, where the matrix A is the Vandermonde matrix where each row is the successive powers of each x value. This gives you the vector of your coefficients for that particular degree of polynomial.

https://en.wikipedia.org/wiki/Vandermonde_matrix

So instead of having to hard code adding new variables, you can dynamically change the size (drop down menu?) of a 2-D array for the matrix and a 1-D array for the vector of y-values used (this couldn't be larger than one less than the number of points that you've currently drawn) and then compute your coefficients that way. (Does js use arrays? I don't code in js)

P.S. I love that you nerded out doing polynomial regression even though most other people would say "Ew, math"

22. Mike Lezhnin says:

Right after watching this video I was like "Pffff, I can do better than that. I will make it approximate the given points with a conic". For those of you who aren't math nerds: conics are generalization of ellipses, parabolas and hyperbolas. Copied the code, made some tweaks in it. All of a sudden I understand I got no idea how to draw a conic (ofc I can draw it pixelwise but that is clearly too slow). I found that bezier curves can easily help drawing a parabola, but nothing about hyperbola… Guess I'm stuck now.

23. fgbhrl says:

It would be nice to print the R^2 value as well; perhaps you could also use that to choose the order of the equation? Eg, if R<0.9, increase order by 1. It would also be interesting to plot R vs order

24. Joan Vila says:

Would you try to program a mobius strip??

25. Srini R says:

OmG !! Great teacher. Where the math is really applied… loving to learn. Thanks a ton..

26. Kim says:

Hm, y=mx+d? In Austria it's y=kx+d

27. Log Out says:

Can you use tensor in processing? I would like to see a coding challenge with tensor in the processing environment. Keep up the good work!

28. Naej says:

This is freaking beautiful

29. Loïc Bertrand says:

You're my favourite youtuber ! 🚂🦄✨

30. Dario Benitez says:

Man… People like you makes the world a better place to live, "saludos desde Argentina"

31. Wolf Rage says:

U r the best teacher in the world.

Huge fan😊

32. Muhammed Shameel says:

now you have to do logistic regression too. even though its classification and not regression

33. coderhonnybord15 says:

34. kustomweb says:

Dan, I think the next natural step is to redo the Nature of Code with TF, with the rules of motion no longer dictated by the applyForce function, rather using the rules to train the AI to position the boids. The initial algorithm becomes the training wheels, and the AI takes over.

35. kustomweb says:

It's starting to look like magic. The black box aspect of this most impressive demonstration is worrisome. It works, but you don't know how. It can tell you the difference between a cat and a dog, but not how it figured it out. This is where ethics become so important, because if I'm rejected for a loan by an AI, I'm going to want a reason, not just a result.

36. Lev Mizgirev says:

Dan, thanks for this amazing tutorial. I was actually thinking to implement it myself and you noticed my comment asking about it in the chat. Cool, at least now I know how to do it.

37. Ken Haley says:

First of all, I absolutely love this series on tensorflow.js! Great job! I am learning a lot — I doubt if I would ever have delved into this topic without your hands-on demos.

Anyway, having worked with programs that evalutate polynomials in the past, I thought I'd make a suggestion for efficiency and (IMHO) elegance:
Notice that:
ax^2 + bx + c can be written as ((ax + b)x + c — saving one multiply operation.
and..
ax^3 + bx^2 + cx + d can be written as (((ax + b)x + c)x + d – saving 3 multiply operations (if you count cubing as 2 multiply's).
The higher the degree, the more multiply operations you save.

But even better, look at how easy it is to add a degree to the polynomial. Just multiply the previous expression by x and add the next coefficient. And now there's no need to call the square function, etc.
So, your predict code for one degree would be
const ys = a.mul(xs).add(b) (I'm using a instead of m here, to demo the symmetry that follows)
For 3rd degree (cubic) it's
Notice that there's only one level of parentheses making it easy to understand, and for each degree you just append a multiply and add operation.

38. YellowNovaCrew says:

Man, you guys put a ton of work into cleaning up the stream, looks awesome! Fruit train pulling into the station. 🚂 🍇

39. GABRIELFILMSTUDIOS says:

Dan, if you look for a good resource for algebra, there is 3blue1brown‘s series.

40. No Name says: