home.social

#numericalmethods — Public Fediverse posts

Live and recent posts from across the Fediverse tagged #numericalmethods, aggregated by home.social.

  1. Alright, future engineers!
    **Euler's Method:** Approximates solutions to differential equations by taking small linear steps.
    Formula: `y_n+1 = y_n + h * f(x_n, y_n)`
    Pro-Tip: Smaller step size (h) improves accuracy but increases computational cost!

    #NumericalMethods #ODEs #STEM #StudyNotes

  2. Alright, future engineers!
    **Newton-Raphson:** Iteratively finds function roots (where f(x)=0).
    Formula: `x_n+1 = x_n - f(x_n) / f'(x_n)`
    Pro-Tip: A poor initial guess can lead to divergence or finding the wrong root!
    #NumericalMethods #RootFinding #STEM #StudyNotes

  3. **Truncation Error:** Inaccuracy from approximating infinite math processes (e.g., series) with finite steps.
    Ex: Using `x` for `sin(x)` near 0.
    Pro-Tip: It's an error in the *method*, not computer precision.
    #NumericalMethods #Error #STEM #StudyNotes

  4. Alright, future engineers!
    **Bisection Method:** Finds f(x)=0 roots by repeatedly halving intervals where sign changes.
    Ex: If `f(a)f(b)<0`, root's in `[a,b]`. `x_new = (a+b)/2`.
    Pro-Tip: Guaranteed convergence if root is bracketed, but can be slow!
    #NumericalMethods #RootFinding #STEM #StudyNotes

  5. Alright, future engineers!

    **Newton-Raphson** finds roots of `f(x)=0` by iteratively refining guesses.
    Ex: `x_new = x_old - f(x_old)/f'(x_old)`.
    Pro-Tip: A good initial guess speeds up convergence & prevents divergence!

    #NumericalMethods #RootFinding #STEM #StudyNotes

  6. Alright, future engineers!
    **Truncation Error** occurs when an exact mathematical procedure is replaced by an approximation, often by cutting off an infinite series.
    Ex: Using only the first few terms of a Taylor series for `e^x`.
    Pro-Tip: This error is *predictable* and *controllable* by refining your approximation!
    #NumericalMethods #ErrorAnalysis #STEM #StudyNotes

  7. Alright, future engineers!
    **Newton-Raphson** is an iterative method to find function roots (where f(x)=0).
    Formula: `x_n+1 = x_n - f(x_n)/f'(x_n)`
    Pro-Tip: Your initial guess `x_0` matters! Pick one close to the root for faster convergence.
    #NumericalMethods #RootFinding #STEM #StudyNotes

  8. Alright, future engineers!
    **Trapezoidal Rule:** Approximates `∫f(x)dx` by summing areas of trapezoids under the curve.
    Ex: For one segment, `∫f(x)dx ≈ (b-a)/2 * (f(a) + f(b))`.
    Pro-Tip: Use more segments (smaller `h`) for better accuracy!
    #NumericalMethods #Integration #STEM #StudyNotes

  9. Alright, future engineers!
    **Relative Error:** Quantifies how much an approximation deviates, *relative to the true value*. Ex: `RE = |(Approx - True) / True|`. Pro-Tip: Essential for evaluating precision and setting engineering tolerances!
    #NumericalMethods #ErrorAnalysis #STEM #StudyNotes

  10. Alright, future engineers!
    The **Newton-Raphson Method** iteratively finds roots (where f(x)=0) using tangent lines.
    Ex: `x_n+1 = x_n - f(x_n) / f'(x_n)`.
    Pro-Tip: A good initial guess `x_0` is crucial for quick convergence!
    #NumericalMethods #RootFinding #STEM #StudyNotes

  11. Alright, future engineers!

    **Euler's Method:** Approximates solutions to ODEs by taking small linear steps. Ex: `y_new = y_old + h * f(x_old, y_old)`. Pro-Tip: Accuracy depends heavily on step size `h`. Smaller `h` is better for precision!
    #ODEs #NumericalMethods #STEM #StudyNotes

  12. Alright, future engineers!

    **Truncation Error** is the inaccuracy from ending an infinite math process (like a series or integral) at a finite step. Ex: `e^x` approximated by `1+x`. Pro-Tip: You can *control* it by adjusting step size or terms, unlike round-off error!

    #NumericalMethods #ErrorAnalysis #STEM #StudyNotes

  13. Alright, future engineers!
    The **Bisection Method** finds roots by repeatedly halving an interval where `f(x)` changes sign. Ex: If `f(a)f(b) < 0`, a root is in `[a,b]`. Pro-Tip: Always converges, guaranteed if a root exists in the initial bracket!
    #NumericalMethods #RootFinding #STEM #StudyNotes

  14. Alright, future engineers!

    **Newton-Raphson:** Finds roots for f(x)=0 using tangent lines. Ex: `x_new = x_old - f(x_old)/f'(x_old)`. Pro-Tip: Needs `f'(x)`, but converges rapidly with a good initial guess!

    #NumericalMethods #RootFinding #STEM #StudyNotes

  15. Released my DIA-format Gauss-Seidel smoother plugin for OpenFOAM v13. MIT licensed.

    Replaces the default LDU smoother on structured hex meshes — DIA stores diagonal bands contiguously, reducing pointer indirection and DRAM pressure. Expecting 10–20% wall-clock gains and better cache utilisation based on standalone profiling. Full OpenFOAM benchmarks incoming.

    github.com/amartyadav/DIAGauss

  16. Alright, future engineers!
    **Fixed-Point Iteration** finds a root by transforming `f(x)=0` into `x=g(x)` & iterating `x_new = g(x_old)`.
    Ex: For `x^2 - x - 1 = 0`, try `x = sqrt(x+1)`.
    Pro-Tip: Choosing the right `g(x)` is CRUCIAL for fast convergence!

    #NumericalMethods #RootFinding #STEM #StudyNotes

  17. Alright, future engineers!

    **Newton-Raphson** iteratively refines guesses to find function roots (zeros). Formula: `x_new = x_old - f(x_old)/f'(x_old)`. Pro-Tip: A *good* initial guess is crucial for quick, stable convergence!

    #NumericalMethods #RootFinding #STEM #StudyNotes

  18. **Truncation Error** is the error from approximating an infinite mathematical process with a finite one. Ex: Using a finite Taylor series sum. Pro-Tip: It typically decreases with smaller step sizes (h) or more terms (N)!
    #NumericalMethods #ErrorAnalysis #STEM #StudyNotes

  19. Alright, future engineers!

    **Newton-Raphson:** Iteratively finds roots using tangent lines. Formula: `x_n+1 = x_n - f(x_n)/f'(x_n)`. Pro-Tip: A good initial guess is KEY to fast convergence!

    #NumericalMethods #RootFinding #STEM #StudyNotes

  20. Alright, future engineers!

    **Condition Number:** Measures how sensitive a system's solution is to small changes in input data. Ex: High # for Ax=b means small 'b' error can cause large 'x' error. Pro-Tip: A high # means your problem is ill-conditioned—solutions might be unreliable!

    #NumericalMethods #LinearAlgebra #STEM #StudyNotes

  21. Alright, future engineers!

    **Newton-Raphson** iteratively finds roots of f(x). Formula: `x_n+1 = x_n - f(x_n)/f'(x_n)`. Pro-Tip: A good initial guess is *key* for fast convergence!

    #NumericalMethods #RootFinding #STEM #StudyNotes

  22. Condition Number measures how sensitive a problem's output is to tiny input changes. Large condition number = huge output errors! Pro-Tip: It's key for understanding solution reliability; high values mean your problem is ill-conditioned and results are unreliable!
    #NumericalMethods #ErrorAnalysis #STEM #StudyNotes

  23. Euler's Method approximates ODE solutions by stepping along tangent lines. Formula: `y_n+1 = y_n + h * f(x_n, y_n)`. Pro-Tip: A smaller step size `h` means more accuracy but more computation. Balance wisely!
    #NumericalMethods #ODEs #STEM #StudyNotes

  24. Newton-Raphson finds equation roots iteratively. Formula: x_n+1 = x_n - f(x_n)/f'(x_n). Pro-Tip: A good initial guess (x_0) is crucial for convergence! It can diverge if x_0 is poor.
    #NumericalMethods #RootFinding #STEM #StudyNotes

  25. Numerical Integration approximates definite integrals by summing areas of simple geometric shapes. Ex: Trapezoidal Rule uses trapezoids! Area = (h/2)(y1+y2) for a single one. Pro-Tip: More subintervals (n) usually yield better accuracy, but increase computation!

    #NumericalMethods #CalcApprox #STEM #StudyNotes

  26. Newton-Raphson iteratively finds roots of f(x)=0 using tangent lines. Formula: `x_new = x_old - f(x_old) / f'(x_old)`. Pro-Tip: Your initial guess is critical; a bad one can lead to divergence!

    #NumericalMethods #RootFinding #STEM #StudyNotes

  27. Newton-Raphson iterates to find equation roots by repeatedly approximating the tangent.
    x_new = x_old - f(x_old)/f'(x_old)
    Pro-Tip: Initial guess is crucial! A bad start can lead to divergence or wrong roots.

    #NumericalMethods #RootFinding #STEM #StudyNotes

  28. Whenever I walk to/from home, I have to walk up/down an inclined street; I noticed that the asphalt floor has different curvatures depending on how near it is of a bend, and I try to find a less steep incline while walking.

    This got me inspiration for the few questions below. Any simple explanations, and related links, are welcome.

    Given a #differentiable surface within R^3, and two distinct points in it, there are infinitely many differentiable paths from one point to another, remaining on the surface. At each point of the #path, one can find the path's local #curvature. Then:

    - Find a path that minimizes the supreme of the curvature. In other words, find the "flattest" path.

    - Find a path that minimizes the variation of the curvature. In other words, find a path that "most resembles" a circle arc.

    Are these tasks always possible within the given conditions? Are any stronger conditions needed? Are there cases with an #analytic solution, or are they possible only with numerical approximations?

    #Analysis #DifferentialGeometry #Calculus #DifferentialEquations #NumericalMethods

  29. Whenever I walk to/from home, I have to walk up/down an inclined street; I noticed that the asphalt floor has different curvatures depending on how near it is of a bend, and I try to find a less steep incline while walking.

    This got me inspiration for the few questions below. Any simple explanations, and related links, are welcome.

    Given a #differentiable surface within R^3, and two distinct points in it, there are infinitely many differentiable paths from one point to another, remaining on the surface. At each point of the #path, one can find the path's local #curvature. Then:

    - Find a path that minimizes the supreme of the curvature. In other words, find the "flattest" path.

    - Find a path that minimizes the variation of the curvature. In other words, find a path that "most resembles" a circle arc.

    Are these tasks always possible within the given conditions? Are any stronger conditions needed? Are there cases with an #analytic solution, or are they possible only with numerical approximations?

    #Analysis #DifferentialGeometry #Calculus #DifferentialEquations #NumericalMethods

  30. Whenever I walk to/from home, I have to walk up/down an inclined street; I noticed that the asphalt floor has different curvatures depending on how near it is of a bend, and I try to find a less steep incline while walking.

    This got me inspiration for the few questions below. Any simple explanations, and related links, are welcome.

    Given a #differentiable surface within R^3, and two distinct points in it, there are infinitely many differentiable paths from one point to another, remaining on the surface. At each point of the #path, one can find the path's local #curvature. Then:

    - Find a path that minimizes the supreme of the curvature. In other words, find the "flattest" path.

    - Find a path that minimizes the variation of the curvature. In other words, find a path that "most resembles" a circle arc.

    Are these tasks always possible within the given conditions? Are any stronger conditions needed? Are there cases with an #analytic solution, or are they possible only with numerical approximations?

    #Analysis #DifferentialGeometry #Calculus #DifferentialEquations #NumericalMethods

  31. Whenever I walk to/from home, I have to walk up/down an inclined street; I noticed that the asphalt floor has different curvatures depending on how near it is of a bend, and I try to find a less steep incline while walking.

    This got me inspiration for the few questions below. Any simple explanations, and related links, are welcome.

    Given a #differentiable surface within R^3, and two distinct points in it, there are infinitely many differentiable paths from one point to another, remaining on the surface. At each point of the #path, one can find the path's local #curvature. Then:

    - Find a path that minimizes the supreme of the curvature. In other words, find the "flattest" path.

    - Find a path that minimizes the variation of the curvature. In other words, find a path that "most resembles" a circle arc.

    Are these tasks always possible within the given conditions? Are any stronger conditions needed? Are there cases with an #analytic solution, or are they possible only with numerical approximations?

    #Analysis #DifferentialGeometry #Calculus #DifferentialEquations #NumericalMethods

  32. Whenever I walk to/from home, I have to walk up/down an inclined street; I noticed that the asphalt floor has different curvatures depending on how near it is of a bend, and I try to find a less steep incline while walking.

    This got me inspiration for the few questions below. Any simple explanations, and related links, are welcome.

    Given a #differentiable surface within R^3, and two distinct points in it, there are infinitely many differentiable paths from one point to another, remaining on the surface. At each point of the #path, one can find the path's local #curvature. Then:

    - Find a path that minimizes the supreme of the curvature. In other words, find the "flattest" path.

    - Find a path that minimizes the variation of the curvature. In other words, find a path that "most resembles" a circle arc.

    Are these tasks always possible within the given conditions? Are any stronger conditions needed? Are there cases with an #analytic solution, or are they possible only with numerical approximations?

    #Analysis #DifferentialGeometry #Calculus #DifferentialEquations #NumericalMethods

  33. Mastodon : TuxRiders is a journey to research experiences using free and scientific computing programs, which aimed to demonstrate their power for real-world scientific research.

    In our channel, we regularly talk about , , , , , , , ++, , , and .

    Check out our YT channel: youtube.com/TuxRiders